Whenever you open the Instagram or Tiktok for example, you’ll most often found out that the videos or photos posted in the main page are similar to what you’re interested in. Weird, isn’t it? While you may think that this is weird, in actual, it is not. This could happen due to ‘algorithm’.
This algorithm doesn’t only keep you from viewing contents that exacerbates your own biases, but it also isolates you from opposing views. It has been so widespread that some sociologists are afraid that it is now becoming a form of “social control.” It has also become such sort of a “toxic personalization.”
Take Facebook for example. The developer mentions that the algorithm that they have is built for your own benefit, which isn’t necessarily true. In fact, the algorithm is optimized for the company and the advertisers (by showing more advertisements than it is supposed to be). Algorithms are not always neutral and this is the example.
Algorithms are sometimes can be useful (giving the right recommendations), however, it can also become a scary tool once it “predicts” too much of our movements thus “controlling” us to do exactly what they wanted.
Source:
Comments