bańki informacyjne

Filter bubbles, alternative Internet, and attempts to define freedom of speech

Share on facebook
Share on twitter
Share on linkedin
Share on email
Share on print

Until recently, we agreed on a description of the real world. However, the development of the Internet seems to be disturbing the uniform narrative of reality. Algorithms choosing the information we read and watch strengthen our beliefs and once surprise us with increasingly bizarre conspiracy theories time after time. Their fans are ready to fight long online battles, for instance about the fake nature of coronavirus. At the beginning of its existence, the Internet was supposed to be a field for exchanging views where everyone can be heard. However, we did not know that 30 years later it would become a tool for bringing people together “in the bubble” around the idea of a flat earth.

The name “filter bubble” is used to describe  the phenomenon

of matching the content shown on the Internet to the recipient and adapting it to the profile of a specific user. In other words, this means that we only receive information that the relevant algorithms of social networks such as Facebook, Twitter, and YouTube, have identified as the one with the largest influence on us.

Sometimes, these will be content that is in line with our view, sometimes, it will be something that annoys us immensely so that we lose time to frantically exchange comments on this information. The way algorithms and programs that are part of the filter bubble operate makes us passively consume content that is given to us by the algorithm, designed not only by developers, but also by a group of behaviorists, instead of actively searching for information and deepening our knowledge.

It is worth remembering that thanks to the filter bubble, we can only see a small portion of the information that is tailor-made. As a result, despite being able to use virtually unlimited sources and resources, we gain knowledge about the world mainly from the places that provide us with the vision that is the most suitable to us.

Within this context, it is worth mentioning the phenomenon called confirmation error, which consists in prioritizing information that meets our expectations over information that is inconvenient and contradictory to our vision. It can therefore be said that creating a bubble around us is in line with our nature, however, it is undoubtedly its dark side, which Homo sapiens in the age of relative enlightenment should recognize and reject.

The protagonist of the popular Netflix document “The Social Dilemma” in Joe Rogan’s podcast talked about how content screening algorithms work. Listen to know how deeply they manipulate the way we perceive the world and how this affects our emotional state as well.

Nowadays, algorithms sorting and profiling content are present in almost every application that can be described as social. In order to illustrate how many aspects of our life are subject to algorithmization, let us use the recent information about a popular music service. According to the information provided by the famous music magazine Pitchfork, Spotify had obtained a patent for a technology aimed at using recordings of user’s speech and background noise to determine what kind of music it should recommend to the user. Yes, we are supposed to allow music streaming applications to eavesdrop on us so that they can provide us with the most appropriate music recommendations.

In practice, the application may keep asking us: “How are you? What do you feel like doing today?” to adjust the type of music to our current mood. It is hard to imagine a more bizarre situation than our feelings being monitored by an application whose task is to give us the opportunity to listen to the new record by the Golec Brothers.

#learntocode or what is hate speech?

Over the last 20 years, the Internet has evolved into a place that is increasingly difficult to define in a uniform way. It is perceived differently by regulators, differently by the largest technology companies, users have yet another vision, with the way they perceive the network not being consistent either. We do not know how to build a stream of information so that it is fair, inclusive, and realistic and does not restring freedom of speech at the same time.

Social networks evolve virtually every day, changing their face. The more users in a given medium, the greater the challenges faced by Facebook and Twitter owners. Not so long ago, these services streamed information according to chronology without passing it through any filter. Facebook was the first to resign from this model, and Twitter followed in its footsteps in 2016, resigning from chronological newsfeed.

At the beginning of the first decade of the 21st century, the Internet was a kind of Wild West, content was removed occasionally, whereas the notion of intellectual property practically did not exist. In the course of time, it is not only the presentation of information that has changed, but also what can be presented as well. Robots eliminating content infringing on copyright as well as the possibility of reporting violent or abusive content appeared on YouTube. One of the latest components of social networks are fact-checking tools implemented before the US elections at the end of 2020.

The Internet has started to remind the Greek agora less and less, where everyone can say what they want and is heading more and more towards a moderated television debate, where not everything is appropriate/can be said. Anyway, social networks are known for updating their privacy policies frequently, which is why now it is increasingly difficult to determine what is allowed and what is not allowed. 

An interesting example from two years ago can be a massive campaign where Twitter users sent messages to journalists with the hashtag #learntocode . This was a response to information on redundancies at BuzzFeed and other Verizon Group members. Some Twitter users considered it a good time to give a lesson to those who used to instruct others in the past, e.g. truck drivers or miners, that it was high time to requalify. Evaluation of the whole campaign is one of the most complicated interpretations of what is allowed under freedom of expression.

On the one hand, the #learntocode action was massive harassment of individual journalists who received hundreds of messages like this (which was covered by inter alia Jack Dorsey, Twitter CEO). On the other hand, there is an arbitrary decision taken by Twitter, a private technology company, which decided to permanently remove many users who joined the campaign “encouraging” journalists to requalify. Until today, the case of #learntocode keeps appearing in discussions on freedom of speech on the Internet and on social platforms.

Interestingly enough, portals such as Facebook, on the one hand, introduce fact-checking tools as part of the presidential campaign, and, on the other hand, they see nothing wrong in cultivating closed social groups around antivax subjects, alternative medicine, flat earth or even a theory related to a closed pedophilia circle, people trafficking children in the back of pizzerias, to which inter alia Bill Clinton or Barack Obama are supposed to belong. 

Alt-right and alternative test

In the current social media environment, everyone seems to be a victim. On the one hand, voices are being raised about restricting freedom of speech by private corporations, on the other hand, many people are expressing their dissatisfaction by saying that the platforms have become propaganda tools in the political fight and propagators of conspiracy theories. While the status quo for people with moderate views seems to be acceptable for now, over the last months, movements related to the so-called alternative right have not been able to stand it anymore.

As a result of a number of actions taken by Facebook, Twitter, and others, people with less “mainstream” views decided to try to redefine social media. The presidential elections in the United States, at the end of which any subsequent Donald Trump’s tweet was identified as being fraudulent, was a hot stop. In order to show their objection the allegedly rigged elections that cannot be discussed on Facebook and Twitter, people linked mainly to the alt-right started to migrate to Gab and Parler. Moreover, former US President Donald Trump, who was impeached, also announced that he was going to create his own social platform. 

At this point in history, we are coming to the most recent events. Back in  mid-January, Parler was one of the fastest growing applications in AppStore and Google Play. This was the case until Silicon Valley decided to speak in unison. Apple and Google said that they were prohibiting the Parler mobile application from being made available in their application stores, whereas Amazon Web Services announced that they would no longer host the Parler website, which led to the suspension of the service until a new host was found. The companies pointed out the continued presence of posts published by users inciting violence. As a result, even if it migrates to a new host, Parler will not be able to return to App Store or Google Play “unless it abandons its identity as a platform whose content policies are as permissive as the First Amendment.” 

What’s next?

What will future alternatives to Facebook or Twitter look like from those who are dissatisfied? We can only wait for developments. Perhaps the Polish alternative to Facebook, Albicla, which was created to show dissatisfaction with the policy of the largest social networks, may be a certain indicator for foreign websites. Right after it had launched, users started to report problems – activation e-mails could not be received for a long time due to high interest. The database with user data was unsecured for several hours (it was possible to link the user to the password), and ultimately it turned out that the regulations of Albicla.com were a copy of Facebook regulations. The portal has also undergone the first wave of deleting accounts of user who failed to meet arbitrary requirements of the service manager. Thus, both small and large players on the social media market are not handling criticism well and prefer to be controlled centrally.

CONTACT WITH Editor in Chief

Michał Serwiński

+48/ 698 059 620‬
michal.serwinski@frsi.org.pl

SHARE
Facebook
Twitter
LinkedIn
Email
Print
NEWSLETTER
FRSI Fundacja Rozwoju Społeczeństwa Informacyjnego Sektor 3.0

Sektor 3.0 is an initiative of the Polish American Freedom Foundation. It’s implemented by the Information Society Development Foundation.

NEWSLETTER
NEWSLETTER