Public Dialog in a Networked Age

HomeEssaysTechnologyPublic Dialog in a Networked Age
Public Dialog in a Networked Age
04.03.2020
Category:

Social network Facebook has launched a special mechanism to combat fake news due to its recent increase and negative effect on the public opinion. This was announced in the company’s official statement. One of the reasons for this decision was the fact that Facebook was heavily criticized after the victory of Donald Trump’s presidential elections in the United States. According to many users, this result was a consequence of a large number of fake web news disseminated by the opponents of Hillary Clinton. During the presidential campaign, messages repeatedly appeared claiming that the search results on the Internet and news related to the elections were filtered in favor of one candidate. The statement of the company was announced after Barack Obama opposed the fake news appearing on the famous online platforms, such as Facebook. Mark Zuckerberg refuted the idea that the fake news on Facebook influenced the elections in the US and raised voice against the conclusion that the people voted on the basis of false information in the social media (Frenkel). However, despite the early underestimation of the impact of fake news on Facebook, Mark Zuckerberg unveiled a plan to solve the growing problems. Facebook will increase the company’s efforts to combat fake information in the social network. According to this plan, customers who have noticed fake news in their newswire will be able to click on the upper right-hand post sector and choose a mark that the message distorts the truth. Facebook will also send information to organizations involved in checking the facts. News that is doubtful will be marked as questionable and will not appear in the top positions in the users’ newsfeed. Facebook representatives emphasize that it is important that the information that users see on social networks, is real and meaningful. The head of the social network said that independent organizations would be engaged in the test data. The media coverage of the issue with fake news on Facebook cannot be considered comprehensive, as it does not provide readers with an understanding of the related concepts, such as lock-in phenomenon, peer-production, filter bubble, and others. Therefore, the paper in question is aimed at studying such aspects of the fake news in online networks as lock-in phenomenon, peer-production, and filter bubbles and how they might be solved.

Get a price quote

I’m new here 15% OFF

Network Age: The Relevant Details of Event and How the News Stories Are Covering It

All the news articles under consideration state that Mark Zuckerberg has claimed his decision to fight against fictional news on Facebook. The main provisions of the articles involve the following main points. First, is the claim that more than 99% of the content on Facebook is authentic (“Facebook Staff Mount Secret Push to Tackle Fake News, Reports Say”). Secondly, Zuckerberg denies that fake news on Facebook has influenced the outcome of the US elections (Peck). Third, Facebook develops service for the fight against fake web news (Frenkel). These three positions are rather contradictory. In order to deal with an extremely limited amount of false information on Facebook, the company introduces the best technical system for the detection of fake news before users mark them. The company also asked the independent organizations to verify the facts, which will facilitate the reporting of false news, as well as examine the marking posts that have been marked when people read or share them.

Analyzing the news reports on the event, one can see that that the fake news is taken very seriously by the social network, and discussing them on Facebook lasted for several months (“Facebook Staff Mount Secret Push to Tackle Fake News, Reports Say”). Facebook is working with the news industry to understand its fact-checking system. At the same time, no schedule showing when these projects will be put into action is presented. The only information that is given is that some of these ideas will work well, and some will not. Facebook group is trying to stop fake news. However, it is still unclear whether it will become an official part of the decision. While Facebook founder denies the influence of phishing web news on the outcome of the votes, there is still no official evidence that the absence of filtering algorithms on the fake news could not affect the election results. While Facebook users are urged to mark the fake news, guidelines for the removal of such material are not fully defined. The reader gets the fact, but not a solution. The aim of the publications seems to show people that the content that they will find in the social network is the most important and most accurate news. An intention to strengthen the fight against the emergence and spreading of fake news is the main message of the Facebook founder

Explanation of the Issues at Stake

Filter Bubble

Facebook is constantly becoming the target of criticism because of the algorithmic tape and fake news circulation. Modern information systems accumulate vast amounts of data about each user, analyze, structure, and ultimately decide what information to give to the screen. On Facebook, posts on an interesting topic for each user appear first. Facebook remembers what the users click on and next time gives them only the news that this social network suggests being interesting for them. The connection of all these services, the personalization algorithms, issuing personalized information, and the filters create the so-called “filter bubble” (Pariser). Its main idea is that filtering and personalization rob people of new notions and important information and makes the illusion that their interests, in fact, are all that is significant in the world (Pariser). Users do not control what information they receive. Thus, it is also people who create filter bubbles.

The effects of the filter bubble lie primarily in the social context. Intellectually isolated in their information field, users get less information that contradicts their own points of view (Pariser). It brings the potential harm not only to the individual but to society as a whole. Adverse effects filter bubble causes are harmful to the society since it can undermine the very formation of civil opinions that makes people more vulnerable to propaganda and manipulation, and globally – slows down the democratic process worldwide (Pariser).

Free Extras
  • Free formatting
  • Free email delivery
  • Free outline (on request)
  • Free revision (within 2 days)
  • Free title page
  • Free bibliography
We Guarantee
  • 24/7 Customer Support
  • Quality
  • Experienced writers
  • Confidentiality
  • No hidden charges
  • Works are never resold
  • No plagiarism
Paper Format
  • 12pt. Times New Roman
  • Double-spaced/Single-spaced
  • Up-to-date sources
  • Fully referenced papers
  • 1 inch margins
  • Any citation style

As an example of the negative effect of the filter bubble, a wrong opinion about the current situation during recent presidential elections can be considered. Because of the algorithmic news feeds, Clinton followers have seen only Clinton supporters’ posts, while admirers of Donald Trump only those in support of this candidate. However, Zuckerberg said that, compared with the information in the newspapers or on television 20 years ago, they, on the contrary, have been trying to provide users with different perspectives. However, people simply do not read the materials and opinions that differ from their judgment (Seetharaman). People simply ignore a lot of things that are not suitable for their mindset. This fact is proved by the recent study, According to it, liberals and conservatives with enabled algorithmic newswire settings see only 1% fewer posts on the opposite point of view than without these settings (Seetharaman). However, this fact can be considered contradictory. In May 2016, an interactive chart was created on how the feeds on Facebook are changing according to the political preferences of the user. It turned out that the supporters of different parties are offered completely different news (“Blue Feed, Red Feed”).

Fake news circulation on Facebook is due to the fact that human psychology makes people more prone to accept information that is in line with their outlook on life. If they are serious about the facts, what the truth is, and what is not, if they cannot distinguish between convincing arguments and propaganda, then it causes a problem. The bottom line is that people take misinformation seriously. Unfortunately, the news feed is optimized for communication. The lie is extremely attractive. The bias in the direction of truth is an impossible task. Now, it has become clear that democracy suffers if the news environment encourages lies. This violates the whole community. Facebook also encourages and aggravates this process.

Peer Production

Firstly, Mark Zuckerberg denied the possibility of the influence of fake news on Facebook on the election results, but eventually, he announced a plan to combat fake information. With a difference of a few hours, Facebook and Google issued a statement that described the plans to deny sites involved in the proliferation of fake web news advertising revenue. Thus, the companies will no longer share the revenue with sites that distribute fake news (Ghoshal). There are countless fake news sites, and every day their number is increasing. They can be divided into three groups. The first is completely bogus news sites simply publishing false information for fun or advertising revenue. The second group is affiliated with political organizations’ sites that provide the reader with only positive information on one side and negative on the other party represented by the opponent. The third and the most dangerous group includes the site’s hybrids that publish fiction in the real news flow copied from other sources. These portals provide a web of misinformation. The news shatters on the Internet, and even when the reader tries to check the accuracy of the information, search engines produce dozens of sources, each of which pretends to be a real news site. Some entirely false stories appeared in the news feeds on Facebook, as well.

Fake news was written by several young people from Macedonia who, apparently, were not associated with right-wing forces in the United States or the alleged members of the Russian special services. Using records of domain names and online searches, over 100 active websites on American politics, centered in Veles, were identified (Silverman and Alexander). In the largest of them, the pages of Facebook can account for hundreds of thousands of subscribers. The teens have created sites about American politics and earned millions of dollars. In the course of the race, few teenagers launched phishing web sites with news, which were actively criticizing and ridiculing a rival of Trump. Notes from these sources have widely spread through social networks, and their authors thus earned from advertising (Silverman and Alexander).

Motives for creating fake web news were purely financial. These sites demonstrated the economic benefits of the production of misinformation, especially for the rich advertising markets, in particular, for Facebook, the largest social network in the world, as well as the largest advertising online systems, such as Google AdSense. Creators of fakes usually show a completely indifferent attitude towards politics. For them, it is just a tool to make money. When a political race is completed, the news will no longer be evaluated and expensive to collect a bunch of clicks. However, this has opened the way for a huge amount of fake web news. The long-term risk is the fact that because of this precedent, in the future, politics can indulge the circulation of fabricated news. Upcoming elections in France and Germany open up new horizons for them. This model can be implemented anywhere, since fake news can be monetized, and many will want to repeat the success of peer production from Macedonia.

Information the Public Really Needs

This issue shows how important the correct and comprehensive coverage of the problem in the press is. Fake web news can distort the views of the community. After Trump’s victory, a question on Facebook responsibility in the dissemination of false information and failure in the fight against fake web news exchange arose. A recent study by BuzzFeed News showed that nearly half of adult Americans rely on Facebook as their main news source (Frenkel). 3 main center-left pages have published false or misleading information in almost 20% of cases, while the 3 main pages of right-wing forces in the Facebook publish it in 38% of cases throughout the campaign (Frenkel). The users should firmly understand that there is no special Facebook news department, which would be engaged in the production of the content. Facebook is a general media company, and as a consequence, it is not responsible for the published content. Facebook is a tool for those who want to use it to express their preferences.

Thus, the duty of every user is to check sensational information before sharing it. The duty of public media platforms is obviously to eliminate fake stories. Dissemination of false news creates a problem since news is transferred from reality to the sphere of the senses. The information causes an emotional response in users, which is an ideal opportunity for candidates who play with the facts. People running these sites are coming up with stories (and especially titles), playing on the emotions of the readers and the existing prejudices. Dissemination of false news is compounded by the fact that some users can only read the websites of this kind, as well as their surroundings.

Chat label

Struggling with your essay?

Chat label

Ask professionals to help you!

Chat label

Start Chat

Lock-in

The concept of lock-in has also its impact on the problem of spreading fake news on Facebook. The issue is that nowadays computers are an indispensable part of human life. Today, designers should be extremely careful when providing the design for any media platform, as it has a direct impact on inconsequential decision making. Lock-in removes design options considering what can be easier to develop and program, what looks better, what is more fashionable, and what is feasible in political respect or just occasionally (Lanier). The main problem is that lock-in removes the ideas that do not perfectly fit into winning the digital representation scheme. It eliminates all the hues, leaving the ideas only black or white. This leads to the distortion of the truth, which in turn leads to shaping public opinion based on this distortion (Lanier). However, it should be understood that further lock-in is guaranteed unless there will be an absolutely new way of developing software. Thus, the modern user should be very careful in the decision-making process.

To sum up, Facebook is a relatively new tool for shaping a social opinion. Publications coverage on the problem by means of Facebook can be limited to two partially crosscutting issues. On the one hand, through social media, it is very easy to limit information by creating a range of friends sharing common views, and on the other hand, there is a mercantile interest in users’ clicks in terms of the circulation of the fake news articles. Perhaps, Facebook will be able to customize their algorithms so as to reduce the number of references to the sources that are not credible since now. The company should understand its responsibility for publishing real information and the powerful impact it has on people’s minds. Some measures are already taken since hundreds of Facebook employees around the world are engaged in monitoring of the content published by users.

all Post
Discount applied successfully