Argument: Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally

32 Read & Understand: Misinformation

This chapter introduces students to the reading, “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally,” through a vocabulary preview activity, reading process activity, and a summary and response activity. It may be helpful to print a copy of Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally to make notes about vocabulary and to annotate as you preview and read the article.

Vocabulary Preview for “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally

Purpose

The purpose of this activity is to build knowledge of vocabulary before reading a text in order to improve fluency and efficiency. You may also wish to practice using the new vocabulary in your writing. The preview is divided into three groups: (1) Academic Vocabulary, (2) Vocabulary with Other Common Meanings, and (3) Collocations and Informal Language.

Academic Vocabulary

Academic vocabulary are bold in the article “Why Rituals Are Good for You” and also listed below.  Prior to reading the article, familiarize yourself with the words using a dictionary. Several pieces of information are provided for each word and phrase:

  • The part of speech for the word according to how it is used in the article “Why Rituals Are Good for You”
    • Many words can take multiple parts of speech and have numerous definitions. Knowing a word’s part of speech in the sentence can help you to narrow down to the correct dictionary definition.
  • The sentence where the word is used in the article “Why Rituals Are Good for You”
    • The sentence provides context, which also helps to narrow down to the appropriate definition from the dictionary. The context is the situation in which the word is used.
  • The paragraph number where the word can be found in the reading
    • If you need additional context beyond the sentence, you can refer to the paragraph in the article for more information.

Using the information provided for each word, identify a relevant definition. You may also wish to note definitions and synonyms next to the words in the article to help you while you are reading. A synonym is a word which has a similar meaning to another word.

  1. algorithms (n.): But the fact that low-credibility content spreads so quickly and easily suggests that people and the algorithms behind social media platforms are vulnerable to manipulation. (Paragraph 2)
  2. amplify (v.): Then the bots can amplify false claims smearing opponents by retweeting articles from low-credibility sources that match certain keywords. (Paragraph 21)
  3. biases (n.): Cognitive biases originate in the way the brain processes the information that every person encounters every day. (Paragraph 4)
  4. cognitive (adj.): Cognitive biases originate in the way the brain processes the information that every person encounters every day. (Paragraph 4)
  5. connotations (n.): People are very affected by the emotional connotations of a headline, even though that’s not a good indicator of an article’s accuracy. Much more important is who wrote the piece. (Paragraph 6)
  6. credibility (n.): In the process, they learn to recognize signals of source credibility, such as hyperpartisan claims and emotionally charged headlines. (Paragraph 7)
  7. dense (adj.): When we drilled down on the misinformation-spreading accounts, we found a very dense core group of accounts retweeting each other almost exclusively – including several bots. (Paragraph 12)
  8. devolve (v.): This helps explain why so many online conversations devolve into “us versus them” confrontations. (Paragraph 10)
  9. disseminating (n.): Our analysis of the structure of these partisan communication networks found social networks are particularly efficient at disseminating information – accurate or not – when they are closely tied together and disconnected from other parts of society. (Paragraph 9)
  10. exploit (v.): That is why our Observatory on Social Media at Indiana University is building tools to help people become aware of these biases and protect themselves from outside influences designed to exploit them. (Paragraph 3)
  11. fabricated (adj.):Social media are among the primary sources of news in the U.S. and across the world. Yet users are exposed to content of questionable accuracy, including conspiracy theories, clickbait, hyperpartisan content, pseudo science and even fabricated “fake news” reports. (Paragraph 1)
  12. finite (adj.): The brain can deal with only a finite amount of information, and too many incoming stimuli can cause information overload. (Paragraph 4)
  13. homogeneity (n.): Because this is at the level of a whole platform, not of a single user, we call this the homogeneity bias. (Paragraph 16)
  14. hyperpartisan (adj.):Social media are among the primary sources of news in the U.S. and across the world. Yet users are exposed to content of questionable accuracy, including conspiracy theories, clickbait, hyperpartisan content, pseudo science and even fabricated “fake news” reports. (Paragraph 1)
  15. legitimacy (n.): The only times that fact-checking organizations were ever quoted or mentioned by the users in the misinformed group were when questioning their legitimacy or claiming the opposite of what they wrote. (Paragraph 12)
  16. lucrative (adj.): Spam and online fraud are lucrative for criminals, and government and political propaganda yield both partisan and financial benefits. (Paragraph 2)
  17. manipulation (n.): But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation. (Paragraph 13)
  18. originate (v.): Cognitive biases originate in the way the brain processes the information that every person encounters every day. (Paragraph 4)
  19. partisan (adj.): Spam and online fraud are lucrative for criminals, and government and political propaganda yield both partisan and financial benefits. (Paragraph 2)
  20. personalization (adj.): These personalization technologies are designed to select only the most engaging and relevant content for each individual user. (Paragraph 13)
  21. propaganda (n.): Spam and online fraud are lucrative for criminals, and government and political propaganda yield both partisan and financial benefits. (Paragraph 2)
  22. reinforcing (n.): But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation. (Paragraph 13)
  23. stimuli (n.): The brain can deal with only a finite amount of information, and too many incoming stimuli can cause information overload. (Paragraph 4)
  24. suspicious (adj.): Players get more points for sharing news from reliable sources and flagging suspicious content for fact-checking. (Paragraph 7)
  25. tendency (n.): The tendency to evaluate information more favorably if it comes from within their own social circles creates “echo chambers” that are ripe for manipulation, either consciously or unintentionally. (Paragraph 10)
  26. vulnerable (adj.): But the fact that low-credibility content spreads so quickly and easily suggests that people and the algorithms behind social media platforms are vulnerable to manipulation. (Paragraph 2)

Vocabulary with Other Common Meanings

Words can have many different meanings in English. Some words that have common, everyday meanings also have specific meanings that are not used as often.

Consider the word factor as an example. In everyday use, factor refers to some element that influences an outcome, as in the following sentence: Students’ time management skills are factors in their academic success. In a mathematics class, however, factor has a less common meaning that relates to multiplication.

The words in this section have less common and often more abstract meanings in the article compared to their meanings in everyday situations.

As with the Academic Vocabulary list above, the  vocabulary in this section includes the part of speech, the sentence from the article, and the paragraph number where the word can be found in the article. Using the information provided for each word, identify a relevant definition that fits with the context of how the word is used in the sentence. You may also wish to print a copy of the article and note definitions and synonyms next to the words in the article to help you while you are reading. A synonym is a word which has a similar meaning to another word.

  1. ecosystem (n.): Our research has identified three types of bias that make the social media ecosystem vulnerable to both intentional and accidental misinformation. (Paragraph 3)
  2. feed (n.): One cognitive shortcut happens when a person is deciding whether to share a story that appears on their social media feed. (Paragraph 6)
  3. flagging (n.): Players get more points for sharing news from reliable sources and flagging suspicious content for fact-checking. (Paragraph 7)
  4. smearing (adj.): Then the bots can amplify false claims smearing opponents by retweeting articles from low-credibility sources that match certain keywords. (Paragraph 21)
  5. steep (adj.): We have found that steep competition for users’ limited attention means that some ideas go viral despite their low quality – even when people prefer to share high-quality content. (Paragraph 4)
  6. yield (v.): Spam and online fraud are lucrative for criminals, and government and political propaganda yield both partisan and financial benefits. (Paragraph 2)

Collocations and Informal Language

This section of vocabulary includes collocations and informal language. A collocation is the frequent use of two more words together, such as save time, which is a common phrase in English. Informal language may include conversational language that is less likely to be used in academic writing, as well as idioms. An idiom is an expression that cannot be defined based on the meanings of the separate words; instead, the combination of words has a different meaning altogether. For example, the idiom to open a can of worms has nothing to do with cans or worms; it means to create an especially challenging problem.

The collocations and informal language in this section include the sentence from the article and the paragraph number where the words can be found in the article. Prior to reading the article, familiarize yourself with the concepts using a dictionary or by searching online if you cannot find one in the dictionary. Identify a relevant definition for each. You may also wish to note definitions and synonyms next to the words in the article to help you while you are reading.  A synonym is a word which has a similar meaning to another word.

  1. bots: When we drilled down on the misinformation-spreading accounts, we found a very dense core group of accounts retweeting each other almost exclusively – including several bots. (Paragraph 12)
  2. clickbait: Social media are among the primary sources of news in the U.S. and across the world. Yet users are exposed to content of questionable accuracy, including conspiracy theories, clickbait, hyperpartisan content, pseudo science and even fabricated “fake news” reports. (Paragraph 1)
  3. confirmation bias: For instance, the detailed advertising tools built into many social media platforms let disinformation campaigners exploit confirmation bias by tailoring messages to people who are already inclined to believe them. (Paragraph 14)
  4. conspiracy theories: Social media are among the primary sources of news in the U.S. and across the world. Yet users are exposed to content of questionable accuracy, including conspiracy theories, clickbait, hyperpartisan content, pseudo science and even fabricated “fake news” reports. (Paragraph 1)
  5. cut off from: Our analysis of the data collected by Hoaxy during the 2016 U.S. presidential elections shows that Twitter accounts that shared misinformation were almost completely cut off from the corrections made by the fact-checkers. (Paragraph 11)
  6. drilled down: When we drilled down on the misinformation-spreading accounts, we found a very dense core group of accounts retweeting each other almost exclusively – including several bots. (Paragraph 12)
  7. echo chambers: The tendency to evaluate information more favorably if it comes from within their own social circles creates “echo chambers” that are ripe for manipulation, either consciously or unintentionally. (Paragraph 10)
  8. emotionally charged: In the process, they learn to recognize signals of source credibility, such as hyperpartisan claims and emotionally charged headlines. (Paragraph 7)
  9. filter bubbles: These bots are able to construct filter bubbles around vulnerable users, feeding them false claims and misinformation. (Paragraph 21)
  10. go viral: We have found that steep competition for users’ limited attention means that some ideas go viral despite their low quality – even when people prefer to share high-quality content. (Paragraph 4)
  11. grassroots: However, some conceal their real nature and are used for malicious intents, such as boosting disinformation or falsely creating the appearance of a grassroots movement, also called “astroturfing.” (Paragraph 18)
  12. in conjunction with:  Using Botometer in conjunction with Hoaxy, we analyzed the core of the misinformation network during the 2016 U.S. presidential campaign.
  13. information overload: The brain can deal with only a finite amount of information, and too many incoming stimuli can cause information overload. (Paragraph 4)
  14. irrespective of: This also feeds into existing cognitive bias, reinforcing what appears to be popular irrespective of its quality. (Paragraph 17)
  15. political leanings: In fact, in our research we have found that it is possible to determine the political leanings of a Twitter user by simply looking at the partisan preferences of their friends. (Paragraph 9)
  16. social circles: The tendency to evaluate information more favorably if it comes from within their own social circles creates “echo chambers” that are ripe for manipulation, either consciously or unintentionally. (Paragraph 10)
  17. tailoring to: For instance, the detailed advertising tools built into many social media platforms let disinformation campaigners exploit confirmation bias by tailoring messages to people who are already inclined to believe them. (Paragraph 14)

Reading Process Activity

Purpose

The purpose of this activity is to activate your background knowledge and build your interest before reading an article so that you have a more engaging and efficient reading experience; to actively read the article; and to reflect on your reading process and understanding of the text.

Preview the Article

Print a copy of the article “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally” so that you can annotate it. Follow the steps below to preview the article.  As you complete this activity, do not read the entire article.  You will read the entire article later — after you have previewed it.  Focus on previewing only.  As you preview the article, record your thoughts in the margins of the printed copy of the article.

1. Read the title.

  • What does it make you think about?  What do you think the article is about?  What do you already know about the concepts mentioned in the article (misinformation, bias, social media)?  Record your ideas in the top margin of the printed article.
  • What questions do you have based on the title?  Record your questions on the printed article near the title.

2. Read paragraphs 1-3, which form the introduction to the article.  What predictions and questions do you have based on the introduction?  Record your predictions and questions in the margin of the printed article near the introduction.

3. The reading is divided into sections with headings.  Read each bold heading and the first sentence or two of each section.  What predictions and questions do you have based on your preview of each section?  Record your ideas and questions in the margins next to each section of the article.  You should note the following headings in the article:

  • Bias in the brain
  • Bias in society
  • Bias in the machine
  • Understanding complex vulnerabilities

4. Based on your preview of the article, what do you think is the central point of the article?  (Don’t worry if you are not sure.  This is just a prediction or guess – you do not have to be correct. You can confirm or adjust your predictions as you read.)

Actively Read and Annotate the Article

After previewing “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally,” actively read the article.  As you read the article, do the following:

  • Consider whether or not your predictions were correct.
  • Use the preview questions you wrote to guide your reading and answer them (if the answers are in the text).  You can record your responses directly on the article by annotating the text or by taking notes on a separate sheet of paper.
  • Paraphrase main points briefly in the margins.
  • Mark unfamiliar vocabulary.

Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally

Giovanni Luca Ciampaglia is an Assistant Professor, Department of Computer Science and Engineering, University of South Florida.  Filippo Menczer is a
Professor of Computer Science and Informatics and the Director of the Center for Complex Networks and Systems Research at Indiana University.  This article originally appeared in The Conversation.

As you read, take notes related to the questions you wrote and the predictions you made when you previewed the article.

1 Social media are among the primary sources of news in the U.S. and across the world. Yet users are exposed to content of questionable accuracy, including conspiracy theories, clickbait, hyperpartisan content, pseudo science and even fabricated “fake news” reports.

2 It’s not surprising that there’s so much disinformation published: Spam and online fraud are lucrative for criminals, and government and political propaganda yield both partisan and financial benefits. But the fact that low-credibility content spreads so quickly and easily suggests that people and the algorithms behind social media platforms are vulnerable to manipulation.

3 Our research has identified three types of bias that make the social media ecosystem vulnerable to both intentional and accidental misinformation. That is why our Observatory on Social Media at Indiana University is building tools to help people become aware of these biases and protect themselves from outside influences designed to exploit them.

Bias in the brain

4 Cognitive biases originate in the way the brain processes the information that every person encounters every day. The brain can deal with only a finite amount of information, and too many incoming stimuli can cause information overload. That in itself has serious implications for the quality of information on social media. We have found that steep competition for users’ limited attention means that some ideas go viral despite their low quality – even when people prefer to share high-quality content.

5 To avoid getting overwhelmed, the brain uses a number of tricks. These methods are usually effective, but may also become biases when applied in the wrong contexts.

6 One cognitive shortcut happens when a person is deciding whether to share a story that appears on their social media feed. People are very affected by the emotional connotations of a headline, even though that’s not a good indicator of an article’s accuracy. Much more important is who wrote the piece.

7 To counter this bias, and help people pay more attention to the source of a claim before sharing it, we developed Fakey, a mobile news literacy game (free on Android and iOS) simulating a typical social media news feed, with a mix of news articles from mainstream and low-credibility sources. Players get more points for sharing news from reliable sources and flagging suspicious content for fact-checking. In the process, they learn to recognize signals of source credibility, such as hyperpartisan claims and emotionally charged headlines.

Bias in society

8 Another source of bias comes from society. When people connect directly with their peers, the social biases that guide their selection of friends come to influence the information they see.

9 In fact, in our research we have found that it is possible to determine the political leanings of a Twitter user by simply looking at the partisan preferences of their friends. Our analysis of the structure of these partisan communication networks found social networks are particularly efficient at disseminating information – accurate or not – when they are closely tied together and disconnected from other parts of society.

10 The tendency to evaluate information more favorably if it comes from within their own social circles creates “echo chambers” that are ripe for manipulation, either consciously or unintentionally. This helps explain why so many online conversations devolve into “us versus them” confrontations.

11 To study how the structure of online social networks makes users vulnerable to disinformation, we built Hoaxy, a system that tracks and visualizes the spread of content from low-credibility sources, and how it competes with fact-checking content. Our analysis of the data collected by Hoaxy during the 2016 U.S. presidential elections shows that Twitter accounts that shared misinformation were almost completely cut off from the corrections made by the fact-checkers.

12 When we drilled down on the misinformation-spreading accounts, we found a very dense core group of accounts retweeting each other almost exclusively – including several bots. The only times that fact-checking organizations were ever quoted or mentioned by the users in the misinformed group were when questioning their legitimacy or claiming the opposite of what they wrote.

Bias in the machine

13 The third group of biases arises directly from the algorithms used to determine what people see online. Both social media platforms and search engines employ them. These personalization technologies are designed to select only the most engaging and relevant content for each individual user. But in doing so, it may end up reinforcing the cognitive and social biases of users, thus making them even more vulnerable to manipulation.

14 For instance, the detailed advertising tools built into many social media platforms let disinformation campaigners exploit confirmation bias by tailoring messages to people who are already inclined to believe them.

15 Also, if a user often clicks on Facebook links from a particular news source, Facebook will tend to show that person more of that site’s content. This so-called “filter bubble” effect may isolate people from diverse perspectives, strengthening confirmation bias.

16 Our own research shows that social media platforms expose users to a less diverse set of sources than do non-social media sites like Wikipedia. Because this is at the level of a whole platform, not of a single user, we call this the homogeneity bias.

17 Another important ingredient of social media is information that is trending on the platform, according to what is getting the most clicks. We call this popularity bias, because we have found that an algorithm designed to promote popular content may negatively affect the overall quality of information on the platform. This also feeds into existing cognitive bias, reinforcing what appears to be popular irrespective of its quality.

18 All these algorithmic biases can be manipulated by social bots, computer programs that interact with humans through social media accounts. Most social bots, like Twitter’s Big Ben, are harmless. However, some conceal their real nature and are used for malicious intents, such as boosting disinformation or falsely creating the appearance of a grassroots movement, also called “astroturfing.” We found evidence of this type of manipulation in the run-up to the 2010 U.S. midterm election.

19 To study these manipulation strategies, we developed a tool to detect social bots called Botometer. Botometer uses machine learning to detect bot accounts, by inspecting thousands of different features of Twitter accounts, like the times of its posts, how often it tweets, and the accounts it follows and retweets. It is not perfect, but it has revealed that as many as 15 percent of Twitter accounts show signs of being bots.

20 Using Botometer in conjunction with Hoaxy, we analyzed the core of the misinformation network during the 2016 U.S. presidential campaign. We found many bots exploiting both the cognitive, confirmation and popularity biases of their victims and Twitter’s algorithmic biases.

21 These bots are able to construct filter bubbles around vulnerable users, feeding them false claims and misinformation. First, they can attract the attention of human users who support a particular candidate by tweeting that candidate’s hashtags or by mentioning and retweeting the person. Then the bots can amplify false claims smearing opponents by retweeting articles from low-credibility sources that match certain keywords. This activity also makes the algorithm highlight for other users false stories that are being shared widely.

Understanding complex vulnerabilities

22 Even as our research, and others’, shows how individuals, institutions and even entire societies can be manipulated on social media, there are many questions left to answer. It’s especially important to discover how these different biases interact with each other, potentially creating more complex vulnerabilities.

23 Tools like ours offer internet users more information about disinformation, and therefore some degree of protection from its harms. The solutions will not likely be only technological, though there will probably be some technical aspects to them. But they must take into account the cognitive and social aspects of the problem.

 


Reflect after Reading the Article

Record your responses to the questions below in complete sentences.

  1. Now that you have read the article, what is the main point?  Write it in your own words.
  2. Why do you think Ciampaglia and Menczer wrote the article?  (What was their purpose?)
  3. Were your predictions about the article correct?  Which ones were accurate, and which ones did you revise as you read the article.
  4. As you previewed the article, you wrote questions.  What questions do you still have after reading the article?  What else do you want to know about the article, the author, or topic of the reading?
  5. How did previewing the article help with your understanding of the text?

Reading & Response

Instructions:

  1. Read the article, “Misinformation and Biases Infect Social Media, Both Intentionally and Accidentally” by Giovanni Luca Ciampaglia and Filippo Menczer. As you read, annotate the article.  Take notes about the main idea, your reactions, and questions that you may have.
  2. After reading, complete a one-paragraph summary of the article. The summary should include the author’s name, article title, and the overall main idea.  Additionally, it is helpful to focus on the who, what, where, why, when, and how of the article to develop your summary.  The ideas should be paraphrased and written in your own words.
  3. Write a developed, one-paragraph response to the article. Develop a clear statement of your position or point of view on the ideas expressed in the article. Be sure clearly explain and support your response.  You may also consider using a particular quote from the article to use in your response.  If using a quote, work to incorporate the quote smoothly into the response.  Be sure to cite the quote using in-text citations.

As an example:

Ciampaglia and Menczer mentions, “Another source of bias comes from society.  When people connect directly with their peers, the social biases that guide their selection of friends come to influence the information they see.” I have witnessed this in my personal experience.”

From there, expand on your ideas to explain and support why you agree with this statement.

Suggestions for Writing

  1. Plan your summary and response before writing them. Review the notes that you have made regarding the article. Then, use a writing process that you are comfortable with that can include brainstorming, free writing, listing, outlining, mapping, pre-thinking, pre-writing, etc.
  2. Aim to use conventional grammar and sentence structure and to make the tone of your essay professional, not casual.
  3. Consider the use or research in the authors’ essay. Does it seem appropriate and/or valuable.
  4. Edit your work before submitting it.

Share This Book