Argument: Why do People Fall for Fake News?

28 Grammar Focus: Fake News

This chapter focuses on the following grammar components found in the article, Why Do People Fall for Fake News?

  • Using Independent Clauses to Help Determine Meaning
  • Analyzing Text for Verb Choice
  • Using Noun Clauses to State Position
  • Passive Voice & Modal Verbs
  • Hedging & Subject-Verb Agreement

Answer keys for each of the grammar activities are found in the answer key chapter.


Using Independent Clauses to Help Determine Meaning

Finding Independent Clauses to help determine meaning

(*for more information about marker words and connector words, see the Sentence Structure Glossary and the “Every Sentence is a Tree” video)

 

Sentences and paragraphs in academic articles can be packed with a lot of information contained in long, confusing sentences. When you are approaching a difficult reading, it can be helpful to identify the independent clauses. Independent clauses contain the main meaning of a sentence; the other parts connected to it give more information about it. The marker and connector words that connect these other parts indicate the relationship of these extra parts to the meaning of the independent clause.

 

Consider this sentence: People are very affected by the emotional connotations of a headline, even though that’s not a good indicator of an article’s accuracy.

The independent clause in this sentence is People are very affected by the emotional connotations of a headline. The focus of this sentence is the emotional feelings caused by headlines.

EVEN THOUGH is an adverb (subordinate clause) marker word, which means the second half of the sentence is a dependent clause. The meaning of the dependent clause is that headlines do not always accurately reflect the content of the article.

EVEN THOUGH indicates a contrast.

 

We could restate this sentence saying that Emotional headlines create intense feelings in people, but the facts of the actual article may or may not be true.

 

Reread the following paragraphs from the article and find subjects and verbs. Identify the independent clauses. Every sentence must have at least one independent clause. Next, find the dependent clauses, connector words, and phrases.

How do these extra parts relate to the meaning of the independent clause?

 

  1. These questions have become more urgent in recent years, not least because of revelations about the Russian campaign to influence the 2016 United States presidential election by disseminating propaganda through social media platforms.

 

2. One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization.

 

3. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor plays a role in what types of situations, we’ll be better able to design policy solutions to help combat the problem.

 

4. The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth.

 

5. According to this view, political passions essentially make people unreasonable, even — indeed, especially — if they tend to be good at reasoning in other contexts. (Roughly: The smarter you are, the better you are at rationalizing.)

6. Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests.

 

7.Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem.

 

8. Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them.

 

9. For example, people who think more analytically (those who are more likely to exercise their analytic skills and not just trust their “gut” response) are less superstitious, less likely to believe in conspiracy theories and less receptive to seemingly profound but actually empty assertions (like “Wholeness quiets infinite phenomena”).

 

10. This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.


Analyzing Text for Verb Choice


After reviewing the uses of present perfect and simple past, reread the following paragraphs from the article.

  • Underline present perfect and simple past verbs you see.
  • Why did the author choose to use present perfect in some cases and simple past in others?
  • Notice the present tenses as well. When do the authors use simple present? When do they use present continuous? Why?

 

  1. What makes people susceptible to fake news and other forms of strategic misinformation? And what, if anything, can be done about it? These questions have become more urgent in recent years, not least because of revelations about the Russian campaign to influence the 2016 United States presidential election by disseminating propaganda through social media platforms. The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth.

 

2. Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

 

3. We found that people who engaged in more reflective reasoning were better at telling true from false, regardless of whether the headlines aligned with their political views. (We controlled for demographic facts such as level of education as well as political leaning.) In follow-up studies yet to be published, we have shown that this finding was replicated using a pool of participants that was nationally representative with respect to age, gender, ethnicity and region of residence, and that it applies not just to the ability to discern true claims from false ones but also to the ability to identify excessively partisan coverage of true events.

 

4. Our results strongly suggest that somehow cultivating or promoting our reasoning abilities should be part of the solution to the kinds of partisan misinformation that circulate on social media. And other new research provides evidence that even in highly political contexts, people are not as irrational as the rationalization camp contends. Recent studies have shown, for instance, that correcting partisan misperceptions does not backfire most of the time — contrary to the results of Professors Nyhan and Reifler described above — but instead leads to more accurate beliefs.


Using Noun Clauses


(* for more detailed information about how noun clauses work, check the sentence structure glossary)

Noun clauses are often used in academic writing to state positions make claims about the topics being discussed. These statements can:

  •  Present a claim or belief
  • One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization.
  • Show support or agreement
  • The good news is that psychologists and other social scientists are working hard to understand what prevents people from seeing through propaganda.
  • Show opposition or disagreement
  • The bad news is that there is not yet a consensus on the answer.
  • Present evidence as support
  • Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests.

Read the following paragraphs from the article. Find the noun clauses. How are they being used in each case? What verbs do they follow?

  • Be careful – THAT does not always mark a noun clause. Remember that a noun clause occurs after a verb. Also notice that there are no commas used with this type of dependent clause.
  1. The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth. According to this view, political passions essentially make people unreasonable, even — indeed, especially — if they tend to be good at reasoning in other contexts. (Roughly: The smarter you are, the better you are at rationalizing.)

 

2. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

 

3. Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them…We believe that people often just don’t think critically enough about the information they encounter.

4. A great deal of research in cognitive psychology has shown that a little bit of reasoning goes a long way toward forming accurate beliefs. For example, people who think more analytically (those who are more likely to exercise their analytic skills and not just trust their “gut” response) are less superstitious, less likely to believe in conspiracy theories and less receptive to seemingly profound but actually empty assertions (like “Wholeness quiets infinite phenomena”). This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.

 

5. We found that people who engaged in more reflective reasoning were better at telling true from false, regardless of whether the headlines aligned with their political views. (We controlled for demographic facts such as level of education as well as political leaning.) In follow-up studies yet to be published, we have shown that this finding was replicated using a pool of participants that was nationally representative with respect to age, gender, ethnicity and region of residence, and that it applies not just to the ability to discern true claims from false ones but also to the ability to identify excessively partisan coverage of true events.

 

6. Our results strongly suggest that somehow cultivating or promoting our reasoning abilities should be part of the solution to the kinds of partisan misinformation that circulate on social media. And other new research provides evidence that even in highly political contexts, people are not as irrational as the rationalization camp contends. Recent studies have shown, for instance, that correcting partisan misperceptions does not backfire most of the time — contrary to the results of Professors Nyhan and Reifler described above — but instead leads to more accurate beliefs.

 

7. We are not arguing that findings such as Professor Kahan’s that support the rationalization theory are unreliable. Our argument is that cases in which our reasoning goes awry — which are surprising and attention-grabbing — seem to be exceptions rather than the rule. Reason is not always, or even typically, held captive by our partisan biases. In many and perhaps most cases, it seems, reason does promote the formation of accurate beliefs.

 

8. This is not just an academic debate; it has real implications for public policy. Our research suggests that the solution to politically charged misinformation should involve devoting resources to the spread of accurate information and to training or encouraging people to think more critically. You aren’t doomed to be unreasonable, even in highly politicized times. Just remember that this is also true of people you disagree with.


The Language of Hedging


Instructions: Review “The Language of Hedging” in Supplemental Grammar Information section. Then read the following the article “Why Do People Fall for Fake News” and highlight all hedging expressions that you can find. 

1 What makes people susceptible to fake news and other forms of strategic misinformation? And what, if anything, can be done about it?

2 These questions have become more urgent in recent years, not least because of revelations about the Russian campaign to influence the 2016 United States presidential election by disseminating propaganda through social media platforms. In general, our political culture seems to be increasingly populated by people who espouse outlandish or demonstrably false claims that often align with their political ideology.

3 The good news is that psychologists and other social scientists are working hard to understand what prevents people from seeing through propaganda. The bad news is that there is not yet a consensus on the answer. Much of the debate among researchers falls into two opposing camps. One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.

4 However, recent research suggests a silver lining to the dispute: Both camps appear to be capturing an aspect of the problem. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor plays a role in what types of situations, we’ll be better able to design policy solutions to help combat the problem.

5 The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth. According to this view, political passions essentially make people unreasonable, even — indeed, especially — if they tend to be good at reasoning in other contexts. (Roughly: The smarter you are, the better you are at rationalizing.)

6 Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

7 The implications here are profound: Reasoning can exacerbate the problem, not provide the solution, when it comes to partisan disputes over facts. Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them. It seemed as if people who were ideologically inclined to believe a given falsehood worked so hard to come up with reasons that the correction was wrong that they came to believe the falsehood even more strongly.

8 But this “rationalization” account, though compelling in some contexts, does not strike us as the most natural or most common explanation of the human weakness for misinformation. We believe that people often just don’t think critically enough about the information they encounter.

9 A great deal of research in cognitive psychology has shown that a little bit of reasoning goes a long way toward forming accurate beliefs. For example, people who think more analytically (those who are more likely to exercise their analytic skills and not just trust their “gut” response) are less superstitious, less likely to believe in conspiracy theories and less receptive to seemingly profound but actually empty assertions (like “Wholeness quiets infinite phenomena”). This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.

10 To test this possibility, we recently ran a set of studies in which participants of various political persuasions indicated whether they believed a series of news stories. We showed them real headlines taken from social media, some of which were true and some of which were false. We gauged whether our participants would engage in reasoning or “go with their gut” by having them complete something called the cognitive reflection test, a test widely used in psychology and behavioral economics. It consists of questions with intuitively compelling but incorrect answers, which can be easily shown to be wrong with a modicum of reasoning. (For example: “If you’re running a race and you pass the person in second place, what place are you in?” If you’re not thinking you might say “first place,” when of course the answer is second place.)

11 We found that people who engaged in more reflective reasoning were better at telling true from false, regardless of whether the headlines aligned with their political views. (We controlled for demographic facts such as level of education as well as political leaning.) In follow-up studies yet to be published, we have shown that this finding was replicated using a pool of participants that was nationally representative with respect to age, gender, ethnicity and region of residence, and that it applies not just to the ability to discern true claims from false ones but also to the ability to identify excessively partisan coverage of true events.

12 Our results strongly suggest that somehow cultivating or promoting our reasoning abilities should be part of the solution to the kinds of partisan misinformation that circulate on social media. And other new research provides evidence that even in highly political contexts, people are not as irrational as the rationalization camp contends. Recent studies have shown, for instance, that correcting partisan misperceptions does not backfire most of the time — contrary to the results of Professors Nyhan and Reifler described above — but instead leads to more accurate beliefs.

13 We are not arguing that findings such as Professor Kahan’s that support the rationalization theory are unreliable. Our argument is that cases in which our reasoning goes awry — which are surprising and attention-grabbing — seem to be exceptions rather than the rule. Reason is not always, or even typically, held captive by our partisan biases. In many and perhaps most cases, it seems, reason does promote the formation of accurate beliefs.

14 This is not just an academic debate; it has real implications for public policy. Our research suggests that the solution to politically charged misinformation should involve devoting resources to the spread of accurate information and to training or encouraging people to think more critically. You aren’t doomed to be unreasonable, even in highly politicized times. Just remember that this is also true of people you  disagree with.

Question: In Paragraphs 10 and 11, hardly any hedging language is used. Why do you think this is? To answer the question, think about the purpose of hedging.  


Subject-Verb Agreement – Error Correction


Read the following sentences. Find and correct errors in subject-verb agreement. 

  1. In general, our political culture seems to be increasingly populated by people who espouses outlandish or demonstrably false claims that often aligns with their political ideology. (2 errors)
  2. The good news are that psychologists and other social scientists are working hard to understand what prevent people from seeing through propaganda. The bad news are that there is not yet a consensus on the answer. (3 errors)
  3. Much of the debate among researchers fall into two opposing camps. One group claims that our ability to reason are hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claim that  the problem is that we often fails to exercise our critical faculties: that is, we’re mentally  lazy. (4 errors)
  4. However, recent research suggest a silver lining to the dispute: Both camps appears to be capturing an aspect of the problem. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor play a role in what types of situations, we’ll be better able to  design policy solutions to help combat the problem. (3 errors)
  5. Some of the most striking evidence used to support this position come from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change were greater  among people who scored higher on measures of science literary and numerical ability  than it was among those who scored lower on these tests. (2 errors)
  6. But this “rationalization” account, though compelling in some contexts, do not strike us as the most natural or most common explanation of the human weakness for misinformation. We believe that people often just doesn’t think critically enough about  the information they encounter. (2 errors)
  7. A great deal of research in cognitive psychology have shown that a little bit of reasoning goes a long way toward forming accurate beliefs. For example, people who thinks more analytically (those who are more likely to exercise their analytic skills and  not just trust their “gut” response) is less superstitious, less likely to believe in  conspiracy theories and less receptive to seemingly profound but actually empty  assertions (like “Wholeness quiets infinite phenomena”). (3 errors)
  8. Our results strongly suggests that somehow cultivating or promoting our reasoning abilities should be part of the solution to the kinds of partisan misinformation that circulate on social media. And other new research provide evidence that even in highly political contexts, people is not as irrational as the rationalization camp contends. (4 errors)
  9. Recent studies has shown, for instance, that correcting partisan misperceptions do not backfire most of the time — contrary to the results of Professors Nyhan and Reifler described above — but instead lead to more accurate beliefs. (3 errors)
  10. We are not arguing that findings such as Professor Kahan’s that supports the rationalization theory is unreliable. Our argument is that cases in which our reasoning goes awry — which are surprising and attention-grabbing — seems to be exceptions  rather than the rule. (3 errors)

Passive Voice & Modal Verbs


Noticing Passive Voice and Modal Verbs

Instructions: Read the following passages from the article “Why Do People Fall for Fake News” and underline all passive voice expressions that you can find. Highlight modal verbs.

Paragraph 1. What makes people susceptible to fake news and other forms of strategic misinformation? And what, if anything, can be done about it?

Paragraph 2. In general, our political culture seems to be increasingly populated by people who espouse outlandish or demonstrably false claims that often align with their political ideology.

Paragraph 3. Much of the debate among researchers falls into two opposing camps. One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.

Paragraph 4. However, recent research suggests a silver lining to the dispute: Both camps appear to be capturing an aspect of the problem. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor plays a role in what types of situations, we’ll be better able to design policy solutions to help combat the problem.

Paragraph 5. The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth.

Paragraph 6. Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

Paragraph 7. The implications here are profound: Reasoning can exacerbate the problem, not provide the solution, when it comes to partisan disputes over facts. Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them. It seemed as if people who were ideologically inclined to believe a given falsehood worked so hard to come up with reasons that the correction was wrong that they came to believe the falsehood even more strongly.

Paragraph 9. This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.

Paragraph 10. To test this possibility, we recently ran a set of studies in which participants of various political persuasions indicated whether they believed a series of news stories. We showed them real headlines taken from social media, some of which were true and some of which were false. We gauged whether our participants would engage in reasoning or “go with their gut” by having them complete something called the cognitive reflection test, a test widely used in psychology and behavioral economics. It consists of questions with intuitively compelling but incorrect answers, which can be easily shown to be wrong with a modicum of reasoning.

Paragraph 11. In follow-up studies yet to be published, we have shown that this finding was replicated using a pool of participants that was nationally representative with respect to age, gender, ethnicity and region of residence, and that it applies not just to the ability to discern true claims from false ones but also to the ability to identify excessively partisan coverage of true events.

Paragraph 12. Our results strongly suggest that somehow cultivating or promoting our reasoning abilities should be part of the solution to the kinds of partisan misinformation that circulate on social media. And other new research provides evidence that even in highly political contexts, people are not as irrational as the rationalization camp contends.

Paragraph 13. Our argument is that cases in which our reasoning goes awry — which are surprising and attention-grabbing — seem to be exceptions rather than the rule. Reason is not always, or even typically, held captive by our partisan biases. In many and perhaps most cases, it seems, reason does promote the formation of accurate beliefs.

Paragraph 14. Our research suggests that the solution to politically charged misinformation should involve devoting resources to the spread of accurate information and to training or encouraging people to think more critically. You aren’t doomed to be unreasonable, even in highly politicized times. Just remember that this is also true of people you  disagree with.


Error Correction – Passive Voice and Modal Verbs

Instructions: The following sentences contain errors in the use of passive voice and modal verbs. Find and correct these errors.

Adapted from “Why Do People Fall for Fake News?”

  1. What makes people susceptible to fake news and other forms of strategic misinformation? And what, if anything, can done about it? (1 error)
  2. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor plays a role in what types of situations, we’ll better able to design policy solutions to help combat the problem. (1 error)
  3. The rationalization camp is build around a set of theories contending that when it comes to politically charge issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth. (2 errors)
  4. Some of the most striking evidence use to support this position comes from an influential 2012 study. Apparently, more “analytical” Democrats better able to convince themselves that climate change was a problem, while more “analytical” Republicans better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants asked to assess. (4 errors)
  5. This body of evidence suggests that the main factor explaining the acceptance of fake news could cognitive laziness, especially in the context of social media, where news items are often skim or merely glance at. (2 errors)
  6. The study participants completed something called the cognitive reflection test, a test widely use in psychology and behavioral economics. It consists of questions with intuitively compelling but incorrect answers, which can be easily show to be wrong with a modicum of reasoning. (2 errors)
  7. In follow-up studies yet to be publish, we have shown that this finding was replicate using a pool of participants that was nationally representative with respect to age, gender, ethnicity and region of residence, and that it applies not just to the ability to discern true claims from false ones but also to the ability to identify excessively partisan coverage of true events. (2 errors)
  8. Our results strongly suggest that somehow cultivating or promoting our reasoning abilities should part of the solution to the kinds of partisan misinformation that circulate on social media. (1 error)

Share This Book