8. Ethics
8.4. Veracity and Objectivity in Research
Victor Tan Chen; Gabriela León-Pérez; Julie Honnold; and Volkan Aytar
Learning Objectives
- Discuss the importance of objectivity in social scientific research.
- Identify possible political concerns related to research ethics.
As we describe in Appendix A: Presenting, Writing, and Publishing, it is critical for researchers to provide a full and detailed accounting of how we did our work (typically described in the methods section of a presentation or paper) and what our analysis concluded. When we disclose all this information, then those who draw on our work—say, to conduct other research, or to formulate and implement policies—can trust it and make the best possible decisions. This transparency also assures readers that we have conducted legitimate research and did not simply come to whatever conclusions we wanted to find. In the appendix, we talk about these practices as a matter of ensuring the quality of our research and contributing to the scientific conversation: we want fellow sociologists to know all the details so that they can assess our work fairly and fully, and so that others might be able to easily replicate or build on our work. But exhaustively providing details about the research we do is also a moral issue, one that has, sadly, continued to crop up in public debates over the truthfulness of scientific research.
Full transparency in research entails being completely honest about how we identified and recruited our research subjects, how we collected and analyzed our data, what the specific limitations of our methodological approaches were, and what our full and unvarnished findings were—even when they contradict what we expected or hoped, or what prominent scholars have concluded. Furthermore, we as researchers should be ready to share any data we collected (properly anonymized to protect people’s identities, of course) with the larger scientific community. Sometimes researchers may leave out methodological details because there may not be enough time or space to share them. This is often the case with news reports of research findings, which (as we also discuss in the appendix) usually must be condensed and simplified.
Researchers may also have less justifiable reasons to conceal. To prevent others from challenging their work, they may be silent about the weaker parts of their studies, or hide or fudge findings they may not like. They may refuse to share their data—perhaps in the hopes that they can crank out as many scholarly publications as possible before other researchers can catch up. And they may jealously guard the details of how they did their work—because they fear copycats, perhaps, or because they want to hide flaws in their studies. Yet science relies on transparency to work its magic of generating verified knowledge and promoting creative and critical thinking. Good researchers are upfront about their work—or at least as upfront as they can possibly be, given other ethical obligations like protecting the identities of their participants. And as we’ve stressed previously, every research study—especially any ambitious one—has flaws and shortcomings. We, as serious scholars, have to own up to them. In fact, if you find that other researchers are less than open about their methods, data, and findings, you might want to question whether you should believe what they say.
As scientists, we have an obligation to uphold veracity in our research—put simply, to tell the truth about whatever we find. But for professional and even financial gain, sometimes scientists do not fulfill these obligations. There are many examples to pick from, but one relatively recent and egregious example is the fraudulent research conducted by Harvard evolutionary biologist Marc Hauser. An early sign of problems with Hauser’s research was a 1995 study he led, published in the prestigious Proceedings of the National Academy of Sciences, which concluded that cotton-top tamarins could recognize themselves in mirrors. Psychologist Gordon Gallup reviewed some of the available videos of Hauser’s experiment and saw “not a thread of compelling evidence” of such behavior among the tamarins featured in the recordings (Johnson 2010). When he asked for the remaining tapes, he was told they had been stolen (Economist 2010). In response to the critique by Gallup, Hauser first defended the study and clarified the criteria used to code the video data; in 2001, however, he said his subsequent work had failed to replicate the earlier finding that tamarins recognized themselves in mirrors.
Rumors circulated for years about Hauser’s dubious research, and in 2007, Harvard University opened up an investigation, ultimately finding him responsible for eight counts of scientific misconduct. Although details of the internal investigation remain secret, individuals who worked with Hauser have alleged that Hauser falsely coded video data of monkey behavior—even pressuring research assistants who wanted to accurately recode the data not to do so. In the face of these criticisms, Hauser resigned from Harvard in 2011, and a year later, the Office of Research Integrity, a U.S. federal agency overseeing researcher ethics, found him guilty of scientific misconduct.
Fraudulent research can cause harm that goes well beyond misled and irate colleagues. This is especially the case nowadays, when scientific findings are regularly attacked and dismissed. During the early years of the Covid-19 pandemic, many people refused to take the various vaccines that were quickly developed for the virus and its variants. Hesitation about any new technology is understandable, and every medical treatment has risks, but the international scientific community was flabbergasted by the extent to which nonscientists challenged their work, or latched onto unproven treatments for Covid that, at best, had support from small-n, one-off studies. Vaccines in general had a very safe track record, and studies of Covid vaccines had found only small risks associated with them, especially relative to the acute and chronic conditions threatened by Covid. The incendiary public debates over Covid vaccines and treatments often drew from anecdotal evidence, misrepresentations of scientific studies, or outright false claims. And while opposition to Covid vaccines in the United States became closely associated with conservative politics, it built on a broader current of skepticism about vaccines that had thrived in the decades before Covid. Across the country, parents’ decisions not to vaccinate their children based on unfounded fears have regularly led to outbreaks of infectious diseases like measles, which in 2000 had been temporarily eliminated (Centers for Disease Control and Prevention 2020).
There is only so much that scientists can do in the face of misinformation (inaccurate information) or disinformation (deliberately misleading information), but at the very least, they can present research—whether or not it is their own, and whether or not it supports the findings of their past work—in an impartial and fair fashion. They can also help nonscientists understand how to evaluate research and think critically about the claims being made (Dyer and Hall 2019). Unfortunately, well-publicized scandals in science have eroded public trust and fed contrarian views with no real empirical support. The vaccine debate provides another relevant example. A 1998 paper published in the Lancet, a top-ranked British medical journal, made the claim that the MMR vaccine (which combines vaccines against measles, mumps, and rubella) was linked to the development of autism in children. Other studies failed to replicate this finding, however.
The reporting of investigative journalists eventually uncovered evidence that the lead researcher on this study, physician Andrew Wakefield, had fudged diagnoses and dates to fabricate a causal relationship between the vaccine and autism. Wakefield reportedly had been set to make millions from selling a diagnostic kit and a competing vaccine that would benefit from skepticism about the MMR vaccine. The Lancet and Wakefield’s coauthors retracted the paper, and Wakefield was discredited and stripped of his license to practice medicine. Although research in subsequent decades has continued to find no relationship between autism and the MMR vaccine—or any other vaccine—the Lancet study appears to have fueled skepticism about vaccines that continues to this day.
The scandal over the retracted Lancet study also underscores problems with bias in research. In Wakefield’s case, he clearly had a conflict of interest—a situation where a researcher would derive a financial or other personal benefit from a particular research finding, and therefore might favor such a finding (consciously or unconsciously) while conducting their research. In Wakefield’s case, he was studying whether or not the MMR combination vaccine contributed to the development of autism, yet finding a causal relationship would have brought him substantial financial gain—meaning his objectivity as a researcher was in question, and it would be hard to trust the conclusions of his study.
Typically, reputable journals and publishers require authors to disclose any conflicts of interest. While Wakefield’s conflict of interest was particularly sensationalistic, more mundane but still troubling conflicts might include receiving grant funding from an organization that favors the policy, program, or intervention you are studying, or conducting research that would benefit a family member’s financial or career stake in regard to the issue being studied. To track potential conflicts of interest, IRBs sometimes ask researchers to regularly report their financial interests (such as their holding of stock in certain companies), which are recorded officially and can be used to identify potential conflicts. When such situations actually do arise, a journal’s editors may decide not to publish a scientist’s work, or they may publish it with a statement (typically at the end of a journal article) that fully discloses the existence of that conflict of interest. In the latter case, an editor might think the researcher can still conduct the research in an impartial fashion even though the conflict exists. Yet this determination is always a subjective judgment call, and to some extent, the reader will need to decide how large of a grain of salt should accompany their take on the study’s results.
Conflicts of interests are exceedingly common in the social sciences, given that the work we do often has real-world consequences for policymaking. In fact, in the aftermath of the 2008 financial crisis that tanked the global economy, many economists got into trouble for failing to disclose the financial interests they had even as they commented on issues relevant to those interests. Some economists, for instance, produced papers and reports before the crash that dismissed some of the worrying signs of the economy’s fragility and opposed efforts to regulate the financial industries whose fraudulent and reckless practices ultimately brought about disaster. In the documentary Inside Job (2010), an Academy Award-winning analysis of the factors that brought about the financial crisis, filmmaker Charles Ferguson famously criticized a number of these economists, including scholars working for both Democratic and Republican administrations (Inman and Kingsley 2011). For example, in a contentious interview with Glenn Hubbard, former chair of President George W. Bush’s Council of Economic Advisers, Ferguson confronted the Columbia University economist about his strong advocacy for deregulating financial markets while also being paid hundreds of thousands of dollars by financial firms for serving on their boards or consulting for them—positions that were sometimes disclosed on his website, but not mentioned in his writings (Solman 2011). Afterward, Columbia strengthened its conflict-of-interest policies, and the American Economic Association, the professional association representing the country’s economists, adopted stronger standards for such disclosure in its code of ethics (Lahart 2011).
Bias in research does not have to be deliberate or devious. People are not always aware of how much their personal interests or background can influence their take on things, including their take on their own research: remember the power of confirmation bias in skewing our findings. For example, pioneering American and European anthropologists who immersed themselves in fieldwork in remote indigenous communities produced powerful accounts of life in small and close-knit villages without many modern technologies. Yet even luminaries like Margaret Mead—who in her classic ethnography Coming of Age in Samoa (1928) described Samoan youth as having sex in a free and open manner—have been accused of “romanticizing” or misrepresenting aspects of another culture, arguably because their Western perspective shaped how they interpreted what they observed.
Social scientists, particularly those who use qualitative methods, are more skeptical today about outsiders studying cultures they are not part of and do not fully understand. They worry that ethnographers do not always fully appreciate the impact of their work on those they study or the voyeuristic way they can approach less advantaged communities—for instance, how they personally profit through their academic publications from the “deviant” and “exotic” behaviors they observe in low-income neighborhoods. On the flip side, ethnographers have also been criticized for uncritically embracing the viewpoints of the communities they study—which, at an extreme, can reach the point of abetting immoral and even illegal activities. (We will discuss these various pitfalls at greater length in Chapter 9: Ethnography.) Columbia University sociologist Sudhir Venkatesh, who penned the blockbuster sociology trade book Gang Leader for a Day (2008), was subsequently criticized for contributing to the criminal enterprise of the drug gang he studied. Separately, Venkatesh came under fire for being careless with facts in his writing and not being able to account for missing funds at a center he oversaw at Columbia (Kaminer 2012).
Ethnographer Alice Goffman wrote another widely read book, On the Run (2015), which followed young African American men as they interacted with police in a low-income neighborhood in Philadelphia. Like Venkatesh, fellow sociologists at first praised her work as providing an unprecedented and empathetic look at individuals on the margins of society as well as the institutional forces that hemmed in their lives. However, Goffman later received criticism for becoming too close to her informants, such as when—according to her own account—she drove an armed informant around as he sought revenge for the murder of another of Goffman’s respondents. In addition, some academics questioned the accuracy of police practices described in her book and even alleged she had exaggerated or made up facts about her observations. Journalist Jesse Singal tracked down some of the families and individuals described with pseudonyms in Goffman’s work and confirmed they were real (itself an illustration of both the distinct professional norms of journalists and the difficulty of preserving confidentiality for participants). Singal concluded that “her book is, at the very least, mostly true” and that her critics’ accusations of embellishment or factual errors were likely due to the fact that Goffman “simply didn’t heed her own advice about credulously echoing sources’ stories” (Singal 2015b, 2015a).
This brings us back to the divide between the ethical tenets of professional journalism (as much as they are often savaged) and those of sociology. From a principled perspective, journalists see themselves as doggedly pursuing the truth—anyone else’s feelings be damned. As rock journalist Lester Bangs advised a cub reporter in the film Almost Famous (2000): “Be honest, and unmerciful.” For many journalists, the professional imperative is to hold people in power to account. Therefore, they are more likely to conduct adversarial interviews, challenging people’s opinions or narratives of events, laying down verified facts, and catching any lies or contradictions. By contrast, the model in sociology tends to be empathetic interviewing, in which the goal is to listen without judgment, build rapport with the interviewee, and gather as much detailed and in-depth information as the interviewee is willing to provide. Such an orientation toward interviewing fits well with our ethical obligations as sociologists not to exploit or harm participants, treating them with a recognition of our position relative to them and the power we wield in that relationship (see the discussion of manipulation in the sidebar The Journalist and the Ethnographer). One downside of this approach is that it may lead sociologists to naively parrot what their respondents tell them, without checking whether their claims really hold up. To manage this risk, it is a good idea to speak to multiple sources and triangulate one’s findings—a strategy we discuss further in later chapters. In rare cases, sociologists have gone so far as to hire professional fact-checkers to verify the details in their books, as Matthew Desmond did for his Pulitzer Prize-winning book Evicted (2016); fact-checking is expensive, however, and therefore out of reach of most researchers.
Of course, as we discussed in Chapter 3: The Role of Theory in Research, some social scientists dispute the idea of “truth” altogether, and they push backs against notions like journalistic “objectivity” that can seem to them like a convenient excuse to impose a mainstream or establishment interpretation of the facts. It is also true that, as sociologists, we are also not so much interested in individuals—say, the decisions of one specific leader—but rather in how the actions and thinking of those individuals reflect broader social forces. This places less weight on the “truth” of particular persons or situations, given that they are being used as representations of more abstract cultures or social structures. Sociologists might also pay more attention to who is telling a particular story. For instance, another criticism leveled against Goffman’s study of young African American men running from the law was that she is a white woman from a privileged background (in fact, she is the daughter of the influential sociologist Erving Goffman). Was it appropriate for someone like her to tell the story of the men she followed (Brown 2017)? Social scientists often disagree about these thorny ethical issues, and as we’ve suggested, journalists can legitimately take a very different approach based on the distinct professional norms they operate under. Even if we as sociologists take to heart the postmodernist critiques of “objectivity” and “truth,” it is always worth considering how fair our interpretation of reality is, and whether it includes a sufficient range of viewpoints and facts to capture what is going on in a reasonable, open-minded, and valid way.
Politics in Research
A final point we want to make about ethics is how much the political and cultural context we operate in can shape our ability as scholars to produce incisive, high-quality research. A repressive or polarized political context can have a chilling effect on important research that does not fall in line with popular ideologies.
Especially in countries with limited civil liberties, researchers may feel compelled to legitimize authoritarian practices, reinforce a regime’s narratives about politics and history, and even ignore or justify state crimes (Akçam 2007). Not doing so might mean a scholar’s ability to conduct research or advance in their career is stymied, given how many universities and research institutes are government-funded. Researchers who are closer to authorities may be given easier access to archives, opportunities to conduct larger surveys among the populace, and financial and other support from state-run institutions. In turn, scholars may omit important findings if they could be perceived as antagonizing those in power or challenging some of the myths the state is trying to propagate. After all, taking such a critical perspective can lead to serious personal consequences: throughout history, authoritarian regimes have targeted academics for killing and imprisonment, seeing intellectuals as inherent threats to the obedience and docile order they desire. It is understandable why researchers who operate in repressive environments will toe the official line, but doing so amounts to a breach of the ethical obligations that social scientists have to be truthful in their research (Öztürk 2018).
Even in countries where scholars presumably have greater freedom to conduct research as they want to, governments regularly try to influence research in more or less innocuous ways. In the United States, scientific funding agencies like the National Science Foundation and the National Institutes of Health shape research agendas with the massive amounts of money they leverage to support specific lines of study. More troubling for many academics is the major role that military agencies play in funding research. For instance, the Stanford Prison Experiment that we described at the beginning of the chapter was funded by the U.S. Office of Naval Research. Henry Murray, the psychologist who subjected the Unabomber and other students to continuous humiliation in his Harvard laboratory, formerly served as a lieutenant colonel in the Office of Strategic Services (often considered to be the precursor to the CIA), where he conducted psychological experiments on mind control and psychedelics; bioethicist Jonathan Moreno (2006) and others have even argued that Murray’s Harvard experiments were part of the CIA’s classified Project MKUltra.
So far, we have been talking about how those with political power can silence or suppress the voices of academics. But it is important to note that the field of sociology can also reinforce political ideologies. As we discussed in Chapter 2: Using Sociology in Everyday Life, critics have pointed out the ideological conformity in the discipline, with one critic saying that sociology “is so dominated by the left that it is instinctively dismissed by the right” (Kristof 2014). As important as it is to have and hold tight to your own moral and political values, an intellectually honest student of social life quickly realizes how complex it is, and how a variety of different perspectives can be legitimately brought to bear to understand it. Ethically, social scientists should hold themselves to a higher standard than just echoing the acceptable opinions of people on their side of a debate. They should be open to being surprised and persuaded by what they uncover through empirical investigation. If you care about not just preaching to the choir but also convincing others with different perspectives, the most effective strategy may be to conduct your research as honestly and impartially as you can.
Studying Vulnerable Populations and Polarized Issues: A Q&A with Didem Danış
Didem Danış is an associate professor of sociology at Galatasaray University in Istanbul. She is also the chair of the Association for Migration Studies (GAR), a Turkish nongovernmental organization (NGO) she helped found. Born in Istanbul, Danış received her bachelor’s and master’s degrees at Turkish universities and then completed her doctorate at one of France’s top graduate academies—the School for Advanced Studies in the Social Sciences (EHESS), where she wrote her dissertation about the social networks of Iraqi transit migrants in Istanbul. An urban sociologist and scholar of migration, Danış studies how refugees and other migrants assert their agency in a context of structural constraints and deeply rooted historical inequalities. More broadly, she examines how transnationalism, social networks, and gender issues shape immigrant communities. In addition to her research in Turkey, Danış has collaborated on studies of migrants in France and the Middle East and has edited four volumes on topics relating to migration.
Why did you decide to study sociology?
When I was finishing high school, I was very confused. I couldn’t decide whether to study archeology, biology, or the social sciences. My grades were high, and I knew I could choose any major I wanted, but most of the people around me were trying to convince me to study business administration. At that time, globalization was at full throttle, and finance had replaced industry. Business administration was a major that seemed to guarantee high-salary white-collar jobs for its graduates.
Yet, I wanted to go against the current. I decided to study political science at Boğaziçi (Bosphorus) University in Istanbul, considered among the best institutions for social sciences in Turkey at that time. Political science was an exciting choice in a country whose political life was interrupted by military coup d’états almost every 10 years. However, at the end of my first year, I realized that it would not be possible to understand the political dynamics without understanding the society, and I decided to do a double major in sociology.
I became fascinated by sociology courses related to cities and the interaction between spatial and social dynamics. After all, I was living in Istanbul, an impressive metropolis and a global crossroads where different migration movements have intersected throughout history. My first readings on urban sociology made me better understand my family’s story, which is based on a migration journey, too—one within Turkey. As I read more sociological texts, I started to grasp the inequalities that various migrant groups encounter in the urban space. Istanbul’s population has been formed by various immigration waves, yet the city has not embraced every group with the same warmth or offered them equal opportunities.
Global cities are places of discrimination and injustice. But they are also sites of solidarity and struggle. Istanbul—a city of great opportunities and challenges, of opening as well as closing doors—has been my passion as a sociologist.
You have worked closely with national and international NGOs. How has your background as a sociologist influenced your work in the nonprofit sector? Have you found certain research methods to be particularly useful in these professional settings?
As sociologists engaged in the production of scientific knowledge, we have an important role to play in sharing our knowledge about society with the general public. I use the knowledge I have gained through scientific methods to strengthen the work of nonprofit organizations and answer the public’s questions about migration, one of the main challenges of our time. Since migration is a global issue, a bridge needs to be built between local, national, and international organizations. As an academic who specializes in the sociology of migration, I try to link these fields.
In environments such as Turkey where quantitative data are limited, inaccessible, or unreliable, qualitative research methods play a very valuable role, providing information that can support the efforts of NGOs and help them develop policy proposals. This is especially the case for migration issues. Informality and irregularity characterize the lives of many migrants. Researchers studying these communities therefore need to be flexible and adaptable—as Stephen Castles (2012:25–26) puts it, willing to “respond to the lessons of the field and to hear what respondents are saying.” In the course of research, frequent and unexpected problems arise. Good researchers reexamine and reformulate their original questions as needed.
You study Syrian, Afghan, and Iraqi refugees and immigrants in Turkey. We live in a world that is deeply divided on issues of immigration, with refugee flows in particular sparking anger and violence across countries. What role do you think research can play in helping societies address these demographic shifts in a healthy and constructive way?
In today’s public discussions of immigration, we see many features of the “post-truth” era. There are intense campaigns of disinformation and other efforts to manipulate people’s thinking about immigrants and refugees, especially on social media. By examining migration dynamics from the perspectives of both immigrants and host societies, sociologists can shed light on processes and experiences that are rendered invisible in everyday political debates. Take “replacement theory,” a conspiracy theory that is being peddled all over the world. Replacement theory argues that foreign powers bring immigrants into a country to disrupt its social structure. Over time, immigrants will gain demographic superiority and “destroy our homeland,” according to the theory’s proponents.
This idea is widespread in immigrant-receiving countries—even though such a drastic demographic shift is just not possible in many of these places. A sociological perspective helps us understand what makes these exclusionary discourses, nevertheless, so popular. People feel that others they consider to be different—minority groups—pose an ideological, social, or economic threat. They thus overstate the size of these groups. This sense of “symbolic threat” is stronger during periods of profound social change. As a sociologist, I try to clarify these dynamics, how they lead to outputs like hate speech, exclusion, and violence toward immigrants, and what risks they pose for society.
You have received vicious attacks on social media because of your work with refugees. Have you been surprised by the backlash you’ve received? How have you been able to deal with this harsh public scrutiny, and how do you feel it affects your work as a scholar?
Xenophobic and racist rhetoric has become widespread online. Anyone who criticizes these discriminatory discourses and talks about migrants’ rights is targeted. Migration researchers get their share of these attacks because they examine the subject with scientific methods and produce evidence-based arguments against disinformation and manipulation. When I first encountered such offensive statements, I was surprised and frightened. Especially on social media, some people use a much more aggressive style—even prying into their targets’ private lives. They want to scare and silence scientists who speak out. Eventually, my skin thickened against such attacks on social media, but what is still difficult for me is to hear the same unfounded claims in face-to-face interactions.
When we sociologists limit our speech to narrow scientific environments, we can comfortably express our ideas. But as someone who believes in public sociology, I think that we should share our knowledge with the wider public. To do so in the face of attacks, we need to use some tools that the sociological imagination provides—for example, drawing strength from online communities of colleagues who adopt similar values. The Facebook group Migration Researchers’ Platform, which now has more than 10,000 followers, has been a good support for me in this respect. The researchers involved with this group also founded the Association for Migration Research in Istanbul.
What sorts of ethical challenges have you encountered when you’ve conducted research on vulnerable populations like immigrants and refugees? Are there particular approaches you take to address those challenges?
A crucial issue in doing research with immigrants and refugees is being aware of the vulnerability and precarity within these communities. Researchers should do no harm—neither to the immigrants, nor to themselves. For me, this means being careful about my respondents’ privacy and anonymity and not revealing any confidential information I gathered, which may put them in danger if disclosed. Whenever I am in the field, self-reflexivity and positionality are my key considerations.
You live in a country where academics face intense political pressures regarding how they do their work. What impact does the political situation in Turkey have on you, as a researcher and otherwise?
In my country, freedom of expression has been seriously curtailed over the last ten years. The rise of authoritarianism places a heavy burden of fear on those who produce and disseminate critical knowledge. In particular, authoritarian regimes consider sociology to be a threat, since it is a discipline that studies society in a scientific and analytical way.
Political pressure is not the only danger. As in many other countries, scholars in Turkey are also dealing with the commodification of the academic world. The funding of research has become a mechanism of control and censorship. Both the public and private sector have an instrumentalist interest in the knowledge produced by researchers.
What advice would you give sociology students who want to do work similar to your own—research that helps disadvantaged communities and vulnerable populations?
I recommend that they pursue the subjects that they are sincerely curious about. Sociology has the potential to build bridges between the academic and the real world, and they should keep in mind their capacity to influence politics. As a sociologist, I want to study the society I live in, because I want to change it.
Key Takeaways
- Even though we sociologists often critique black-and-white notions of truth and objectivity, we can all follow standards of honesty and transparency in our research to ensure that our work is credible and does not contribute to misinformation.
- Political concerns and authoritarian pressures may hamper scientific research and further complicate notions of what is ethical and appropriate research.
Information that is inaccurate or misleading. (Compare to disinformation.)
Information that is deliberately misleading. (Compare to misinformation.)
Situations in which the financial or other personal benefits that researchers have received, or could receive, might compromise or bias their research by influencing their professional judgment and objectivity.
A natural tendency to interpret data in ways that support, or “confirm,” one’s existing views, which can lead to flawed research findings that reflect the researcher’s personal biases.
To use one research method to evaluate or extend the findings derived from another method.