Steven Pinker and a whole host of scientists write a thoughtful article on censorship in science, and while I think he underplays the importance of nefarious sources of censorship from the government and Big Tech, his focus on how scientists censor themselves and each other is perfectly justifiable.
Scientists do, in fact, censor scientists for a variety of reasons, and in almost all cases this is in my view not only a bad thing but also extraordinarily arrogant. Scientists are placing their judgments of what others should be allowed to know above the search for truth.
They are, in other words, betraying the core value that drives science: truth.
Does science get censored? Yes. Who does it? Scientists. My new paper in PNAS (spearheaded by Cory Clark, w 38 coauthors). https://t.co/E2OIzlcNKP
— Steven Pinker (@sapinker) November 20, 2023
Pinker et. al call the reasons scientists often censor “prosocial,” which is a commonly used social science term. It basically means what it sounds like: doing something for the good of society. In this case, what is being done is hiding a truth for the good of others.
The fundamental principle of science is that evidence—not authority, tradition, rhetorical eloquence, or social prestige—should triumph. This commitment makes science a radical force in society: Challenging and disrupting sacred myths, cherished beliefs, and socially desirable narratives. Consequently, science exists in tension with other institutions, occasionally provoking hostility and censorship (1). In liberal democracies, government censorship of science is rare (although see ref. 2). The greatest threats to scientific openness are often more diffuse and disguised as legitimate scientific criticism (e.g., rejection of dangerous and false information) (3).
Let me be clear: I do not believe that there aren’t any cases in which hiding facts is justified. I just don’t think scientists are the ones who are making those judgments. Keeping the secrets of making a world-destroying bomb or the identities of our spies is perfectly justified. But we aren’t talking about these edge cases; we are talking about the building blocks of knowledge.
Hard censorship occurs when people exercise power to prevent idea dissemination. Governments and religious institutions have long censored science (26). However, journals, professional organizations, universities, and publishers—many governed by academics—also censor research, either by preventing dissemination or retracting postpublication (27–31). Soft censorship employs social punishments or threats of them (e.g., ostracism, public shaming, double standards in hirings, firings, publishing, retractions, and funding) to prevent dissemination of research. Department chairs, mentors, or peer scholars sometimes warn that particular research might damage careers, effectively discouraging it (32). Such cases might constitute “benevolent censorship,” if the goal is to protect the researcher.
All of these examples happen frequently in academia. And while claims such as “97% of scientists agree” are usually overblown, at least sometimes a consensus exists in an area of inquiry simply because it is career-ending to NOT agree. This happens much more frequently than you would expect.
Science is done by human beings after all, and human beings act in these ways all the time.
People disproportionately search for (55), share (56), and remember (even falsely) preferred information (57). In addition, people are selectively skeptical of discordant information (58) and more negatively evaluate scientific methods when results are undesirable (59, 60). Similar patterns occur among scientists. For example, peer reviewers evaluate research more favorably when findings support their prior beliefs, theoretical orientations, and political views (61–63). Scientific papers describe ideological outgroup members more negatively than ingroup members (64). Scholars are likelier to reject papers ostensibly written by little-known authors than identical papers ostensibly written by prominent authors (65). In an analysis of scientific papers, 96% of statistical errors directionally supported scientists’ hypotheses, suggesting credulity among scholars toward favorable outcomes (66). In addition, a survey of Society of Experimental Social Psychology members revealed that perceived undesirability of an empirical finding corresponded with disbelief in that finding (67). Confirmation bias and other forms of motivated cognition (68) can fuel a self-reinforcing dynamic in which censorship and self-censorship discourage empirical challenges to prevailing conclusions, encouraging a false consensus that further discourages dissent.
The entire article deserves a deep dive and I encourage you to do so, but what drew my attention to it was an observation that Pinker makes that too few people consider–scientists, like others, often want to achieve certain goals in society, and they present their work or evaluate others’ in light of those goals.
Science is influenced by values–it is not value-neutral–and it is important to keep that in mind whenever scientific claims are made in areas where they may have an impact on human behavior or society.
Censorship research typically explores dark psychological underpinnings such as intolerance, authoritarianism, dogmatism, rigidity, and extremism. Authoritarianism (76, 77), on the political right and left (78, 79), is associated with censoriousness, and censorship is often attributed to desires for power and authority (11). Although citizens in liberal democracies support free speech in the abstract, they often support censorship in ideologically challenging cases (80, 81). Censorship may also signal in-group allegiances (82), as members denounce others to gain status and affirm their group’s superiority (83).
But censorship can be prosocially motivated (84). Censorious scholars often worry that research may be appropriated by malevolent actors to support harmful policies and attitudes (4). Both scholars and laypersons report that some scholarship is too dangerous to pursue, and much contemporary scientific censorship aims to protect vulnerable groups (4, 85, 86). Perceived harmfulness of information increases censoriousness among the public (3, 87), harm concerns are a central focus of content moderation on social media (88), and the more people overestimate harmful reactions to science, the more they support scientific censorship (86). People are especially censorious when they view others as susceptible to potentially harmful information (89, 90). In some contemporary Western societies, many people object to information that portrays historically disadvantaged groups unfavorably (60, 91), and academia is increasingly concerned about historically disadvantaged groups (92). Harm concerns may even cause perceptions of errors where none exist (53, 86).
Prosocial motives for censorship may explain four observations: 1) widespread public availability of scholarship coupled with expanding definitions of harm (93) has coincided with growing academic censorship (94); 2) women, who are more harm-averse and more protective of the vulnerable than men (95, 96), are more censorious (48, 77, 78); 3) although progressives are often less censorious than conservatives (86), egalitarian progressives are more censorious of information perceived to threaten historically marginalized groups (91, 97); and 4) academics in the social sciences and humanities (disciplines especially relevant to humans and social policy) are more censorious and more censored than those in STEM (98, 99).
As a practical matter what is happening here is similar to what happens elsewhere: science that supports a preferred Narrative is emphasized, and that which doesn’t or even undermines that Narrative is tossed into the garbage bin. Even robust findings, such as the odd one that eating ice cream is good for diabetics, can be ignored because it undermines the scientists’ biases.
Science is a human activity, and as such we should be as skeptical of scientists as we are of politicians or the claims of any activist. Not because we distrust science itself, or because scientists are any worse than any group of people, but because they are, in the end, no better than any other group.
Science is a powerful tool, but it is a tool wielded by human beings and must be understood to be so.
Join the conversation as a VIP Member