In Science Fictions, psychologist Stuart Ritchie explores how the scientific enterprise systematically goes awry, and what can be done to right the ship. The book is focused on, essentially, the inverse of the problem that Heterodox Academy was created to solve. The key problem with science today, as Ritchie sees it, is not that there is too much orthodoxy, but instead, too much focus on critique and innovation at the expense of careful, iterative knowledge building.
Everyone wants to coin their own specialized terms and acronyms, to establish new paradigms, to pilot promising new interventions. Novelty, big effects and positive results are what determine whether research is perceived as valuable.
As a consequence, we end up with an overwhelming number of false positives (findings that do not replicate or generalize), misinformation cascades, and a failure to publish and learn from null results. We have people reinventing the same wheel, over and over, in slightly different form. Or coming up with the same null findings — which are never published (because positive results are what ‘matter’) — resulting in subsequent scholars pursuing the same dead ends over and over again.
Staggering amounts of money, labor and other resources are wasted in the process. But really, that’s the least of it. People continue to suffer and die from problems that could be much more tractable in a world where scientists were not going in circles like this. Still more are harmed or killed by social, psychological and medical interventions based on these erroneous studies. These outcomes generate understandable skepticism towards science, technology, expertise and institutions of knowledge production. However, the good is often undermined alongside the bad, leading to still more adverse consequences as sound science is mistrusted, underutilized or defunded.
Of course, any critique of a status quo implies normative judgements of what should be the case. Ritchie recognizes this off the bat, and opens up the book by endorsing as his ideal the Mertonian norms of science: universalism, disinterestedness, communality and organized skepticism. The book is then organized as an exploration of how contemporary institutional structures and cultural trends often push scientists away from these ideals, as manifested in the central problems the book sets out to describe: fraud, bias, negligence and hype.
Lying for Truth, Exaggerating for Justice
Given the ubiquity of these problems, and their profound negative consequences, it would be easy for scientists to emerge almost as villains in Ritchie’s story. It is to the author’s credit that he invites readers to be sympathetic instead.
Yes, careerism is rampant in the sciences. The drive to ‘make a name’ for oneself, to defend one’s reputation or contributions, to rack up publications and citations in prestigious journals, to secure funding, etc. play a central role in the story. However, academic corruption plays out in subtle ways – and it is very rare that scholars are motivated by outright disregard for the truth or others’ well-being. Instead, as Tetlock memorably put it, the road to scientific hell is often paved with good moral intentions.
For instance, in addition to careerist concerns, scholars who commit fraud often seem genuinely convinced that the narratives advanced by their papers are, in fact, true. Fraud is often motivated, in part, by a desire to amplify what scientists believe to be the truth when their experiments fail to provide the expected confirmatory data (see pp. 70-3). In other cases, scholars are convinced that a new treatment or intervention can help people, but feel like they need eye-popping results to draw attention or secure funding for it – leading them to either massage the data or overhype their findings.
With respect to negligence, ideological precommitments also play a key role. Scholars regularly catch errors for findings that seem to disconfirm their desired or expected results. However, when data instead confirm our preferred narratives or hypotheses, we tend to trust its reliability, and don’t scrutinize it much (pp. 126-7). This asymmetry is typically not cynical or intentional – it is a product of how our brains are wired. And it requires constant vigilance and institutional guardrails to hedge against these tendencies.
There are also structural problems at work, such the privatization of academic journals, scientific data, key technologies and techniques (especially when funded by corporations), or the neoliberalization of higher learning (which exacerbates careerist impulses, and pushes scholars toward competition over collaboration and transparency). There is the drive for sensationalism in corporate media and social media. In many cases, there is state interference (Ritchie’s discussions on China are especially striking; the data are breathtaking). These restrict and distort the flow of information, create perverse incentives, and weed out people unwilling to ‘play the game.’ All of us have to navigate and work within these structural constraints; virtually all of us, at some point, end up making compromises of one kind or another in order to succeed therein. Ritchie recognizes and sympathizes with these realities – but insists we must not simply resign ourselves to them. The stakes are too high.
What is to Be Done?
Detailing a problem is easy. Thinking through what, concretely, can be done about it is much harder. Given the enormity of the problems detailed in the book, it would be easy to think the situation is pretty much hopeless. However, throughout, Ritchie emphasizes that his text is not a screed against science, but a call to action for reforming it. The book’s final section explores what those reforms might look like.
Some of the proposals relate to individual scientists, and ways they can protect themselves against the temptations of fraud, hype, negligence and bias. However, there are limits to what any person can do on their own. Blind spots, motivated reasoning and perverse incentives are not something we can unilaterally overcome. The key insight of the scientific method is that, by putting people together with different experiences, priors and commitments, at different institutions, with different specializations, etc. – we can collectively transcend our individual limitations, and produce knowledge that is more complete, and more reliable, than we otherwise could on our own.
Hence, the concluding chapters of Science Fictions also explore institutional and cultural changes that could be made to address the problems of bias, fraud, negligence and hype. Some of Ritchie’s proposed solutions are in the process of being piloted or built out – and the book does a good job of drawing attention to the heroic efforts already underway. Others may be more difficult to accomplish because there are powerful economic and political interests arrayed against them – and it will be up to scientists to rally against those special interests, and to make the case to the public to do the same.
Of course, some of the proposed solutions may not be feasible for various reasons, others may not work out as expected were they to be implemented. There may be important alternative tactics that are overlooked or ignored here. The author seems well-aware of these possibilities. The goal of these final sections is not to try to provide a definitive roadmap or provide all the answers, but to mobilize more people to start thinking through these issues, collaborating with one-another, and working towards reform.
Bias: Political and Otherwise
Let me close this summary by briefly discussing Ritchie’s treatment of bias – a topic much studied and discussed by Heterodox Academy and its sympathizers.
Ideological bias is sexy for the media. It also happens to be a problem that is particularly acute in social science, where most of those studying bias reside. Ritchie, a social scientist, does briefly discuss ideological bias in his book (pp. 115-119). However, most of the chapter on bias is focused on other forms which may seem more banal, but are also far more common and substantial – cutting across a much broader range of fields (from social science, to biomedical research, to the ‘hard sciences’). For instance, biases towards positive findings (including statistical significance), novelty, large effects, demonstrating interventions as efficacious, delivering results that confirm the theories and methods we are heavily invested in, or which suggest return-on-investment for those funding research, etc.
In his discussion of ideological bias, Ritchie does acknowledge homogeneity in social research fields as a problem – be it ideological or demographic. Yet he insists that the solution should not be trying to achieve some arbitrary ratio, or to produce unapologetically right-biased work as a ‘corrective’ to leftwing bias. Instead, he argues, the focus should be on removing unfair barriers to participation in the scientific enterprise, and on improving the process of knowledge production to better identify and account for biases across the board.
This has also been Heterodox Academy’s longstanding position. The goal is not necessarily to have the ideological and demographic makeup of higher ed institutions perfectly mirror base rates in the broader society – but instead to ensure that 1) as many people as possible feel as though they have a voice and a stake in these institutions, 2) no particular faction is in a position to ‘capture’ these institutions (imposing their views on others, censoring or purging dissent), 3) processes like hiring, promotion, admissions, peer review, IRBs, grant allocations, etc. are as rigorous and fair as they can be, and 4) scholars are mindful of the ways their own thinking and research can go awry.
With respect to this latter point, the chapter on bias is somewhat masterful. It opens with a vignette that seems to be a clear example of bias (pp. 81-4), but which is transformed at the conclusion of the chapter into a cautionary tale for those who would level such accusations against others (pp. 119-21). Scholars who accuse others of biases often have unexamined biases of their own, which are highly germane to the topic under consideration. In their zeal to point out others’ errors, debunkers often make their own; in attempting to correct excesses in one direction, they push things too far in another.
In short, the one lesson that everyone should take from the research on biases is that no one is immune to them.
And yet people generally view themselves as an exception to social rules. We think that the forces that bind and blind everyone else do not govern our own attitudes and behaviors to the same extent, if at all. There is a tendency to view oneself as smarter, more ethical, and less biased than most other people. In principle, social scientists should know better than to help themselves to these same assumptions. However, our positionality as social observers, ironically, often reinforces our sense of exception. As Koppl put it,
“In examining human society, we may easily forget that we too are humans in a society. We see society as an anthill and people as ants. We gaze down upon the anthill as if we were higher beings… The very act of theorizing society puts you in a spurious godlike position.”
(Expert Failure, p. 19)
Sociologist Andrew Abbott described this tendency among social scientists to ignore the conclusions of research when it comes to their own lives and decision making as ‘knowledge alienation’:
“As social scientists we are in the business of explaining other people’s behavior. But as humans we live our own lives as if we were free moral beings, Kantian individuals. We don’t explain our own lives, we live them: it is other people’s lives we explain.”
(Processual Sociology, p. 255)
This is why reflexivity and viewpoint diversity are so crucial for social research – they help us to reckon with the otherwise alienated knowledge that, in fact, we are no better than everyone else.
Being smart is no protection against bias. Instead, there is an abundant and growing literature suggesting that highly intelligent and educated people may, in many respects, actually be more prone to dogmatism and motivated reasoning than most.
Being involved with a movement like Heterodox Academy does not provide exemption either. If anything, a commitment to calling out groupthink, bias and agenda-pushing *in others* may divert our attention from taking a comparably hard look at ourselves and our allies.
In reality, there are no shortcuts. There are no exceptions. There is only the day to day unglamorous work of trying to put these values into practice as best we can — prioritizing careful iterative knowledge production over novelty, sensationalism and quick payoffs.
Read the full book: Ritchie, Stuart (2020). Science Fictions: How Fraud, Bias, Negligence and Hype Undermine the Search for Truth. New York, NY: Metropolitan Books.