Fast Science, Slow Science: Finding Balance in the Time of COVID-19 and the Age of Misinformation
By Daniel J. Dunleavy and Vincent F. Hendricks
The COVID-19 pandemic has placed politicians, community leaders, and everyday citizens in a difficult position. Decisions need to be made, but are typically done so with incomplete information, and often with a sense of anxiety and urgency. We discuss how the pandemic has changed science, for better and for worse, how this change affects decision-making, and how it interrelates with broader social phenomenon, such as the hyperconnectivity of the information age and consequent spread of misinformation. Taken together, these issues place us at high risk for implementing poor policies and making dangerous decisions about personal safety. We end by considering some ways that science can be made more rigorous, in order to inform policies related to the pandemic and other social problems, how scientists may play a more active role in policy-decisions, and how citizens and community leaders can make better informed decisions themselves.
The current crisis
Nearly nine months after its identification by Chinese authorities, the novel coronavirus (COVID-19) has caused substantial harm and disrupted daily life globally. There have been roughly 34 million documented cases and approaching one million associated deaths. The long-term consequences for physical health are still being uncovered, but appear to be substantial, at least for some infected persons. The economic and social costs of the pandemic are less clear but are by all accounts significant. Still further, mitigation strategies, used to slow and prevent the spread of the virus, have emerging costs and benefits — which may not be evenly distributed across socioeconomic and geographic groups, particularly when implemented uniformly and inflexibly. Simply put, the pandemic has caused loss and disruption of historic proportions.
The global community is in a precarious position. Decisions are being made with urgency, but often without complete information or scientific consensus. For example, stakeholders across all levels of society are tasked with deciding when and how to restrict travel and business, initiate and ease social distancing measures, reopen schools, and allocate relief funds. Any single one of these decisions involves numerous considerations. The situation is made more difficult by twin problems: One, the deluge of research that is available to guide such decisions and two, an information ecosystem that can produce, spread, and amplify falsehoods.
Fast science
Science has been turned upside-down by the COVID-19 pandemic — for better and for worse. Research support structures, foundations, and governments have granted emergency funds for COVID-19 research, while academic publishers and professional societies have opened access to relevant publications. With these resources in hand, researchers have been working around the clock, often in international teams, scrambling to find effective treatments and a possible cure. But the pressure to succeed is constant. With each passing day, the number of cases and deaths grow while economic growth decreases or goes backwards; and external actors, such as state authorities, anxiously wait to approve vaccines, even before they have gone through stage 3 testing just to get the wheels turning again.
As finding effective treatments is a matter of extreme urgency, the speed of the process through which science is conducted, vetted, and published has increased dramatically. Preprints (papers and materials posted online before traditional, formal peer review) have rapidly proliferated, while review windows for peer review have narrowed, leaving scientists drowning in COVID-19 papers and results.
Speed increase has its benefits, but is not necessarily conducive to truth, testing, and methodological rigor. It can lead to findings that cannot be independently reproduced and to support for theories without merit; support which may not be easily withdrawn. Science may then, unwittingly, become a key source for the spread of dis- and mis-information, phenomena not only to be factored in on social media, but as part of an “infodemic” potentially taking hold of science proper; a problem highlighted by both Tedros Adhanom Ghebreyesus, the Director-General of the World Health Organization (WHO), and science publishers worldwide.
The attention economy
The unfortunate consequences of fast science and the problems of information overload are intimately connected to the trade-off between information on one end and attention on the other. Already back in 1971, Herbert Simon, who would win the 1978 Nobel Prize Laureate in Economics, said something very prophetic about an age to come with information in abundance. He noted that, ”…in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients.”
Attention is a very limited resource, after all there are only 24 hours in a day. With information in the world doubling almost daily, attention becomes a very valuable asset. In order to attract attention one may speculate in what sort of information users are willing to spend their precious attention on — thus realizing an attention economy in the information market. But whatever turns viral is not necessarily true, and whatever is true is not necessarily viral, even in science. This leaves a market opportunity open for circulating information (or scientific studies) which, short of being true and tested, attracts scores and scores of attention.
There is in turn a lucrative market for fake news and tampered information, particularly during the current pandemic. This market amplifies the reach and magnitude of misinformation, polarizing aspects of the COVID-19 pandemic, permitting everything from fraudulent personal advice on how to counter the virus to global conspiracy theories about its origin and motives to set to sea. Add to this mix the speed by which (mis)information may traverse across the web, while all decision points pertaining to the virus are perceived as being urgent, and the result is a toxic environment for slow, methodical science to have voice and impact, and a seductive environment for hasty and flashy headlines, in science and press alike.
Fanning the flames
In an information rich environment the crucial feature is not whether we all — users, societies, companies, NGOs etc. — have been granted a bullhorn to the world given profiles on social media, blogs and platforms — the crucial question is whether our voice will be heard in the cacophony. Wild theories without much evidence and justification, catchy titles and hasty conclusions while possibly neglecting the basic checks-and balances of scientific methodology may make for an unfortunate divide between the constitutive and regulative rules of science. Indeed, science may in general be compared to a game, the objective of which is to find a true, adequate, or correct model of relevant aspects of the world using the methods of scientific inquiry as the constitutive framework. The regulatory rules are comprised by everything from publication strategies, incentive structures, and directives for research organization, management, and funding. In science, one may play according to the regulatory rules while heading nowhere near a correct model, treatment or vaccine for that matter. Boosting results, stretching scientific findings beyond their explanatory value, and neglecting rigor and replicability may surely be an optimal strategy for acquiring short-term, social rewards, such as attention from funding agencies, prestige, or some other research benefits, but it does not do the goal of knowledge acquisition any good. In fact, playing only according to the regulatory rules neglecting the constitutive ones may create science bubbles.
Rigorous science
The information age is part of our present reality and will continue to be a part of our future. It will continue to shape our beliefs and sway our decisions — even ones based on scientific research. It will amplify the voices of those acting in “bad faith” and facilitate the spread of fake news. Despite this, there are ways that we as scientists, policymakers, and global citizens, can exert control and make balanced, informed decisions; decisions founded in a more rigorous science.
Scientists and scholars can begin by opening and reforming the scholarly publication system. This can happen immediately. Researchers can make their data available (with few exceptions) for reanalysis by independent reviewers. Peer review, whether done prior to or after publication, can be made publicly accessible. Journals can encourage submission and review of manuscripts before data collection, as a means prioritizing strength of study design and methodology over flashy results. Together these actions can improve the quality of research assessment, help prevent the validation and dissemination of fast, but “sloppy” science, and take advantage of thoughtful, deliberation, not merely by one or two experts but by a community of “truth seekers”. Together these actions provide a more reliable foundation for decision-making and reorient researchers back to constitutive rules of inquiry. While there is still much work to be done, many of these recommendations are already being put in place and are ready for widespread implementation.
Still further, scientists and scholars can use their unique skillsets and training to help others. As behavioral scientist Neil Lewis notes, scientists can help their local agencies and organizations collect, manage, and interpret data, or aid local officials and journalists in accessing and interpreting research studies. This may also include helping others sift through and combat (mis)information and COVID-19-related propaganda.
Likewise, there are steps to be performed for everyday citizens, community leaders, and policymakers. When faced with new information about a coronavirus treatment or cure, we can ask what the source of a claim is? Relying on multiple, reliable and independent sources can help validate the accuracy of a story. Before rushing to judgment or action, reflect and ask yourself how this new information fits with your existing belief set. Simply becoming aware of the nuts and bolts of attention economics and the market for fake news in the information age can go a long way toward sharpening our critical thinking faculties. And we need these faculties right now during the COVID-19 pandemic … in science and for society.
Authors’ note: This post was jointly written by Daniel Dunleavy and Vincent Hendricks in September 2020.
It is archived at Zenodo at: https://doi.org/10.5281/zenodo.4056908 and may be considered a companion piece to: Scientific Practice in the Time of COVID-19: It’s Time to Commit to Change.
Author information:
Daniel J. Dunleavy, PhD, Center for Translational Behavioral Science, Florida State University, Tallahassee, Florida, USA; Twitter: @Dunleavy_Daniel
Vincent F. Hendricks, PhD, Center for Information and Bubble Studies (CIBS) University of Copenhagen, Copenhagen, Denmark; Twitter: @infostorms