User:Peeps121/Brandolini's law
Brandolini's law, also known as the bullshit asymmetry principle, is an internet adage coined in 2013 that emphasizes the effort of debunking misinformation, in comparison to the relative ease of creating it in the first place. The law states the following:
The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it.
The rise of easy popularization of ideas through the internet has greatly increased the relevant examples, but the asymmetry principle itself has long been recognized.
Origins
[edit]The adage was publicly formulated in January 2013 by Alberto Brandolini, an Italian programmer. Brandolini stated that he was inspired by reading Daniel Kahneman's Thinking, Fast and Slow right before watching an Italian political talk show with former Prime Minister Silvio Berlusconi and journalist Marco Travaglio.
Examples
[edit]The persistent claim that vaccines cause autism is a prime example of Brandolini's law. The false claims, despite extensive investigation showing no relationship, have had a disastrous effect on public health. Decades of research and attempts to educate the public have failed to eradicate the misinformation.
In an example of Brandolini's law during the COVID-19 pandemic, a journalist at Radio-Canada said, "It took this guy 15 minutes to make his video and it took me three days to fact-check."
In another example, shortly after the Boston Marathon bombing, the claim that a student who had survived the Sandy Hook Elementary School shooting had been killed by the bombing began to spread across social media. Despite many attempts to debunk the rumor, including an investigation by Snopes, the false story was shared by more than 92,000 people and was covered by major news agencies.
The yoga scholar-practitioners Mark Singleton and Borayin Larios write that several of their colleagues have "privately" expressed their "aversion to public debate" with non-scholars because of Brandolini's law.
Environmental researcher Dr. Phil Williamson of University of East Anglia implored other scientists in 2016 to get online and refute falsehoods to their work whenever possible, despite the difficulty per Brandolini's law. He wrote, "the scientific process doesn't stop when results are published in a peer-reviewed journal. Wider communication is also involved, and that includes ensuring not only that information (including uncertainties) is understood, but also that misinformation and errors are corrected where necessary."
Psychological Effects
[edit]One of the many psychological effects at play may be the continued influence effect. This effect means that when the reader is presented with false information first, it is more difficult to correct or debunk that information. Even when the misinformation is debunked or corrected, the reader may recall the misinformation rather than the corrected information. “the continued influence effect seems to defy most attempts to eliminate it.”~ Ullrich Ecker[1]
A second psychological effect is called the implied truth effect. This effect is basically when a fact-checking platform corrects misinformation on a post/platform, the other posts that the fact-checker didn’t get to, must be true. Even though the fact-checking platform went through the platform, they might not have caught some misinformation, which may cause problems regarding this psychological effect to the reader and the misinformation gathered.
Another psychological effect at play might be repetition. When something is repeated to the consumer, they are more likely to remember it. This directly coincides with Brandolini’s law. If the reader sees the false information that is spread fast through the internet, then they will remember that easier than the correct information. According to Brandolini’s law the correct information is much slower and takes much more effort to obtain than the misinformation. This will make the false information more readily available and easier to see.
Prevention Tactics
[edit]There are some ways to prevent this phenomenon by using certain techniques. The “Debiasing Techniques” could be used to stop the spread of false information, slowing down the effect of Brandolini’s law. One of the techniques is called “Changing Incentives”. This includes changing the feedback/rewards associated with a bias. When misinformation is posted, usually following the post, is an onslaught of likes, shares, and views. If negative feedback is associated with the spread of false information, such as if the spreader’s account is penalized in some way, it would prompt people to be more careful and precise with the information they post. Another technique is called “Increasing Accountability”. This also helps solidify the changing incentives technique. If the accounts posting false information are held accountable for what they post, they might be more careful with what they contribute to the internet. [2]
"Pre-bunking" and active "debunking"[3] are other techniques that can be used in order to prevent the effect of Brandolini's law. Debunking involves actively correcting false information after it has been spread, aiming to revise misinformation and prevent the continued spread and negative effects on individuals' beliefs. This approach recognizes the psychological barriers that surround the spread of misinformation like the repetition effect. Pre-bunking is a proactive approach aimed to prevent the spread of misinformation in the first place. By protecting individuals from potential falsehoods before they are exposed to the misinformation pre-bunking is a cognitive defense mechanism that can prevent psychological effects such as the continued influence effect by preventing the initial exposure to the misinformation.
Similar concepts
[edit]In 1845, Frédéric Bastiat expressed an early notion of this law:
We must confess that our adversaries have a marked advantage over us in the discussion. In very few words they can announce a half-truth; and in order to demonstrate that it is incomplete, we are obliged to have recourse to long and dry dissertations. - Frédéric Bastiat, Economic Sophisms, First Series (1845)
In 1986, Harry G. Frankfurt wrote this quote in his piece “On Bullshit”:[4]
“Bullshit is unavoidable whenever circumstances require someone to talk without knowing what he is talking about. Thus the production of bullshit is stimulated whenever a person's obligations or opportunities to speak about some topic are more extensive than his knowledge knowledge of the facts that are relevant to that topic.”
-Harry G. Frankfurt, On Bullshit
Based on Frankfurt’s work in “On Bullshit”, John V. Petrocelli proposed two new hypotheses.[5] The first hypothesis is the “Obligation to Provide an Opinion” hypothesis. This hypothesis claims that people bullshit when it is deemed acceptable by societal norms in a situation, as well as when the expectation by others is to have an opinion. The second hypothesis is the “Ease of Passing Bullshit” hypothesis, which claims that when they can get away with it, people will bullshit. This is amplified when the topic isn’t well known or a passionate subject. Petrocelli conducted two social experiments to test his hypotheses, and both reinforced and supported each hypothesis.
See also
[edit]- Big Lie
- Burden of proof
- False balance
- Gish gallop
- Hitchens's razor
- List of eponymous laws
- Poe's law
- Sealioning
- ^ Shane, Tommy (2020-07-14). "The psychology of misinformation: Why it's so hard to correct". First Draft. Retrieved 2023-11-20.
- ^ Guo, Yue; Yang, Yi; Abbasi, Ahmed (2022). "Auto-Debias: Debiasing Masked Language Models with Automated Biased Prompts". Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA, USA: Association for Computational Linguistics. doi:10.18653/v1/2022.acl-long.72.
- ^ Ecker, Ullrich K. H.; Lewandowsky, Stephan; Cook, John; Schmid, Philipp; Fazio, Lisa K.; Brashier, Nadia; Kendeou, Panayiota; Vraga, Emily K.; Amazeen, Michelle A. (January 2022). "The psychological drivers of misinformation belief and its resistance to correction". Nature Reviews Psychology. 1 (1): 13–29. doi:10.1038/s44159-021-00006-y. ISSN 2731-0574.
- ^ Harrison, Jeff (2005-11-01). "On Bullshit. By Harry G. Frankfurt. Princeton: Princeton University Press, 2005". Literature and Theology. 19 (4): 412–414. doi:10.1093/litthe/fri049. ISSN 1477-4623.
- ^ Petrocelli, John V. (May 2018). "Antecedents of bullshitting". Journal of Experimental Social Psychology. 76: 249–258. doi:10.1016/j.jesp.2018.03.004.