Rachael Rettner
Such false information can have serious consequences — these rumors were linked to thousands of hospitalizations and hundreds of deaths.
From the idea that drinking bleach can kill the coronavirs to a theory that the virus was created in a lab as a bioweapon, the COVID-19 pandemic has generated a flurry of misinformation, hatching more than 2,000 rumors, conspiracy theories and reports of discrimination, according to a new study.
Such false information can have serious consequences — the researchers of the new study found that COVID-19 related rumors were linked to thousands of hospitalizations and hundreds of deaths. For example, a myth that consumption of highly-concentrated alcohol could kill the coronavirus has been linked with more than 5,900 hospitalizations, 800 deaths and 60 cases of blindness due to methanol poisoning (which can occur when people drink home-brewed or illegally manufactured alcohol), the report said. Many of these cases were in Iran where alcoholic beverages are illegal. In India, 12 people, including five children, got sick after drinking liquor made from the toxic seed Datura, believing it to be a cure for COVID-19, according to the new report.
“Misinformation fueled by rumors, stigma and conspiracy theories can have potentially serious implications on the individual and community if prioritized over evidence-based guidelines,” the authors wrote in their study, published Monday (Aug. 10) in the American Journal of Tropical Medicine and Hygiene. “Health agencies must track misinformation associated with … COVID-19 in real time, and engage local communities and government stakeholders to debunk misinformation.”
Advertisement
For the study, an international team of social scientists, doctors and epidemiologists reviewed content on social media, including posts on Twitter and Facebook, as well as newspaper and TV reports, from December 2019 to April 2020.
They identified more than 2,300 separate reports of rumors, conspiracy theories and stigma related to COVID-19 in 25 languages from 87 countries. Of these, most (89%) were classified as rumors, or unverified claims surrounding COVID-19; about 8% were classified as conspiracy theories, or beliefs about people working in secret with malicious goals; and 3.5 percent were classified as stigma, or reports of people experiencing discrimination due to illness, travel history, exposure to infected people or ethnic descent. (For example, the study identified 26 episodes of violence related to stigma, such as a case in Ukraine in which people hurled stones at buses that carried persons evacuated from Wuhan, China.)
Just like the COVID-19 pandemic, this “infodemic” of misinformation has come in waves, with the first wave between Jan. 21 and Feb. 13, the second between Feb. 14 and March 7 and the third between March 8 and March 31. The third wave was the largest in terms of number of reports, with reports peaking in mid-March, the authors said.
About a quarter of the claims were related to COVID-19 illness, transmission or mortality, and an additional 19 percent were related to treatments and cures for the disease. For example, there were rumors that drinking bleach, eating garlic, keeping the throat moist, avoiding spicy foods, taking vitamin C and even drinking cow’s urine could prevent or cure the disease. Clorox, on its website, has a pop-up message warning consumers of the dangers of drinking or ingesting bleach.
About 15% of the infodemic was related to causes or origins of the disease. For example, some conspiracy theories suggested that COVID-19 had been engineered as a bioweapon.
“Governments and other agencies must understand the patterns of COVID-19–related rumors, stigma and conspiracy theories circulating the globe” so that they can better communicate COVID-19 information and debunk false information, the authors said.
The authors recommend that governments and health agencies continue to publish accurate scientific information about COVID-19 on their websites. In addition, agencies should not only identify and debunk COVID-19 rumors, but also engage with social media companies to “spread correct information,” they concluded.
- Originally published on Live Science.
– HERALD