“I’m willing to bet you know at least one girl that’s using steroids every single day,” starts a young man in a TikTok video.
He stares into the camera, continuing his big, yet notably false, reveal: “One in three girls these days is taking the birth control pill, and believe it or not, the birth control pill is actually an analog of the bodybuilding steroid nandrolone.”
Another face swiftly crowds the screen. Dressed in a white lab coat, debunker Mustafa Dhahir, a practicing pharmacist and medical student based in Australia, interrupts the video with his own commentary: “One of the most annoying things when it comes to busting misinformation is that the people who spread the misinformation use hints of truth to spread their lies.”
For every large creator who is genuinely evidence-based, you’ve got 50 or 60 big creators who spread misinformation.
Dhahir explains what a steroid is and then goes point by point to illustrate why the original video – which claims oral contraception causes a medley of symptoms, including changes in sexual attraction – is inaccurate. “This guy is simply using scare tactics,” Dhahir tells the viewer, noting that there are many birth control options with varying sets of side effects.
Dhahir is part of a growing cohort of scientists, physicians, health care professionals and academics who debunk health misinformation on TikTok by “stitching” videos, which involves clipping existing videos into new ones and then offering one’s own input.
While social media platforms including TikTok have developed systems to flag vaccine misinformation, an ocean of other dubious health claims often go unscrutinised – except when individual users like him, who have actual medical knowledge, push back.
“Misinformation impacts medical decisions and health,” said Dhahir, who began responding to false claims on TikTok at the start of the pandemic and has since amassed 9.5 million likes on his videos. He has debunked claims that contraception makes women infertile, that only “natural” medicine can be trusted and that Tylenol is linked to autism.
The work is often draining. Unqualified influencers posting misinformation far outnumber the experts debunking it, who are often harassed by other users for their efforts. “For every large creator who is genuinely evidence-based, you’ve got 50 or 60 big creators who spread misinformation,” said Dr Idrees Mughal, a Britain-based physician with an additional masters in nutritional research, whose account, @dr_idz, has 1 million followers.
He debunks fad diets, unsupported claims that food ingredients are “cancer-causing” and the myth that certain vegetables contain harmful “toxic” chemicals. Misinformation is so ubiquitous that Mughal said he is tagged in 100 to 200 videos a day from users requesting that he debunk claims. “People are looking for genuine science-based, evidence-based creators,” he said.
FERTILE GROUND FOR PSEUDOSCIENCE
Misinformation is widespread on all of the major social media platforms, but TikTok’s audio capabilities can give false claims particular longevity. Bits of misinformation clipped and saved as what TikTok calls sounds “operate like viral chain messages,” according to a 2021 blog post from the Institute for Strategic Dialogue, a London-based center that researches disinformation and extremism online. Even if a video is taken down, the original audio often survives in the work of users who have already borrowed it for their own content.
TikTok has enacted policies to flag such content, including adding informational banners to COVID-19 vaccine content, but one ISD study of more than 6,000 videos related to the vaccines found that 58 per cent lacked banners.
When it comes to combating health misinformation in general, all social media platforms face a daunting task, given the sheer volume of inaccurate posts. In a statement, TikTok wrote, “We work diligently to take action on content and accounts that spread misinformation, while also promoting authoritative content about vaccines through our COVID-19 information hub.”
When asked if TikTok was addressing general health misinformation, the company replied that it both removes violations of the platform’s policies and works “with credible voices to elevate authoritative content on topics related to public health.”
Abbie Richards, a misinformation and disinformation researcher and research fellow with the Accelerationism Research Consortium, an organisation aimed at understanding and addressing the threat of extremism, said TikTok’s video format was also advantageous in spreading conspiracies.
Period sex, vaginal smells, IVF: The best on TikTok for women’s health advice
How Singapore healthcare experts are using TikTok to reach out to more patients
Creators speak directly into the camera as if they are on a video call with the viewer. “It feels more authentic than disembodied text,” Richards said, which can make it seem more credible. YouTube, which is still a far larger video destination than TikTok and also has audio capabilities, does not necessarily create the same sense of intimacy.
Timothy Caulfield, the Canada research chair in health law and policy at the University of Alberta, studies misinformation on social media and said that research had shown that a powerful personal anecdote, testimonial or narrative could overwhelm people’s ability to think scientifically. Coupled with creative visuals, “those elements have made it a very consumable and powerful way to spread misinformation,” he said.
Pseudoscience creators have built on the inherent velocity of social media platforms with tactics of their own. Many use shock and fear to make bogus health claims seem more urgent and credible. One video that Mughal debunked claimed that many people unknowingly have parasites living in their gut that cause chronic illnesses and that the solution is buying “parasitic cleansing kits.”
Then there is “science-washing,” a term debunkers use to describe how pseudoscience creators deploy scientific-sounding language to weigh in on health issues, cherry-pick studies to support false claims or cite studies that seem relevant but are not.
In TikTok’s beauty circles, Michelle Wong, a cosmetic chemist who runs Lab Muffin Beauty Science, a blog and social media accounts that explain the science behind skin care and cosmetic products, has made a new career out of fighting misinformation.
She often encounters creators who take ingredients out of context, sometimes conflating topical use with ingestion – an important distinction, as consumers do not drink their moisturisers.
Wong also sees pseudoscience creators who back up false, fearmongering claims about sunscreen with white papers the creators either do not have full access to or do not understand. “That in itself is quite convincing, because very few people are actually going to look up every single paper listed,” she said.
The lack of science literacy online was partly what inspired Katrine Wallace, a public health researcher and a professor at the University of Illinois at Chicago, to start debunking inaccurate content on TikTok.
At the start of the pandemic, she noticed that users were debating whether COVID was even real, and she has since debunked videos stating that COVID vaccines cause death within six months, for example, and that microscopic worms or parasites are found in surgical face masks. That topic recirculates every six months, she said in an interview, adding, “People are obsessed.”
Viewers are drawn to surprising or extreme statements, and highly engaged content is promoted by the app’s algorithm, which is why debunkers often find themselves responding to provocative assertions. For the same reasons, myth-busting response videos like Dhahir’s on oral contraception often get more views than simple explainers. “People like drama,” Wallace said.
In refuting claims, debunkers try to engage respectfully with other creators. Mughal said he refrained from insulting or attacking creators who disseminated misinformation and instead focused on addressing the health claims. Wallace takes a different approach.
She said she would first reach out privately to the original poster to explain why the video is problematic and urge them to take it down or publicly address the misinformation. “And if they block me or delete my comments,” she said, “then I’m like, ‘OK, it’s on.’”
ON THE BRINK OF BURNOUT?
The business of debunking is time-consuming. Scripting, filming and editing, not to mention managing comments – which sometimes also breed misinformation when users share counterarguments – can take hours each day. To attract an audience, each video must accurately convey the science but must also be entertaining and approach the topic with nuance and sensitivity, all while grabbing the viewer’s attention within 15 seconds.
Science just takes so much longer than misinformation, because you have to do the research properly.
When Wong was a full-time science educator, she found herself working an extra 30 hours a week creating content for social media and her blog. “It was just destroying my personal life,” she recalled, adding that her relationship with her partner had ended partly because she was spending so much time on content creation.
It does not help that debunking often will not pay the bills, as many myth busters refrain from accepting sponsorships to prevent a conflict of interest. Wong accepts sponsorships, but like others who work with brands, she is selective, avoiding clients with deceptive marketing practices or cure-all claims.
“It is possible to work with brands and still remain fact-based and science-based,” she said, but she acknowledged that “part of it is necessity – because debunking was taking up so many of my hours.”
Wong quit her job in 2019 to devote herself full-time to Lab Muffin Beauty Science, but she still sometimes works up to 70 hours per week. “Science just takes so much longer than misinformation, because you have to do the research properly,” she said.
Once a debunker has an audience, the work of maintaining and building an account can also lead to burnout. Like most influencers, they put pressure on themselves to excel. As Dr Austin Chiang, a gastroenterologist with more than 500,000 TikTok followers, explained, they often blame themselves if their content underperforms. “We think, is it because my messaging isn’t good?” he said. “Is it because the quality of the video isn’t good?”
Wallace said the most exhausting element, though, was the harassment. Commenters repeatedly insult her, and when she posts in favor of vaccination, they accuse her of being a “shill for Big Pharma.” “I block accounts every day,” Wallace said. She also received threatening and sexually violent messages through her university email account – a situation that she said had required the university police to become involved early this year.
For health care professionals, harassment can also lead to professional consequences or the fear of them. “Many people’s institutions don’t want them to be attracting tons of negative attention,” said Renée DiResta, a misinformation expert and the technical research manager at the Stanford Internet Observatory, which studies internet propaganda.
Doctors are encouraged to treat patients. Scientists are encouraged to conduct research and submit their findings for peer review. To create content on TikTok? Less so.
Dhahir considered quitting TikTok after users found the address of his pharmacy and spread rumors about his professional and personal lives. He also had to meet with the dean of medicine at the University of Sydney and explain why the university had received complaints.
Dhahir said he felt supported by his university but worried that that could change quickly. “One wrong move, and then my work can fire me or the university can kick me out,” he said. “I have to make sure I don’t screw up.”
Can supplements really help with depression or anxiety?
A new study points to a surprisingly simple way to ward off knee pain
Mughal said he had heard from fellow doctors who balked at making educational content on social media lest complainers “get them into trouble.” “There’s not a lot of protection for health care professionals that make content for the public,” he said. Creators said they feared losing their licenses or memberships in professional associations. Doctors who work in private practice worry that critics will flood Yelp with negative reviews.
DiResta stressed the importance of health experts engaging with the public on social media, but until there is stronger support from institutions, she hesitates to recommend that they do so. “They have to know what’s going to happen to them when they do it. That’s the problem,” she said.
MANY SOLUTIONS, THOUGH NOT ALWAYS AGREEMENT
With debunker burnout on the horizon and a never-ending flow of pseudoscience, researchers said they needed more data to determine which strategies worked best for countering misinformation. Some say that TikTok should promote evidence-based health content on the “For You” page, the platform’s personalised feed of recommended content, so it does not get lost among similar-sounding videos pushing misinformation.
Not everyone is eager to see a platform artificially elevating health experts, though. Dr Karan Raj, a surgeon for Britain’s National Health Service who has garnered 4.8 million followers on TikTok by discrediting claims like the warning that holding in gas causes appendicitis or that desiccant tablets found in pregnancy tests are a form of Plan B, argued that top-down promotion could erode audience trust.
“People should want to watch content because they enjoy the content,” Raj said, insisting that videos needed to gain popularity of their own accord. “If I get on the ‘For You’ page, it’s because people like my content.”
Richards pointed out that ensuring that a video from, say, the World Health Organisation gets millions of views does not ensure that the information will affect users, noting that impact is gauged through a mixed methodology that analyses several factors, including engagement, watch time and shares.
Some researchers recommend that TikTok work with health organisations to identify experts who can explain complex health issues in layperson’s terms, as the platform did with COVID-19 efforts. Ciaran O’Connor, an analyst who studies disinformation at the Institute for Strategic Dialogue, suggested that TikTok offer experts verified badges as a stamp of authority.
But misinformation experts and creators alike argued that social media platforms are not solely responsible for amplifying accurate information. They want institutions and health departments to further invest in influencers. They also want stronger initiatives to support creators, like financial compensation, mental health resources or help handling harassment.
“For all of the experts who are on there trying to put out high-quality information, if we want that to be sustainable for them, we need to be building infrastructure that doesn’t rely on the platforms,” Richards said. Right now, she said, “it’s viewed as a hobby, almost like charity work.”
Despite the hurdles, debunkers do see their efforts paying off. Followers have told Wallace that they got vaccinated after watching her videos. Chiang heard from viewers who got screened for medical conditions they might have otherwise ignored. And Dhahir’s fans sometimes reach out to say thank you.
“They’ll say, ‘I appreciate everything,’ or, ‘You’ve inspired me,’” Dhahir said. “Then I’ll be like, ‘You know what? This is actually worth it.’”
By Rina Raphael © 2022 The New York Times
This article originally appeared in The New York Times.