A lot of false information about hurricanes in the US has spread online, mostly because social media values likes and shares over the truth.
The quick spread of false rumors about Hurricane Helene and Hurricane Milton is different from what I’ve seen before.
Viral posts included simple questions about weather forecasts and rescue efforts, along with false claims—like the one repeated by Donald Trump—that hurricane relief money is being used for migrants who came to the US illegally.
Some shared fake images of destruction, including AI-generated pictures of children escaping disaster, old videos of other storms, or computer-made clips. Others pushed unfounded conspiracy theories about the government controlling the weather.
“Yes, they can control the weather,” wrote Congresswoman Marjorie Taylor Greene last week on X.
Most viral misinformation comes from social media accounts with blue ticks that often share conspiracy theories. Some accounts that spread false information about Hurricane Milton had previously posted about real events being faked, like elections and the pandemic.
I reached out to many of these accounts on X that shared misleading posts about hurricanes. They seem to go viral because of changes made since Elon Musk bought X. Blue ticks, which used to be for verified users, can now be bought, and this helps their posts get more attention. They can also make money from these posts, whether they are true or not.
X’s policy lets blue-tick users earn money from ads in their replies. Recently, they announced higher payouts based on user engagement, encouraging some to share whatever might go viral, even if it’s false. Many I contacted admitted they benefit from sharing attention-grabbing content.
While other social media platforms like YouTube and Instagram have rules against misinformation, X does not. Although it has some measures against fake AI content, it removed the feature that let users report misleading information.
Misinformation can easily spread from X to comments on videos on other platforms, showing how ideas can move through social media.
“Wild Mother,” an influencer who shares unproven theories, noted that four years ago, her comments were mostly negative, but now many agree with her views, especially on recent conspiracy theories about hurricanes.
This misinformation can hurt trust in authorities, especially during important recovery efforts after Hurricane Milton. While misinformation has always existed during disasters, it now reaches more people—over 160 million views for just a few false posts. The false claims also have a stronger political angle due to the upcoming US presidential election.
Many viral posts come from accounts that support Donald Trump, according to ISD. These posts often attack foreign aid and migrants.
Some videos even accuse relief workers of “treason” for being involved in made-up plots. This anger and distrust can hurt real rescue efforts. With an election coming up, it also risks damaging people’s trust in the government and overshadowing valid criticisms of its actions.
While Wild Mother and others see this as more people waking up to the truth, I see it as these conspiracy theories reaching more people.
She believes “a well-informed group is harder to control.” In other words, the more people who believe these baseless theories, the tougher it is to fight against them.
This issue is linked to how social media algorithms prioritize engagement. False claims and conspiracy theories can spread to many people before anyone notices they’re untrue, and those who share them can gain views, likes, followers, or even money.j