Just two days before Slovakia’s elections, an audio recording was posted to Facebook. On it were two voices: allegedly, Michal Šimečka, who leads the liberal Progressive Slovakia party, and Monika Tódová from the daily newspaper Denník N. They appeared to be discussing how to rig the election, partly by buying votes from the country’s marginalized Roma minority.
Šimečka and Denník N immediately denounced the audio as fake. The fact-checking department of news agency AFP said the audio showed signs of being manipulated using AI. But the recording was posted during a 48-hour moratorium ahead of the polls opening, during which media outlets and politicians are supposed to stay silent. That meant, under Slovakia’s election rules, the post was difficult to widely debunk. And, because the post was audio, it exploited a loophole in Meta’s manipulated-media policy, which dictates only faked videos—where a person has been edited to say words they never said—go against its rules.
The election was a tight race between two frontrunners with opposing visions for Slovakia. On Sunday it was announced that the pro-NATO party, Progressive Slovakia, had lost to SMER, which campaigned to withdraw military support for its neighbor, Ukraine.
Before the vote, the EU’s digital chief, Věra Jourová, said Slovakia’s election would be a test case of how vulnerable European elections are to the “multimillion-euro weapon of mass manipulation” used by Moscow to meddle in elections. Now, in its aftermath, countries around the world will be poring over what happened in Slovakia for clues about the challenges they too could face. Nearby Poland, which a recent EU study suggested was particularly at risk of being targeted by disinformation, goes to the polls in two weeks’ time. Next year, the UK, India, the EU, and the US are set to hold elections. The fact-checkers trying to hold the line against disinformation on social media in Slovakia say their experience shows AI is already advanced enough to disrupt elections, while they lack the tools to fight back.
“We’re not as ready for it as we should be,” says Veronika Hincová Frankovská, project manager at the fact-checking organization Demagog.
During the elections, Hincová Frankovská’s team worked long hours, dividing their time between fact-checking claims made during TV debates and monitoring social media platforms. Demagog is a fact-checking partner for Meta, which means it works with the social media company to write fact-check labels for suspected disinformation spreading on platforms like Facebook.