Meta Allows Deepfake of Irish Presidential Candidate To Spread for 12 Hours Before Removal
Meta removed a deepfake video from Facebook that falsely depicted Catherine Connolly withdrawing from Ireland's presidential election. The video was posted to an account called RTE News AI and viewed almost 30,000 times over 12 hours before the Irish Independent contacted the platform. The fabricated bulletin featured AI-generated versions of RTE newsreader Sharon Ni Bheolain and political correspondent Paul Cunningham announcing that Connolly had ended her campaign and the election scheduled for Friday would be cancelled.
Connolly responded in a statement that she remained a candidate and called the video a disgraceful attempt to mislead voters. Meta confirmed the account violated its community standards against impersonating people and organizations. Ireland's media regulator Coimisiun na Mean contacted Meta about the incident and reminded the platform of its obligations under the EU Digital Services Act. An Irish Times poll published last Thursday found Connolly leading the race with 38% support.
Read more of this story at Slashdot.