Vivid Headlines

Affidavit filed to support Minnesota deepfake law undermined by AI

By Graeme Hanna

Affidavit filed to support Minnesota deepfake law undermined by AI

We uphold a strict editorial policy that focuses on factual accuracy, relevance, and impartiality. Our content, created by leading industry experts, is meticulously reviewed by a team of seasoned editors to ensure compliance with the highest standards in reporting and publishing.

A submission to support Minnesota's legislation concerning the misuse of artificial intelligence to interfere in the democratic process has been called into question due to the apparent presence of AI.

A federal lawsuit has been filed over the North Star State's "Use of Deep Fake Technology to Influence An Election", with an affidavit lodged to back the case appearing to contain AI-generated text.

In the run-up to the recent presidential election, around half of U.S. states moved to regulate AI and deepfakes in elections.

The Minnesota Reformer detailed how Attorney General Keith Ellison tasked Stanford Social Media Lab founder Jeff Hancock to make the submission but all is not as it seems.

The document filed shows indications of text generated by AI programs such as ChatGPT or another LLM model, and in particular, the inclusion of non-existent sources.

In Hancock's affidavit, a 2023 study was cited, which was supposedly published in the Journal of Information Technology & Politics.

However, the "The Influence of Deepfake Videos on Political Attitudes and Behavior" study does not exist in the journal, or any other publication, according to the Reformer.

Calling the entire document into question

The same situation applies to another source listed in Hancock's submission, titled "Deepfakes and the Illusion of Authenticity: Cognitive Processes Behind Misinformation Acceptance".

This has led to intense scrutiny from opposing lawyers, as expected.

Representatives for Minnesota state Rep. Mary Franson and the conservative YouTube persona Mr. Reagan, aka Christopher Khols, indicated the citation issues show the hallmarks of an AI 'hallucination'.

Their filing stated: "Plaintiffs do not know how this hallucination wound up in Hancock's declaration, but it calls the entire document into question, especially when much of the commentary contains no methodology or analytic logic whatsoever."

Previous articleNext article

POPULAR CATEGORY

entertainment

11530

discovery

5172

multipurpose

12138

athletics

11938