Insights
AI & Your Business: Libel Risks
Nov 14, 2024Summary
As lawyers who often defend defamation suits, we know from experience that it’s not just media defendants who are sued for libel. Defamation suits are routinely filed against all sorts of businesses, arising from all kinds of content and communications. As the pressure to compete pushes more businesses to incorporate generative AI into their content-creation processes, it is important to be mindful of the different ways a libel lawsuit might arise. This applies to text, images, video, audio, and all other types of content and information.
We expect to see a flurry of cases stemming from AI-generated content in the coming years falling into one of these four general categories:
- Libel by juxtaposition: This can result where truthful information about two different individuals or entities is juxtaposed as part of generative AI output, making it seem like the output is about the same person or entity.
- Libel by hallucination: the AI output text is simply not true.
- Libel by omission: In this scenario, the AI output is true, but a missing fact changes its meaning.
- Libel by misquote: When generative AI output gets a quote wrong (even by a word or two), or misattributes a quote to the wrong person, the result can be a libel lawsuit.
This insight details some of the first U.S. lawsuits arising from AI and libel which both illustrate the first and second of these scenarios: libel by juxtaposition and libel by hallucination.
BATTLE V. MICROSOFT CORP., NO. 1:23-CV-01822 (D. MD., FILED JULY 7, 2023)
This case alleges AI-generated libel-by-juxtaposition. According to the complaint, plaintiff Jeffery Battle, also known as “The Aerospace Professor,” is the president and CEO of Battle Enterprises and its subsidiary The Aerospace Professor Company. He is an honorably discharged US Air Force veteran and has been appointed as adjunct professor for Embry-Riddle Aeronautical University. Battle has a master of business administration in aviation and two bachelor of science degrees.
Battle’s complaint alleges that the Bing search engine generated false statements “conflating Mr. Battle with a person of a similar name, Jeffrey Leon Battle, who is a convicted terrorist, to the professional and personal detriment of Mr. Battle and his family.” The complaint further alleges:
Specifically, when someone uses Bing to search plaintiff’s name, “Jeffery Battle,” the search engine generates a small blurb at the top of the search results which states:
“Jeffery Battle, also known as The Aerospace Professor, However, Battle was sentenced to eighteen years in prison after pleading guilty to seditious conspiracy and levying war against the United States. He had two years added to his sentence for refusing to testify before a grand jury.”
The complaint goes on to allege that “this blurb falsely and detrimentally states that Mr. Battle was sentenced to prison for seditious conspiracy and levying war against the United States and links to a Wikipedia page for Jeffrey Leon Battle, not Jeffery Battle. These false statements are reporting, ‘This summary was generated using AI based on multiple online sources. To view the original source information, use the “Learn more” links.’”
STATUS
How did this alleged libel by juxtaposition case turn out? We may never know. On October 23, 2024, the United States District Court for the District of Maryland granted Microsoft’s motion to compel arbitration and stayed the matter pending the resolution of the parties’ arbitration.
WALTERS V. OPEN AI, LLC, NO. 23-A-04860-2 (SUPERIOR COURT OF GWINNETT COUNTY, STATE OF GA, FILED JUNE 5, 2023)
In this alleged generative AI libel-by-hallucination case, plaintiff Mark Walters is a Georgia-based radio personality. According to his complaint, journalist Fred Riehl, a subscriber of ChatGPT, provided ChatGPT with a link to a complaint that appeared on the Second Amendment Foundation’s website, and asked ChatGPT for a summary of the complaint. As the Walters complaint further alleges, ChatGPT responded that the complaint:
“[I]s a legal complaint filed by Alan Gottlieb, the founder and executive vice president of the Second Amendment Foundation (SAF), against Mark Walters, who is accused of defrauding and embezzling funds from the SAF. The complaint alleges that Walters, who served as the organization's treasurer and chief financial officer, misappropriated funds for personal expenses without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures to the SAF's leadership.”
According to the Walters complaint, every statement of fact in the summary of the complaint was false: Walters was not a party to the lawsuit, the complaint did not allege that Walters served as SAF’s treasurer or chief financial officer, and Walters was not accused of doing any of the things described in the summary. The Walters complaint also alleges Riehl asked ChatGPT to provide him with further information about the complaint and that this information was also fabricated.
The action was initially removed to federal court, then remanded back to state court. Open AI, the company that developed and provides ChatGPT, filed a motion to dismiss, which was ultimately denied. The case is in the discovery stage.
For more information, please contact any of the authors listed or another member of our Media & First Amendment Team.
Related Practice Areas
-
Media & First Amendment