Law and Politics People and Community

Law school professor calls for caution and care when using AI

Christina Frohock's upcoming paper outlines the dangers to the integrity of the legal system from fake legal citations.
Christina Frohock
School of Law professor Christina Frohock cautions against the pitfalls of using AI in legal writings. Photo: Catharine Skipp/University of Miami

Professor Christina Frohock's latest research and scholarship, "Ghosts at the Gate: A Call for Vigilance Against AI-Generated Case Hallucination," makes the case that as artificial intelligence-generated writing becomes widespread in modern life, hallucinations, or the generation of false or misleading information, are showing up with alarming frequency in legal writing. 

Frohock, whose paper will be published in the winter issue of the Penn State Law Review, calls for the need for more vigilance as judges are imposing penalties on attorneys for citing judicial opinions that do not exist. She writes that apparitions in the law date back a century or more, citing the case of a phantom town appearing on a map from a respected map company. 

The professor, who teaches J.D. and LL.M. students legal communications and research skills at the University of Miami School of Law, warns that writers should be wary, as hallucinated case citations can take on a life of their own, gaining traction and respect through repetition and reliance. She concludes that hallucinated cases lurk as ghosts at the gate, and attorneys must serve as gatekeepers of the law. 

What motivated you to write the paper? 

I started writing out of curiosity: How does artificial intelligence fit in the law? I want to understand—and to teach my students—both the opportunities and the risks of generative AI, in particular. One big risk is hallucinations. Fictitious case citations are appearing in search query results. As I researched case law on AI-generated case citations, I realized that these hallucinations are modern versions of older apparitions. A century ago, phantom settlements appeared on paper maps as copyright traps. So, I wrote this article as a cautionary tale. What happened to those copyright ghosts may happen to citation ghosts: they take on a life of their own. 

What surprised you? 

I was surprised by the link between past and present, as apparitions in the law connect obsolete paper maps with cutting-edge AI technology. Phantom settlements appeared long ago on maps as copyright traps, and phantom cases appear today on computer screens as AI hallucinations. Copyright traps are small errors intentionally added to a map, or perhaps a dictionary or encyclopedia or other reference work. The trap is designed to catch intellectual property thieves. When the same error appears on a rival cartographer’s map, that error proves the rival copied. Then the original cartographer can sue for copyright infringement. Today, AI hallucinations set new traps. Fake case opinions are ensnaring attorneys, who unwittingly cite those opinions in motions and briefs. Federal Rule of Civil Procedure 11 requires that court filings have legal support. A contention based on fictitious opinions violates Rule 11, and courts are imposing sanctions in the thousands of dollars. I don’t take a position on whether AI programs like ChatGPT intend to ensnare attorneys. But the traps are out there, and we have a duty to confirm the existence of cited cases. 

How does this exploration relate to other research you have done? 

The deep theme connecting all my scholarship is the power of language. This article places that theme front and center. Copyright traps took a surprising turn with the town of Agloe, New York, which happens to be near my hometown of Syracuse. In the 1930s, General Drafting Company invented Agloe as a phantom settlement on its map of upstate New York. When Rand McNally published its own map and included Agloe, General Drafting threatened to sue for copyright infringement. Rand McNally then pointed to the actual town of Agloe, right where General Drafting had placed the dot. The town came into existence because it had appeared on a respected map. The apparition became real, off the paper and onto the landscape. The same fate may await AI hallucinations. Phantom cases may become real, off the screen and into a judicial opinion. One court described that possibility as scary, and I agree. My article is a call for vigilance, especially given the distrust of authority and expertise in today’s world. We need to protect the law.


Top