Artificial Intelligence & Robotics

If law schools prohibit ChatGPT in writing, can they back it up?

  •  
  •  
  •  
  • Print

shutterstock_ChatGPT typing on computer

Many lawyers interviewed by the ABA Journal found the University of Michigan Law School’s new policy regarding using artificial intelligence to write admissions essays surprising. Image from Shutterstock.

Using artificial intelligence to write admissions essays now comes with significant risks at the University of Michigan Law School, which recently asked applicants to certify that they did not use the technology for drafting purposes.

False statements could result in the cancellation of an admissions offer or expulsion or rescinding a degree, according to the certification language.

Likewise, if admissions office readers go over a candidate’s essay and suspect that technology did the writing, it would give them serious pause, even for a strong candidate, says Sarah C. Zearfoss, the law school’s senior assistant dean. The certification language was introduced this year, and she has not heard of other law schools with similar prohibitions on AI technology.

Many lawyers interviewed by the ABA Journal found the University of Michigan Law School’s new policy surprising. The retributions would be difficult to carry out because there are no good tools to detect the technology in writing, they say. And even if there were, offerings such as ChatGPT will continue to evolve and likely outfox anything created to catch it.

With in-class writing tests and bar exams, testing software and humans monitoring the physical environment prohibit examinees from using things such as ChatCPT, says Greg Sarab, CEO of the exam software company Extegrity. Without those two conditions, there is probably no way to know whether someone used the technology, according to Sarab.

Sarah Zearfoss headshot_400px Sarah C. Zearfoss is the senior assistant dean at the University of Michigan Law School. “One of the reasons we invest so much in our application reading process is because we care so much about writing abilities. We send people into the kind of jobs where you need to be an A-plus writer.”

“AI is basically a collaborator that is discreet. It won’t narc on you, and it’s quick. It would take very little editing to foil a detector,” Sarab adds.

Zearfoss says they are giving it their best shot—not with computers but people power. She directs JD and LLM admissions at the law school. And with a team of two seasonal employees, both of whom are retired from working in higher education, they plan to compare applicants’ essays with their Law School Admission Test writing samples to determine whether the writing styles are consistent.

They read approximately 6,000 applications per year. Zearloss admits that they can’t be completely sure whether someone used the technology to write a personal statement.

“We will see essays and think, ‘Yep, this could be ChatGPT. Or it could just be a bland essay,” she adds.

The Michigan prohibition includes first drafts. Zearfoss does not see a similarity between hiring an admissions consultant and using AI technology. A consultant gives feedback, she adds, and that’s not the same as writing the first draft. Also, the law school application asks that candidates certify that they have not had “a human do more than basic proofreading or general feedback” on submissions, she explains.

“One of the reasons we invest so much in our application reading process is because we care so much about writing abilities. We send people into the kind of jobs where you need to be an A-plus writer,” Zearfoss says.

If someone submits a personal statement generated by ChatGPT with no editing, a perceptive reader might notice, says David Kemp, an adjunct professor at Rutgers Law School. For instance, the final paragraph will often start with: “In conclusion.”

Citing a case that doesn’t exist would be another tip off, Kemp explains, but it would be unusual to cite a case in law school application materials.

The Law School Admission Council, which designs the LSAT, tried to create technology that compares writing samples and deciphers whether they were written by the same person. It doesn’t work well for comparing law school application essays with LSAT writing samples because the conditions are different, says Troy Lowry, the LSAC’s senior vice president of technology products, chief information officer and chief information security officer.

“Think about a good personal essay statement. It takes a couple of weeks to write, and you’re thinking about what would be compelling. With the LSAT, you have 35 minutes,” Lowry says.

He also predicts that using technology to write will become more accepted in legal education and compares the situation to calculators. For earlier generations of students, teachers often banned them, thinking that it would prohibit learning math. Today, many secondary school supply lists include a graphing calculator, which can also be used on the SAT and the ACT, as well as high school Advanced Placement program tests.

At Michigan, the law school’s honor code does not prohibit students from using AI technology, and professors decide that individually, Zearfoss says.

Regarding ABA accreditation, honor codes are addressed in Standard 308(a). Among its requirements, law schools must have “sound” academic standards for academic integrity. Additionally, in August, ABA President Mary Smith announced the creation of the ABA Task Force on Law and Artificial Intelligence. It will study various issues, including legal education.

Robert Brain, a professor at the Loyola Marymount University’s Loyola Law School, told the Journal that most, if not all, law schools currently have faculty committees considering the issue. Also, an Association of Legal Writing Directors board member, Brain spoke about the topic when the council of the ABA Section of Legal Education and Admission to the Bar met in August.

“I don’t think anybody is seriously suggesting going back to handwritten blue book essays, but they could cut off take-home tests,” says Brain, who thinks that law schools should teach students to use AI technology.

“My personal view is we can’t stop them but also because lawyers are using it,” he says.

And it’s possible that law students use the technology differently than law school professors and administrators think, says Malak Tehaili, the national chair of the ABA Law Student Division.

“I don’t think they are using it to write research papers. To cut and paste things into a Word document, that would be very bold,” says Tehaili, a third-year student at the University of Detroit Mercy School of Law.

She adds that some law professors may not understand the usefulness of AI technology outside writing.

“You can use it to outline case briefs. It can be helpful if you are in a time crunch or are cold-called in class, and you just want to get a sense of the case,” Tehaili says.

Give us feedback, share a story tip or update, or report an error.