Legal Education

Class Is in Session: How some law schools are training students in generative AI

AI in the classroom illustration

(Image from Shutterstock)

After initially taking a more skeptical approach following the introduction of ChatGPT in 2022, law schools are warming up to using artificial intelligence in curriculum and clinics.

As more AI software is made available to law schools and students, the moment of law schools watching and waiting is over. More options are emerging for AI coursework that not only examines the ethical uses of AI but also offers hands-on experiences that range from practicing contract negotiations and court appearances via AI-generated simulators to developing systems to handle divorce and end-of-life documents in legal clinics.

“Now is when the rubber meets the road,” says Daniel W. Linna, the law and technology initiatives director at Northwestern University Pritzker School of Law. “We’ve got to take some of these experiments and turn them into substantive changes in our courses; our curricula; and, in the case of clinics and legal services delivery, real tools with measurable results.”

The stakes for aspiring lawyers are high. As AI handles much of the work young associates do, the pressure is on to train future lawyers to quickly level up, says Megan Ma, executive director of Stanford Law School’s Legal Innovation through Frontier Technology Lab.

“How do we get them to the level of a senior lawyer and learn the skills of issue-spotting?” she asks.

“We built AI agents that reflect the personalities and the thinking of senior lawyers,” says Megan Ma, executive director of Stanford Law School’s Legal Innovation through Frontier Technology Lab. “We effectively downloaded their brains.”

At many law schools, the answer is offering more AI-centric opportunities. Last year, 55% of schools responding to the ABA Task Force on Law and Artificial Intelligence’s AI and Legal Education Survey reported having classes dedicated to AI; 32% offered formal opportunities to use AI via interdisciplinary arrangements with their university; and 83% reported opportunities, like clinics, where students could learn to use AI tools. The survey was sent to 200 law school deans via email, and 29 deans or faculty members responded.

Par for the course

In February, Case Western Reserve University School of Law became the first law school to make AI training mandatory, requiring all 1Ls to take an AI certification course.

The weekend program, called “Introduction to AI and the Law,” was developed with Wickard.ai, a company specializing in AI legal education, software and training. Students learn AI fundamentals with an eye toward large language models and ethical and regulatory guidelines, such as the ABA’s Formal Opinion 512.

The class offers hands-on training with programs such as Spellbook, CoCounsel and Gemini, says Holtzman Vogel AI Practice Group co-head Oliver Roberts, who teaches the course.

Roberts says he has led similar intensive programs at Washington University in St. Louis School of Law and the George Washington University Law School.

Those schools are not alone in bringing in more AI-related courses into the law curriculum.

Suffolk Law School, for instance, added three AI-oriented classes this academic year—Generative AI and the Delivery of Legal Services, Artificial Intelligence and the Law, and Emerging AI Regulatory Frameworks, says Dyane O’Leary, director of the school’s Legal Innovation & Technology Center.

And at the University of Miami School of Law, the Miami Law & AI Lab developed a tool called ClassInsight to help professors assess student understanding of materials within minutes of finishing a lecture, says Or Cohen-Sasson, lab director.

During class, students are emailed a link requesting an answer—that won’t be graded—to a question related to the materials just presented. As the students get personalized feedback about their email submissions returned, the class’s answers are collated, and a pie chart is created showing the percentage of correct answers vs. incorrect ones and sent to the professor, Cohen-Sasson says.

ClassInsight is now being used in three courses, and professors are finding an unexpected benefit, he adds. “Students that didn’t use to participate before are now participating more in class. It makes sense,” he says, because once they receive positive feedback, students feel comfortable speaking up in class. “Some students are maybe shy or not secure enough to participate before they know their level of understanding.”

Hand writing the term AI on a chalkboard(Image from Shutterstock)

One reason AI has landed in the classroom is that generative AI tools “are finally starting to get deployed in the academic market,” says Mark Williams, a professor at Vanderbilt Law School and a founding co-director of the Vanderbilt AI Law Lab.

However, once AI classes are up and running, professors need to stay on top of their game and constantly update classes on AI as tools evolve, adds Williams, who teaches the popular AI in Law Practice class he co-created that is now in its fourth iteration.

“I tell students right up front that half of the substantive material that we cover in this class is probably going to be outdated by the time that you graduate,” he adds. But the frameworks he teaches for critically evaluating AI tools will remain constant, he says.

He sees the class as an opportunity for students to differentiate themselves at future employers.

“They will have an understanding of not only how to engage with AI as it currently exists but how to navigate it long-term as it evolves and continues to have an increasing impact on knowledge work of all kinds and society at large,” he says.

Living in a simulation

Like a flight simulator for student pilots or surgical simulations for medical students, students and new associates can practice legal proceedings and negotiations via AI-based tools.

The Stanford Center for Legal Informatics, or CodeX, developed the M&A Negotiation Simulator. The tool helps users refine skills needed for merger and acquisition negotiations in a virtual setting by leaning into generative AI’s ability to mimic characters and role-play, Ma says.

The goal is to “preview for young lawyers the experiences of senior partners, walk in their shoes and understand the nuance of those practices in a low-risk environment,” she says.

Users are presented with various negotiation scenarios with specific objectives and constraints, then taught how to refine communications and use strategic thinking and problem-solving skills against different types of personalities.

The program was built by Stanford’s computer science students, who interviewed senior partners in mergers and acquisitions.

“We built AI agents that reflect the personalities and the thinking of senior lawyers,” she adds. “We effectively downloaded their brains.”

While the developers took measures to reduce the number of hallucinations, “it still misinterprets certain facts and extrapolates in an incorrect way,” Ma says. But even that can make good practice, she adds, as students practice dealing with thorny lawyers who are bluffing.

The software is “out-of-the-box ready,” Ma says. “It just looks like a text editor. What you’re really trying to teach is the legal skills. You’re not focused on the AI skills.”

Currently, the program is used by law firms for new associates as Stanford Law School determines how to best deploy the simulator to its classes, says Ma, who works with with the law school faculty committee for AI and education.

DavidDavid Colarusso, a 2016 ABA Journal Legal Rebel, built some of the tools being used by his students at Suffolk Law. (Photo from Suffolk University)

In Suffolk Law’s Artificial Intelligence and the Law class, students used a handful of AI tools built by David Colarusso—co-director of the Legal Innovation & Technology Lab and a 2016 ABA Journal Legal Rebel—who led the course. The class aimed to teach students about AI technology and the law by using AI to study currently open cases that could inform AI regulations, he says.

Students looked at cases such as New York Times v. Microsoft and conducted tabletop simulations of motion, practice and trial, he says.

“It’s the caselaw method, except it’s in simulation, and it’s growing and alive and vibrant,” he says.

Instead of practicing being on call, students simulated being an attorney in the case.

“It was a lot of fun, but we couldn’t have done that without some of the AI tech tools,” he adds.

Colarusso created a bot called Moot a Case that simulates the judge and allows students to practice their oral arguments before in-class arguments.

“The judge would question you based upon its understanding of having read the briefs,” Colarusso says.

Other Colarusso-created AI tools include Go Socrates, which conducts a Socratic method with cases; and Distill & Question, which allows students to upload case documents to the question bot, receive a summary and then conduct a question-and-answer session about a case, “something I like to call prereading,” he says.

Each week, students used Weekly Reflection, a tool that asks students about that week’s work and their plans for their projects. They then shared two of that week’s GPT conversations with Colarusso, who says he gained new insights.

“For the first time, I could see what they’re thinking as they’re engaging with the case and their thought process in a way that I never before could,” he says. “It’s the exact opposite of what everyone worries these tools will do—that people are just going to use them to write the essay, and then you don’t know what they’re thinking.”

Clinical

Meanwhile, Suffolk Law’s Legal Innovation & Technology Lab, in collaboration with the American Arbitration Association, created the Online Dispute Resolution Innovation Clinic to help people in Massachusetts facing simple divorces as pro se litigants.

DeanDean Andrew Perlman of Suffolk Law says if the ODR divorce clinic is successful, they hope to replicate the program in other jurisdictions. (Photo by Michael Clarke)

The new clinic, announced last June, will offer an accessible method via generative AI to complete the required paperwork, file it directly into the courts and dramatically reduce the cost of ending a marriage, says Suffolk Law dean Andrew Perlman.

“Currently, the documents are a mess,” he says. “This will make it better for the courts, make it better for the individuals who need those kinds of services.”

It is the first clinic where Suffolk Law students will design, build, test and implement a workflow to help litigants deliver completed, correct documents to the court, O’Leary says.

Currently, law students in an Online Dispute Resolution Design Lab elective course are gathering community input about what is needed, she adds. The intuitive platform will offer interactive guided interviews and smart court forms powered by AI, as well as virtual mediation services.

These skills help students as they enter the job market, says Suffolk Law’s O’Leary. “It’s definitely a new skill set that a lot of people seem interested in and desperate for. It’s a bonus that we encourage our students to market carefully.”

The clinic will officially launch in the fall, O’Leary says.

After the Massachusetts beta test, the hope is to ultimately replicate the program in other jurisdictions, Perlman says.

Meanwhile, at Vanderbilt University’s AI Law Lab, students are building tools they hope will help ease the access-to-justice gap, creating an end-of-life AI planning tool to create wills and advanced directives for people in Tennessee, aiming to offer it to legal aid organizations.

“One of the magical things about these tools is they can reach a whole new audience of people who, frankly, do not have access to legal help,” Williams says. “It’s not about taking legal work away from people. It’s reaching people who are never getting touched by the legal market, period.”

While learning coding and building AI tools, “you have to have deep subject knowledge of what you’re talking about” to evaluate the outputs of the tool and if they are offering truthful answers.”

“There’s a lot of benchmarking and safeguarding and testing that needs to be done before you can deploy a tool in that manner,” Williams says.

All expect more AI tools to come.

“We’ve just come out a little bit from hesitation of the use of AI,” Ma says. “It takes time.”