Bar Exam

Some of California's troubled bar exam was drafted by nonlawyers with help from artificial intelligence

Exam options

Despite a multimillion dollar contract with Kaplan to create its new bar exam, the State Bar of California recycled some questions and had its independent psychometrician draft portions of its new bar exam’s multiple-choice questions with help from artificial intelligence, according to a news release. (Image from Shutterstock)

Updated: Despite a multimillion dollar contract with Kaplan to create its new bar exam, the State Bar of California recycled some questions and had its independent psychometrician draft portions of its new bar exam’s multiple-choice questions with help from artificial intelligence, according to a news release.

While the majority of the test’s 200 multiple-choice questions were created by Kaplan, “a subset of questions came from the First-Year Law Students’ Exam,” also known as the “baby bar,” which is comprised of 100 multiple-choice questions often given to first-year law students, and some were developed by ACS Ventures, the state bar’s independent psychometrician, according to the release. It goes on to say ACS used AI to assist in writing the questions that were “reviewed by content validation panels and a subject matter expert.”

In its move to a hybrid and remote exam prompted by financial issues, the state bar last summer signed a five year, $8.25 million agreement with Kaplan to write the exam. Launched in February, the exam was plagued with widespread technical and administrative problems.

The announcement drew the ire of California-based law faculty members contacted by the ABA Journal.

“It was irresponsible and reckless for the state bar to use questions drafted by ACS Venture’s nonlawyer psychometricians in the February 2025 bar exam,” says Mary Basick, assistant dean of academic skills at University of California, Irvine School of Law. She adds that the lack of transparency about the use of AI “is an egregious breach of trust,” and to her knowledge, the plan had not been approved by the state’s supreme court.

Martin Pritikin, dean of Purdue Global Law School, a fully online law school, also found the disclosure troubling. That “AI was going to be used by nonlawyers in drafting questions was not previously disclosed, and clear parameters for how AI would or wouldn’t be used in the process were not laid out ahead of time,” he wrote to the ABA Journal.

As part of its contract, Kaplan was mandated to leave the bar-prep business in California to write the new exam to avoid a conflict of interest.

But Pritikin says ACS’s work could be considered just that.

“I would think the company that is paid to help the state bar evaluate the validity of the exam and make recommendations about what the passing score should be would likewise have a conflict of interest in helping to draft portions of that very exam,” Pritikin says. “The potential incentive to paint a rosy picture in their assessment of an exam they helped write seems obvious.”

Basick agrees. “The nonlawyer psychometricians who using AI drafted the substandard questions with minimal (if any) review and oversight by bar exam subject matter experts were the same ones that determined the validity and reliability of those very questions—which is an actual conflict of interest,” she says.

After the announcement of the agreement with Kaplan, the quick turnaround to develop the new exam raised eyebrows among law school faculty. “It usually takes years to properly develop and vet multiple-choice questions,” Basick says. Ahead of the new exam, the practice materials released from the state bar and Kaplan were rife with errors, she adds.

Basick and other bar academics had offered to help, she says, and initially, they had been asked to look over the questions after the February administration. But one week before the scheduled review session, their invitations were pulled, she says. The state bar cited a conflict of interest since the academics had previously worked with the National Conference of Bar Examiners’ Multistate Bar Exam questions while preparing students, which could have led to a violation of NCBE’s intellectual property rights, she says.

The faculty members originally included in the review group were not current or former NCBE drafters, says Sophie Martin, NCBE director of communications.

“This made me suspicious,” Basick says, and she and her peers began publicly questioning the state bar in public meetings regarding the quality and the validity of the multiple-choice questions.

Remedies for the February examinees are still under discussion. On Friday, the Committee of Bar Examiners recommended lowering the raw passing score to 534 from the psychometrician’s recommendation of 560. The move must be approved by the California Supreme Court, which the board asked to act by April 28.

But the court was not made aware of the use of AI to draft some of the questions, Cathal Conneely, public affairs director of the Judicial Council of California, wrote in an email to the ABA Journal.

As a result, the court asked the state bar to explain “how and why AI was used to draft, revise, or otherwise develop certain multiple-choice questions, efforts taken to ensure the reliability of the AI-assisted multiple-choice questions before they were administered, the reliability of the AI-assisted multiple-choice questions, whether any multiple-choice questions were removed from scoring because they were determined to be unreliable, and the reliability of the remaining multiple-choice questions used for scoring,” the email continues.

On May 5, the committee will discuss other options, including provisional licensure, a supervised practice pathway and special admissions for attorneys licensed in other states, according to the release.

More than 5,600 candidates originally had registered to take the new exam, but about 1,300 withdrew after the troubled runup to the exam, according to the state bar.

“On top of everything that went so wrong for the February 2025 bar exam-takers,” Pritikin says, “these new disclosures seem to add insult to injury by indicating that this cohort was somehow used as unwitting guinea pigs for a new way of generating exam questions using unvetted protocols and undisclosed drafters.”

Updated April 24 at 12:27 p.m. to add comment from Sophie Martin. Updated April 24 at 2:09 p.m. to add comment from Cathal Conneely.