Thousands of people took the new California bar exam in February, ready to the join the ranks of the state’s 195,000 lawyers.
But a series of missteps by the institution responsible for licensing lawyers has thrown thousands of nascent legal careers into a frustrating limbo.
First, there was the faulty testing software used during the exam. Test takers had trouble logging in. The software often crashed or was missing critical functions like copy and paste, leaving many unable to complete the exam. The organization that administers the test, the State Bar of California, had to offer adjustments of test-takers’ scores and other remedies.
Then came the news that at least a handful of the multiple-choice questions had been developed with the help of artificial intelligence. To many of those who took the exam, it was hardly shocking — they already had suspicions that A.I. had been used, based on a few questions that they said had struck them as bizarrely worded or legally unsound.
And now, California’s future lawyers are likely to have to wait a little longer to find out if they made the cut.
The state bar said it would need more time to obtain approval from the Supreme Court of California to adjust test scores in light of the problems. The results of the February exam had been slated to be released on Friday, but that is likely to be delayed.
“I just wanted a fair chance to be an attorney,” Edward Brickell, a 32-year-old graduate of Southwestern Law School in Los Angeles who took the test, said in an interview. “And it just feels like every week there’s another thing that comes out and says, like, ‘We didn’t give you a fair chance.’”
Mr. Brickell and others who took the test have flooded Reddit and other social media sites with horror stories and with plans to organize protests and demand accountability. On Tuesday at a state bar committee meeting, a handful of test-takers used the public-comment period to voice their displeasure and frustration.
“You guys are the body that is determining if we are competent to earn a living,” one test-taker, Dan Molina, told the state bar’s contracts committee at the virtual meeting. “Finances are being destroyed. Lives are being destroyed, and are about to be destroyed even more.”
With a high threshold for passage, California’s bar exam had long been considered one of the hardest in the nation. That threshold had been lowered in recent years.
In October, the state bar obtained approval from the California Supreme Court to introduce a reworked exam, with questions developed by a new test provider and the option to allow the test to be taken remotely. The state bar made the change to save money.
The state bar had previously used exams developed and prepared by the National Committee of Bar Examiners, the organization behind the exams used by most states that are considered the gold standard in the field. The N.C.B.E. does not allow remote testing.
Test takers in California were told that the new exam would not require any substantive changes in preparation, so many of them prepared the same way they would have for the N.C.B.E. version of the test.
In November, the state bar administered an experimental exam that functioned as a test run. Those who took it reported technical difficulties. Then, a study guide released by Kaplan, the new test provider, was rife with errors. That guide was quietly corrected and rereleased in the weeks before the exam in February.
Kaplan declined to comment.
In a sign that the state bar had anticipated some difficulties, it offered more than 5,000 registered test-takers the option to defer taking the exam until July, the next test date.
After the February exam, the state bar acknowledged the widespread technical failures.
“We know and have stated that these issues were, and continue to be for those still testing, unacceptable in their range and severity,” the State Bar of California said in a statement. “We apologize again, and we make no excuses for the failures that have occurred.”
The state bar added that it would evaluate whether Meazure Learning, the vendor that provided the technology and proctoring services to administer the exam, had failed to meet its contractual obligations. It also said it would enlist a psychometrician — a specialist who focuses on measuring intangible qualities such as knowledge or intelligence — to come up with score adjustments for test-takers who had experienced difficulties.
The state bar’s proposed test score adjustment was announced last week. The proposal lowered the raw passing score considerably.
That recommendation was filed with a request for approval from the State Supreme Court on Tuesday — three days before the results were set to be released. Given the late filing, the state bar told test-takers that the release of the exam results could be delayed, prolonging a dizzying stretch of uncertainty for many.
Buried deep in the announcement about the scoring adjustment was the new development: Some of the multiple-choice exam questions were developed not by Kaplan but by the state bar’s psychometrics provider, ACS Ventures, with the assistance of artificial intelligence.
ACS Ventures did not respond to a request for comment.
The state bar said that its Committee of Bar Examiners, the body that oversees the exam, had not previously been made aware of the use of A.I. The committee had been instructed by the State Supreme Court last year to explore changes to make the exam less expensive to administer, including the potential use of A.I.
“But the court has not endorsed, nor authorized, the broader use of A.I.,” Alex Chan, the chairman of the Committee of Bar Examiners, said in a statement. “While A.I. may eventually play a role in the future of exam development, absent specific judicial guidance, the Committee has neither considered nor approved its use to date.”
The Supreme Court said it had not been aware that the technology was used in the development of the exam and called for an investigation.
The state bar has not disclosed the details of how the technology was used by ACS Ventures to assist in developing exam questions.
For Mr. Brickell and others, the disclosure that A.I. was used at all seemed to offer an explanation for some of their confusion. Some questions, he and others who took the test said, did not read as though they had been drafted by a human and listed only incorrect multiple-choice answers.
Ceren Aytekin, an aspiring entertainment lawyer, said she had also noticed peculiarities in some of the questions, but she at first refused to believe A.I. had been used.
“I initially thought, ‘Maybe I’m the wrong one,’” Ms. Aytekin said. “Maybe I’m putting blame on a organization that would never do this to their examinees.” She added: “All the issues I spotted make so much sense with A.I. being involved. We just didn’t want to believe it.”
Two other large state bar associations, in New York and Illinois, said they had never used A.I. to develop questions on their exams. The N.C.B.E., which prepares the exams for New York, Illinois and most other states, said it had never used A.I. for that purpose.
April Dawson, an associate dean at the Technology Law and Policy Center at the North Carolina Central University School of Law, said the use of A.I. in developing test questions was not an issue on its own. She said the problem was in the fact that it had been done without transparency.
“That you would have a licensing body engage in such irresponsible conduct, it really is kind of baffling,” she said.
If he doesn’t pass, Mr. Brickell is likely to take the exam in July. Those who fail the February exam will be able to take it then for free. The state bar has said it will not use any questions that have been developed with A.I. on the July exam.
Had the exam not been offered for free in July, Mr. Brickell had contemplated taking it in another state.
“I don’t want to give them my bar dues as an attorney for the rest of my life,” Mr. Brickell said of California’s state bar. “This has soured me so much.”
Content Source: www.nytimes.com