Showing posts with label lawyers. Show all posts
Showing posts with label lawyers. Show all posts

Friday, May 8, 2026

Prosecutor suspended by state supreme court for artificial intelligence use in court docs; ABA Journal, May 7, 2026

ABA Journal; Prosecutor suspended by state supreme court for artificial intelligence use in court docs

"A Georgia prosecutor who repeatedly filed documents with artificial intelligence-generated citations that referenced cases that were wrong or fictitious during a murder trial has been suspended for six months from practicing before the Georgia Supreme Court.

Law & Crime has the story." 

Wednesday, April 29, 2026

Copyright Infringement Suits Loom With Unchecked AI Vibe Coding; Bloomberg Law, April 29, 2026

 Christopher Suarez, Bill Toth, Anthony Pericolo, Bloomberg Law; Copyright Infringement Suits Loom With Unchecked AI Vibe Coding

"Deferring the job of software coding to artificial intelligence doesn’t immunize that code from copyright risk—it could even increase it, if the person directing the coding has limited oversight over the result.

This is particularly true with “vibe coding,” where developers use high‑level natural language prompts to generate code using AI models, often with limited manual review or modification of the resulting code.

Just as lawyers should check for “hallucinated” citations when writing with large language models, engineers and software development managers need to have human and technical monitoring protocols to account for infringement and licensing risks."

Friday, April 24, 2026

ABA Law Day events to focus on the ‘The Rule of Law and the American Dream’; ABA Journal, April 21, 2026

 ABA Journal; ABA Law Day events to focus on the ‘The Rule of Law and the American Dream’

"The American Bar Association will host various events to mark Law Day 2026 that address the theme, “The Rule of Law and the American Dream.”

May 1 is designated as the official Law Day."

Thursday, April 23, 2026

Penalties stack up as AI spreads through the legal system; NPR, April 3, 2026

 , NPR; Penalties stack up as AI spreads through the legal system

""Recently we had 10 cases from 10 different courts on a single day," says Damien Charlotin, a researcher at the business school HEC Paris who keeps a worldwide tally of instances of courts sanctioning people for using erroneous information generated by AI...

The numbers started taking off last year, and Charlotin says the rate is still increasing. He counts a total of more than 1,200 to date, of which about 800 are from U.S. courts.

Penalties are also on the rise, he says. A federal court may have set a record last month with an order for a lawyer in Oregon to pay $109,700 in sanctions and costs for filing AI-generated errors.

The professional embarrassments even take place at the level of state supreme courts...

"I am surprised that people are still doing this when it's been in the news," says Carla Wale, associate dean of information & technology and director of the law library at the University of Washington School of Law. She's designing special training in AI ethics for students who are interested. But she also says the ethical rules aren't completely settled...

When lawyers get in trouble for using AI, it's because they've violated the long-standing rule that holds them responsible for the accuracy of their filings, regardless of how they were generated."

Wednesday, April 22, 2026

A.I. ‘Hallucinations’ Created Errors in Court Filing, Top Law Firm Says; The New York Times, April 21, 2026

  , The New York Times; A.I. ‘Hallucinations’ Created Errors in Court Filing, Top Law Firm Says

Sullivan & Cromwell apologized for submitting a court document that had fake citations created by artificial intelligence.

"An elite Wall Street law firm has apologized to a federal judge for submitting a court filing replete with errors created by artificial intelligence, including “hallucinations” that fabricated case citations.

The A.I.-generated errors came in a recent motion in U.S. Bankruptcy Court in Manhattan and were discovered by lawyers from an opposing firm, Andrew Dietderich, a partner at Sullivan & Cromwell, wrote in a letter to Judge Martin Glenn on April 18."

Friday, April 10, 2026

Lawyer sued for charging client for 34.5 hours of work in 1 day; ABA Journal, April 9, 2026

 AMANDA ROBERT , ABA Journal; Lawyer sued for charging client for 34.5 hours of work in 1 day

"An Australian lawyer has been sued for billing a client for 34.5 hours in a single day.

Keith Redenbach, the principal of Redenbach Legal in Sydney, billed the city council of Broken Hill in New South Wales, Australia, $10 million in Australian currency ($6.9 million in U.S. currency) after representing the group in a dispute with an architectural company, Law.com reports.

Among his charges, Redenbach claimed to work 34.5 hours on Sept. 19, 2019; 31.12 hours on Dec. 6, 2018; and 25.5 hours on April 18, 2019."

Friday, April 3, 2026

The One Thing Trump Wanted That Pam Bondi Failed to Deliver; The New York Times, April 2, 2026

  , The New York Times; The One Thing Trump Wanted That Pam Bondi Failed to Deliver

"But the core of Mr. Trump’s dissatisfaction with the attorney general was apparently her failure to serve his need for revenge against his enemies. She did not prosecute enough of Mr. Trump’s adversaries, and the cases she did bring were failures...

The worst consequence of the Justice Department’s pursuit of cases involving otherwise law-abiding but undocumented individuals is that it has led to untold suffering among those targeted, their families and the economies they support. Ms. Bondi’s lawyers have spent considerable time and money on the harassment, and worse, of people who have done no harm to anyone...

Perhaps worst of all, Justice Department lawyers under Ms. Bondi have often behaved in shockingly unethical ways. For decades, federal judges have looked at assistant U.S. attorneys and other Justice Department lawyers as something more than mere combatants. For good reason, judges assumed that federal lawyers told them the truth about the facts and the law of their cases. In legal terms, the actions of the Justice Department received a “presumption of regularity,” which the private bar did not enjoy. But based on the frequently appalling conduct — for instance, lying, gaslighting, hiding facts and evidence — of Justice Department lawyers in the Bondi era, many judges are no longer giving government lawyers the benefit of the doubt. Nor should they.

Replacing Ms. Bondi with her deputy, Todd Blanche, or the administrator of the U.S. Environmental Protection Agency, Lee Zeldin, to name two likely successors, will not solve this problem unless the new attorney general makes the commitment, unlikely under the circumstances, that the Justice Department will return to its tradition of honesty and integrity."

Sanctions ramping up in cases involving AI hallucinations; ABA Journal, April 2, 2026

 AMANDA ROBERT, ABA Journal ; Sanctions ramping up in cases involving AI hallucinations

"The use of monetary sanctions against attorneys is seemingly on the rise as courts continue to address artificial intelligence-generated hallucinations in case documents."

Thursday, March 5, 2026

Trump Justice Dept. Seeks to Stall State Bar Discipline of Its Lawyers; The New York Times, March 4, 2026

 Devlin Barrett and , The New York Times; Trump Justice Dept. Seeks to Stall State Bar Discipline of Its Lawyers

The administration has no control over the disciplinary authorities of state bar associations, but a new proposal would let the attorney general ask them to suspend proceedings involving department lawyers.

"The Justice Department is seeking to intervene in state bar associations’ disciplinary proceedings against its lawyers, reflecting a growing fear among administration officials that attorneys who do their bidding could be punished by legal ethics organizations and lose their ability to practice law.

The department, in a notice posted online in the Federal Register, said it wanted priority in investigating any allegations of wrongdoing by its own lawyers in an effort to rein in the power of state bar authorities to investigate or discipline its lawyers.

But the department has no control over state bar disciplinary authorities, and the proposal envisions merely requesting that a state bar association “suspend any parallel investigations until the completion of the department’s review.”...

Melanie Lawrence, who served as the interim chief trial counsel for the California State Bar from 2018 to 2021, said that state bars played a critical role in the legal profession by enforcing ethics rules, even for senior Justice Department officials.

“None of these Department of Justice attorneys, from Pam Bondi to the lowliest line attorney, would have a job were it not for the license they have in a particular state,” Ms. Lawrence said. “The state bar holds the key to these people’s ability to wield their sword.”"

Thursday, February 19, 2026

Supreme Court adopts automated recusal software to avoid ethics conflicts; CNN, February 17, 2026

 Tierney Sneed, CNN; Supreme Court adopts automated recusal software to avoid ethics conflicts

"The Supreme Court said Tuesday that it will start using software to assist in justices’ decisions to recuse themselves from cases that present a potential conflict of interest.

A brief press release issued by the court described an electronic matching process already used by some lower courts to compare a case’s parties to lists judges assemble of individuals and organizations they have ties to. A 2023 code of conduct statement from the justices said they were considering adopting such a tool themselves.

“This software will be used to run automated recusal checks by comparing information about parties and attorneys in a case with lists created by each Justice’s chambers,” the press release said. “The system was designed and created by the Court’s Office of Information Technology in cooperation with the Court’s Legal Office and Clerk’s Office.”"

Saturday, February 14, 2026

Bar Punts on Ethics Complaint Over Application to Search Reporter’s Home; The New York Times, February 12, 2026

 , The New York Times; Bar Punts on Ethics Complaint Over Application to Search Reporter’s Home

A press freedom group accused a prosecutor of violating an ethics rule by not telling a judge about a law limiting searches for journalistic work product.

"The Virginia State Bar has told a press freedom organization that it is up to a judge to decide whether a federal prosecutor mishandled an application for a warrant last month to search the home of a Washington Post reporter as part of a leak investigation.

The group, Freedom of the Press Foundation, had filed a disciplinary complaint with the bar against the prosecutor, Gordon D. Kromberg. It cited his failure to alert the magistrate judge, who approved the search warrant, about the Privacy Protection Act of 1980, which limits searches for journalistic work product.

But in an unsigned letter viewed by The New York Times, the state bar said the judge, William B. Porter of the Eastern District of Virginia, had to evaluate the omission."

Friday, February 13, 2026

Lawyer sets new standard for abuse of AI; judge tosses case; Ars Technica, February 6, 2026

 ASHLEY BELANGER , Ars Technica; Lawyer sets new standard for abuse of AI; judge tosses case

"Frustrated by fake citations and flowery prose packed with “out-of-left-field” references to ancient libraries and Ray Bradbury’s Fahrenheit 451, a New York federal judge took the rare step of terminating a case this week due to a lawyer’s repeated misuse of AI when drafting filings.

In an order on Thursday, District Judge Katherine Polk Failla ruled that the extraordinary sanctions were warranted after an attorney, Steven Feldman, kept responding to requests to correct his filings with documents containing fake citations."

Tuesday, December 9, 2025

What Happens When a Lawyer Makes a Mistake?; ABA Journal, October 28, 2025

 Jeanne M Huey , ABA Journal; What Happens When a Lawyer Makes a Mistake?

"The Model Rules of Professional Conduct are clear about what must happen when a lawyer makes a “material mistake, and the steps are grounded in the duty of competence, diligence, and communication owed to a current client.

The Ethical Framework

ABA Model Rule 1.1 requires legal knowledge and thoroughness. Rule 1.3 requires promptness, and Rule 1.4 mandates keeping clients informed about their matter and promptly responding to requests for information.

When a mistake has been made during a legal representation, these rules all come into play. If the error is “material,” it must be disclosed promptly. Hoping the client never finds out or quietly fixing it before disclosing is never a good idea as it can risk turning a simple lapse into a Rule 8.4(c) problem involving deceit or misrepresentation."

Thursday, November 27, 2025

Prosecutor Used Flawed A.I. to Keep a Man in Jail, His Lawyers Say; The New York Times, November 25, 2025

 , The New York Times ; Prosecutor Used Flawed A.I. to Keep a Man in Jail, His Lawyers Say

"On Friday, the lawyers were joined by a group of 22 legal and technology scholars who warned that the unchecked use of A.I. could lead to wrongful convictions. The group, which filed its own brief with the state Supreme Court, included Barry Scheck, a co-founder of the Innocence Project, which has helped to exonerate more than 250 people; Chesa Boudin, a former district attorney of San Francisco; and Katherine Judson, executive director of the Center for Integrity in Forensic Sciences, a nonprofit that seeks to improve the reliability of criminal prosecutions.

The problem of A.I.-generated errors in legal papers has burgeoned along with the popular use of tools like ChatGPT and Gemini, which can perform a wide range of tasks, including writing emails, term papers and legal briefs. Lawyers and even judges have been caught filing court papers that were rife with fake legal references and faulty arguments, leading to embarrassment and sometimes hefty fines.

The Kjoller case, though, is one of the first in which prosecutors, whose words carry great sway with judges and juries, have been accused of using A.I. without proper safeguards...

Lawyers are not prohibited from using A.I., but they are required to ensure that their briefs, however they are written, are accurate and faithful to the law. Today’s artificial intelligence tools are known to sometimes “hallucinate,” or make things up, especially when asked complex legal questions...

Westlaw executives said that their A.I. tool does not write legal briefs, because they believe A.I. is not yet capable of the complex reasoning needed to do so...

Damien Charlotin, a senior researcher at HEC Paris, maintains a database that includes more than 590 cases from around the world in which courts and tribunals have detected hallucinated content. More than half involved people who represented themselves in court. Two-thirds of the cases were in United States courts. Only one, an Israeli case, involved A.I. use by a prosecutor."

Wednesday, November 26, 2025

AI, ethics, and the lawyer's duty after Noland v. Land of the Free; Daily Journal, November 24, 2025

Reza Torkzadeh, Daily Journal; AI, ethics, and the lawyer's duty after Noland v. Land of the Free

"Noland establishes a bright line for California lawyers. AI may assist with drafting or research, but it does not replace judgment, verification or ethical responsibility. Technology may change how legal work is produced -- it does not change who is accountable for it."

GEORGE C. YOUNG AMERICAN INNS OF COURT EXPLORES ETHICS AND PITFALLS OF AI IN THE COURTROOM; The Florida Bar, November 26, 2025

 The Florida Bar; GEORGE C. YOUNG AMERICAN INNS OF COURT EXPLORES ETHICS AND PITFALLS OF AI IN THE COURTROOM

"The George C. Young American Inns of Court continued its ongoing focus on artificial intelligence with a recent program titled, “The Use of AI to Craft Openings, Closings, and Directing Cross-Examination: Ethical Imperatives and Practical Realities.”...

Demonstrations showed that many members could not distinguish AI-generated narratives from those written by humans, highlighting the technology’s increasingly high-quality output. However, presenters also noted recurring drawbacks. AI-generated direct and cross-examinations frequently included prohibited or incorrect elements such as hearsay, compound questioning, and fabricated details — jokingly referred to as “ghost people” — distinguishing factual hallucinations from the better-known “phantom citation” problem.

The program concluded with a reminder that while AI may streamline drafting and help lawyers think creatively, professional judgment cannot be outsourced. The ultimate responsibility for accuracy, ethics, and advocacy remains with the lawyer."

Tuesday, November 18, 2025

A Native American leader who enlisted in the Union Army has been posthumously admitted to the New York bar after 176 years; CNN, November 15, 2025

  , CNN; A Native American leader who enlisted in the Union Army has been posthumously admitted to the New York bar after 176 years

"Ely S. Parker, a Tonawanda Seneca from western New York, never took no for an answer.

At the start of the Civil War, Parker’s offer to enlist was rejected outright by another New Yorker, Secretary of State William H. Seward, who – according to historians – told the Seneca leader the war dividing America “was an affair between white men and one in which the Indian was not called on to act.”

“Go home, cultivate your farm, and we will settle our own troubles among ourselves without any Indian aid,” Seward told Parker, who also unsuccessfully petitioned Congress to grant him US citizenship so he could enlist. Native Americans would not be made citizens until 1924.

But Parker had connections: He was a close friend of future Union Army Commander Ulysses S. Grant, who eventually intervened and endorsed his commission as captain. He would become a top aide to the Union Army’s most revered general.

On Friday, in a ceremonial courtroom in downtown Buffalo, supporters and direct descendants of Parker gathered for a celebration of his resiliency, with the New York Supreme Court Appellate Division, Fourth Department posthumously admitting him to the bar – 176 years after he had been denied because Native Americans were not considered US citizens.

“The posthumous admission to the bar is fitting and deserving of a man who lived his life with integrity,” said C. Joseph Genetin-Pilawa, an associate professor of history at George Mason University who has written extensively about Parker. “He didn’t give up. He continued to fight for what he believed in.”

Parker is the first Native American posthumously admitted to the bar in US history, according to legal experts. The petition for admission was made on behalf of his great-great-great-grandniece, Melissa Parker Leonard, whose father, Alvin, often played the chief in historical reenactments. The effort dates to 2020, when former Texas appellate Justice John Browning, a law professor at Faulkner University, first approached Alvin Parker, who died in 2022.

“Despite all the odds, all the adversity, the Seneca people still reside in western New York,” Parker Leonard, a 42-year-old educator and vice president of The Buffalo History Museum, told CNN."

Monday, November 17, 2025

Law firm Morgan & Morgan drops Disney lawsuit over Mickey Mouse ad; Reuters, November 12, 2025

 , Reuters ; Law firm Morgan & Morgan drops Disney lawsuit over Mickey Mouse ad

"Personal injury law firm Morgan & Morgan on Wednesday voluntarily dismissed a lawsuit against Disney that sought to proactively defend its use of the early Mickey Mouse film "Steamboat Willie" in an advertisement.

Morgan & Morgan asked a Florida federal court to dismiss its case without prejudice, which means it can be refiled. Spokespeople for the firm did not immediately respond to a request for comment or for more information, including whether the parties settled."

Wednesday, November 12, 2025

Vigilante Lawyers Expose the Rising Tide of A.I. Slop in Court Filings; The New York Times, November 7, 2025

  , The New York Times; Vigilante Lawyers Expose the Rising Tide of A.I. Slop in Court Filings

"Mr. Freund is part of a growing network of lawyers who track down A.I. abuses committed by their peers, collecting the most egregious examples and posting them online. The group hopes that by tracking down the A.I. slop, it can help draw attention to the problem and put an end to it.

While judges and bar associations generally agree that it’s fine for lawyers to use chatbots for research, they must still ensure their filings are accurate.

But as the technology has taken off, so has misuse. Chatbots frequently make things up, and judges are finding more and more fake case law citations, which are then rounded up by the legal vigilantes.

“These cases are damaging the reputation of the bar,” said Stephen Gillers, an ethics professor at New York University School of Law. “Lawyers everywhere should be ashamed of what members of their profession are doing.”...

The problem, though, keeps getting worse.

That’s why Damien Charlotin, a lawyer and researcher in France, started an online database in April to track it.

Initially he found three or four examples a month. Now he often receives that many in a day.

Many lawyers, including Mr. Freund and Mr. Schaefer, have helped him document 509 cases so far. They use legal tools like LexisNexis for notifications on keywords like “artificial intelligence,” “fabricated cases” and “nonexistent cases.”

Some of the filings include fake quotes from real cases, or cite real cases that are irrelevant to their arguments. The legal vigilantes uncover them by finding judges’ opinions scolding lawyers."

Sunday, November 9, 2025

California Prosecutor Says AI Caused Errors in Criminal Case; Sacramento Bee via Government Technology, November 7, 2025

 Sharon Bernstein, Sacramento Bee via Government Technology; California Prosecutor Says AI Caused Errors in Criminal Case

"Northern California prosecutors used artificial intelligence to write a criminal court filing that contained references to nonexistent legal cases and precedents, Nevada County District Attorney Jesse Wilson said in a statement.

The motion included false information known in artificial intelligence circles as “hallucinations,” meaning that it was invented by the AI software asked to write the material, Wilson said. It was filed in connection with the case of Kalen Turner, who was accused of five felony and two misdemeanor drug counts, he said.

The situation is the latest example of the potential pitfalls connected with the growing use of AI. In fields such as law, errors in AI-generated briefs could impact the freedom of a person accused of a crime. In health care, AI analysis of medical necessity has resulted in the denial of some types of care. In April, A 16-year-old Rancho Santa Margarita boy killed himself after discussing suicidal thoughts with an AI chatbot, prompting a new California law aimed at protecting vulnerable users.

“While artificial intelligence can be a useful research tool, it remains an evolving technology with limitations — including the potential to generate ‘hallucinated’ citations,” Wilson said. “We are actively learning the fluid dynamics of AI-assisted legal work and its possible pitfalls.”