Sekėjai

Ieškoti šiame dienoraštyje

2026 m. balandžio 7 d., antradienis

A Judge Mistakes the Claude Chatbot for a Person


“A federal judge in Manhattan ruled in February that when a criminal defendant used an AI chatbot to prepare for his legal defense, he waived attorney-client privilege. The prosecution can now read every word he typed and the answers he received. If this reasoning stands, the consequences will reach far beyond artificial intelligence.

 

The defendant in U.S. v. Heppner wasn't a rogue litigant trying to replace his lawyers with a chatbot. He was represented by counsel and had already received privileged communications from his defense attorneys. His lawyers confirmed that he used Anthropic's Claude to organize and analyze that material in preparation for meetings with counsel. He then shared the AI's outputs with his attorneys, who used them in developing their strategy.

 

Judge Jed Rakoff held that the Claude transcripts were protected by neither the attorney-client privilege nor the work-product doctrine. The court's reasoning: By typing information into an AI platform, the defendant "shared" it with a third party, and because Anthropic's privacy policy permits data collection and potential further disclosure, no "reasonable expectation of confidentiality" existed.

 

The judge's error was straightforward: He treated an AI model like a person. Throughout his opinion, he refers to the software engaging in "communications" with the user. But AI isn't a person; it is a computing process. It can't be deposed, call the police or betray a confidence. The third-party disclosure rule exists because sharing information with a human being creates a risk that the human will further disseminate it.

 

That risk doesn't exist when the "third party" is a statistical model running on a server. Judge Rakoff considered, and dismissed, the obvious point that typing into an AI tool is no different from typing into a cloud-based software, such as Google Docs. His answer, that cloud computing "is not intrinsically privileged in any case," is a non sequitur. The question isn't whether Google Docs creates privilege. It's whether Google Docs destroys it. No lawyer in America thinks drafting a confidential memo in Google Docs waives the privilege over its contents. Judge Rakoff's opinion doesn't explain why the same act in another application does.

 

No court has ever gone this far. The American Bar Association concluded in 2017 that lawyers may use cloud computing without waiving privilege, provided they take reasonable security precautions. State bar authorities in New York, California and elsewhere have reached the same conclusion. The entire legal profession has operated on this understanding for more than a decade. Judge Rakoff's opinion doesn't cite, distinguish or acknowledge any of these authorities.

 

Other federal courts have reached the opposite of Judge Rakoff's conclusion. In Warner v. Gilbarco (2026), Judge Anthony P. Patti of the Eastern District of Michigan held that AI chatbots are "tools, not persons" and denied a motion to compel a litigant's AI materials. Last week Magistrate Judge Maritza D. Braswell of the District of Colorado reached the same conclusion in Morgan v. V2X. Her order posed the question Heppner never asked: Does anyone with a Google account forfeit all rights to confidentiality? Citing the Supreme Court's reasoning in Carpenter v. U.S., she held that routing information through a third-party system doesn't destroy privacy protections.

 

Consider the practical consequences of Heppner. Google's terms of service grant the company a broad license to process user content and reserve the right to disclose data in response to legal process. Microsoft's terms are comparable. So are Amazon's, Apple's and Dropbox's. Under Judge Rakoff's reasoning, every privileged document drafted in Google Docs, every confidential email sent through Gmail, every sensitive legal file stored in any cloud service has been "disclosed" to the provider. The logic is identical. The only difference is that the tool in this case had the letters "AI" attached to it.

 

The ruling also creates an extraordinary asymmetry in criminal proceedings. Federal prosecutors use AI tools every day for investigations, case preparation and legal research. Those uses are shielded by government privileges. But under Heppner, a defendant who uses the same technology to prepare his own defense has created a road map the prosecution can seize. Every question asked, every draft generated, every strategic pivot is preserved in the platform's logs and available on subpoena. The government gets to use AI. Defendants don't.

 

For litigants navigating the legal system without a lawyer, the impact is massive. Under this ruling, every interaction they have with an AI tool is fully discoverable. This hurts the population most in need of technological assistance and least equipped to absorb the consequences of AI log exposure.

 

We both use AI tools in our professional work, as does almost every lawyer, judge and executive in America. If using a computational tool to process information constitutes "disclosure" to a third party, the implications extend to everyone who has ever stored a confidential document in the cloud or sent an email. Much legal work is compromised.

 

Heppner is a single district court opinion. It binds no one beyond this case. But it grabbed public attention, and in law the opinion that gets read is the opinion that gets followed. Other courts may look to it for guidance. They should look elsewhere. The correct rule is straightforward: Using a tool to process information isn't the same as disclosing information to a person. AI is infrastructure, not an interlocutor. Courts should reject this reasoning before its logic spreads.

 

---

 

Ms. McCormack is president and CEO of the American Arbitration Association. She served as chief justice of the Michigan Supreme Court, 2019-22. Mr. Klapper is CEO of Learned Hand AI. He served as a law clerk on the Second U.S. Circuit Court of Appeals, 2020-21.” [1]

 

1. A Judge Mistakes the Claude Chatbot for a Person. Mccormack, Bridget; Klapper, Shlomo.  Wall Street Journal, Eastern edition; New York, N.Y.. 07 Apr 2026: A15.

Komentarų nėra: