Summary
- IA Claude of Anthropic helped reduce a hospital bill from $195,000 to $33,000 after identifying duplicate charges.
- The report was made on Threads by Matt Rosenberg, who says the hospital used incorrect codes and overpriced supplies by up to 2,300%.
- In addition to Claude, Rosenberg used ChatGPT to draft a legal letter, resulting in a successful negotiation with the hospital.
A man in the United States reports that he managed to reduce a hospital bill from US$195,000 (around R$1 million) to just US$33,000 (R$176,800) after the death of his brother-in-law. On the social network Threads, user nthmonkey, identified as Matt Rosenberg, explains that he used the chatbot Claude, from Anthropic, to analyze the detailed invoice and find serious errors.
The original amount, according to the report, was for just four hours of intensive treatment before death. His brother-in-law had lost his health insurance two months earlier. The initial invoice, according to Rosenberg, was confusing and did not detail costs, grouping US$70,000 only under the description “cardiology.”
After insisting, the hospital sent an itemized invoice with standard billing codes (CPT). It was at that moment that the user decided to use the chatbot. In the post, Rosenberg says he used the paid versions of Claude and ChatGPT (for review) during the process.
What did the AI discover?
According to the report, Claude’s main discovery was a massive duplicity in charges. The hospital charged both for the main procedure and, separately, for each of the individual components.


“This represented over a hundred thousand dollars in costs that Medicare [sistema de seguro de saúde público dos EUA] would have refunded zero dollars,” Rosenberg wrote.
In addition to duplicity, the AI identified other irregularities that inflated the account:
- Incorrect codes: the hospital used a procedure code that only applies to inpatients, but the brother-in-law was in the emergency room and was never formally admitted.
- Regulatory violation: He was charged for using a ventilator on the same day as a critical emergency admission, a practice that, according to Claude, is not permitted under Medicare rules.
- Overpricing: Simple supplies, such as aspirin, were charged at a premium of 500% to 2,300% above the amount that would be reimbursed by the public system.
Dispute was made by AIs
Rosenberg reports that the hospital “made up its own rules, its own prices, and figured it could just take money from lay people.” The institution even suggested that the family seek charitable assistance.
Despite recognizing that AIs tend to hallucinate, Rosenberg counted on Claude’s help (and, according to him, a fact check carried out by ChatGPT) to write a letter in a legal tone. The document detailed billing violations and threatened legal action, negative press exposure, and reports to legislative committees.


The letter proposed paying the amount that Claude estimated Medicare would have reimbursed. The hospital counter-offered US$37,000. The user refused and negotiated the final value at US$33,000, which was accepted. “Hospitals know they are the criminals, and if you confront them properly, they will back down,” Rosenberg posted.
With information from Tom’s Hardware
Source: https://tecnoblog.net/noticias/homem-recorre-a-ia-e-reduz-conta-medica-de-us-195-mil-para-us-33-mil/
