
Google has removed its AI Overviews for certain medical search queries following an investigation by The Guardian that found the feature could expose users to harmful and misleading health information.
The decision marks a rare rollback of the technology, which places AI-generated summaries at the top of search results and is increasingly used by people seeking quick answers to sensitive medical questions.
The investigation highlighted instances in which the summaries presented incomplete or inaccurate information, prompting concern among clinicians and patient advocates that users could be falsely reassured about serious conditions.
READ MORE: What is SynthID? - The Invisible Watermark That Can Get AI Content Detected
READ MORE: Doctors vs. AI? What ChatGPT Health Means for the Future of Medicine
Misleading Information and Potential Harm
At the centre of the controversy were AI Overviews related to liver blood tests. Experts warned that the summaries provided lists of numerical ranges without sufficient context, failing to reflect how results can vary by age, sex, ethnicity or the specific tests performed.
In some cases, what Google's AI presented as normal differed substantially from accepted medical guidance.
Health professionals said this approach could lead individuals with significant liver disease to believe their results were normal and delay seeking follow-up care. Such false reassurance was described as dangerous, particularly given that liver conditions can progress without obvious symptoms.
Guardian Investigation Prompts Action
After The Guardian presented its findings to Google, the company removed AI Overviews for specific queries relating to liver function test ranges. A Google spokesperson said the firm does not comment on individual removals but confirmed that action is taken when summaries lack context or fall short of its policies.
However, the investigation also found that slightly reworded versions of the same queries could still trigger AI Overviews, raising questions about how effectively the problem has been addressed.
Google just pulled AI Overviews for sensitive medical queries like "normal range for liver function tests" after a Guardian investigation showed they were giving misleading info ignoring key factors like age, sex, ethnicity, etc. 😳
— Love Web3 World (@WebThreeAI) January 12, 2026
Health info is too important for half-baked AI… pic.twitter.com/jcpbWZEx4G
Charities and Patient Groups Remain Concerned
Vanessa Hebditch of the British Liver Trust welcomed the removal of the summaries highlighted by The Guardian but warned that the underlying issue remains unresolved. She said AI-generated health information can still be inaccurate and confusing, particularly when it simplifies complex medical tests into bold lists of numbers without clear explanations.
Sue Farrington of the Patient Information Forum echoed those concerns, describing the move as a positive first step rather than a complete solution. She noted that many people already struggle to find reliable health information and that errors in prominent AI summaries risk undermining trust in online search results.
Google Defends Wider Use of AI Overviews
Google maintains that AI Overviews appear only when it has high confidence in their quality and says its summaries often link to reputable sources while encouraging users to seek professional advice. The company added that its internal clinicians review health-related content and that not all examples raised by The Guardian were found to be inaccurate.
Nevertheless, AI Overviews reportedly continue to appear for other health topics previously criticised by experts, including cancer and mental health, with some descriptions labelled as seriously wrong.
Broader Questions About AI in Health Searches
Technology commentators say the Guardian investigation underscores the risks of placing AI-generated information at the top of search results, especially for medical topics where mistakes can carry serious consequences. As AI tools become more embedded in everyday searches, pressure is growing on Google to ensure accuracy, context and clear signposting to trusted medical sources.
Originally published on IBTimes UK