
Following calls to upgrade Grok's safeguards after it generated inappropriate images of minors, Elon Musk is not resisting the use of the AI platform. Instead, the xAI CEO is urging users to 'try' Grok's medical features, quoting a post encouraging the upload of blood tests, X-rays and MRI scans for instant analysis, while insisting he has 'not seen it be wrong yet'.
Try it! https://t.co/dnRSycuWXX
— Elon Musk (@elonmusk) January 3, 2026
While some users described Grok as life-changing support during medical crises, others questioned whether uploading intimate health data to a private platform crosses a line that medicine has spent decades trying to protect.
Musk: Grok is Helpful With Medical Stuff
Musk's quoted post described Grok as a powerful medical assistant, saying users can photograph blood work, upload scans, and receive explanations or diagnoses that match — or sometimes exceed — what doctors provide. He suggested AI's advantage lies in its ability to cross-check vast amounts of medical literature instantly, positioning Grok as a tool that can process data faster than humans. In the video, Musk stated that Grok's medical features are already functional and should be used by the public.
Elon Musk: "You take a photograph of your blood work, the page, upload it from your phone to Grok. It will understand what all the data results are and tell you if there's something wrong
— X Freeze (@XFreeze) January 3, 2026
I haven't seen it be wrong yet
I think AI could actually be very helpful with medical… pic.twitter.com/DfMrcwA2iO
Stories of Grok Helping in Real Emergencies
Some of the most striking reactions came from users sharing personal medical experiences, treating Grok as a practical lifeline for their health problems. One detailed reply described how AI advice helped reverse a near-fatal situation when doctors believed nothing more could be done. The commenter said Grok recommended nutritional intervention via a feeding tube, advice they persuaded doctors to follow. Their partner regained consciousness, began eating again, and remains alive months later.
Others reported similar experiences, saying Grok's analysis 'largely overlapped with the doctor's' or was 'even more comprehensive' than their doctor's answers. A common theme was speed: users could understand results immediately rather than wait days or weeks for appointments.
'Assistant, Not a Replacement'
A number of comments, however, urged caution in using Grok. These users welcomed AI in medicine but warned against treating it as a substitute for doctors.
Several noted that AI excels at spotting patterns, summarising studies and analysing probabilities, but lacks patient history, context and judgment. One summed it up plainly: medicine is 'not only data, it is judgment'. Another added that using AI blindly is dangerous, but using it to ask better questions is already a win. For them, Grok can act as a second brain, but should not be treated as the final authority on life-or-death decisions.
Privacy Fears and Data Misuse
Alongside enthusiasm came strong discomfort about sharing medical data. Many users warned against uploading identifiable information at all.
Some suggested stripping names and IDs before using AI, while others were more cynical, claiming the data would be sold to insurers. One comment boiled the fear down to a simple question: 'Do I really want to give you all my personal data?'
Others called for a healthcare-specific version of Grok, arguing there should be protections similar to medical privacy laws before anyone treats AI analysis as safe. There were also users who rejected Musk's premise entirely, pointing out that people have always been able to research lab results themselves without uploading sensitive data.
Taken together, the reactions show less about Grok alone and more about what people want from healthcare. Faster answers. Clear explanations. Less gatekeeping.
At the same time, reactions to Musk's Grok usage encouragement reveal real fear about privacy and over-reliance on machines, especially on a platform that can easily generate images that go against the law.