Recently Diagnosed or Relapsed? Stop Looking For a Miracle Cure, and Use Evidence-Based Therapies To Enhance Your Treatment and Prolong Your Remission

Multiple Myeloma an incurable disease, but I have spent the last 25 years in remission using a blend of conventional oncology and evidence-based nutrition, supplementation, and lifestyle therapies from peer-reviewed studies that your oncologist probably hasn't told you about.

Click the orange button to the right to learn more about what you can start doing today.

ChatGPT, Dr. Google and Myeloma

Share Button

ChatGPT, Dr. Google, and Myeloma. Odd combination, right? Not if you’re a newly diagnosed multiple myeloma (NDMM) patient, it’s not.

First and foremost, the Internet in general and artificial intelligence specifically can be the single most important advancement for NDMM ever. At least they can be. But NDMM patients must understand the limitations of Dr. Google and artificial intelligence.

Almost every blog post on PeopleBeatingCancer.org starts with a study about MM from the internet. Research is the heart of evidence-based medicine. However, myeloma patients and survivors need to understand the limitations of searching for MM information on the internet.

The video below is a humorous take on the negatives that can come from Dr. Google. Interestingly, many of the cancer “cures” mentioned in the video are complementary MM therapies. For example, intravenous vitamin C has been shown to kill MM. But this doesn’t mean that you can cure MM with IVC. 



The challenge faced by the NDMM patient is that they don’t know enough (yet) to be able to use Dr. Google or ChatGPT to help them. However, searching for basic information like  “what is multiple myeloma” or “how is myeloma staged” can be helpful. Lots of information, but NDMM patients can learn.

Unfortunately, multiple myeloma is a rare, incurable blood cancer that comes with a language all its own. Searching for some of the jargon may help the NDMM patient understand more about their blood cancer.

But asking Dr. Google “how long will I live with myeloma” or something like that can only lead to information that may or may not apply to you.

Email me at David.PeopleBeatingCancer@gmail.com to learn more about both conventional and non-conventional MM therapies.

David Emerson

  • MM Survivor
  • MM Cancer Coach
  • Director PeopleBeatingCancer

ChatGPT in Your Clinic: Who’s the Expert Now

Patients arriving at appointments with researched information is not new, but artificial intelligence (AI) tools such as ChatGPT are changing the dynamics.

Their confident presentation can leave physicians feeling that their expertise is challenged. Kumara Raja Sundar, MD, a family medicine physician at Kaiser Permanente Burien Medical Center in Burien, Washington, highlighted this trend in a recent article published in JAMA.

A patient visited Sundar’s clinic reporting dizziness and described her symptoms with unusual precision: “It’s not vertigo, but more like a presyncope feeling.” She then suggested that the tilt table test might be useful for diagnosis.

Occasionally, patient questions reveal subtle familiarity with medical jargon. This may indicate that they either have relevant training or have studied the subject extensively.

Curious, Sundar asked if she worked in the healthcare sector. She replied that she had consulted ChatGPT, which recommended the tilt table test.

For years, patients have brought newspaper clippings, internet research, and advice from friends and relatives to consultations.

Suggestions shared in WhatsApp groups have become a regular part of clinical discussions. Sundar noted that this particular encounter was different.

The patient’s tone and level of detail conveyed competence, and the confidence with which she presented the information subtly challenged his clinical judgment and treatment plans.

Clinical Practice

It is not surprising that large language models (LLMs), such as ChatGPT, are appealing. Recent studies have confirmed their remarkable strengths in logical reasoning and interpersonal communication.

However, a direct comparison between LLMs and physicians is unfair. Clinicians often face immense pressure, including constrained consultation times, overflowing inboxes, and a healthcare system that demands productivity and efficiency.

Even skilled professionals struggle to perform optimally under adverse conditions.

In contrast, generative AI is functionally limitless. This imbalance creates an unrealistic benchmark; however, this is today’s reality.

Patients want clear answers; more importantly, they want to feel heard, understood, and reassured.

Patients value accurate information but also want to feel recognized, reassured, and heard. “Unfortunately, under the weight of competing demands, which is what often slips for me not just because of systemic constraints but also because I am merely human,” Sundar wrote.

Despite the capabilities of generative AI, patients still visit doctors. Though these tools deliver confidently worded suggestions, they inevitably conclude: “Consult a healthcare professional.”

The ultimate responsibility for liability, diagnostics, prescriptions, and sick notes remains with physicians.

Patient Interaction

In practice, this means dealing with requests, such as a tilt table test for intermittent vertigo, a procedure that is not uncommon but often inappropriate.

“I find myself explaining concepts such as overdiagnosis, false positives, or other risks of unnecessary testing. At best, the patient understands the ideas, which may not resonate when one is experiencing symptoms. At worst, I sound dismissive. There is no function that tells ChatGPT that clinicians lack routine access to tilt-table testing or that echocardiogram appointments are delayed because of staff shortages. I have to carry those constraints into the examination room while still trying to preserve trust,” Sundar emphasized in his article.

Sundar reported a new form of paternalism in his own thinking during his conversations with medical students. While he once thought they were probably using search engines or medical portals for diagnosis, his inner monologue has now shifted.

The new, equally dismissive thought is, “You must have asked ChatGPT and are now going to tell us what I should order.”

It often reflects defensiveness from clinicians rather than genuine engagement and carries an implicit message: We still know the best. “It is an attitude that risks eroding sacred and fragile trust between clinicians and patients. It reinforces the feeling that we are not ‘in it’ with our patients and are truly gatekeeping rather than partnering. Ironically, that is often why I hear patients turn to LLMs in the first place,” Sundar concluded.

Patient Advocacy

One patient said plainly, “This is how I can advocate for myself better.” The word “advocate” struck Sundar, capturing the effort required to persuade someone with more authority. Although clinicians still control access to tests, referrals, and treatment plans, the term conveys a sense of preparing for a fight.

When patients feel unheard, gathering knowledge becomes a strategy to be taken seriously.

In such situations, the usual approach of explaining false-positive test results, overdiagnosis, and test characteristics is often ineffective. From the patient’s perspective, this sounds more like, “I still know more than you, no matter what tool you used, and I’m going to overwhelm you with things you don’t understand.”

Physician Role

The role of physicians is constantly evolving. The transition from physician-as-authority to physician-as-advisor is intensifying. Patients increasingly present with expectations shaped by non-evidence-based sources, often misaligned with the clinical reality. As Sundar observed, “They arm themselves with knowledge to be heard.” This necessitates a professional duty to respond with understanding rather than resistance.

His approach centers on emotional acknowledgment before clinical discussion: “I say, ‘We’ll discuss diagnostic options together. But first, I want to express my condolences. I can hardly imagine how you feel. I want to tackle this with you and develop a plan.’” He emphasized, “This acknowledgment was the real door opener.”

Global Trend

What began as a US trend observed by Sundar has now spread worldwide, with patients increasingly arriving at consultations armed with medical knowledge from tools like ChatGPT rather than just “Dr Google.”

Clinicians across health systems have reported that digitally informed patients now comprise the majority.

In a forum discussion, physicians from various disciplines shared their experiences, highlighting how previously informed patients are now the norm. Inquiries often focus on specific laboratory values, particularly vitamin D or hormone tests. In gynecologic consultations, Internet research on menstrual disorders has become a routine part of patient interactions, with an overwhelming range of answers available online.

Chanice,’ a Coliquio user who’s a gynecologist, shared, “The answers range from, ‘It’s normal; it can happen’ to ‘You won’t live long.’” “It’s also common to Google medication side effects, and usually, women end up experiencing pretty much every side effect, even though they didn’t have them before.”

How should doctors respond to this trend? Opinions are clear: openness, education, and transparency are essential and ideally delivered in a structured manner.

“Get the patients on board; educate them. In writing! Each and every one of them. Once it’s put into words, it’s no longer a job. Invest time in educating patients to correct misleading promises made by health insurance companies and politicians,” commented another user, Jörg Christian Nast, a specialist in gynecology and obstetrics.

The presence of digitally informed patients is increasingly seen not only as a challenge but also as an opportunity. Conversations with these patients can be constructive, but they can also generate unrealistic demands or heated debates.

Thus, a professional, calm, and explanatory approach remains crucial, and at times, a dose of humor can help. Another user who specializes in internal medicine added, “The term ‘online consultation’ takes on a whole new meaning.”

The full forum discussion, “The Most Frequently Asked ‘Dr. Google’ Questions,” can be found here.

Find out what young physicians think about AI and the evolving doctor-patient relationship in our interview with Christian Becker, MD, MHBA, University Medical Center Göttingen, Göttingen, Germany, and a spokesperson for the Young German Society for Internal Medicine.

Leave a Comment: