ChatGPT accurately answers questions about common cancer myths, Huntsman study shows

Dr. Skyler Johnson worked on a study with the Huntsman Cancer Institute to see whether Chat GPT returned accurate answers for common myths and misconceptions about cancer. It got much better scores than social media.

Dr. Skyler Johnson worked on a study with the Huntsman Cancer Institute to see whether Chat GPT returned accurate answers for common myths and misconceptions about cancer. It got much better scores than social media. (Spenser Heaps, Deseret News)


Save Story

Estimated read time: 4-5 minutes

This archived news story is available only for your personal, non-commercial use. Information in the story may be outdated or superseded by additional information. Reading or replaying the story in its archived form does not constitute a republication of the story.

Editor's note: This is part of a series looking at the rise of artificial intelligence technology tools such as ChatGPT, the opportunities and risks they pose and what impacts they could have on various aspects of our daily lives.

SALT LAKE CITY — ChatGPT is not always great at providing accurate answers, but there's at least one realm in which it far surpasses social media: common cancer myths.

Dr. Skyler Johnson worked on a study with the Huntsman Cancer Institute days after the artificial intelligence chatbot was made available to see whether ChatGPT has accurate answers for common myths and misconceptions about cancer.

Johnson said the questions they fact-checked through the chatbot were commonly asked questions from a list created by the National Cancer Institute. ChatGPT's answers were pretty accurate, and 97% of the answers matched the answers from the National Cancer Institute.

On social media, about one-third of all articles or sources contain misinformation, according to Johnson.

He said sometimes patients' decisions, like refusing a prescribed or tested treatment in favor of something they read online or heard about from a friend, can lead to poor health outcomes. In a 2018 study, Johnson found patients who make the decision to go with unproven treatments have worse chances for survival; there is almost a sixfold increase in the risk for death.

For a while, it has been clear that social media is a source for much of this misinformation about cancer treatment, Johnson said, adding that "the vast majority of those contain the potential to hurt cancer patients."

He said the amount of misinformation is scary, and it is not uncommon to see patients make decisions to go with an alternative treatment and then come back later with cancer that has spread further.

"That's always disheartening, and I lose a lot of sleep over those situations," Johnson said.

Consequently, it's clear why Johnson was also interested in studying ChatGTP's accuracy. And after seeing the results of this most recent study, he and other researchers noted answers the chatbot came up with, while accurate, were more vague than the answers provided by the National Cancer Institute.

That had them concerned about whether patients could interpret the answers as being less definitive than they are. Because of this, Johnson said they would not recommend using ChatGPT as a resource.

He also said it's likely ChatGPT would not be as accurate when asked about less-common cancer myths.

"I do think that we have to continually monitor this new information environment that includes these AI chatbots … because there's a potential risk that it starts producing misinformation at some point," Johnson said.

He said things could change, too. The study was completed when ChatGPT was only a few days old, and there have since been multiple updates.

"I have concerns, where things are evolving so quickly in this space, that although it looks accurate right now, it may not be accurate in the future," Johnson said.

He said cancer patients look for alternative treatments because they want to have control, have autonomy over their medical care, actively participate in their care, and because of fear of side effects from treatment.

"There's no guarantees in cancer care, and some people want certainty. … They will often choose false certainties in the face of known uncertainties," Johnson said.

He said doctors can work to improve trust to help with this, spending more time with patients and communicating better. He said if doctors establish common goals with their patients, often to cure the cancer and reduce pain, then they can build trust.

Johnson splits his time between research and caring for patients directly. He said research allows for population-based changes, but as a physician he makes positive changes for individuals.

He encouraged patients to talk with their physicians about their questions and go to well-established websites for information, like the website for the Huntsman Cancer Institute or the National Cancer Institute.

"I think, a lot of times, patients have some fear that they might be judged by their physician or that their questions might be stupid, but that's rarely the case. Most physicians are very interested to know what patients are thinking about, and they want to help patients make the best decisions possible," he said.

Johnson said he is optimistic that AI could provide a way for cancer communication experts, doctors and organizations to help answer patients' questions accurately.

Most recent Science stories

Related topics

Emily Ashcraft is a reporter for KSL.com. She covers issues in state courts, health and religion. In her spare time, Emily enjoys crafting, cycling and raising chickens.
KSL.com Beyond Business
KSL.com Beyond Series

KSL Weather Forecast

KSL Weather Forecast
Play button