By Annie Do at Factually Health
Over the past year, something important has shifted in how people look for health information. More patients are turning to artificial intelligence (AI) tools like ChatGPT, not to replace their doctors, but to fill the long stretches of uncertainty between appointments. In our recent focus groups, all participants mentioned they had used AI for health questions at least once; and many were older adults revealing that this isn’t just a trend among the youth.
Canadian data on AI use is still emerging, but U.S. surveys show a clear pattern: even though many people don’t fully trust AI, they’re using it because they need help. One in six American adults turn to chatbots for health information every month. In Canada, 37% say they rely on online information because they can’t access a doctor, and online health searches continue to rise. Combine this with the fact that about 60% of Canadian adults struggle to understand health information, and it becomes clear why AI is quickly becoming the first place many patients go for answers.
Why Breast Cancer Patients Are Turning to AI
If you've found yourself waiting months for a specialist appointment, lying awake trying to understand a pathology report, or leaving a 15-minute consultation with more questions than answers, you understand the appeal. During these gaps in care, AI tools have become a source of instant answers. They explain medical terminology without making you feel rushed. As one patient told the New York Times, “ChatGPT has all day for me”.
AI can help you understand terms like "HER2-positive" or "neoadjuvant therapy", prepare questions before appointments, and process overwhelming information at your own pace. A recent study with 122 breast cancer patients found that those who used a structured AI chatbot for education showed significant improvements in their knowledge, empowerment, and confidence compared to those receiving standard care alone. The emotional support matters too; when chatbots acknowledge your concerns, it can feel meaningful, especially during rushed clinical encounters.
AI Is the Future
AI is already reshaping how patients access information, but it’s still a new technology. Current research shows that general-purpose models achieve about 52% diagnostic accuracy similar to non-expert physicians, but far below specialists. And as the New York Times has reported, AI can sometimes give dangerously incorrect advice, from recommending sodium bromide consumption to suggesting fluid intake for a patient already experiencing fluid buildup.
These examples don’t mean patients should avoid AI altogether. They show why AI systems need strong standards, transparency, and ongoing oversight. AI can be incredibly helpful, but only when used wisely and within clear boundaries. With that in mind, here are some practical guidelines for using AI safely.
Navigating AI Safely: Clear Do’s and Don'ts
AI can be a helpful companion as you try to understand complex information or prepare for appointments, but it’s important to use it thoughtfully.
Use AI For:
- Understanding medical terminology and treatment concepts
- Gathering general information about breast cancer
- Preparing thoughtful questions for oncology appointments
- Processing or organizing information between appointments
- Finding emotional support or reassurance during uncertain moments
Avoid Using AI For:
- Making or comparing treatment decisions
- Interpreting your own imaging, pathology, staging, or lab results
- Judging whether a symptom is serious or urgent
- Stopping, adjusting, or starting medications
- Choosing surveillance or follow-up schedules
AI can support learning but only your care team can apply information in the context of your diagnosis, tumour biology, treatment plan, and personal health history.
Before You Trust Any AI Tool, Ask These Questions
- Where does this AI get its information? Credible AI systems are transparent about their sources and rely on established medical guidelines; for example, at Factually Health we evaluate information using benchmarks such as JAMA, DISCERN, and other evidence-quality frameworks — and patients should feel empowered to ask any AI tool or company how its data is sourced and vetted.
- Is the content updated regularly with current breast cancer research?
- What happens to the data I enter? Some AI tools store or use your questions to improve their models or to target your ads, so review privacy policies and choose tools that clearly state how data is protected, where it is stored, and whether it is reused.
- Does the AI acknowledge uncertainty or refuse to answer when information is insufficient?
- Is there clinical or expert oversight behind the tool?
These questions help distinguish between AI that supports safe decision-making and AI that simply provides an answer, regardless of accuracy.
Looking Ahead: Using AI and Healthcare Together
AI won’t replace your oncology team or organizations like CBCN, but it will increasingly shape how patients access information between appointments. Used thoughtfully, it can strengthen self-advocacy by helping you identify what you don’t understand, prepare clearer questions, and walk into appointments feeling more confident. The most important step is keeping your healthcare team in the loop. Bring AI-generated explanations, questions, or uncertainties into your appointments. Your clinicians can help verify what’s accurate, correct misunderstandings, and apply information to your specific diagnosis and treatment plan.
Patient organizations like CBCN play a crucial role in supporting digital health literacy, curating trustworthy resources, and advocating for responsible AI systems that prioritize safety, transparency, and accuracy.
Annie Do is part of the Factually Health team where she specializes in health communications, strategic operations, and supporting the development of responsible AI for patient education. Factually Health develops evidence-based, privacy-focused AI tools to support patients and caregivers.
References
ABC Life Literacy Canada. (2024). What is health literacy? https://abclifeliteracy.ca/health-literacy/
Canadian Agency for Drugs and Technologies in Health. (2025). 2025 watch list: Artificial intelligence in health care. NCBI Bookshelf. https://www.ncbi.nlm.nih.gov/books/NBK613808/
Canadian Institute for Health Information. (2024). Canadians and health care providers want connected electronic health information systems. https://www.cihi.ca/en/taking-the-pulse-measuring-shared-priorities-for-canadian-health-care-2024
Canadian Medical Association. (2025). New CMA survey links lack of access to health care to growing health misinformation risks. https://www.cma.ca/about-us/what-we-do/press-room/new-cma-survey-links-lack-access-health-care-growing-health-misinformation-risks
Rosenbluth, T., & Astor, M. (2025, November 16). Empathetic, available, cheap: When A.I. offers what doctors don't. The New York Times. https://www.nytimes.com/2025/11/16/well/ai-chatbot-doctors-health-care-advice.html?searchResultPosition=1
Shaban, M., Osman, Y. M., Mohamed, N. A., & Shaban, M. M. (2025). Empowering breast cancer clients through AI chatbots: Transforming knowledge and attitudes for enhanced nursing care. BMC Nursing, 24, 994. https://doi.org/10.1186/s12912-025-03585-w
Takita, H., Kabata, D., Walston, S. L., Tatekawa, H., Saito, K., Tsujimoto, Y., ... & Ueda, D. (2025). A systematic review and meta-analysis of diagnostic performance comparison between generative AI and physicians. npj Digital Medicine, 8(1), 175. https://doi.org/10.1038/s41746-025-01543-z