
The Hidden Risks of Artificial Intelligence (AI): Biases and Disparities in the Global Development and Deployment of AI for Cancer Care
Artificial intelligence (AI) and machine learning (ML) are beginning to have a dramatic impact on the field of medicine in the domains of clinical decision making, radiation planning and interactions between patients and their medical teams. Despite the power of this technology to improve cancer care, there are growing concerns that its deployment may exacerbate disparities and that its algorithms may create or amplify biases. This activity explores the technical pathways within AI model development that can perpetuate human bias and widen disparities in the health care setting depending on its application and implementation, including how AI interacts with gender, race and other socioeconomic factors. This activity also explores how AI can be used to correct disparities in medicine, the potential role and responsibility of industry, and opportunities to leverage AI in the global context, including in low- and middle-income countries. Finally, the activity discusses potential policy solutions to ensure the fair and ethical deployment of AI technologies, and the importance of ensuring that a diverse research community is included in the discussions of ensuring algorithmic fairness.
Topics:
- Testing AI in the Clinic: Deployment, Monitoring and Bias Detection
Andrew J. Hope, MD, FRCPC - Uncovering Racial Bias in AI Models and Algorithms
Laleh Seyyed-Kalantari, PhD - Inclusive AI Cancer Care
Kingsley Ndoh, MD, MPH - Worldwide Clinical Trials in AI: What is Fair?
Ajay Aggarwal, MD, FRCR, PhD - Q and A
Andrew J. Hope - Moderator
Full Panel
This activity is available from March 25, 2025, through 11:59 p.m. Eastern time on March 24, 2027.
The content was originally presented and recorded at the 2024 ASTRO Annual Meeting.
Target Audience
The activity is designed to meet the interests of medical oncologists, radiation oncologists, surgeons, physicists, nurses, diagnostic radiologists, pathologists, radiation therapists, radiation dosimetrists, residents and specialists in industry and computer science/engineering.
Learning Objectives
Upon completion of this activity, participants should be able to:
- Discuss how systematic racism, biases and disparities can be amplified or created through AI/ML technologies.
- Explore the risks and opportunities of AI/ML on global health care disparities.
- Identify principles of equitable AI/ML development, deployment and utilization.
- Andrew J. Hope, MD, FRCPC, is employed by Princess Margaret Cancer Centre, University Health Network. Dr. Hope receives grant/research funding from AstraZeneca Canada.
- Laleh Seyyed-Kalantari, PhD, is employed by York University. Dr. Seyyed-Kalantari receives grant/research funding as a principal investigator from NSERC Discovery Grant, the Canada First Research Excellence Fund (CFREF) and Google.
- Kingsley Ndoh, MD, is employed by University of Washington and is the Founder and CEO of Hurone AI, Inc.
- Ajay Aggarwal, MD, FRCR, PhD, is employed by London School of Hygiene and Tropical Medicine and Guy's & St. Thomas NHS Trust. Dr. Aggarwal receives grant/research funding from the National Institute for Health Research, the National Cancer Institute and the Rising Tide Foundation.
The person(s) above served as the developer(s) of this activity. Additionally, the ASTRO Education Committee had control over the content of this activity. All relevant financial relationships have been mitigated.
The American Society for Radiation Oncology (ASTRO) is accredited by the Accreditation Council for Continuing Medical Education (ACCME) to provide continuing medical education for physicians.
Available Credit
- 1.25 AMA PRA Category 1 Credit™The American Society for Radiation Oncology (ASTRO) is accredited by the Accreditation Council for Continuing Medical Education for physicians. ASTRO designates this Enduring material for a maximum of 1.25 AMA PRA Category 1 Credit™. Physicians should claim only the credit commensurate with the extent of their participation in the activity.
- 1.25 Certificate of AttendanceThis activity was designated for 1.25 AMA PRA Category 1 Credit™.
Price
Course Fees:
ASTRO members must log in to the ASTRO website to view and receive the member rate.
- Nonmember: $149
- Member: $99
- Member-in-Training: $49
- Student/Graduate Student/PGY-1 Member: $49
- Postdoctoral Fellow Member: $49
If you are an ASTRO member from a low or lower-middle income country, as identified by the World Bank, you can receive a 50% discount off your corresponding registration for this activity. Please email [email protected] to inquire about the discount.
Policies:
No refunds, extensions, or substitutions will be made for those participants who, for any reason, have not completed the activity by the expiration date.
Participants using ASTRO Academy activities to satisfy the requirement of a Continuing Certification (MOC) program should verify the credit number and type and availability dates of any activity before making a purchase. No refunds, extensions, or substitutions will be made for participants who have purchased activities that do not align with their MOC requirement.
The activity and its materials will only be available on the ASTRO website until March 24, 2027, regardless of purchase date. At the expiration of the activity, participants will no longer have access to the activity or its materials. ASTRO reserves the right to remove an activity before its expiration date.
Required Hardware/software
One of the two latest versions of Google Chrome, Mozilla Firefox, Internet Explorer or Safari.