In this issue paper, URAC explores why health care organizations must clearly define the use of AI, establish responsibility for its outcomes and implement strong oversight when deploying these technologies. The paper highlights where AI is improving efficiency in health care, where errors and risks can emerge in patient care and why transparency about AI use matters. It also examines when organizations should disclose AI use to patients and what governance structures are needed to manage these risks.
One theme runs throughout: trust in health care AI is not established at design or deployment. It is earned over time through responsible oversight and accountable decision-making when patient care is on the line.
Artificial intelligence (AI), including machine learning and large language models, is rapidly reshaping health care. These technologies can help clinicians assess risk, inform clinical decision-making and help organizations allocate resources and structure workloads. Increasingly, they influence decisions that affect patient care.
But when AI is used in health care, accountability matters.
To better understand how organizations are approaching these challenges, URAC spoke with experts working at the forefront of health care AI.
Fill in the form to download the Issue Paper: Health Care AI: Accountability In Practice
The Experts

Demetri Giannikopoulos is Chief Innovation Officer at Rad AI, a health care AI company. He has contributed to national initiatives focused on AI governance and patient safety, including providing invited testimony to the United States Senate on the role of artificial intelligence in health care.
He is currently serving the final year of a five-year term as a patient representative on the American College of Radiology Commission on Patient- and Family-Centered Care, where he helped advance initiatives including the Scanxiety Toolkit to improve communication and support for patients undergoing imaging. He holds equity in Aidoc, Inc., a privately held health care AI company.
Shakira J. Grant, MD is a board-certified physician executive working at the intersection of clinical care, health policy, and artificial intelligence. As founder of CROSS Global Research & Strategy, she advises health care and life sciences organizations on operationalizing AI governance and responsible adoption in real-world clinical and clinical trial workflows, with particular emphasis on equity, oversight, AI literacy, patient safety and trustworthy performance.
With nearly two decades of clinical experience, Dr. Grant previously served as a health policy advisor to the U.S. House Committee on Ways and Means and as an Assistant Professor and principal investigator at the University of North Carolina at Chapel Hill. Her background spans patient care, policy and innovation, informing her work to align AI design, evaluation and deployment with clinical practice and improved outcomes, particularly for historically marginalized communities.
Dr. Jenn Richards, PhD, JD, PharmD is the Senior Director of Product Management at URAC, where she oversees overall program management and the development and revision of URAC’s standards. She also oversees the volunteer standing advisory groups that assist URAC in defining best practices and quality standards.
Dr. Richards has a background in retail, prescription benefits management, specialty, compounding and hospital pharmacy. She has worked on numerous quality improvement projects, making significant changes to organizational structures to provide better total patient care. Dr. Richards has a Doctor of Pharmacy, a Juris Doctor and a PhD in Psychology with an emphasis in industrial and organizational psychology.

