Health care professionals may be using ChatGPT to answer/support clinical work (e.g. patient triage, clinical decision support, medication management, guidelines) or for policy and procedure support to medical education. However, this poses risks to patient safety and quality.
ChatGPT is an artificial intelligence (AI) chatbot launched in November 2022. It was trained on openly available information on the internet up to about late 2021. It is a natural language process tool driven by artificial intelligence (AI) that generates responses based on the information it was trained on, which is not necessarily current or authoritative. The developers have acknowledged that it sometimes writes seemingly plausible, but ultimately incorrect answers and that it should not be relied upon for anything important at this time. ChatGPT and other AI language tools are a rapidly emerging and evolving technology and how this will transform patient care is still developing.
Relying on ChatGPT to answer clinical questions, to develop policies and procedures, and/or for medical education may result in harm to patients. ChatGPT may provide information that is outdated, inaccurate, or biased while appearing to be accurate and up-to-date. For example, when asked a clinical question, it may fabricate scholarly citations or claim that there are studies and medical literature to support an evidence-based medical decision. These chatbots often create citations to scholarly articles that seem very authoritative but are not actually in existence in the literature. Poor usage of medical literature could lead to serious safety events.
Teammates should not rely on ChatGPT to answer clinical questions whenever evidence-based information is needed. Continue to use proven medical information resources provided by the medical library such as peer-reviewed medical journals or skills and decision support tools such as Dynamic Health or UpToDate to answer clinical questions. Teammates who need help to find medical evidence to support their clinical decision-making can consult our librarians at 414-389-5870 or firstname.lastname@example.org. If information is needed to support medical education, consult with med ed leaders.