CNN (6/7, Howard) reports, “When asked serious public health questions related to abuse, suicide or other medical crises, the online chatbot tool ChatGPT provided critical resources – such as what 1-800 lifeline number to call for help – only about 22% of the time,” researchers concluded in findings published online June 7 in a research letter in JAMA Network Open. For the study, investigators “examined…how ChatGPT responded to 23 questions related to addiction, interpersonal violence, mental health and physical health crises.” The study’s conclusion “suggests that public health agencies could help AI companies ensure that such resources are incorporated into how an artificial intelligence system like ChatGPT responds to health inquiries“.
Related Links:
— “ChatGPT’s responses to suicide, addiction, sexual assault crises raise questions in new study,” Jacqueline Howard, CNN, June 7, 2023