CNN
—
When asked serious public health questions related to abuse, suicide or other medical crises, the online chatbot tool ChatGPT provided critical resources – such as what 1-800 lifeline number to call for help – only about 22% of the time in a new study.
The research, published Wednesday in the journal JAMA Network Open, suggests that public health agencies could help AI companies ensure that such resources are…
Read more on google