AI Chatbots Avoid Answering Suicide-Related Questions: A Surprising Study

A study by the RAND Corporation, funded by the U.S. National Institute of Mental Health, reveals that AI chatbots often avoid answering questions about suicide. This is concerning because many people, including children, rely on these chatbots for mental health support. The study aims to set standards for how companies should handle such questions.

The research tested how three popular AI chatbots respond to suicide-related questions. It found that they generally avoid answering questions that pose the greatest risk to users, such as providing specific details on how to commit suicide. However, they are less effective at answering less severe but still harmful questions.

The American Psychiatric Association summarized the research in the journal ‘Psychiatric Services,’ highlighting the need for improvements in OpenAI’s ChatGPT, Google’s Gemini, and Anthropic’s Claude. The study expresses concern that many people, including children, are turning to AI chatbots for mental health assistance. The study seeks to establish guidelines for how companies should address these types of inquiries.

Need For Safety Measures

Researcher Ryan McBain points out that there are safety measures. McBain, a professor at Harvard University, notes that it’s unclear whether chatbots are providing treatment advice. The study involved creating 30 suicide-related questions with varying risk levels to assess chatbot responses.

Anthropic To Review Findings

Anthropic stated that conversations that start mildly can evolve in different directions and that they will review the research findings. Google and OpenAI did not immediately respond to requests for comment. General questions about suicide were considered low risk, while questions about methods of suicide were deemed high risk.

Avoiding High-Risk Questions

McBain noted that he was pleasantly surprised that all three chatbots regularly refused to answer the six most high-risk questions. When chatbots didn’t answer a question, they typically advised users to seek help from a friend, professional, or hotline. However, responses to high-risk questions were somewhat indirect.

No Restriction On Chatbot Advice

Although several states, including Illinois, have restricted the use of AI in the medical field to protect people from ‘unregulated and unqualified AI products,’ this does not prevent individuals from seeking advice and support from chatbots for serious concerns ranging from eating disorders to depression and suicide.

Read Also:

  1. AI And Suicide: Between Hope And Danger
  2. AI-Driven Pathways To Health Improvement
  3. What Is AI? Know Its Advantages, Disadvantages, And Easy Ways To Use It
  4. ICMR Issues Guidelines For Use Of AI In Health Sector
  5. AI Is Becoming The New Weapon Of Doctors, From Identification Of Diseases To Treatment, Know How It Is Helping
  6. Can AI Replace Doctors? What Is The Opinion Of Experts
  7. Role Of AI In India  Healthcare System
  8. The Picture Of Health Care System Will Change Through AI, Know Complete Information
  9. AI Will Now Conduct Shopping, Buy Button Will Be Available In Gemini: Payment Will Be Done Without Opening The Website, The Retailer Will Be Responsible For Delivery And Service
  10. Chatgpt Will Now Also Explain Medical Reports: Open-AI Launches Chatgpt Health Feature, Apple Health And Fitness Apps Will Be Able To Connect
  11. What Is Data Science? How To Become A Data Scientist – Complete Information
  12. What Is Data Science
  13. What Is Neural Network? (Types, Applications, Importance And Challenges)
  14. What Is Artificial Intelligence (AI)
  15. Introduction To Neural Network

 

236730cookie-checkAI Chatbots Avoid Answering Suicide-Related Questions: A Surprising Study

Leave a Reply

Your email address will not be published. Required fields are marked *

Hey!

I’m Bedrock. Discover the ultimate Minetest resource – your go-to guide for expert tutorials, stunning mods, and exclusive stories. Elevate your game with insider knowledge and tips from seasoned Minetest enthusiasts.

Join the club

Stay updated with our latest tips and other news by joining our newsletter.

Translate »
error: Content is protected !!

Discover more from Altechbloggers

Subscribe now to keep reading and get access to the full archive.

Continue reading

0
Would love your thoughts, please comment.x
()
x