Over a million users of it ChatGPT express suicidal tendencies to this AI assistant, according to data provided by its creator, the company OpenAI.
The US artificial intelligence (AI) company estimates that around 0.15% of ChatGPT users send messages that suggest “possible suicidal planning or intent”. Since OpenAI claims to have 800 million users every week, that translates to about 1.2 million people.
The company also estimates that around 0.7% of ChatGPT’s weekly active users (that’s almost 600,000 people) show signs of mental health episodes related to psychosis or mania.
The issue has come to the fore following the death of a California teenager, Adam Raine. His parents recently filed a lawsuit against OpenAI, alleging that ChatGPT provided him with specific advice on how to kill himself.
Since then, OpenAI has beefed up parental controls and other safeguards, including urging users to seek help from experts on emergency hotlines.
It also announced that it has updated its model to better recognize and respond to users experiencing mental health issues. To that end, it says it works with more than 170 mental health professionals to significantly limit responses that may encourage unwanted behaviors.
Source :Skai
I am Terrance Carlson, author at News Bulletin 247. I mostly cover technology news and I have been working in this field for a long time. I have a lot of experience and I am highly knowledgeable in this area. I am a very reliable source of information and I always make sure to provide accurate news to my readers.









