As ChatGPT has gained popularity worldwide, people are now using it not only for search, coding, and writing, but also for personal decisions, including life advice and emotional support.
In a press release on product safety, the company stated that, with increased use, they have encountered individuals experiencing severe mental and emotional issues.
As society adapts to artificial intelligence (AI), there is a profound responsibility to help those in need.
World Mental Health Day is observed annually on October 10. In 2018, the theme focused on “Young people and mental health in a changing world.”
Leaders highlighted the importance of increasing investment and collaboration across the social, health, and education sectors to develop programs that support the mental health of young people.
According to the United Nations, this investment should focus on raising awareness among adolescents and young adults about the importance of caring for their mental well-being, as well as educating peers, parents, and teachers on how to support their friends, children, and students.
In its press release this week, ChatGPT said the company is strengthening protections for teenagers.
ChatGPT has begun implementing additional safeguards specifically for users under the age of 18. The company continues to develop and roll out these protections, recognizing the unique developmental needs of teenagers.
ChatGPT will soon introduce parental controls, allowing parents to gain insight into and influence how their teens use the platform.
Additionally, ChatGPT is exploring the option for teens, under parental oversight, to designate a trusted emergency contact. In moments of acute distress, this feature would allow ChatGPT to connect teens with someone who can provide immediate support, rather than just directing them to resources.
Despite these safeguards, ChatGPT acknowledged that there have been instances where the system did not function as intended in sensitive situations.
According to the United Nations, mental health issues in children and adolescents are often overlooked or missed. Approximately 1 in 7 young people between the ages of 10 and 19 are affected.
Anxiety, depression, and behavioral disorders are the most common conditions.
The World Health Organization states that parents, caregivers, and teachers can all play a crucial role in recognizing the signs that a child or young person may need mental health support.
Karen Hao, bestselling author of Empire of AI, wrote on social media that reading about the horrifying case of a teen after prolonged engagement with ChatGPT brought to mind something that an artist and filmmaker said: Psychologically harmful material accumulates when mass surveillance is the basis for data collection. To fix this problem, we have to question what’s in the data.
Hao’s book is the culmination of more than seven years of reporting on artificial intelligence (AI). It is based on over 300 interviews with more than 260 people, including 150 interviews with current and former employees.
How do we govern AI? More importantly, who should govern it?
The book asks the central question: How do we govern AI? More importantly, who should govern it? The future of AI is inextricably tied to our own future, and whether it will improve or deteriorate.
