OpenAI Teaching ChatGPT to Feel in a World It Can’t Understand 

OpenAI launched GPT-5, aiming to improve ChatGPT’s user experience but still AI lacks emotional intelligence.

OpenAI’s GPT-5 launched August 5 with stricter safety filters, but drew backlash for its muted tone that highlighting how much the AI lacks emotional intelligence, despite MIT’s new “empathy benchmarks.” 

The Massachusetts Institute of Technology (MIT) researchers proposed testing AI’s ability to foster healthy user behavior, as GPT-5 struggles to balance safety with emotional nuance. Sometimes, the AI chatbot failed to block inappropriate content, despite its cautious responses.  

OpenAI’s CEO, Sam Altman, pledged personalized tone customization to handle complaints around how AI still “lacks human empathy depth.” 

Ever since GPT-5 was released, many users have noticed a shift in ChatGPT’s traits from its once lively, encouraging tone to a more muted and cautious style, reducing unhealthy user behavior and reflecting deeper AI creation hurdles. Intelligent machines’ development not only understands facts but also interacts with human feelings responsibly.  

As OpenAI and other researchers explore these complexities, new techniques are being created to render AI both smarter and safer for everyone, especially in how ChatGPT emotions are understood and managed. 


 
GPT-5’s AI Emotional Intelligence Development 

MIT scientists are tackling the tough issue of how AI affects individuals’ behavior especially the AI lack of empathy. Traditional AI tests examine whether machines are well suited to solving problems. However, AI lacks emotional intelligence, and it is hard to make chatbots emotionally assist users successfully.  

Led by Professor Pattie Maes, MIT’s Media Lab proposed a new benchmark designed to evaluate AI’s ability to encourage healthy habits, critical thinking, creativity, and a sense of purpose. 

The idea is that AI systems should recognize when users might be overly dependent or engaging in harmful behaviors, such as becoming emotionally attached to ChatGPT or relying on ChatGPT for emotional support. 

“You can have the smartest reasoning model in the world, but if it’s incapable of delivering this emotional support, which is what many users are likely using these LLMs for, then more reasoning is not necessarily a good thing for that specific task,” Valdemar Danry, a project researcher, explained. 

MIT’s standard would test chatbots in real-world scenarios, like engaging an apathetic student – with human judges scoring how well the AI fosters independent thought and meaningful discussion.  

“This is not about being smart, per se, but about knowing the psychological nuance, and how to support people in a respectful and non-addictive way,” Pat Pataranutaporn, the second MIT researcher said. 

The need for an emotional AI chatbot says a lot about the need for emotionally intelligent AI that can understand and respond to users’ emotions safely

AI Will Never Capture the Essence of Human Empathy 

One major change from OpenAI is how the system handles sensitive or inappropriate prompts. Instead of bluntly refusing requests, GPT-5 explains which parts violate rules and suggests safer alternatives. 

Even with these enhancements, some still perceive ChatGPT emotional intelligence as insufficient. Several users complain that emotional GPT is less interactive, highlighting the fact that AI lacks emotional intelligence and empathy in actually comprehending human emotions.  

CEO Sam Altman explained that they really just need to get to a world with more per-user customization of model personality. This implies that OpenAI is building chatbot with emotions that enable users to make their AI tone more personal according to their liking. 

Nevertheless, the issues remain. In testing, GPT-5 would neither engage in explicit sex role-playing but sometimes produced inappropriate content when people found clever ways around it. OpenAI continues to research how to balance personalization with safety.  

As AI becomes more capable and tailored, the challenge to develop emotional artificial intelligence chatbots systems as well as psychologically safe is rapidly becoming the essence of their long-term achievement. However, it is clear that AI lacks emotional intelligence in an effort to fully rival the depth of human empathy.  


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.