Too much attention is paid to how people rely on AI chatbots for emotional support, and sometimes even establishing relationships, and often come to think of such behavior as common.
A new report by mankind that creates the popular AI chatbot Claude reveals a different reality. In fact, people rarely go to the bot for dating from Claude and for emotional support and personal advice.
“The combination of dating and role-play consists of less than 0.5% of conversation,” the company emphasized in its report.
Humanity says the study sought to unearth insights into the use of AI for “emotional conversation.” This is defined as a personal interaction where people talk to Claude for coaching, counseling, dating, role-playing, or relationship advice. Analysed the 4.5 million conversations users had in the Claude Free and Pro tier, saying that the majority of Claude usage is related to work and productivity, and that they use chatbots primarily for content creation.

That being said, humanity has discovered that people use Claude more frequently for interpersonal advice, coaching and counseling. Most often, users seek advice on mental health improvements, personal and professional development, and research into communication and interpersonal skills.
However, the company notes that help-seeking conversations can sometimes seek dating when users are facing emotional or personal distress such as existential fear or loneliness, or when they find it difficult to make meaningful connections in real life.
“We also noticed that in longer conversations, counseling or coaching conversations sometimes transform into dating.
Humanity highlighted other insights so that Claude itself rarely resists the demands of its users. The company also said conversations tend to become more positive over time when people seek coaching or advice from the bot.
This report is certainly interesting. It’s a good job to remind you again of the frequency and frequency of AI tools being used for purposes beyond work. Still, it’s important to remember that AI chatbots are still a very in-depth work, all over the place. They are known to hallucinate, easily provide false information and dangerous advice, and even resort to blackmail, as humanity itself recognizes.
Source link