51% found chatbots easy for mental health talks; 49% said so for doctors, 37% psychologists
Teenagers pose for a picture while looking at their phones, in Bonn, Germany, February 20, 2026.PHOTO: REUTERS
Nearly one in two young people in Europe have used AI chatbots to discuss intimate or personal matters, as the technology increasingly serves as a source of emotional support, an Ipsos BVA survey showed on Tuesday.
Of the 3,800 people surveyed, 51% said it was “easy” to discuss mental health and personal issues with a chatbot. Only 49% said the same about healthcare professionals and 37% about psychologists.
People close to them were at the top of the list, with 68% saying it was easy to discuss issues with friends and 61% with parents.
The survey, commissioned by France’s privacy watchdog CNIL and insurer Groupe VYV, was carried out among people aged 11 to 25 across France, Germany, Sweden and Ireland in early 2026.
The findings showcased growing concerns over young people’s mental health. About 28% of respondents met the threshold for suspected generalized anxiety disorder, the survey found.
Around 90% of those surveyed had used artificial intelligence tools before, with many citing their constant availability and non-judgmental nature. More than three in five users described AI as a “life adviser” or a “confidant”.
However, concerns over the psychological impact of AI tools have also grown over the past year, and experts have warned about the limitations of AI in detecting human emotions and safely providing emotional support.
Read More: Pentagon makes deal to expand use of Google AI: reports
Earlier this year, the family of a Florida man sued Google, alleging its Gemini AI chatbot contributed to his paranoia and eventual suicide.
The results of the survey were not a surprise, said Ludwig Franke Föyen, a psychologist and digital health researcher at Stockholm’s Karolinska Institutet.
Current large language models can produce high-quality responses, Franke Föyen told Reuters, adding that his research suggested even licensed professionals may struggle to distinguish AI-generated advice from that of human experts.
But he warned against relying on chatbots alone for mental health support, saying general-purpose AI systems were designed for engagement and companies’ goals may not align with mental healthcare needs.
“AI can offer information and support, but it should not replace human relationships or professional care,” Franke Föyen said. “If someone turns to a chatbot instead of speaking to a parent, a friend, or a mental health professional, that is a concern. We do not want technology to make people feel more alone.”





