Mental health app insecurity is ‘creepy’

Popular psychological support applications can harm users by, among other things, sharing their personal and sensitive data with third parties.

Although there are good intentions behind these applications – maintaining mental health and spiritual well-being, most can harm users by disclosing personal and intimate data due to serious security and privacy vulnerabilities, say Mozilla researchers who analyzed 32 mobile applications for mental health and prayer.

Of the 32 applications, 28 are insecure and marked “Privacy not included”. As many as 25 applications did not meet Mozilla’s minimum security standards, such as requiring strong passwords and managing security updates and vulnerabilities, the researchers said.

These applications address some of the most sensitive mental health issues that people face, such as depression, anxiety, suicidal thoughts, domestic violence and eating disorders. But it turned out that applications are among the “most insensitive” when it comes to protecting this intimate data.

Jen Kaltrider of Mozilla, who led the study, went so far as to call most mental health and prayer applications “extremely creepy.” in a post about the study published on the Mozilla blog.

“They monitor, share and capitalize on the most intimate personal thoughts and feelings of users, such as mood, mental state and biometric data,” she said. “It turns out that mental health applications are not good for your mental health, because they reveal how careless and greedy these companies can be according to our most intimate personal data.”

Mozilla researchers spent 255 hours, or about eight hours per application, analyzing the safety of various mental health and prayer applications.

The apps they researched have functionality such as connecting users with therapists and offering AI chat bots, community support pages and prayers. They also offer mood diaries and mental health assessments, among other features that require the collection of sensitive information about users.

Some of the bad behaviors of apps include sharing intimate user data, allowing weak passwords, targeting vulnerable users with personalized ads, and displaying vague and poorly written privacy policies, the researchers said.

For example, at least eight of the apps reviewed allowed weak passwords ranging from “1” to “11111111”, while one – a mental health app called Moodfit – required only one letter or number as a password, which is worrying for an app that collects data on mood and symptoms, “the researchers said.

Among the analyzed applications, six were marked as “worst violators” of user privacy: Better Help, Youper, Woebot, Better Stop Suicide, and Talkspace.

Two of these apps – Better Help, a popular app that connects users to therapists, and Better Stop Suicide, a suicide prevention app – have “vague and chaotic” privacy policies that provide little or no detail on how apps protect user data and what users can do. to do so in case they are concerned about it, the report said.

Three other apps – Youper, Digital Mental Health Service for Anxiety and Depression,, which encourages daily prayer, and Woebot, an AI chat bot for better mental health – go even further with sharing personal data from apps with third parties. .

Woebot, for example, collects personal information such as user name, email address, IP address, phone number, as well as all sensitive information that users share in conversations with the bot. The application also receives information about users “from other sources”.

“Thus, Woebot can collect a good portion of personal data, and add information collected from third parties to the information you provide,” the researchers said in a report. “Then they say they can share some of this information with third parties, including insurance companies and a broad category they call ‘external advisers.’

The second biggest offender, the Talkspaces online therapy app, promoted by celebrities such as swimming champion Michael Phelps or singer Demi Lovato, collects a significant amount of users’ personal information, including name, email address, address, phone number, gender, connection status. employer information, geolocation information, chat transcripts and more. Talkspaces even goes so far as to ask for written permission from users to use information about their health and notes on therapy for marketing purposes, which Mozilla researchers said was “bad” for any application, especially one dedicated to mental health.

Source: by

*The article has been translated based on the content of by If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!

*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.

*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!