Lessons from end-users: how young people select mobile mental health applications

This blog post discusses findings from a peer-reviewed article titled: “What criteria are young people using to select mobile mental health applications? A nominal group study” published in Digital Health (2022, paper here).

Mental health apps: more accessible mental health support

According to the World Health Organization, as many as one in seven children and teenagers aged 10-19 experience mental health disorders (1). Despite the prevalence of mental health issues in this group, access to professional diagnosis and treatment remains low. As a result, children and teenagers are turning to smartphone applications to support their mental wellbeing. When young people search for apps to support their mental health in general or to address a specific problem, such as anxiety, they are faced with an overwhelming number of options.

How to pick a mental health app?

The smartphone applications available on platforms such as Google Play or App Store are evaluated based on general user experience or satisfaction, usually on a 1 to 5-star rating scale. This system results in apps being suggested based on popularity, rather than the content of the app or its effectiveness in addressing mental health concerns. While some mental health interventions delivered by apps are developed based on evidence, most of the apps on the market are not supported by science. Additionally, there is no regulatory oversight to prevent apps from promoting potentially harmful interventions, making false claims, or mishandling user data (2,3). Based on a search from 2016, only 2.6% of apps make effectiveness claims that are supported in any way (4). The high number of available apps combined with only popularity-based rankings make it difficult to choose apps that are safe and effective.

Young people’s criteria for selecting mental health apps

Since selecting mental health support apps is challenging, the Neuroscience Engagement and Smart Tech (NEST) lab at Neuroethics Canada, in collaboration with Foundry BC, set out to develop a tool that would make it easier to select an app that is best suited to user’s circumstances. A tool that is helpful to young people has to align with their needs and priorities. Thus, we conducted a series of nominal group meetings to identify the criteria that are important to young people when they select mental health apps. The infographic below summarizes the criteria that emerged in discussions with 47 young people aged 15-25 in four towns in British Columbia, Canada. These criteria will inform the development of an app-selection tool that will combine end-user priorities with expert input.

The future of mental health support

As mental health apps continue to increase in popularity, so does the diversity and complexity of the features they offer. For example, some mobile applications offer access to healthcare professionals via video or chat but may also use AI chat bots to provide help or counselling. As we uncovered in the nominal groups, young people want the apps to provide links to community services that are available in their area and allow users to share the information that the apps collect with their health care team. As such, it is critically important to identify the priorities of end-users to guide the ethical usage of this innovative form of mental health support.

References

  1. Adolescent mental health [Internet]. [cited 2022 Dec 16]. Available from: https://www.who.int/news-room/fact-sheets/detail/adolescent-mental-health
  2. Anthes E. Mental health: There’s an app for that. Nature News. 2016 Apr 7;532(7597):20.
  3. Robillard JM, Feng TL, Sporn AB, Lai JA, Lo C, Ta M, et al. Availability, readability, and content of privacy policies and terms of agreements of mental health apps. Internet Interventions. 2019 Sep 1;17:100243.
  4. Larsen ME, Nicholas J, Christensen H. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps. JMIR mHealth and uHealth. 2016 Aug 9;4(3):e6020.

Advertisement

Can social robots improve children’s mental health? What we know and what we still need to learn

This blog post discusses some of the key findings from the article “Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review” published in the International Journal of Social Robotics (2020, paper here).


What is a social robot?

Social robots are small robotic devices that are capable of social interactions, such as cooperation, instruction and play. The robots can be shaped like animals (e.g., Aibo – the robotic dog (Fig. 1(a)) or characters (e.g. humanoid robot Nao (Fig. 1(b)). Among other functions, social robots can play a therapeutic role (1), serve as companions (2) and aid in education (3,4). One of the application areas of social robotics is therapy for children with autism spectrum disorder. In this domain, robotic companions have the potential to improve a variety of behavioural outcomes, including social and language skills (5). Social robots are also used with older populations. For example, robots like Paro (Fig. 1(c)) are being used in elder care settings. This baby harp seal lookalike helps reduce loneliness and agitation among residents (6).

Fig. 1 Commonly used social robots Aibo (a)(7), Nao (b)(8), Paro (c)(9).

A new mental health intervention for children

Since social robots seem to have a positive impact on mental health in different populations, there is a growing interest in using them as a tool to promote and improve mental health among children. As a result, a number of studies are being conducted to test social robots in this relatively new domain. In the Neuroscience Engagement and Smart Tech (NEST) Lab, we collected and analyzed the existing research studies which investigate the use of social robots to improve children’s mental health (10), to get a fuller view on what interventions are being tested and how.  

What we know: Feasibility and short-term effects

Using social robots to benefit children’s mental health is a new and rapidly developing field. Hence, the majority of currently available studies are intended as means of exploring what is possible and what could be effective in the future. The studies usually include a single session with the robot, which shows only short-term outcomes of the interaction. While the evidence does not allow for drawing strong and long-term conclusions, the studies in our sample demonstrate that various robotic interventions are feasible. We know that social robots can be introduced and deployed in therapy, clinical and other settings. But perhaps the most crucial aspect of determining whether robotic companions could be successful, is the fact that children participating in the studies usually showed a positive response to the robots and were engaged in the interaction, e.g., distraction during vaccination (11). This positive reception makes the developments in the use of social robots promising.

What we still need to learn: Effectiveness and social impact

To be able to draw conclusions about the effectiveness of robotic interventions we need more evidence. Future research in this field needs to systematically address well-focused questions around specific outcomes (e.g., stress reduction). Additionally, potential social impacts of the robots should be more carefully considered. Robots are intended to be introduced into different environments as social entities. For example, a robot present at a hospital to distract children during medical procedures will likely affect others around the child such as parents and nurses. Moving forward, we need to learn more not only about specific social robot interventions that can be helpful, but also about how introducing social robots into new environments will affect social dynamics.

What about ethics?

Conducting child-robot interaction research comes with unique ethical concerns. In our scoping review of the literature, we found that the majority of studies in the sample provide only generalized statements about the assent process used (10). Transparency about how the robot is introduced and described  to young participants is crucial, as children of different ages may have different beliefs about the animacy of robots. Other notable ethical considerations include attachment and deception. For example, children could experience distress when the robot is taken away or mistreated (12). The key to proactively addressing these ethical issues could be using participatory approaches throughout the research process. Working together with children and parents will help minimize the risk and maximize the benefit of future social robot mental health interventions.

Acknowledgements to the leaders of this work Dr. Julie Robillard and Dr. Tony Prescott.

References:

  1. Howard AM. Robots learn to play: robots emerging role in pediatric therapy. FLAIRS Conference. 2013 May; Available from: https://smartech.gatech.edu/handle/1853/49760.
  2. Abdi J, Al-Hindawi A, Ng T, Vizcaychipi MP. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open. 2018 Feb 1;8(2):e018815.
  3. Looije R, Neerincx MA, Peters JK, Henkemans OAB. Integrating Robot Support Functions into Varied Activities at Returning Hospital Visits. Int J of Soc Robotics. 2016 Aug 1;8(4):483–97.
  4. Ros R, Oleari E, Pozzi C, Sacchitelli F, Baranzini D, Bagherzadhalimi A, et al. A Motivational Approach to Support Healthy Habits in Long-term Child–Robot Interaction. Int J of Soc Robotics. 2016 Nov 1;8(5):599–617.
  5. Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, et al. Autism and social robotics: A systematic review. Autism Research. 2016;9(2):165–83.
  6. Pu L, Moyle W, Jones C, Todorovic M. The Effectiveness of Social Robots for Older Adults: A Systematic Review and Meta-Analysis of Randomized Controlled Studies. Gerontologist. 2019;59(1):e37–51.
  7. Entertainment Robot “aibo” Announced. Sony Group Portal – Sony Global Headquarters. [cited 2021 Jun 29]. Available from: http://www.sony.com/en/SonyInfo/News/Press/201711/17-105E/index.html
  8. Nao – ROBOTS: Your Guide to the World of Robotics. [cited 2021 Jun 29]. Available from: https://robots.ieee.org/robots/nao//
  9. Purchasing PARO seal. [cited 2021 Jun 29]. Available from: https://www.paroseal.co.uk/purchase
  10. Kabacińska K, Prescott TJ, Robillard JM. Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review. Int J of Soc Robotics .2020 Jul 27;10.1007/s12369-020-00679-0.
  11. Beran TN, Ramirez-Serrano A, Vanderkooi OG, Kuhn S. Reducing children’s pain and distress towards flu vaccinations: A novel and effective application of humanoid robotics. Vaccine. 2013 Jun 7;31(25):2772–7.
  12. Kahn Jr. PH, Kanda T, Ishiguro H, Freier NG, Severson RL, Gill BT, et al. “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology. 2012;48(2):303–14.