This blog post discusses findings from a peer-reviewed article titled: “What criteria are young people using to select mobile mental health applications? A nominal group study” published in Digital Health (2022, paper here).
Mental health apps: more accessible mental health support
According to the World Health Organization, as many as one in seven children and teenagers aged 10-19 experience mental health disorders (1). Despite the prevalence of mental health issues in this group, access to professional diagnosis and treatment remains low. As a result, children and teenagers are turning to smartphone applications to support their mental wellbeing. When young people search for apps to support their mental health in general or to address a specific problem, such as anxiety, they are faced with an overwhelming number of options.
How to pick a mental health app?
The smartphone applications available on platforms such as Google Play or App Store are evaluated based on general user experience or satisfaction, usually on a 1 to 5-star rating scale. This system results in apps being suggested based on popularity, rather than the content of the app or its effectiveness in addressing mental health concerns. While some mental health interventions delivered by apps are developed based on evidence, most of the apps on the market are not supported by science. Additionally, there is no regulatory oversight to prevent apps from promoting potentially harmful interventions, making false claims, or mishandling user data (2,3). Based on a search from 2016, only 2.6% of apps make effectiveness claims that are supported in any way (4). The high number of available apps combined with only popularity-based rankings make it difficult to choose apps that are safe and effective.
Young people’s criteria for selecting mental health apps
Since selecting mental health support apps is challenging, the Neuroscience Engagement and Smart Tech (NEST) lab at Neuroethics Canada, in collaboration with Foundry BC, set out to develop a tool that would make it easier to select an app that is best suited to user’s circumstances. A tool that is helpful to young people has to align with their needs and priorities. Thus, we conducted a series of nominal group meetings to identify the criteria that are important to young people when they select mental health apps. The infographic below summarizes the criteria that emerged in discussions with 47 young people aged 15-25 in four towns in British Columbia, Canada. These criteria will inform the development of an app-selection tool that will combine end-user priorities with expert input.
The future of mental health support
As mental health apps continue to increase in popularity, so does the diversity and complexity of the features they offer. For example, some mobile applications offer access to healthcare professionals via video or chat but may also use AI chat bots to provide help or counselling. As we uncovered in the nominal groups, young people want the apps to provide links to community services that are available in their area and allow users to share the information that the apps collect with their health care team. As such, it is critically important to identify the priorities of end-users to guide the ethical usage of this innovative form of mental health support.
Anthes E. Mental health: There’s an app for that. Nature News. 2016 Apr 7;532(7597):20.
Robillard JM, Feng TL, Sporn AB, Lai JA, Lo C, Ta M, et al. Availability, readability, and content of privacy policies and terms of agreements of mental health apps. Internet Interventions. 2019 Sep 1;17:100243.
Larsen ME, Nicholas J, Christensen H. Quantifying App Store Dynamics: Longitudinal Tracking of Mental Health Apps. JMIR mHealth and uHealth. 2016 Aug 9;4(3):e6020.
Consumer demand for social robots is increasing, particularly in response to the reduced amounts of social contact children that are getting because of school closures (1). Isolation due to the COVID-19 pandemic has accelerated people’s need for social interaction. Social robots have the ability to listen, emote, and sustain a verbal, or non-verbal conversation with others without spreading disease, making them an increasingly relevant solution to today’s problems. However, it is important to balance the growing excitement for social robots with a careful examination of the ethical issues they raise.
Socially assistive robots are devices intended to provide companionship, education, and healthcare assistance for diverse populations. Current research centers around the use of social robots for ageing populations and children. Social robots’ child-specific uses include support during hospitalization (2,3), support for distress during medical procedures (4), mitigation of the effects of a short-term stressor (5), intervention to improve social skills in children with autism spectrum disorder (6,7), and enhancement of education in the classroom (8).
In order to qualify as a social robot, a device must possess three elements: sensors to detect information, a physical form with actuators to manipulate the environment, and an interface that is able to interact with humans on a social level (9). Social robots’ interactions with humans also follow four key rules; (1) social robots have a physical presence, (2) social robots can flexibly react to novel events, (3) social robots are equipped to realize complex goals, and (4) social robots are capable of social interaction with humans in pursuit of their goals (definition adapted from 10).
Today’s social robotics scene contains robots that are available for research purposes as well as some that are sold commercially to children around the world. The present article will present examples of both types of social robots currently being used.
Huggable is a blue and green, bear-shaped social robot created by MIT Media Lab in collaboration with Boston Children’s Hospital. An image of Huggable from MIT Media Lab’s website is shown. Its goal is to bridge the socio-emotional gap between child and parent stress and human resource supply in pediatric hospitals. Huggable wants to close this gap with its ability to “mitigate stress, anxiety, and pain in pediatric patients by engaging them in playful interactions”, as advertised on its website. It is meant to enhance social interactions between children and their teachers or healthcare providers through its fun communication abilities.
Research with Huggable also touched on the importance of the physical embodiment aspect of social robot interaction. An experiment was performed with children to compare the effects of the Huggable robot to a virtual character on a screen and a regular plush teddy bear. They showed that children are “more eager to emotionally connect with and be physically activated by a robot than a virtual character”(11). This is one of the first studies in 2012 to demonstrate the potential of social robots as opposed to other types of pediatric interventions.
In terms of commercially available robots, Moxie is one of the newest social robots on the market. A picture of Moxie from the Embodied Inc’s website is included. Her teal colour, and animatronic face, as well as her teardrop-shaped head, give her a unique, yet modern look. According to the manufacturer’s website, Moxie is designed to “help autistic children learn the necessary social skills they need to thrive in the world and to provide them with understanding and engaging company.” Moxie is about 1500$ with a 40$ a monthly subscription after the first year of adoption. A highly expressive social robot, with an emotive electronic face, Moxie is designed to have large eyes, to promote eye contact in children. She presents the child with weekly missions to encourage learning and exploration of different topics related to human experiences, ideas, and life skills like kindness, empathy, and friendship. Guided meditations and breathing exercises can help children regulate their emotions and develop their self-expression in a positive way. The manufacturer’s website claims that children can read to Moxie to build confidence in their verbal ability and increase comprehension. Unstructured play can also help promote creativity and self-reflection. The website includes a page entitled “ The Science Behind Moxie”, which supports some claims made by manufacturers on Moxie’s abilities with other social robot studies. However, the only formal data available on Moxie’s effectiveness comes from a short, preliminary study done by the manufacturers, featuring a very small sample size. Although more research is needed on her effects on children, Moxie is a promising social robot for at-home use!
Researchers and manufacturers alike are continuing to acknowledge the growing potential of social robots for child wellbeing. With possible benefits like decreasing distress during hospitalization (2,3,4), enhancing interactions with others (6,7), and helping to promote healthy emotional regulation in response to stress (5), social robots have a unique set of capabilities to enhance children’s lives. However, more research is needed to establish the effectiveness of specific commercial social robots before manufacturers can soundly claim the benefits of well-researched robots as pertaining to their own product. Furthermore, the security of sensitive information a user shares with a social robot is currently evolving as consumers become more aware of ethical issues surrounding data privacy. Concerns about data security and sharing are being addressed by some, but not all social robot manufacturers. Of those which address data privacy, many statements are brief, and do not offer the consumer enough to make a fully informed, consenting decision on sharing their personal information. This information comes from an analysis performed in our yet unpublished paper, which addresses the greatly variable quality of claims made by social robot manufacturers. Although social robots show great potential for enhancing child well-being, further consideration of ethical issues, as well as re-evaluation of the quality of claims made by manufacturers is needed to enhance consumer’s experiences.
Anna Riminchan was born in Bulgaria, where she spent her early childhood before immigrating to Canada with her family. Anna is currently working towards a Bachelor of Science Degree, majoring in Behavioural Neuroscience and minoring in Visual Arts at the University of British Columbia. In the meantime, she is contributing to advancing research in neuroscience, after which, she plans to pursue a degree in medicine. In her spare time, you can find Anna working on her latest art piece!
2. Farrier CE, Pearson JD, Beran TN. Children’s fear and pain during medical procedures: A quality improvement study with a humanoid robot. Canadian Journal of Nursing Research. 2020 Dec;52(4):328-34.
3. Okita SY. Self–Other’s Perspective Taking: The use of therapeutic robot companions as social agents for reducing pain and anxiety in pediatric patients. Cyberpsychology, Behavior, and Social Networking. 2013 Jun 1;16(6):436-41.
4. Trost MJ, Ford AR, Kysh L, Gold JI, Matarić M. Socially assistive robots for helping pediatric distress and pain: a review of current evidence and recommendations for future research and practice. The Clinical journal of pain. 2019 May;35(5):451.
5. Crossman MK, Kazdin AE, Kitt ER. The influence of a socially assistive robot on mood, anxiety, and arousal in children. Professional Psychology: Research and Practice. 2018 Feb;49(1):48.
6. Diehl JJ, Schmitt LM, Villano M, Crowell CR. The clinical use of robots for individuals with autism spectrum disorders: A critical review. Research in autism spectrum disorders. 2012 Jan 1;6(1):249-62.
7. Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, Pioggia G. Autism and social robotics: A systematic review. Autism Research. 2016 Feb;9(2):165-83.
8. Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F. Social robots for education: A review. Science robotics. 2018 Aug 15;3(21).
9. del Moral S, Pardo D, Angulo C. Social robot paradigms: An overview. In International Work-Conference on Artificial Neural Networks 2009 Jun 10 (pp. 773-780). Springer, Berlin, Heidelberg.
10. Duffy BR, Rooney C, O’Hare GM, O’Donoghue R. What is a social robot? In10th Irish Conference on Artificial Intelligence & Cognitive Science, University College Cork, Ireland, 1-3 September, 1999. 1999 Sep 1.
11. Jeong S, Breazeal C, Logan D, Weinstock P. Huggable: the Impact of Embodiment on Promoting Socio-Emotional Interactions for Young Pediatric Inpatients. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems 2018 Apr;21:1-13.
What is the role of the persons with lived experience in research?
Researchers and research organizations in Canada are changing the way that they think about the role of persons with lived experience in research. There is a shift away from thinking about these groups as passive sources of data and towards meaningful collaboration with them at all stages of research (1). This can involve working collectively to set research priorities, select research designs, and interpret and share research findings. There are large potential benefits, even above and beyond the ethical imperative of “nothing about us without us”. Patient engagement in research can result in work that is better aligned with the actual goals of the population under study as well as improving study enrollment and decreasing participant drop-out (2). There are also potential benefits of the collaborative process for participants themselves. In one study, grandparents who acted as research advisors reported that the experience provided a sense of purpose and a feeling of connection (3). However, prioritizing a collaborative research approach does present unique challenges. It takes time and resources, there are a wide range of methodologies and large differences in how engagement is accomplished between research groups, and there is a potential for “tokenization”, which is the appearance of inclusiveness in the absence of true collaboration.
Incorporating patient engagement practices specifically in the technology research and development space has some unique additional challenges. Emerging technologies may not yet be ready for real-world deployment at the point of care, but engaging persons with lived experience in their early development is critical. It can be difficult to know how to ask participants the best way to study a set of devices that are still under development and part of a quickly-changing commercial landscape! For this reason, pathways to involve persons with lived experience in healthcare technology research are not yet well established.
The need to engage persons with lived experience in social robotics research
One example of a potential health technology that our group would like to study in a patient-centered way is social robotics. These interactive devices are intended to be effective social partners for a person. Their functions can include acting as a fun and entertaining companion, acting as a virtual assistant (e.g., setting up video calls, using the internet to answer questions), providing reminders to take medication, and monitoring the user for events like a fall, among other things. They are being trialled for applications like supporting children’s mental health, as supports for individuals for Autism Spectrum Disorder, and as companions for persons living with dementia (4-8). The COVID-19 pandemic is likely to further accelerate the adoption of social robotics as people seek to reduce live human contact without reducing social connectedness (9).
However, social robotics development priorities are largely driven by the market, engineering constraints, and the recommendations of healthcare experts, rather than by input from persons with lived experience (10). While technically advanced devices are coming to market and manufacturers are making strong claims about the usefulness of these objects, scientific evaluation of these claims is of poor quality and does not focus on the experiences and outcomes that are important to potential users and their families. Not including the voices of potential users can lead to the development of devices that ultimately fail to meet their needs.
Engagement at Neuroethics Canada: Lived Experience Expert Groups
Our research group is currently running a set of projects looking at robotic interventions for anxiety in children and teens. To engage members of these groups directly, we have developed a Lived Experience Expert Group (“LEEG” – we call this group our “League”) to advise on all aspects of our ongoing work on social robotics for children. The group includes a mix of children, teens, and parents/guardians with lived experiences of acute and chronic anxiety and a range of ages and diagnostic groups (e.g., social anxiety, generalized anxiety disorder). Involving young people themselves in patient experience research is critical as their reports on the quality of an interaction can differ from those of adults – even from their parents’ reports of the same event (11, 12). Involving an expert group, rather than a single token lived experience partner, tips the balance of our research team towards individuals with lived experience and away from researchers, as well as promoting a diversity of voices in the work. We are excited to work with the League to refine our research questions, design smart studies, and learn more about the experiences and priorities of young people living with anxiety.
This work is supported by BC Support Unit, the BC Children’s Hospital, and the Michael Smith Foundation for Health Research and is being done under the supervision of Dr. Julie Robillard, with team members Anna Riminchan, Jaya Kailley, Kat Kabacińska, and our generous persons with lived experience partners.
Robillard JK, Jordan I. Dialogue? Yes. Burden? No. Ethical challenges in engaging people with lived experience in health care research. Brainstorm, 32-35.
Domecq JP, Prutsky G, Elraiyah T, Wang Z, Nabhan M, Shippee N, Brito JP, Boehmer K, Hasan R, Firwana B, Erwin P. Patient engagement in research: a systematic review. BMC health services research. 2014 Dec;14(1):1-9.
Sheehan OC, Ritchie CS, Garrett SB, Harrison KL, Mickler A, AL EE, Garrigues SK, Leff B. Unanticipated Therapeutic Value of the Patient-Centered Outcomes Research Institute (PCORI) Stakeholder Engagement Project for Homebound Older Adults. Journal of the American Medical Directors Association. 2020 May 4;21(8):1172-3.
Costescu CA, David DO. Attitudes toward Using Social Robots in Psychotherapy. Transylvanian Journal of Psychology. 2014 Mar 1;15(1).
Dawe J, Sutherland C, Barco A, Broadbent E. Can social robots help children in healthcare contexts? A scoping review. BMJ paediatrics open. 2019;3(1).
Hung L, Liu C, Woldum E, Au-Yeung A, Berndt A, Wallsworth C, Horne N, Gregorio M, Mann J, Chaudhury H. The benefits of and barriers to using a social robot PARO in care settings: a scoping review. BMC geriatrics. 2019 Dec;19(1):1-0.
Kabacińska K, Prescott TJ, Robillard JM. Socially assistive robots as mental health interventions for children: a scoping review. International Journal of Social Robotics. 2021 Aug;13(5):919-35.
Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, Pioggia G. Autism and social robotics: A systematic review. Autism Research. 2016 Feb;9(2):165-83.
Ghafurian M, Ellard C, Dautenhahn K. Social companion robots to reduce isolation: A perception change due to COVID-19. In IFIP Conference on Human-Computer Interaction 2021 Aug 30 (pp. 43-63). Springer, Cham.
Riek LD. Robotics technology in mental health care. In Artificial intelligence in behavioral and mental health care 2016 Jan 1 (pp. 185-203). Academic Press.
Hargreaves DS, Sizmur S, Pitchforth J, Tallett A, Toomey SL, Hopwood B, Schuster MA, Viner RM. Children and young people’s versus parents’ responses in an English national inpatient survey. Archives of disease in childhood. 2018 May 1;103(5):486-91.
Kerr C, Nixon A, Angalakuditi M. The impact of epilepsy on children and adult patients’ lives: development of a conceptual model from qualitative literature. Seizure. 2011 Dec 1;20(10):764-74.
This blog post discusses some of the key findings from the article “Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review” published in the International Journal of Social Robotics (2020, paper here).
What is a social robot?
Social robots are small robotic devices that are capable of social interactions, such as cooperation, instruction and play. The robots can be shaped like animals (e.g., Aibo – the robotic dog (Fig. 1(a)) or characters (e.g. humanoid robot Nao (Fig. 1(b)). Among other functions, social robots can play a therapeutic role (1), serve as companions (2) and aid in education (3,4). One of the application areas of social robotics is therapy for children with autism spectrum disorder. In this domain, robotic companions have the potential to improve a variety of behavioural outcomes, including social and language skills (5). Social robots are also used with older populations. For example, robots like Paro (Fig. 1(c)) are being used in elder care settings. This baby harp seal lookalike helps reduce loneliness and agitation among residents (6).
A new mental health intervention for children
Since social robots seem to have a positive impact on mental health in different populations, there is a growing interest in using them as a tool to promote and improve mental health among children. As a result, a number of studies are being conducted to test social robots in this relatively new domain. In the Neuroscience Engagement and Smart Tech (NEST) Lab, we collected and analyzed the existing research studies which investigate the use of social robots to improve children’s mental health (10), to get a fuller view on what interventions are being tested and how.
What we know: Feasibility and short-term effects
Using social robots to benefit children’s mental health is a new and rapidly developing field. Hence, the majority of currently available studies are intended as means of exploring what is possible and what could be effective in the future. The studies usually include a single session with the robot, which shows only short-term outcomes of the interaction. While the evidence does not allow for drawing strong and long-term conclusions, the studies in our sample demonstrate that various robotic interventions are feasible. We know that social robots can be introduced and deployed in therapy, clinical and other settings. But perhaps the most crucial aspect of determining whether robotic companions could be successful, is the fact that children participating in the studies usually showed a positive response to the robots and were engaged in the interaction, e.g., distraction during vaccination (11). This positive reception makes the developments in the use of social robots promising.
What we still need to learn: Effectiveness and social impact
To be able to draw conclusions about the effectiveness of robotic interventions we need more evidence. Future research in this field needs to systematically address well-focused questions around specific outcomes (e.g., stress reduction). Additionally, potential social impacts of the robots should be more carefully considered. Robots are intended to be introduced into different environments as social entities. For example, a robot present at a hospital to distract children during medical procedures will likely affect others around the child such as parents and nurses. Moving forward, we need to learn more not only about specific social robot interventions that can be helpful, but also about how introducing social robots into new environments will affect social dynamics.
What about ethics?
Conducting child-robot interaction research comes with unique ethical concerns. In our scoping review of the literature, we found that the majority of studies in the sample provide only generalized statements about the assent process used (10). Transparency about how the robot is introduced and described to young participants is crucial, as children of different ages may have different beliefs about the animacy of robots. Other notable ethical considerations include attachment and deception. For example, children could experience distress when the robot is taken away or mistreated (12). The key to proactively addressing these ethical issues could be using participatory approaches throughout the research process. Working together with children and parents will help minimize the risk and maximize the benefit of future social robot mental health interventions.
Acknowledgements to the leaders of this work Dr. Julie Robillard and Dr. Tony Prescott.
Abdi J, Al-Hindawi A, Ng T, Vizcaychipi MP. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open. 2018 Feb 1;8(2):e018815.
Looije R, Neerincx MA, Peters JK, Henkemans OAB. Integrating Robot Support Functions into Varied Activities at Returning Hospital Visits. Int J of Soc Robotics. 2016 Aug 1;8(4):483–97.
Ros R, Oleari E, Pozzi C, Sacchitelli F, Baranzini D, Bagherzadhalimi A, et al. A Motivational Approach to Support Healthy Habits in Long-term Child–Robot Interaction. Int J of Soc Robotics. 2016 Nov 1;8(5):599–617.
Pennisi P, Tonacci A, Tartarisco G, Billeci L, Ruta L, Gangemi S, et al. Autism and social robotics: A systematic review. Autism Research. 2016;9(2):165–83.
Pu L, Moyle W, Jones C, Todorovic M. The Effectiveness of Social Robots for Older Adults: A Systematic Review and Meta-Analysis of Randomized Controlled Studies. Gerontologist. 2019;59(1):e37–51.
Kabacińska K, Prescott TJ, Robillard JM. Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review. Int J of Soc Robotics .2020 Jul 27;10.1007/s12369-020-00679-0.
Beran TN, Ramirez-Serrano A, Vanderkooi OG, Kuhn S. Reducing children’s pain and distress towards flu vaccinations: A novel and effective application of humanoid robotics. Vaccine. 2013 Jun 7;31(25):2772–7.
Kahn Jr. PH, Kanda T, Ishiguro H, Freier NG, Severson RL, Gill BT, et al. “Robovie, you’ll have to go into the closet now”: Children’s social and moral relationships with a humanoid robot. Developmental Psychology. 2012;48(2):303–14.
This blog post discusses some of the key findings from the article “Prioritizing Benefits: A Content Analysis of the Ethics in Dementia Technology Policies” published in the Journal of Alzheimer’s Disease (2019, paper here).
A new era of dementia care
From tracking devices to social robots, technology is rapidly transforming the scope of dementia care. Persons living with dementia and their caregivers can now choose from a wide range of innovative technologies to assist with everyday activities, symptom management, and more. With potential benefits such as increased autonomy and enhanced safety for persons living with dementia (1), new technologies are continuously being developed and entering the market.
Despite the excitement of innovation, the promising benefits of dementia technology must not be the only ethical implication to consider. Although monitoring technologies such as video surveillance can keep older adults safe, this may be at the cost of compromising privacy and independence. While companion robots may show potential in enhancing well-being and connection in older adults (2), this often comes with an expensive price tag. These diverse ethical implications are important for older adults to consider so they adopt technology that best aligns with their needs and values.
The question is, how are these ethical implications communicated to the dementia community through public policies?
The guidance of public policies
Alzheimer associations around the world create public-facing policies to guide the adoption and use of technology in dementia care. Given the wide array of ethical implications in need of consideration, policies play a critical role in raising the ethical issues of care technology to the dementia community. However, we found that the quality and ethical content of these policies can greatly vary, particularly around what ethical implications are being most and least discussed with the public (3).
What we found: Policies prioritize benefits
In the Neuroscience, Engagement, and Smart Tech (NEST) Lab, we analyzed the ethical content of 23 international policies using the four principles of biomedical ethics (4): beneficence, non-maleficence, autonomy, and justice (Fig. 1).
What we found was that nearly all policies (96%) discussed the benefits of using technology such as increased independence, improved social contact, and enhanced quality of life for the person affected by dementia (3). However, this near-perfect score was not matched by the other ethical principles that raise the potential risks and harms associated with using dementia care technology (Fig. 2)
Themes of justice, for example, were discussed in 74% of the policies, followed by themes of non-maleficence at 52% and autonomy at only 43% (Fig. 2). This lack of comprehensive discussion surrounding the risks and potential harms of dementia care technology is critical for users. Understanding ethical considerations such as cost, privacy, and consent are imperative for people affected by dementia to make well-informed decisions about their care.
Reshaping dementia technology policies
As dementia care technology continues to rapidly develop, so should policies that shape their adoption and use. To maximize the current and future benefits of dementia technologies, policies need to be reworked so that they are in the best interest of the dementia community. Important to this is the inclusion of not just benefits, but the potential risks and harms associated with dementia care technology. Persons with dementia, caregivers, and family members need to be actively engaged in the policy-making process to ensure patient-centred guidance in public policies.
A guide to adopting new technology in dementia care
Based on our findings, we disseminated a public resource to guide the adoption of new technologies in dementia care. Here are 10 questions for older adults to consider when adopting a new technology:
Acknowledgements to Dr. Julie Robillard for her leadership in this project and research members Tanya Feng and Mallorie Tam for their substantial contributions. This work was supported by the Canadian Consortium on Neurodegeneration in Aging and AGE-WELL NCE.
Meiland F, Innes A, Mountain G, Robinson L, van der Roest H, García-Casal JA, et al. Technologies to Support Community-Dwelling Persons With Dementia: A Position Paper on Issues Regarding Development, Usability, Effectiveness and Cost-Effectiveness, Deployment, and Ethics. JMIR Rehabil Assist Technol. 2017 Jan 16;4(1):e1.
Pike J, Picking R, Cunningham S. Robot companion cats for people at home with dementia: A qualitative case study on companotics. Dementia. 2021 May 1;20(4):1300–18.
Robillard JM, Wu JM, Feng TL, Tam MT. Prioritizing Benefits: A Content Analysis of the Ethics in Dementia Technology Policies. J Alzheimers Dis. 2019;69(4):897–904.
Beauchamp T, Childress J. Principles of Biomedical Ethics: Marking Its Fortieth Anniversary. Am J Bioeth. 2019 Nov;19(11):9–12.
Julia Wu, BSc is a Research Assistant in the Neuroscience, Engagement and Smart Tech (NEST) Lab at the University of British Columbia and BC Children’s and Women’s Hospital. Her research interests include mental health and innovative approaches to improving patient experience and person-centred care in health care systems.