Policy’s Role in the Use of Social Robots in Care Homes

Can policy help social robots provide ethical, dignified, and beneficial care for older adults? This question has been the subject of ongoing ethical debate concerning the use of social robots in care homes.

Around the world, the expanding population of older adults is increasing the need for care resources, straining health and aged care providers (1). The COVID-19 pandemic has further highlighted the negative consequences of overpacked and understaffed healthcare institutions (2). As governments seek solutions to reduce pressure on care homes, the use of social robots as a potential tool has been suggested.

Already, social robots have been studied in the context of older adult care to provide companionship, exercise, cognitive therapy and help with daily tasks (3). Although these studies have shown predominantly positive effects, the majority have assessed social robots in short-term situations, have had small sample sizes, or lack diversity and may not be generalizable to all cultures (3).

Studies from North America with larger sample sizes and longer time periods are showing variable results, with some older adults experiencing declines in loneliness and increased interaction with other older adults (4). Populations of care home residents with dementia also show variation in responses, suggesting that one approach to delivering care with social robots does not fit all (5). However, it is important to note that the social robot used in both of these studies is the same model that has been in use since 2003 (6). The field of social robotics is rapidly expanding, with many new types and models of robots released with a greater focus on end-users in their development (7,8). Research with such models done in care home contexts with generalizable samples is limited (3). Although more research is needed, social robots developed with users are showing promising preliminary results and could be a viable future solution to promoting well-being in the elderly (9).

To this end, social robots are showing great promise as beneficial tools in care homes. They can assist caregivers in situations where they are tired, distracted, overwhelmed, or not feeling very well (5). Social robots can be used to empower older adults to be more independent and to aid aging at-home care (2).

However, key ethical challenges in the use of social robot care assistants include autonomy, privacy, dignity, and bias (2). Autonomy can be suppressed or overridden by a social robot if, for example, a user is prevented from climbing on a chair to reach something in an effort to prevent a fall. Although the user’s safety is maintained, their autonomy and dignity may be diminished by the robot. Furthermore, the social robot’s monitoring features and social interaction with the user require data storage and use, which could interfere with the user’s privacy.

Currently, legislation around privacy and consumer protection could form the basis of government-enforced policies around social robots. However, in AI, self-regulation through developers has typically been the norm (2). Criticism to this point can be made in that self-regulation does not sufficiently protect the rights and safety of vulnerable populations such as older adults, and that manufacturers primarily protect their own interests.

This is where Johnston suggests ethics by design can ensure that ethical values of dignity, respect for autonomy and benevolence can be programmed into the robot’s behavior such that it protects the interests of the elderly. Johnston continues that to determine the “moral code” programmed into social robots and to monitor the ethical use of such systems within care home contexts, the use of clinical ethics committees can be employed. Ethics committees can provide consultation services, help in creating care home policies and procedures regarding social robots, and aid to resolve emerging ethical dilemmas. To counteract ethical biases in design, it is important that ethics committees consider multi-stakeholder perspectives. Emphasizing the voices of end-users tailors social robot functionality to the populations it will serve, and aids in user acceptance of social robots (10).

Furthermore, policies must consider both the benefits and drawbacks of using social in care home contexts (1). Potential benefits could include increased efficiency, increased welfare, physiological and psychological benefits, and increased satisfaction (1). There are, however, interesting objections to the use of social robots including the possibility that relations with robots can potentially displace human contact, that these relations could be harmful, that robot care is undignified and disrespectful, and that social robots are deceptive (1). These are ethical considerations that must be carefully balanced in a holistic policy aimed to maximize benefits for end-users while mitigating potential downsides to social robot use.

Although we are not yet at the stage where social robots can be used in a large-scale fashion across care homes in North America, it is important to anticipate their future ethical ramifications. By discussing policy-regulated ethical considerations, we are taking strides towards the responsible development and use of social robots with the goal of minimizing their potential for harm and ensuring their benefits for human care.

Bio: Anna Riminchan was born in Bulgaria, where she spent her early childhood before immigrating to Canada with her family. Anna is currently working towards a Bachelor of Science Degree, majoring in Behavioural Neuroscience and minoring in Visual Arts at the University of British Columbia. In the meantime, she is contributing to advancing research in neuroscience, after which, she plans to pursue a degree in medicine. In her spare time, you can find Anna working on her latest art piece! 


References

  1. Sætra HS. The foundations of a policy for the use of social robots in care. Technol Soc. 2020 Nov 1;63:101383.
  2. Johnston C. Ethical Design and Use of Robotic Care of the Elderly. J Bioethical Inq. 2022 Mar 1;19(1):11–4.
  3. Thunberg S, Ziemke T. Social Robots in Care Homes for Older Adults. In: Li H, Ge SS, Wu Y, Wykowska A, He H, Liu X, et al., editors. Social Robotics. Cham: Springer International Publishing; 2021. p. 475–86. (Lecture Notes in Computer Science).
  4. Robinson H, MacDonald B, Kerse N, Broadbent E. The Psychosocial Effects of a Companion Robot: A Randomized Controlled Trial. J Am Med Dir Assoc. 2013 Sep 1;14(9):661–7.
  5. Moyle W, Jones C, Murfield J, Thalib L, Beattie E, Shum D, et al. Using a therapeutic companion robot for dementia symptoms in long-term care: reflections from a cluster-RCT. Aging Ment Health. 2019 Mar 4;23(3):329–36.
  6. PARO Therapeutic Robot [Internet]. [cited 2022 Jul 22]. Available from: http://www.parorobots.com/
  7. Breazeal CL, Ostrowski AK, Singh N, Park HW. Designing Social Robots for Older Adults. 2019;10.
  8. Östlund B, Olander E, Jonsson O, Frennert S. STS-inspired design to meet the challenges of modern aging. Welfare technology as a tool to promote user-driven innovations or another way to keep older users hostage? Technol Forecast Soc Change. 2015 Apr 1;93:82–90.
  9. Hutson S, Lim SL, Bentley PJ, Bianchi-Berthouze N, Bowling A. Investigating the Suitability of Social Robots for the Wellbeing of the Elderly. In: D’Mello S, Graesser A, Schuller B, Martin JC, editors. Affective Computing and Intelligent Interaction. Berlin, Heidelberg: Springer; 2011. p. 578–87. (Lecture Notes in Computer Science).
  10. Hameed I, Tan ZH, Thomsen N, Duan X. User Acceptance of Social Robots. In 2016.
Advertisement