Are companion bots a solution to loneliness? Arguably, only partially. While companion bots show some potential to improve quality of life among the elderly, as well as alleviating some symptoms of loneliness and dementia, they have shortcomings. These include the inability of those bots to feel or express emotions like empathy, the potential for deception, the inability to recreate human touch, among other issues.
We discuss another ethical concern about companion bots which to our knowledge has not been discussed. When it comes to the elderly and other subjects of concern, many ethicists recognize the importance of them receiving care. But few have discussed the ability of those people to contribute to the wellbeing of others. Yet, psychological research has shown that being able to contribute to the wellbeing of others is an important need for most human beings. Having the capability and opportunity to contribute to the wellbeing of others is thus an important aspect of human relationships.
We argue that because companion bots are not wellbeing subjects, they cannot benefit from relationships with human beings. As a result, companion bots cannot meet an important human need to contribute to the wellbeing of others. These arguments raise a novel set of moral concerns about companion bots for ethicists, developers, potential users, and policymakers.