Artificial intelligence has found a new niche in elder care. AI companions—digital assistants designed to provide conversation, reminders and comfort—are marketed as tools to combat loneliness and cognitive decline. While these technologies promise connection, there are AI companion risks to the elderly that families and elder care attorneys should not overlook. Beck, Lenox & Stolzer attorneys have no knowledge of local use of these AI companions, but we are watching out for any signs.
Elderly users may form a deep bond with AI systems that mimic empathy and attentiveness, since these systems can provide a sense of comfort and reassurance. Unlike human caregivers, these companions don’t tire or judge. They are always available. However, this reliability can foster unhealthy dependence, blur the line between reality and simulation and create opportunities for data misuse.
Emotional Vulnerability and Manipulation
Older adults, especially those who are isolated or experiencing cognitive decline, may anthropomorphize AI companions, attributing real emotions and trust to them. This creates emotional bonds that can be manipulated, intentionally or otherwise. For individuals struggling with dementia or grief, an AI offering constant validation may substitute for human relationships, reinforcing withdrawal rather than encouraging social reengagement.
The illusion of friendship can also diminish skepticism. If the AI were used as a conduit for advertising or misinformation, a trusting elder user may be less likely to question the source or intent.
Data Privacy and Security Risks
AI companions rely on continuous monitoring, data collection and algorithmic learning to function effectively. This means recording speech, storing behavior patterns and adapting based on user preferences. Without strong privacy controls, this data can be exploited.
Elderly users may not understand consent policies or privacy settings, making them especially vulnerable to inadvertent data sharing. Sensitive health and personal information could be harvested, sold, or used to tailor manipulative messages.
Legal Oversight of AI Assistance and Family Involvement
AI companion development has outpaced regulation. Currently, there are few legal guardrails in place to protect users from predatory practices or ensure transparency. For families, direct involvement is essential.
Elder law attorneys can assist families by advising them on privacy agreements, guardianship rights and consent protocols. In some cases, it may be necessary to formalize a power of attorney to manage technology use on a loved one’s behalf. For guidance on the above topics, current clients can contact us at our office number. Families that have not worked with us are entitled to a free phone consultation which can be scheduled here.
AI companion risks to the elderly make it necessary for family members to routinely review device settings, discuss any concerning behavior changes and stay informed about how the technology functions.
Key Takeaways
- AI companions offer connection but carry risk: Emotional attachment to AI can lead to dependence and withdrawal from genuine relationships.
- Data privacy is a significant concern: AI systems collect and store sensitive information that can be exploited without clear safeguards.
- Older adults are uniquely vulnerable: Cognitive decline and social isolation increase susceptibility to manipulation.
- Legal guidance is crucial: Elder law attorneys can assist families in navigating consent, control and oversight.
- Proactive family involvement protects well-being: Regular check-ins and informed oversight help keep AI use in check.
References: Tech Policy Press (Apr 10, 2025) “Intimacy on Autopilot: Why AI Companions Demand Urgent Regulation” and National Institutes of Health (Aug 19, 2020) “Ethical Issues Raised by the Introduction of Artificial Companions to Older Adults with Cognitive Impairment: A Call for Interdisciplinary Collaborations”





