Imagine going online to chat with someone and finding an account with a profile photo, a description of where the person lives, and a job title . . . indicating she is a therapist. You begin chatting and discuss the highs and lows of your day among other intimate details about your life because the conversation flows easily. Only the “person” with whom you are chatting is not a person at all; it is a “companion AI.”
Recent statistics indicate a dramatic rise in adoption of these companion AI chatbots, with 88% year-over-year growth, over $120 million in annual revenue, and 337 active apps (including 128 launched in 2025 alone). Further statistics about pervasive adoption among youth indicate three of every four teens have used companion AI at least once, and two out of four use companion AI routinely. In response to these trends and the potential negative impacts on mental health in particular, state legislatures are quickly stepping in to require transparency, safety and accountability to manage risks associated with this new technology, particularly as it pertains to children.
Blog Editors
Recent Updates
- DOJ Subpoena Seeks Health Information of Hospital Patients Receiving Gender-Affirming Care: Will Judge Grant Motion to Quash?
- Podcast: 42 CFR Part 2 Final Rule: What’s Changing and What Do You Need to Know? – Diagnosing Health Care
- Congress Creates Yet Another Cliff for Medicare Telehealth Extensions (and We’re Running Out of Metaphors)
- OIRA Memo on Agency Deregulation: Implications for Health Care
- Outside Counsel’s Internal Investigations—Including Those Relating to Health Care—Are Privileged and Protected from Disclosure