Imagine going online to chat with someone and finding an account with a profile photo, a description of where the person lives, and a job title . . . indicating she is a therapist. You begin chatting and discuss the highs and lows of your day among other intimate details about your life because the conversation flows easily. Only the “person” with whom you are chatting is not a person at all; it is a “companion AI.”
Recent statistics indicate a dramatic rise in adoption of these companion AI chatbots, with 88% year-over-year growth, over $120 million in annual revenue, and 337 active apps (including 128 launched in 2025 alone). Further statistics about pervasive adoption among youth indicate three of every four teens have used companion AI at least once, and two out of four use companion AI routinely. In response to these trends and the potential negative impacts on mental health in particular, state legislatures are quickly stepping in to require transparency, safety and accountability to manage risks associated with this new technology, particularly as it pertains to children.
Blog Editors
Recent Updates
- The DOJ’s Bulk Sensitive Data Rule and Your Obligation to “Know Your Data”
- Eliminating the GRAS Pathway: An Update
- Brand Licensing in Health Care: An Overview for Hospitals
- FDA Proposal Would Extend Food Traceability Rule’s Compliance Deadline to July 2028
- NYDFS Cybersecurity Crackdown: New Requirements Now in Force, and "Covered Entities" Include HMOs, CCRCs—Are You Compliant?