Recently, a series of AI virtual companion applications, including "Xingye," "Zhumengdao," and "EchoMe," have come under media scrutiny for presenting compliance risks and ethical concerns. These apps, designed to offer emotional companionship, are plagued by issues such as security loopholes, the practice of inducing consumption (a term referring to strategies that subtly encourage users to make purchases), insufficient protection mechanisms for minors, and the inclusion of vulgar role settings and interaction content. Most of these apps are operated by startups that have been established within a short timeframe and have received concentrated capital injections.
Currently, the AI companionship sector has entered a pivotal phase of governance. The ongoing crackdown signals a transition from a period of "wild growth" to one driven by "compliance." In this new era, algorithm transparency and robust content review systems will emerge as essential prerequisites for the survival of companies within the industry.
