FTC focused on protecting younger users from AI
By Dario Betti, CEO, Mobile Ecosystem Forum (MEF)
September saw the US Federal Trade Commission (FTC) open an inquiry into AI-powered chatbots designed to act as companions. These tools—which make use of generative AI to simulate conversational intimacy—are increasingly being promoted as friends, coaches, or confidants. The concern that these systems may have too much influence on vulnerable users, particularly children and teens. The Commission has issued information requests to seven major companies: Alphabet, Character Technologies, Instagram, Meta Platforms, OpenAI, Snap, and X.AI. The inquiry is not an enforcement action, but rather a fact-finding study focused on industry practices.
Concerns
Essentially the Commission wants to understand how these companies design, monitor, and monetise their products, and what safeguards they have implemented to protect younger users. Regulators also want clarity on whether the companies disclose risks to users and parents, how they restrict or manage access by minors, and to what extent they comply with the Children’s Online Privacy Protection Act.
Another focus is monetisation. By asking companies to explain how they profit from user engagement, the Commission is signalling concern that business incentives may be driving longer and potentially more manipulative interactions with young audiences.
Advocacy groups concerned with children’s rights and online safety have largely welcomed the inquiry. Many see companion chatbots as posing greater risks than traditional social media, since they invite more personal disclosures and can simulate an emotional bond that may not be in the user’s interest.
At the same time, some industry voices are worried that regulatory pressure might slow an area of development with potential social value. Companion chatbots are being explored not only for entertainment but also for education, eldercare, and mental health support. From their perspective, the risk is that heavy regulation could discourage new entrants and leave only the largest firms with the resources to comply. Market analysts, however, suggest that regulation could ultimately increase trust in AI services, improving adoption where strong protections are demonstrated.
The broader mobile ecosystem
The inquiry carries implications that extend beyond the companies directly named. Companion AIs are distributed through the same infrastructure of app stores, messaging channels, and devices that underpin other parts of the digital economy. This means the expectations around accountability will eventually touch operators, integrators, vendors, and developers. These actors may not design chatbots themselves, but they play a central role in enabling access and monetisation.
Trust and compliance, therefore, will increasingly function as differentiators. Mobile operators and messaging providers who can demonstrate robust measures to support safe deployment will be better positioned as partners for AI developers and regulators alike. The way ecosystem players handle billing, parental controls, or content classification could become a reference point when regulators assess whether services are responsibly structured.
The inquiry’s attention to monetisation models also signals a shift. If regulators find that engagement-driven revenue creates risks for children, they may impose constraints on such models. That outcome could have knock-on effects for mobile monetisation practices more generally, particularly for companies that rely heavily on behavioural data or prolonged engagement as primary revenue drivers.
Lessons to learn
There are also parallels with existing messaging regulation. Digital ecosystems have already had to grapple with issues like grey route traffic, fraud prevention, and the need to establish clear trust frameworks. Those precedents may provide useful lessons for AI companion models. If regulators decide that companion characters or certain conversation contexts are higher risk, operators and vendors may need to build systems that flag or restrict such traffic, similar to how mobile players have adapted to fraud detection requirements.
There is also a wider responsibility to frame this discussion constructively. Companion AIs have the potential to deliver positive social value, but only if developed with foresight and adequate safeguards. Finding the balance between innovation and protection is not simply the job of regulators; it requires active engagement from across the ecosystem. Industry bodies like MEF can play a role in convening stakeholders, promoting principles such as consent-by-design, transparency, and authentication, and ensuring that mobile operators, messaging providers, and developers align around common standards.
The FTC inquiry should be seen as part of a longer process rather than a one-time intervention. Its findings will shape both the conversation within the US and the broader trajectory of how companion AI is developed and governed. For the mobile ecosystem, the lesson is that responsibility does not begin or end with the chatbot developers themselves. It is shared across the value chain. By helping to define trust standards and ensuring compliance in distribution, monetisation and safeguarding, the mobile industry can show leadership in building a sustainable environment for these new technologies.
In that sense, the development of AI companions is not just about software. It is about the systems, partnerships, and rules that allow innovation to deliver benefits without undermining user trust. The FTC’s inquiry is a reminder that ecosystem players must think proactively about their role in that balance.
ABOUT THE AUTHOR
Dario Betti is CEO of MEF (Mobile Ecosystem Forum) a global trade body established in 2000 and headquartered in the UK with members across the world. As the voice of the mobile ecosystem, it focuses on cross-industry best practices, anti-fraud and monetisation. The Forum, which celebrates its 25thanniversary in 2025, provides its members with global and cross-sector platforms for networking, collaboration and advancing industry solutions.
Web: https://mobileecosystemforum.com/
X/Twitter: https://x.com/mef
LinkedIn: https://www.linkedin.com/company/mobile-ecosystem-forum