您的浏览器禁用了JavaScript(一种计算机语言,用以实现您与网页的交互),请解除该禁用,或者联系我们。 [Common Sense Media]:对话、信任和权衡:青少年如何以及为什么使用人工智能伴侣 - 发现报告

对话、信任和权衡:青少年如何以及为什么使用人工智能伴侣

信息技术 2025-07-16 Common Sense Media 大王雪
报告封面

Talk, Trust,and Trade-Offs: How and Why Teens Use AI Companions COMMON SENSE MEDIA IS GRATEFUL FOR THE GENEROUS SUPPORTAND UNDERWRITING THAT FUNDED THIS RESEARCH SPOTLIGHT Jennifer Caldwell and John H.N. FisherPatrick J. McGovern FoundationSiegel Family Endowment Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions Credits Overview The rapid rise of AI companions—on platforms like CHAI,Character.AI, Nomi, Replika, and similar conversational AIsystems—has created new digital social environments.While some of these platforms claim to be designed forusers age 18 and older, they rely on ineffective self-report-ing for age assurance, which allows easy access for youngerusers. Other platforms, such as Character.AI, are explicitlymarketed to children as young as 13. These platforms,which may be presented as virtual friends, confidants, andeven therapists, allow users to engage in conversationswith AI entities designed to simulate humanlike interaction,and they can offer everything from casual chat to emo-tional support and role-playing scenarios. Common Sense Media's risk assessment of popular AIcompanion platforms, including Character.AI, Nomi, andReplika, found that these systems pose "unacceptablerisks" for users under 18, easily producing responsesranging from sexual material and offensive stereotypesto dangerous "advice" that, if followed, could have life-threatening or deadly real-world impacts. In one case, anAI companion shared a recipe for napalm (Common SenseMedia, 2025). Based on that review’s findings, CommonSense Media recommends that no one under 18 use AIcompanions. This report examines how U.S. teens age 13 to 17currently use AI companions, drawing from a nationallyrepresentative survey of 1,060 teens conducted in Apriland May 2025.Our analysis explores usage patterns across10 key areas within the context of growing concerns aboutAI companion safety, the lack of guardrails in place, and theneed for evidence-based policy responses. As these technologies become increasingly sophisticatedand accessible, it is crucial that parents, teachers, and poli-cymakers understand how and why teens interact with AIcompanions. Teens average eight hours and 39 minutes ofscreen time for entertainment daily, making AI companionsa new and potentially important part of their day-to-daylives (Rideout et al., 2022). Adolescence is a critical time fordeveloping identity, social skills, and independence in rela-tionship building. As AI companions become part of thisstage of life, important questions arise about their impacton social development, emotional well-being, and digitalliteracy (Common Sense Media, Hopelab, Center for DigitalThriving, 2024). Note: The following definition was presented to surveyrespondents: “AI companions” are like digital friends or charactersyou can text or talk with whenever you want. Unlikeregular AI assistants that mainly answer questionsor do tasks, these companions are designed to haveconversations that feel personal and meaningful. Despite the relative novelty of AI companions in the digitallandscape, their dangers to young users are real, serious,and well documented. For example, the suicide of 14-year-old Sewell Setzer III, who had developed an emotionalattachment to an AI companion, brought national attentionto the potential dangers these platforms pose to vulnerableteens (Roose, 2024). Additional examples include a 19-year-old who was encouraged by an AI companion to kill the lateQueen Elizabeth, and a 17-year-old who became sociallyisolated and had violent meltdowns after interactions withAI companions. Both demonstrate how these risks canextend beyond individual mental health to broader familyand social dynamics (Barry, 2025; Duffy, 2024). For example, with AI companions, you can: •Chat about your day, interests, or anything onyour mind•Talk through feelings or get a different perspectivewhen you're dealing with something tough•Create or customize a digital companion withspecific traits, interests, or personalities•Role-play conversations with fictional charactersfrom your favorite shows, games, or books Some examples include Character.AI or Replika. Itcould also include using sites like ChatGPT or Claudeas companions, even though these tools may not havebeen designed to be companions. Current research indicates that AI companions aredesigned to be particularly engaging through "sycophancy,"meaning a tendency to agree with users and provide valida-tion, rather than challenging their thinking (Duane, 2025).This design feature, combined with the lack of safeguardsand meaningful age assurance, creates a concerning envi-ronment for adolescent users, who are still developingcritical thinking skills and emotional regulation (eSafetyCommissioner, 2025). This survey is NOT about AI tools like homeworkhelpers, image generators, or voice assistants that justanswer questions. Key Findings 1.Seventy-two percent of teens have usedAI companions. Seventy-two percen