您的浏览器禁用了JavaScript(一种计算机语言,用以实现您与网页的交互),请解除该禁用,或者联系我们。 [ITIF]:调查:大多数美国人认为应该允许科技公司设定人工智能限制 - 发现报告

调查:大多数美国人认为应该允许科技公司设定人工智能限制

信息技术 2026-02-26 ITIF 李艺华🌸
报告封面

As theDepartmentof War levels threats and ultimatums against Anthropic, Morning Consult conducteda nationally representative survey of 1,976 U.S. adults to better understand attitudes around the use ofartificial intelligence (AI) in military actions, whether technology companies have a responsibility to set Americans Want Humans in Control of AI The Big Picture Americans are deeply skeptical of AI in military operations. Nearly 8 in 10 (79%) say ahuman beingshould always make the final decisionbefore any use of lethal force—a view held equally byDemocrats (81%) and Republicans (81%). Three-quarters (75%) say AI technology is not yet reliable Concerns are Intense and Bipartisan Autonomous Weapons: Research, Don’t Deploy The public draws a clear line between understanding the technology and putting it on the battlefield. Only21%support developing and deploying AI-controlled weapons (Dem: 16%, Rep: 35%). The plurality position (49%) is 71%agree the U.S. should still research and develop AI-controlled weapons to understand the technology anddefend against enemies who might use them against us, even if we choose not to deploy them (Dem: 72%,Rep: 79%). Republicans are notably divided: while 48% say the U.S.mustdevelop these weapons to stay Surveillance: Americans Want Legal Process, Not Blank Checks A majority (54%) say AI-powered mass surveillance is too dangerous and violates privacy and civil liberties (Dem: 63%, Rep: 45%), versus 30% who see it as necessary for safety. Even Republicans are more likely tosay mass surveillance is too dangerous (45%) than to call it necessary (40%). But the public isn’t reflexively anti-security—46%say the government shouldonlybe able to use AIsurveillanceon specific targets with a court-issued warrant(Dem: 45%, Rep: 51%). The constitutional principle is clear:70%agree that using AI to monitor Americans without a court-issuedwarrant violates the Fourth Amendment’s protection against unreasonable searches (Dem: 74%, Rep: 71%) . Americans Back Companies Setting Limits Two-thirds (67%) believe private technology companies have a responsibility to set limits on how their productscan be used, even if the government wants to use them differently (Dem: 73%, Rep: 65%). When the trade-offis explicit,53%say private AI companies should be allowed to restrict how their technology is used, includingbanning its use for domestic surveillance or autonomous weapons (Dem: 58%, Rep: 43%), versus just29% On the Anthropic dispute specifically, half(50%)of those who are aware of the dispute view penalizing thecompany as government overreach that sets adangerousprecedent (Dem: 57%, Rep: 39%), while 35% call itnecessary for national security. Among Republicans who are aware of the dispute, opinion is closely split and Important Context: A Public Still Forming Its Views Most Americans haven’t engaged deeply with these issues yet.56%have heard “not much” or “nothing atall” about the Anthropic–Department of War dispute; only12%have heard “a lot.” Opinion on specificpolicy tools remains unsettled:30%are unsure about supply chain risk designations, and20%are unsure The trust landscape is fragmented. No institution commands majority confidence on AI decisions. Themost trusted entity isan independentscientificor ethics review board (22%), followed by the militaryand AI companies (14% each). A quarter of Americans (25%) say they’re simply not sure who to trust.Notably,45%oppose using emergency laws to force AI company compliance (Dem: 57%, Rep: 29%), Key Stats Top-line findings for quick reference: •79% say a human should always make the final decision before any use of lethal force•75% say AI is not yet reliable enough to be trusted with life-or-death military decisions without humanoversight•54% say AI-powered mass surveillance is too dangerous and violates privacy and civil liberties•70% agree that using AI to monitor Americans without a court-issued warrant violates the FourthAmendment•67% say private tech companies have a responsibility to set limits on how their products can be used,even if the government disagrees•53% say AI companies should be allowed to restrict their technology from uses like domesticsurveillance or autonomous weapons, vs. just 29% who say companies must give the military full •49% support researching AI-controlled weapons but not deploying them; only 21% favor deployment •47% say penalizing Anthropic is government overreach, vs. 29% who say it’s necessary for national security •45% oppose using emergency laws to force AI company compliance, vs. 35% who support it Methodology: This poll was conducted February 25th, 2026 among a nationally representative sample of 1,976 U.S. adults. Theinterviews were conducted online and the data were weighted to approximate a target sample of U.S. adults based on gender, age,race, educational attainment, region, gender by age, and race by educational attainment. Results from the full survey have amargin