您的浏览器禁用了JavaScript(一种计算机语言,用以实现您与网页的交互),请解除该禁用,或者联系我们。[Capgemini]:当机器精度满足人类直觉:人机理解的新时代(英)2025 - 发现报告

当机器精度满足人类直觉:人机理解的新时代(英)2025

机械设备2025-09-22-Capgemini陈***
AI智能总结
查看更多
当机器精度满足人类直觉:人机理解的新时代(英)2025

understanding Editorial The emergence of the human–machine understanding (HMU) domain marks a significant stepforward in the evolution of artificial intelligence. It shifts the focus from generating outputs tointerpreting tone and context, enabling AI systems to comprehend human behavior, mentalstates, and intent in real time. As interfaces grow more intelligent and humanlike, both in thedigital and physical space, it is key to design systems that understand, adapt, anticipate andconnect with users. This point of view introduces a practical framework to operationalize that shift across three areas:Sense, Understand and Support. It outlines how organizations can design AI systems that adaptnot only to what people say or do, but also to how they feel, what they mean and what they needin the moment. The result is more intelligent, empathetic, and responsive interactions between AIand humans in any role – be they leaders, workers or consumers. The AI revolution that has been gaining momentum is just the start.Human-machine understanding will deliver deeper insights, establishingmore trusted relationships between people and technology. The launch of the Capgemini AI Robotics & Experiences Lab, alongside our established capability atCambridge Consultants, reinforces this ambition. It provides a global platform to design, build andscale embodied AI systems, ranging from humanoid and polyfunctional robots to digital humans,that augment people through real-world and humanized collaboration. Human-machine understanding capabilities will define the next wave of digital experience.Whether easing frustration, shifting pace, or responding with empathy, these systems will supportusers in ways that feel natural and intuitive. For businesses, it represents both a competitiveadvantage and a foundation for responsible, human-centered innovation. Kary BheemaiahCTIO Capgemini Invent Alexandre EmbryHead of the Capgemini AIRobotics and Experiences Lab Tim EnsorHead of Intelligent Services,Cambridge Consultants Table of Contents 04 15 From vision scenarios toHMU-enabled realities The context 07 17 HMU redefines valuein three key areas HMU and data security:Risks and mitigations 08 19 Delivering real-worldvalue Seven things to do now:a structured approach to HMU 11 Vision scenarios foran HMU-enabled future The context The once clearly delineated boundariesbetween the physical and digital worlds aredisappearing. Emerging technologies areconverging in exciting combinations thatcreate new ways for machines and humans tocollaborate. patterns, and consumer services struggle tomaintain meaningful engagement withoutunderstanding customer preferences andcontexts. However, these are early days. The generativeAI models that create text, images, andvideo on demand are the first stage of amove towards deeper human-machinecollaboration. We are on the cusp of a majortransition where AI-enabled systems will takea proactive rather than a reactive approach. We’re seeing virtual assistants evolve intoembodied agents, and AI that was onceconfined to the cloud is now powering robotson the factory floor. We talk to AI assistants,rely on algorithms for recommendations, anduse agentic AI systems in the digital aspectsof our lives. Recent advances,like OpenAI’sChatGPT agent, are enabling machines toact autonomously, enhancing their ability tocollaborate with humans in dynamic,real-world environments. This challenge of bridging capability andexperience is precisely what promptedthe recent collaboration between OpenAIand legendary designer Jony Ive. As notedwhen the partnership was announced:“Computers are now seeing, thinking andunderstanding. Despite this unprecedentedcapability, our experience remains shapedby traditional products and interfaces.” Thispartnership signals a recognition that thenext frontier isn’t just smarter AI, but AI thattruly understands and adapts to human needsthrough thoughtful design and interfaces. As AI spreads across software and hardware,from conversational copilots to autonomousmachines, one question becomes central:how well do these systems understand ourbehaviors, context, and goals? Defining human-machine understanding So far, the answer to the above is ‘not verywell’. The AI-enabled revolution in workingstyles has been confined to a one-wayrelationship. Current AI models providelimited analysis and interaction, such asrequesting clarifications or flagging unhelpfulresponses. These reactive systems operatewith little awareness of human behavior.They analyze input, perform tasks, andprovide answers, without grasping thecontext and the person behind the prompt. The next stage, via human-machineunderstanding (HMU), will deliver smarter,more intuitive AI. By combining sensor data,behavioral cues, mental states, and contextualinformation, HMU-aligned systems willinterpret what we mean, not just what we say.These machines will adapt their responses inreal time and build a trusted relationship wi