AI智能总结
AI inference in practice: time is money INSIGHT SPOTLIGHT include saving money, making money, reducing risk and Inferencing is the real-time decision-making of AI in practice.In the telecoms industry, this could apply to the network,services, customer care or other corporate workloads. As AI To illustrate the impact of AI inference in practice, threeSpotlights will focus on use cases, with each featuring anexample provider from the telecoms AI ecosystem. Thereports will be complemented by a plug-and-play calculatordeveloped for network operators, their partners and As outlined in“Distributed inference: how AI can turbochargethe edge”, published for GTC 2025, several factors supportthe case for AI at the edge: growing use of agentic AI, functionality. Kinetica has developed an inference solution thatautomates and scales these processes to drive productivityimprovements. Their AI stack can ingest millions of data pointsrelated to network metrics (e.g. voice calls, data sessions or Analysis Recap: defining the edge The telco edge is a continuum defined by how decentralisedcompute is relative to a public cloud deployment. Moving away Geospatial analysis The goal is to use AI to analyse changing customer-usage patternsacross an operator’s national, regional(state or provincial)andlocal footprints. The patterns can be compared with site locationsand available spectral capacity to anticipate congestion risks andbetter understand performance variations. This may, for example,involve redistributing spectrum capacity in a city to support •user edge– device edge (processing on a device such as asmartphone or vehicle) or enterprise edge (on-premise) •network edge– far edge (on or near telco RAN sites), nearedge or telco private cloud (operators installing a cloud facility in AI inference can run at any of these locations depending on thenature of the application. Objectives tend to be based on savingtime or money, supporting revenue-generating services or both. In action: time is money The benefits of processing inference at the telco regional datacentre versus the cloud become clear when scale is considered: AI inference in regional data centres is now being incorporated forfunctions including network anomaly detection, pre-emptivemaintenance, traffic and usage patterns, and smarter customercare. Most use cases drive efficiency and cost savings by •Cost– Kinetica’s AI solution uses a multi-agent design to reviewapproximately 3 million lines of text per second, whether logs,traces or any other records format. This could apply to, forinstance, issues affecting unexpected dropped calls for largeenterprise customers. Modelling based on early implementations Kinetica is an AI specialist on this portion of the edge, supportingmore efficient network operations and customer care. It offers a Network anomaly and pre-emptive maintenance •Data sovereignty– Dealing with sensitive customer informationsuch as call logs may bring the requirement for data to remain ina country. This reinforces the value of deploying AI at telco data Network performance monitoring remains a manually intensivefunction for operators, requiring teams of engineers to accuratelyreview, digest and action high volumes of data using toolkits fromvendors or open-source resources such as Wireshark, among •Resilience– Data from Kinetica and other AI solution providersindicates productivity gains in network fault resolution foroperators of 70–85% versus the status quo. This has obviousefficiency benefits, but resilience is an additional advantage as Implications Mobile operators Network vendors and enterprise IT •Know your objective– The rationale for the edge is basedon being able to manage compute latency requirements of AIapplications and agents, network resilience, security and datasovereignty requirements. These ultimately wrap up into aP&L case driven by three factors: TCO savings versusprocessing in the cloud; TCO savings from lower opex byreducing manual labour; and support for revenue generationfrom 5G upsell or complementary offerings such as GPU-as- •Demystify the stack –Stripping out the tech side of AI and itspotential use across an organisation, it is important to ask:what are we trying to achieve? This often plays a secondaryrole to the inverse question: what can we do with AI? The keyis having an outcome-based approach. Kinetica's solutionleverages genAI to run most cost effectively at the edge. At a •Common threads –AI applications serve a range of oftendisparate use cases, even within a single telecoms operatingcompany or group-wide operation. There is therefore value inarticulating the common threads of different AI agents orengines provided by a supplier. Several areas are key todemocratising AI's use across a wider employee base. Theseinclude compatibility across a range of data protocols ( e.g.proprietary vendor toolkits, open-source options) and vendor •Systemic change– The productivity implications of AI areth