AI智能总结
PDx™ BLUEPRINT: OPTIMIZE DATA EXCHANGE INTRODUCTION:Traditional IT architectures are not designed to effectively leverage data. Data is often stored throughout the enterprise in silos, with some elements on premise and some in the cloud. Thisdistribution without intent leads to performance issues, as well as added cost and complexity. Successful digital business requires a new data-centric infrastructure architecture that localizes data aggregation,staging, analytics, streaming and management in centers of data exchange at global points of business presence. Current State Fragmented architectures burdened bytechnical debt and driven by pointsolutions lack capabilities and performancerequired for hybrid IT workflows1 Inconsistent data storage and accessmethods lead to storage sprawl, costoverruns and compliance issues3 Implement distributed data staging/aggregation to optimize data exchangebetween users, things, networks and clouds1 Integrate public/private data sourcesto enable real-time intelligence acrossdistributed workflows3 Siloed data prevents the enablementof analytics and new business modelscentered around data4 Cloud connectivity and network notoptimized, causing poor applicationperformance when leveraging cloud toaccess local data2 Deploy regional data lakes/distributed datawarehouses to maintain data performance,compliance and sovereignty2 Distribute business intelligence capabilitiesto allow you to create new secure B2B dataexchanges and unlock new opportunities4 SOLUTION STEP 2 INTEGRATE PUBLIC/PRIVATEDATA SOURCES STEP 1 IMPLEMENT DATA STAGING/AGGREGATION STEP 3 HOST DATA AND ANALYTICSADJACENT TO NETWORKINGRESS/EGRESS Distributed DataStaging/Aggregation Regionalized DataStorage for Compliance Integrated Public and Private Data Sources 4New Business Opportunities Unlocked ACTIONImplement a cohesive data storage strategy at centers ofdata exchange ACTIONDirectly interconnect cloud on-ramps to centers of datastorage ACTIONDistribute business intelligence and connect global dataecosystems DATA HUB +Deploy regional data lakes and distributed data warehouses atcenters of data exchange+Solve global coverage and capacity needs +Add processing, analytics and streaming capability at global points ofbusiness presence+Host a B2B meeting place for companies to collaborate and connecttheir business platforms +Enable performant data exchange between sources and destinations+Operate deployments as a seamless extension of global infrastructurewith consistent experience, security and resiliency OUTCOME OUTCOME OUTCOME +Enable real-time intelligence across distributed workflows locallyand globally +Optimize data exchange between users, things, networks and clouds +Localized data improves application performance and user experience+Maintain compliance and data sovereignty STEP 1: IMPLEMENT DATA STAGING/AGGREGATION 1.Deploy centers of data staging in key locations2.Data Lakes store raw data to be analyzed and curated by data scientists3.Refined data sits in the data warehouse for business professionals to use4.Due to the value and sensitivity of enterprise data, access needs to be strictly controlled and logged OUTCOME +Localized data improves application performance and user experience+Maintain compliance and data sovereignty STEP 2: INTEGRATE PUBLIC/PRIVATE DATA SOURCES 1.The Core Switching Infrastructure terminates connectivity into the Data Hub and enables access to the cloud and other datasources by direct high-performance interconnection2.Additional connectivity is provided by use of software-defined on-ramps such as Service Exchange™3.Other data sources can be cloud storage, IaaS environments, SaaS environments or other remote Data Hubs OUTCOME STEP 3: HOST DATA AND ANALYTICS ADJACENTTO NETWORK INGRESS/EGRESS +Add processing, analytics and streaming capability at global points ofbusiness presence+Host a B2B meeting place for companies to collaborate and connecttheir business platforms 1.GPU Farm is located directly adjacent to data stores for direct access to enable AI Development and workloads2.Bulk Compute Farm is for media content creation, complex modeling and simulations OUTCOME +Enable real-time intelligence across distributed workflows locallyand globally TARGET STATE ARCHITECTURE SummaryA purpose built architecture to optimize data exchange improves performance and provides data compliance andcontrol. This is necessary to support exploding volume,variability and velocity of data creation as well as processingand storage required to accommodate digital business. Thestrategy brings the users, networks, systems and controls tothe data, which removes barriers of data gravity and createscenters of data exchange to scale digital business. The Optimize Data Exchange Blueprint is part of a libraryof blueprints and repeatable implementation patterns thatmake up the Pervasive Datacenter Architecture (PDx™). Bypractitioners, for practitioners, PDx™ was created by c