What’s Next in AI? GPT-5, MOARGPUs, Test-Time-Compute, andAvoiding the AI Winter U.S. InternetPOSITIVE The investment community has soured a bit on AI based on theslow end-user adoption, high cost to serve, and capex vs.revenue mismatch. While this kind of time lag is normal for bigtechnology transitions, in this report we take a stab at “wherecould we go from here?” and what could turn sentiment backup. U.S. Internet Ross Sandler+1 415 263 4470ross.sandler@barclays.comBCI, US Trevor Young, CFA+1 212 526 3098trevor.young@barclays.comBCI, US The Key Take-Away:The market cap exposed to “the AI trade” is enormous, with five of thelargest companies in the world duking it out for pole position. The fate of the NASDAQ (and to alesser degree the S+P 500) depends on which way “the AI trade” goes over coming quarters andyears. We have already seen a quick sentimentshiftfollowing the publishing of our first in thisseries of AI reports (see “FOMO or Field-Of-Dreams,” from June). Some in the investmentcommunity (including us) are currently asking:Are we entering a trough of disillusionment or justin an air pocket? What could bring back the animal spirits witnessed in 2023 and early 2024?Thisreport attempts to predict what could play out next in AI. We unpack a bunch of new importantthings that help explain where we are on the AI capacity vs. uptake debate, key things that couldimpact future compute needs, and clear up some of the confusion given the limited disclosuresfrom the biggest players in the space. Alex Hughes+1 212 526 3069alexander.hughes@barclays.comBCI, US Joseph Petroline+1 212 526 6382joseph.petroline@barclays.comBCI, US Michael DiSanto+1 212 526 1054michael.disanto@barclays.comBCI, US We were somewhat taken aback when Character.ai more or less threw in the towel and itsseniorstaffjoined Google.Afterall, their founder was a lead inventor of the transformer back inthe day at Google (the “T” in GPT), and Character.ai was generally perceived to be the #2company in the consumer AI space behind ChatGPT, measured by users and engagement.Aftersimilar high-profile flame-outs at Inflection and others, it begs the question of whether the costof compute in AI is just too great, and more broadly if we are headed into an “AI Winter” or ifthese are simply normal bumps along the road for any new space riddled with start-ups. Wethink the latter. We see more catalysts, like the GPT-o1 model release from OpenAI this pastweek, coming over the next few months that could re-invigorate bullish investor sentiment. U.S. Semiconductors & SemiconductorCapital Equipment Tom O'Malley+1 212 526 0692thomas.o'malley@barclays.comBCI, US Scott Fessler+1 212 526 2604scott.fessler@barclays.comBCI, US The bull case around AI boils down to a simple concept: if scaling laws hold and frontier modelperformance continues to improve at great leaps over coming generations, then end-userproducts in consumer and enterprise will similarly improve greatly in terms of functionality andaccuracy and adoption will increase – and hence AI stocks likely rally as this becomes more clearto public market investors. We’d point to GPT-5’s upcoming release as a potential catalyst to get Kyle Bleustein+1 212 526 7618kyle.bleustein@barclays.comBCI, US Barclays Capital Inc. and/or one of itsaffiliatesdoes and seeks to do business with companiescovered in its research reports. As a result, investors should be aware that the firm may have aconflict of interest that couldaffectthe objectivity of this report. Investors should consider thisreport as only a single factor in making their investment decision. Please see analyst certifications and important disclosures beginning on page 22.Completed: 17-Sep-24, 01:06 GMTReleased: 17-Sep-24, 04:10 GMTRestricted - External things going again (also possibly Orion and recent GPT-o1 release). If successful, sentimentcouldshiftgreatly, AI hyperscaler capex would need to ramp further from here, and all thisinvestment will likely end up being another expensive moat for mega cap tech. In this bullscenario, shares of GOOGL, AMZN, META and MSFT (covered by Raimo Lenschow) eventually getrewarded for the build-out with massive returns, ushering in a new wave of AI applications andan enormous toll-road for the industry given the vast majority of it could run on mega-clustersin four clouds. In order for this to play out, we think synthetic data techniques likely need towork, as frontier models have largely run out of human generated data to train on (the “datawall” discussed below). Our new cut of the AI training compute required suggests the industryneeds more GPU capex starting in 2025 (above what consensus currently has modeled in). As wemove from GPT-5 to GPT-6, this is likely the moment (for the series of frontier models) thatsynthetic data techniques need to work, otherwise scaling laws could break down, hinderingmodel performance improvements. This is also the moment when the industry couldshiftawayfrom training an