Many market commentators believe inflation is this year's most important story for equity markets.

By contrast, despite the inflation data, we believe the influence and dominance of powerful growth themes is a far more compelling development.

For example, witness the extraordinary performance of those exposed to electrification and data center cooling within industries. Similarly, we’re seeing a premiumization in the consumer and financial sectors. There’s also been a marked jump in security and defense spending across sectors. And, of course, the most eye-catching has been the significant impact of artificial intelligence (AI).

Thematic investing is increasingly important across global equity markets – a trend we believe will gain pace. This is particularly true in a slowdown of global growth driven by fiscal constraints, demographic shifts, and a new era of globalization.

We believe AI will be one of the most potent and important themes over the next decade.

The evolution of AI will be central to this dynamic. AI is no longer the preserve of a few significant players; it is broadening across sectors, geographies, and companies. And for that, we believe AI will be one of the most potent and important themes over the next decade.

The great reveal

Last year marked AI’s great reveal. The emergence of ChatGPT was a wake-up call to markets. The message? We’re likely on the cusp of a multi-year, meaningful change in growth dynamics within specific parts of the economy.

Generally, new computing cycles emerge every ten years, heralding a tenfold increase in computer access. These cycles follow a similar pattern. Initially, semiconductor manufacturers benefit, followed by a shift in demand towards infrastructure, and finally, software and services reap the rewards. The 2010 internet boom was a case in point.

We believe AI follows a similar pattern but with important caveats. The key reason lies in the technology itself. For instance, the GPT2 training model needed 1.5 billion parameters to operate (the adjustable elements in a model that are learned from training data, such as biases and scaling factors). By contrast, GPT4 requires 100 trillion – equivalent to the number of synapses in the human brain. This rapid growth in complexity requires increasingly sophisticated semiconductor chips.

This has implications not just for data centers (buildings that house computer systems) but also for factors outside the traditional technology space. This includes the type and volume of power consumption, the power grid configuration, power management, and numerous specialized areas within the industrial economy.

Economic implications

The other significant difference lies in how AI diffuses through the economy. In past cycles, consumers adopted new technologies far more quickly than they have AI. Now, companies are testing and trialing AI applications at a pace that far outstrips consumer engagement. Indeed, despite the initial excitement around ChatGPT, we’ve seen scant evidence of meaningful monetization of the technology.

As a result, we continue to focus on direct AI, and are increasingly expanding our thematic focus into three core areas:

  • The energy and power infrastructure underpinning AI.
  • Opportunities within data centers.
  • Generative AI (Gen AI) combined with digital twin technologies.

AI + power infrastructure

Electrification and reindustrialization are potentially the most exciting themes as capital expenditure (CapEx) increases and permeates the supply chain. Electricity consumption in Western countries is expected to surge by 40%, propelled by energy demands of AI-driven data and green policies.

In a recent earnings call, Emerson CEO Lal Karsanbhai said a search on ChatGPT consumes six to ten times more power than a traditional Google search (Chart 1). The estimated CapEx required to develop AI infrastructure continues to rise. The increase in demand is actual and is happening today.

Chart 1. ChatGPT queries are 6x–10x as power intensive as traditional Google searches

CHART ONE: ChatGPT queries are 6x-10x as power intensive as traditional Google searches

Source: Google, SemiAnalysis as per Goldman Sachs report “Generational Growth” April 2024. Note: “Wh” refers to watt-hour, a unit of energy equivalentto one watt of power extended for one hour of time.

Of course, we’ve been here before. In 1999, an Energy Information Administration forecast said 30–50% of the US electric supply would be required to power the internet. Instead, the growth in US electricity demand flattened to zero over the next two decades. Then, efficiency gains in performance-to-watt growth offset significant growth in internet traffic due to Moore’s Law.1 Today’s situation is different. The impact on electricity demand is expected to be significantly greater, compounded by a deceleration in the gains predicted by Moore's Law.1

What happens when the unstoppable force of AI meets the immovable object of the electricity grid?

So, what happens when the unstoppable force of AI meets the immovable object of the electricity grid? In the short term, electrical equipment manufacturers are poised to gain pricing power. Lead times for transformers are expected to double, extending from nine months to two years until delivery. In the long term, however, electricity grids must transform to keep pace with the estimated increase in demand. This includes addressing the unpredictability of supply, especially as renewables increasingly account for a larger share of the energy mix.

AI only one leg of the infrastructure stool

The other two legs of the stool are nations seeking energy independence and widespread electrification in the drive to net zero. This has resulted in a step-change in the electric utilities’ CapEx plans across Europe and North America. Data centers alone will need 250 terawatt hours of generation capacity in the US over the next five years – equivalent to Spain’s power consumption. This represents the highest level of growth in the US for over 20 years.

This surge in demand will also be geographically concentrated in areas where the grid infrastructure might not cope. Data centers are predominantly located in North Virginia and Texas, while in Europe, they are heavily concentrated in Frankfurt, Amsterdam, and Dublin.

Furthermore, BloombergNEF estimates that grid investments to decarbonize global electricity by 2050 will increase from $300 billion in 2022 to $600 billion in 2030.2 We believe companies best placed to capitalize on this transformation are electrical firms and utility-focused equipment manufacturers.

Innovation is also needed to enhance grid operations. Here, AI can help design more efficient systems and infrastructure.

Opportunities within data centers

Another area poised for structural growth comes from the data center sector winners. These facilities require advanced cooling systems and resilient power supplies to operate efficiently and reliably.

Previously, air cooling technologies met the needs of common, lower-power-density servers. However, while still a small sector, AI servers represent some 40% of the current market when adjusted for power density. As AI server adoption grows and power density increases, air-cooling technologies will no longer be up to the task. Liquid-cooling technologies are gaining momentum and increasing market share (Chart 2).

Chart 2. Increasing thermal design power of server processors

Source: Taiwan Technology, Goldman Sachs, May 2024.

We believe many of the future leaders in cooling technologies will come from Asia and emerging markets.

We believe many of the future leaders in cooling technologies will come from Asia and emerging markets.

Meanwhile, the demand for resilient power supplies and the slow-moving nature of grid upgrades create potential opportunities for companies specializing in backup power generation and uninterruptable power-supply equipment. The rebuilding of industrial capacity further contributes to this trend.

A killer combination

A final area that merits attention is the concept of digital twins.3 Organizations use data twins to enhance data-driven decisions. For example, a heavy industry manufacturing plant might use a digital model to monitor on-the-ground processes and predict machinery wear and tear.

A successful digital twin requires accurate, abundant, real-time data. Previously, managing this volume of data has been challenging. However, advancements in AI large language models have revolutionized this process.4 These models can distill crucial information, facilitating efficient data transfer and processing for digital twins, which enhances their utility and effectiveness.

As consulting firm McKinsey highlighted, “The symbiotic relationship between digital twins and Gen AI increases their combined scalability, accessibility, and affordability. This new frontier will allow innovative and dynamic organizations to improve their advantage.”5

Final thoughts

The key to successful thematic investing is understanding where value will materialize within the most potent global growth themes. It’s crucial to distinguish between growth and value creation. We focus on the latter, especially as CapEx spreads throughout the AI value chain.

AI will not be the preserve of large caps forever, with smaller players coming to the fore.

In our view, AI will not be the preserve of large caps forever, with smaller players coming to the fore. This is an opportunity for those with capabilities to invest across the market-cap spectrum and various geographies.

1 Moore’s Law refers to Gordon Moore, the co-founder of Fairchild Semiconductor and Intel (and former CEO of the latter), who in 1965 noted that the number of transistors on computer chips doubles approximately every two years.
2 "BNEF: Global Net Zero Will Require $21 Trillion Investment In Power Grids." Smart Grid Observer, March 2023. https://www.energy-xprt.com/news/bnef-global-net-zero-will-require-21-trillion-investment-in-power-grids-1104837.
3 Digital twins refer to virtual representations of assets, people, or processes and their environments, designed to simulate strategies and optimize behaviors.
4 A large language model (LLM) is a computational model notable for its ability to achieve general-purpose language generation and other natural language processing tasks such as classification.
5 "Digital twins and generative AI: A powerful pairing." McKinsey, April 2024. https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/tech-forward/digital-twins-and-generative-ai-a-powerful-pairing.

Important information

Companies mentioned for illustrative purposes only and should not be taken as a recommendation to buy or sell any security.

Projections are offered as opinion and are not reflective of potential performance. Projections are not guaranteed and actual events or results may differ materially.

Equity stocks of small and mid-cap companies carry greater risk, and more volatility than equity stocks of larger, more established companies.

AA-130824-181810-1