Tech & Sourcing @ Morgan Lewis


Worldwide IT spending is forecast to total more than $5 trillion in 2024, with 10% year-on-year growth of spending on data center systems, according to recent analysis from Gartner. The increasing adoption of artificial intelligence (AI) solutions is driving demand for technology infrastructure in order to meet greater data storage and network infrastructure requirements and more compute-intensive workloads.

As companies ramp up their investments in an AI-powered future, some commentators note parallels to the success of chipmakers, network providers, and software providers arising from spending patterns during the growth of the internet in the 1990s. Today, hyperscale cloud computing is centralized in a few providers that are spending big in order to meet future infrastructure demands, which, according to Gartner, includes planning for eventual execution of generative AI in 2025.

Research from Synergy Research Group indicates that the number of large data centers operated by hyperscale providers surpassed 1,000 in early 2024 and is forecast to double in the next four years, while the three largest hyperscalers reportedly account for 60% of the aggregate global hyperscale data center capacity. Global spending on the construction of data centers is forecast to reach at least $49 billion by 2030. The United States reportedly accounts for 51% of worldwide data center capacity, although significant investment in new data center capacity is anticipated over the coming years in the Asia-Pacific and Middle East regions.

Other factors driving increased spending include investment in AI-specific computer chips, for which there is intense competition in pursuit of, among other goals, better performance and efficiency and greater control over costs. Network infrastructure for securely connecting AI clusters and data between cloud systems is also attracting increased spending.

On the flip side, increasing infrastructure costs are raising barriers for AI startups that, for some, may be insurmountable against the competition despite large fundraising rounds. This could lead to further parallels being drawn to the “dot-com boom.”

The various elements of infrastructure required by AI systems are summarized well by Computer Weekly: “AI workloads are not only more compute intensive—they also place huge demands on data, storage and network infrastructure when training AI models, performing inferencing tasks [i.e., processing data through an AI system to generate conclusions], as well as securing and governing the use of data to comply with data residency and data sovereignty requirements.”

Each of these elements is driving increased spending within an intensely competitive environment in order to stay at the forefront of building cutting-edge AI models and to meet future user demand.