Investing in artificial intelligence infrastructure (AI) continues its upward trend, despite increasing concerns about the sustainability of Data Centers. According to the latest Dell’oro Group report, total capital costs (CAPEX) for Data Centers, worldwide, are expected to exceed $ 1 trillion. by 2029.

This growth is guided by the fact that large cloud providers (hyperscalers) operate in perennial investment cycles, ensuring continuous increase in spending. At the same time, initiatives, such as the Stargate Project, with a budget of $ 500 billion, are broadening state investment in technological infrastructure, further accelerating market growth.

CAPEX costs for data centers are expected to exceed $ 1 trillion worldwide. by 2029

Although AI infrastructure investments have not yet given the expected results in performance and efficiency, the upward trend is given. According to Dell’oro Group, the perennial investment cycles of Hyperscalers, combined with technological evolution, ensure the continuation of the trend.

The Dell’oro Group report predicts that AI-Driven Computing will continue to be a key factor in investment in data centers, with the following trends dominating in the coming years. Thus, accelerated servers for AI training and specialized work will represent almost 50% of total data centers infrastructure, while top cloud providers (Amazon, Google, Meta, Microsoft) will continue to dominate, with almost 50% of total costs in 2025. Finally, providers Tier 2 Cloud are expected to significantly increase their investment, competing with big companies more dynamically.

What pushes the market to rise

In detail, technological progress focuses on three main areas:

– Accelerating calculations through GPU and Custom Accelerators: Modern AI applications are increasingly based on specialized hardware, such as AI accelerators (GPUs and Custom Accelerators), which improve performance and reduce energy consumption. Research and development in these areas aims to create more efficient solutions, making the mass education and functioning of large linguistic models (LLMS) more viable.

– Optimization of large linguistic models (LLM Optimizations): Recent innovations in AI models, such as those presented by Deepseek, offer significant performance improvements, reducing costs and consumption of resources. These improvements are at the heart of the next generation of data centers, as they allow higher efficiency without an exponential increase in energy requirements.

-Development of new rack-scale and website infrastructure: Innovations in Rack-Scale Architectures and Data Centers network systems are essential to improve connectivity and overall efficiency. The next generation of data centers will depend on more efficient networks, which reduce delays and increase overall data processing speed.