What's Behind the Rise of Data Centers in North America?
Data centers are the backbone of our interconnected world, powering everything from streaming services to network infrastructure. As more of our lives move online, the demand for data storage, processing, and quick accessibility is skyrocketing. A rapid rise in data center construction is spreading across North America. This surge isn’t just a trend. It’s a critical evolution, driven in part by advancements in artificial intelligence (AI), shifting real estate dynamics, and the need for innovative cooling strategies
The AI Revolution
AI has transitioned from a futuristic concept to a transformative force in today’s tech landscape. AI applications are everywhere. From chatbots and virtual assistants to autonomous vehicles and complex machine learning algorithms, AI is shifting the digital landscape and its footprint. The AI frenzy has brought with it an insatiable appetite for computing power, which, in turn, is fueling the expansion of data centers.
As AI technologies evolve, they will require more powerful hardware, including advanced Graphics Processing Units (GPUs) and specialized processors. These components generate significant amounts of data that need to be stored, processed, and accessed at lightning speeds, driving the need to build the infrastructure necessary to train and deploy generative AI (GenAI) models. It’s predicted that the sum invested in these data centers will double to $2 trillion in the next five years.
With AI’s increasing need for power comes a higher demand for electricity. According to a study, generating an image using a GenAI model could take as much energy as half a smartphone charge. According to a study released by EPRI, data centers could consume up to 9% of U.S. electricity generation by 2030 — more than double the amount currently used.
Moreover, AI-driven workloads are unique because they often involve processing massive datasets in parallel, which can strain traditional data center infrastructures. To accommodate this, data centers are increasingly adopting high-density configurations, where servers are packed more tightly together. This shift towards high-density setups is a key reason behind the surge in data center construction and expansion, as older facilities are either upgraded or replaced by new, purpose-built centers designed to handle the increasing demands of AI.
Shifting to Smaller Markets
As the demand for data centers grows, so too does the competition for suitable locations. Traditionally, data centers have been clustered in major markets like Silicon Valley, Northern Virginia, and Dallas. However, these areas are now facing challenges, particularly regarding power availability. Data centers are incredibly power-hungry, and in some of the most established markets, the available power capacity is becoming a limiting factor.
Global data center trends in 2024 indicate that sourcing power is a top priority for operators across all regions (North America, Europe, Latin America and Asia-Pacific). The pressure of a worldwide power shortage inhibits the global data center market’s growth. This has led to a shift in focus toward smaller, emerging markets that can offer the power needed to support large-scale data center operations. Areas like Phoenix, Atlanta, and Salt Lake City are seeing a boom in data center construction, driven by their ability to provide reliable power and their relative proximity to major tech hubs.
Secondary markets with ample power will likely continue to attract more data center investment, especially since proximity to an AI application’s servers matters in terms of responsiveness. New studies are looking into the benefits of co-locating data centers at nuclear power plants in competitive electricity markets. In this scenario, the extension cord to the grid is cut, which arguably avoids taking anything away from the grid or shifting costs from the data center owner to other power users.
But location and proximity are not the only emerging factors. Investors and developers should be mindful of the environmental impact to data center operations, specifically cooling. Extreme temperatures, for example, can make it more difficult and costly to cool a data center, which is critical to maintaining the performance and longevity of the hardware. The growth of AI will only increase those energy demands.
And for businesses that rely on those centers to be up and running 24/7, any downtime caused by a heat-related outage can be costly and damaging to their reputation. While some companies have gone so far as to explore putting data centers in outer space, a more practical approach might be co-location or a similar cloud-based model which provides scalability, advanced security, and adequate disaster recovery.
The Cooling Challenge
As data centers grow in number and size, effective cooling strategies are critical. The heat generated by modern data centers, especially those that house high-density AI workloads, creates a considerable challenge. Traditional cooling methods, which may still rely on air conditioning systems, are becoming obsolete as they struggle to cool the increased heat output.
A significant development in accelerated AI computing is the arrival of NVIDIA’s Blackwell architecture, which boasts hefty performance capacity and improved efficiency. However, this evolution comes with major infrastructure requirements. The NVIDIA Blackwell configurations are anticipated to require between 60kW to 120kW per rack. This is a substantial increase from traditional power densities, which typically range between 7kW to 15kW per rack. The growing demand for high-density servers could mean redesigning systems (or even rebuilding) to cater for AI-ready racks.
For data center operators, utilizing emerging liquid cooling techniques is one viable way to beat the heat. With this solution, coolant is circulated through servers to directly absorb heat, which is a more efficient process than traditional air cooling. Liquid cooling is becoming an increasingly popular choice, particularly in data centers that support AI processing.
Another trend is the use of ambient cooling techniques, which leverage the natural climate of a location to cool data centers. This is particularly effective in regions with cooler climates, where outside air can be used to reduce the need for energy-intensive air conditioning. This approach not only reduces energy consumption but also lowers operating costs, making it an attractive option for data center operators.
Moreover, innovations in heat reuse are gaining traction. In some cases, the excess heat generated by data centers is being repurposed for other uses, such as heating nearby buildings or powering industrial processes. This not only improves the overall energy efficiency of the data center but also contributes to sustainability efforts, a growing concern as the world becomes more aware of the environmental impact of digital infrastructure.
The rise of data centers has also led to increased interest in hybrid cooling systems that combine multiple cooling methods to optimize performance. For example, a data center might use liquid cooling for the most heat-intensive components, while employing ambient cooling for the rest of the facility. This flexible approach allows data centers to adapt to varying workloads and environmental conditions, ensuring that they can operate efficiently under a wide range of scenarios.
The rise in power density also raises the importance of advanced monitoring and control systems. These systems can provide real-time data on temperature, humidity, and airflow within the data center, allowing operators to fine-tune their cooling strategies and respond quickly to any issues that may arise. As the demands on data centers continue to grow, these advanced systems will play a crucial role in maintaining the reliability and efficiency of digital infrastructure.
A Future Defined by Growth
The rise of data centers in North America is a testament to the ever-growing role that digital technologies play in our lives. AI advancements are projected to significantly drive future data center demand. High-performance computing will require rapid expansion and innovation in data center design and technology to manage rising power density needs. This growth is reshaping the landscape of the tech industry, creating new opportunities and challenges alike.
As we look to the future, it’s clear that data centers will continue to be a critical component of our digital infrastructure. Whether it’s powering the next generation of AI applications, supporting the growth of emerging tech markets, or pioneering new cooling technologies, data centers are at the heart of our increasingly connected world. As demand grows, so too will the strategic innovation of building, maintaining, and cooling the data centers across the nation and around the world.
Blog Comments