Building 1GW AI Data Center Costs $38 Billion; Server Hardware Consumes 60% of Annual Costs

NVDA-2%

According to research by Epoch AI (reported by Beating), a typical 1GW-capacity AI data center requires $38 billion in initial capital expenditure plus $900 million in annual operating costs. Annualized across asset lifespans, total annual costs reach $8.5 billion, with server depreciation—assuming all NVIDIA GB200 NVL72 systems—accounting for $5 billion annually, or 60% of total expenditure.

Energy costs, the largest operational expense, represent only $600 million per year. Sensitivity analysis shows the model is highly dependent on hardware lifecycle assumptions: shortening IT equipment lifespan from 5 years to 3 years would raise total annual costs to $12 billion; extending it to 7 years would reduce costs to $7 billion.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Comment
0/400
No comments