According to research by Epoch AI (reported by Beating), a typical 1GW-capacity AI data center requires $38 billion in initial capital expenditure plus $900 million in annual operating costs. Annualized across asset lifespans, total annual costs reach $8.5 billion, with server depreciation—assuming all NVIDIA GB200 NVL72 systems—accounting for $5 billion annually, or 60% of total expenditure.
Energy costs, the largest operational expense, represent only $600 million per year. Sensitivity analysis shows the model is highly dependent on hardware lifecycle assumptions: shortening IT equipment lifespan from 5 years to 3 years would raise total annual costs to $12 billion; extending it to 7 years would reduce costs to $7 billion.
Related News
Cerebras IPO raises $5.55 billion, with day-one marketcap reaching $95 billion
Cerebras IPO jumps 68% on the first day, with 20x oversubscription—challenging NVIDIA’s position
Experts Say Zk Proofs Give DePINs an Edge as AI Trust Demands Rise