Fast, scalable, clean, and cheap enough

How off-grid solar microgrids can power the AI raceDecember 2024
By Kyle Baranko (Paces), Duncan Campbell (Scale Microgrids), Zeke Hausfather (Stripe),
James McWalter (Paces), Nan Ransohoff (Stripe)

Thank you to many for providing feedback on previous drafts: James Bradbury (Anthropic), David Robertson (Tesla), Austin Vernon (Orca), Alec Stapp (Institute for Progress), Casey Handmer (Terraform), Jesse Peltan (The Abundance Institute), Shayle Kann (EIP), and Andy Lubershane (EIP), James McGinniss (David Energy), Allison Bates Wannop (DER Task Force), Adam Forni

Summary

Context

Key questions

What we did

Key findings

Speed
Typical time to operation
Cost
Off-Grid Data Center, LCOE vs Lifetime Renewable Percent
Scale
Feasible Land for 90/10 Renewables Scenario in Southwest US

What do hyperscalers care about?

What are the current options?

What are the current options for meeting AI energy needs?

  1. Expand the grid. The problem is that this takes a long time and is expensive.
    • Interconnecting large new sources of electricity is taking longer and longer, a median of 5 years for projects built in 2023.
    • The cost of delivering (not generating) power via the grid is growing meaningfully faster than inflation – about 65% over a ten year period.
    • Siting large new electricity users (datacenters, reindustrialization, EV charging hubs, etc.) has also become very difficult, with operators reporting wait times of many years.²

    • This is not surprising, given the institutions that manage the power system have not had to service new electricity demand for nearly 20+ years. In the face of rapidly growing demand from AI datacenters, industrial on-shoring, and electrification, the grid is short generation and network capacity. This will likely be the case for the foreseeable future.
  2. Restart mothballed facilities like Three Mile Island. The problem is there’s a limited number of these opportunities.
    • There are relatively few recently decommissioned nuclear power plants that could be cost-effectively and quickly repowered.
    • While repowering shuttered coal plants could be an option in some locations, both local air pollution and greenhouse gas emissions make this problematic for hyperscalers. It also likely is not the fastest option if the plant has been closed for more than a few years.
  3. Build off-grid, colocated clean-firm energy like geothermal or new nuclear facilities. These are unlikely to get built at the speed and scale necessary to meet near-term AI energy needs.
    • In the medium to long-run, clean firm power — possibly off-grid/colocated — may represent a low-cost zero-carbon approach for powering datacenters at huge scale.
    • Prior to 2030, enhanced geothermal could fulfill some of the scale needed. For example, Fervo Energy is building 400 MW of enhanced geothermal in Utah that is expected to be online by 2028.
    • Still, clean firm solutions scaling to 30 GW+ by 2030 is very unlikely.
  4. Build new datacenters and new energy infrastructure next to existing utility-scale solar and wind. However, the opportunity here is limited as few utilities have excess existing clean energy capacity that is not currently being utilized.
    • There was ~250 GWs of existing utility-scale solar and wind generation capacity in the US by the end of 2023 and recent announcements indicate big tech companies may be pursuing a strategy of locating new datacenters next to these assets.
    • Although difficult to estimate, the vast majority of these generation assets have existing offtake agreements, making financial restructuring difficult. The true quantity of viable sites is likely significantly lower than new datacenter capacity needs.
    • Furthermore, using energy originally intended for the general public to power AI datacenters may be met with community or even federal pushback.
  5. Power datacenters with rented, portable gas/diesel generators until permanent power can be secured. This is a reasonable stop-gap solution to get power fast, though the scalability of this strategy is currently limited.
    • This option can apply to any of the above options. For example, the xAI site in Memphis is running on rented portable gas turbines while they wait for the existing site grid infrastructure to be upgraded. There are a few challenges here:
    • (1) They have discovered training loads require a battery buffer to maintain power quality, and few companies other than Elon’s have access to on demand utility-scale battery storage equipment to pair with rental power.
    • (2) Relying on this approach for all planned US datacenters would quickly overwhelm the available fleet of large-scale rental generation.
    • (3) Most users of rental power plan to transition once possible because this approach carries very high costs and generally reliability is lower than permanent infrastructure.
    • In short, this is probably not a very generalizable approach, though for those who can do it, it can be a useful stop-gap for most of the solutions above, including off-grid solar microgrids (to be discussed shortly).
  6. Build off-grid, colocated natural gas. Many groups we spoke to consider this to be the most viable near-term option.
    • It’s often the fastest option to get a new datacenter built in areas lacking existing grid capacity.
    • However, there has been limited experience to-date with large off-grid gas turbine plants, and the rapidly increasing demand for gas generation may result in significant delays for acquiring some equipment—especially turbines which are reported to currently have 3+ year lead times.

Off-grid solar microgrids have been conspicuously absent from most hyperscalers’ plans, which is surprising given they are likely the only clean solution that could also achieve the scale and speed requirements described. We wanted to know whether off-grid solar microgrids could meet the needs of hyperscalers—and specifically if they could be a competitive alternative to just building more off-grid natural gas. That is the subject of the rest of the paper.

The case for off-grid solar microgrids

We’ll walk through our findings in four parts:

  1. Cost: How much would this cost?
  2. Scale: Is there enough accessible, buildable land to power the near-term AI race? Where?
  3. Speed: How fast could these be built?
  4. Climate: What would the emissions impact be?

Cost: How much would this cost?

We’ll first describe what we did and then describe the findings. What we did:

What we found:

Scale: Is there enough land to power the near-term AI race? Where?

We’ll first describe what we did and then describe the findings. What we did:

 

What we found:

Speed: How fast could these be built?

Climate: What would the emissions impact be?

If this is so great, why isn’t it happening?

There seem to be a few reasons:

  1. To some extent, cost. There is a cost premium for a 90% renewable system ($109/MWh vs. $86/MWh). While $23/MWh is a premium, the implied cost per ton of avoided emissions here is around $50–within the range of what big tech companies pay for mitigation today. Furthermore, the 44% renewable system is basically at parity with gas only and offers a valuable hedge on fuel price risk.
  2. Massive datacenters dedicated to training only are a recent phonemenon, and datacenter designers have historically been skeptical of off-grid solutions due to the perceived need to optimize for uptime reliability. As more training-only datacenters are built, hyperscalers may be more willing to explore off-grid solar microgrid solutions. Interestingly, neither the off-grid gas turbine reference case nor the Three Mile Island + delivery costs case would achieve five nines without additional redundancy.
  3. But the biggest reason may simply be inertia and the fact that this hasn’t been done before. While building off-grid solar microgrids of this magnitude would be a first, it’s very possible to do with technology that exists today, and to scale it quickly.

Off-grid solar microgrids offer a fast path to power AI datacenters at enormous scale. The tech is mature, the suitable parcels of land in the US Southwest are known, and this solution is likely faster than most, if not all, alternatives (at least at the time of writing this paper). The advantages to whoever moves on this quickly could be substantial, and the broader opportunity offers a compelling path to rapidly securing key inputs to US AI leadership.

We hope this initial open-source exploration helps drive deeper analysis, and we welcome feedback, questions, critique, and collaboration. You can reach us at feedback@offgridai.us.