-2.8 C
New York
Saturday, February 7, 2026

Space data centers: AI’s next frontier explained

Terrestrial data centers are so 2025. We’re taking our large-scale compute infrastructure into orbit, baby! Or at least, that’s what Big Tech is yelling from the rooftops at the moment. It’s quite a bonkers idea that’s hoovering up money and mindspace, so let’s unpack what it’s all about – and whether it’s even grounded in reality.

Let’s start with the basics. You might already know that a data center is essentially a large warehouse filled with thousands of servers that run 24/7.

AI companies like Anthropic, OpenAI, and Google use data centers in two main ways:

  • Training AI models – This is incredibly compute-intensive. Training a model like the ones powering OpenAI’s ChatGPT or Anthropic’s Claude required running calculations across thousands of specialized chips (GPUs) simultaneously for weeks or months.
  • Running AI services – When you converse with those models’ chatbots, your messages go to a data center where servers process it and send back the model’s response. Multiply that by millions of users having conversations simultaneously, and you need enormous computing power ready on-demand.

AI companies need data centers because they provide the coordinated power of thousands of machines working in tandem on these functions, plus the infrastructure to keep them running reliably around the clock.

To that end, these facilities are always online with ultra-fast internet connections, and they have vast cooling systems to keep those servers running at peak performance levels. All this requires a lot of power, which puts a strain on the grid and squeezes local resources.

So what’s this noise about data centers in space? The idea’s been bandied about for a while now as a vastly better alternative that can harness infinitely abundant solar energy and radiative cooling hundreds of miles above the ground in low Earth orbit.

Powerful GPU-equipped servers would be contained in satellites, and they’d move through space together in constellations, beaming data back and forth as they travel around the Earth from pole to pole in the sun-synchronous orbit.

The thinking behind space data centers is that it’ll allow operators to scale up compute resources far more easily than on Earth. Up there, there aren’t any constraints of easily available power, real estate, and fresh water supplies needed for cooling.

Starcloud has already begun running and training a large language model in space, so it can speak Shakespearean English

There are a number of firms getting in on the action, including big familiar names and plucky upstarts. You’ve got Google partnering with Earth monitoring company Planet on Project Suncatcher to launch a couple of prototype satellites by next year. Aetherflux, a startup that was initially all about beaming down solar power from space, now intends to make a data center node in orbit available for commercial use early next year. Nvidia-backed Starcloud, which is focused exclusively on space-based data centers, sent a GPU payload into space last November, and trained and ran a large language model on it.

The latest to join the fold is SpaceX, which is set to merge with Elon Musk’s AI company xAI in a purported US$1.25-trillion deal with a view to usher in the era of orbital data centers.

According to Musk’s calculations, it should be possible to increase the number of rocket launches and the data center satellites they can carry. “There is a path to launching 1 TW/year (1 terawatt of compute power per year) from Earth,” he noted in a memo, adding that AI compute resources will be cheaper to generate in space than on the ground within three years from now.

As Elon Musk's SpaceX and xAI are set to merge, he's keen on increasing the number of satellite-carrying rocket launches per year to serve the need for space data centers
As Elon Musk’s SpaceX and xAI are set to merge, he’s keen on increasing the number of satellite-carrying rocket launches per year to serve the need for space data centers

In an excellent article in The Verge from last December, Elissa Welle laid out the numerous challenges these orbital data centers will have to overcome in order to operate as advertised. For starters, they’d have to safely wade through the 6,600 tons of space debris floating around in orbit, as well as the 14,000-plus active satellites in orbit. Dodging these will require fuel.

You’ve also got to dissipate heat from the space-based data centers, and have astronauts maintain them periodically. And that’s to say nothing about how these satellites will affect the work of astronomers or potentially increase light pollution.

Ultimately, there’s a lot of experimentation and learning to be gleaned from these early efforts to build out compute resources in space before any company or national agency can realistically scale them up.

And while it might eventually become possible to do so despite substantial difficulties, it’s worth asking ourselves whether AI is actually on track to benefit humanity in all the ways we’ve been promised, and whether we need to continually build out infrastructure for it – whether on the ground or way up beyond the atmosphere.

Related Articles

Stay Connected

0FansLike
0FollowersFollow
0SubscribersSubscribe

Latest Articles