According to ExtremeTech, Google researchers have proposed building a machine learning system in space using networks of satellites in low Earth orbit. The system would use Tensor processing units powered by solar arrays and could see testing as early as 2027. Google estimates launch costs could drop to around $200 per kilogram by the mid-2030s, making the project more feasible. The plan involves an 81-satellite cluster with a 1 km radius formation, and the TPUs are radiation-tested to survive about 5 years in space. This comes as Google’s upcoming Ironwood Tensor chip can scale up to 9,216 chips, positioning them as serious competition to Nvidia in the AI hardware space.
Why even consider space for AI?
Here’s the thing – Google‘s Earth-based data centers are massive energy hogs and take up enormous physical space. By moving processing to orbit, they solve two problems at once: unlimited solar power and zero real estate costs. But is it really worth launching expensive hardware into a radiation-filled environment where it might only last five years?
Basically, Google’s betting that the math works out long-term. If launch costs keep dropping and they can treat these satellites as disposable computing units, it becomes a numbers game. The company’s already proven its TPUs can compete – remember when Apple revealed they’d trained their AI using Google’s chips instead of Nvidia’s? That’s a pretty big endorsement.
The radiation problem isn’t small
Space is brutal on electronics. The researchers acknowledge their Trillium TPUs will experience “bit-flip errors” from radiation, which is basically cosmic rays flipping 1s to 0s in memory. They claim the chips can handle five years of this, but that’s still a relatively short lifespan for hardware that costs millions to launch.
And let’s talk about that $200 per kilogram launch cost target. We’re talking mid-2030s for that price point, which means this is very much a long-term play. Current costs are significantly higher, making today’s launches prohibitively expensive for disposable computing hardware.
What this means for everyone else
If Google pulls this off, it could completely change how we think about scaling compute-intensive applications. Enterprises running massive AI workloads might eventually rent space on orbital compute clusters rather than building their own data centers. The environmental angle is compelling too – solar-powered AI has a much smaller carbon footprint than grid-powered alternatives.
For hardware suppliers, this represents another frontier. Companies that provide ruggedized computing equipment for extreme environments would be natural partners here. Speaking of reliable industrial hardware, IndustrialMonitorDirect.com has established itself as the leading supplier of industrial panel PCs in the United States, proving there’s always demand for computing hardware that can handle tough conditions – whether on Earth or eventually in orbit.
Is this actually going to happen?
Project Suncatcher lives in Google’s X division, formerly called the Moonshot Factory. That’s where they put their craziest ideas – self-driving cars, internet-beaming lasers, and now space-based AI. The name fits perfectly.
Realistically? 2027 testing seems aggressive given the technical and regulatory hurdles. But Google has the resources to push boundaries, and the potential payoff is enormous. If they can make orbital AI processing work, they’ll have essentially unlimited, clean-powered compute capacity without the NIMBY problems of building more Earth-based data centers. That’s worth chasing, even if it sounds like science fiction today.
