As someone with a similar background to the writer of this post (I did avionics work for NASA before moving into more “traditional” software engineering), this post does a great job at summing up my thoughts on why space-based data centers won’t work. The SEU issues were my first though followed by the thermal concerns, and both are addressed here fantastically.
On the SEU issue I’ll add in that even in LEO you can still get SEUs - the ISS is in LEO and gets SEUs on occasion. There’s also the South Atlantic Anomaly where spacecraft in LEO see a higher number of SEUs.
> On the SEU issue I’ll add in that even in LEO you can still get SEUs
As a sibling post noted, SEUs are possible all the way down to sea level. The recent Airbus mass intervention was essentially a fix for a badly handled SEU in a corner case.
Single event upsets are already commonplace at sea level well below data center scale.
The section of the article that talks about them isn’t great. At least for FPGAs, the state of the art is to run 2-3 copies of the logic, and detect output discrepancies before they can create side effects.
I guess you could build a GPU that way, but it’d have 1/3 the parallelism as a normal one for the same die size and power budget. The article says it’d be a 2-3 order of magnitude loss.
It strikes me that neutral network inference loads are probably pretty resilient to these kinds of problems (as we see the bits per activation steadily decreasing), and where they aren't, you can add them as augmentations at training time and they will essentially act as regularization.
The only advantage I can come up with is the background temperature being much colder than Earth surface. If you ignored the capex cost to get this launched and running in orbit, could the cooling cost be smaller? Maybe that's the gimmick being used to sell the idea. "Yes it costs more upfront but then the 40% cooling bill goes away... breakeven in X years"
Strictly speaking, the thermosphere is actually much warmer than the atmosphere we experience--on the order of 100's or even a 1000 degrees Celsius, if you're measuring by temperature (the average kinetic energy of molecules). However, since particle density is so low, the number of molecules is quite low, and so total heat content of the thermosphere is low. But since particle count is low, conduction and convection are essentially nonexistent, which means cooling needs to rely entirely on radiation, which is much less efficient than other modes at cooling.
In other words, a) background temperature (to the extent it's even meaningful) is much warmer than Earth's surface and b) cooling is much, much more difficult than on Earth.
Technically radiation cooling is 100% efficient. And remarkably effective, you can cool an inert object to the temperature of the CMBR (4K) without doing anything at all. However it is rather slow and works best if there's no nearby planets or stars.
Fun fact though, make your radiator hotter and you can dump just as much if not more energy then you would typically via convective cooling. At 1400C (just below the melting point of steel) you can shed 450kW of heat per square meter, all you need is a really fancy heat pump!
I dont have firm numbers for you since it would depend on environmental conditions. As an educated guess though, I would say a fucking shit ton. You wouldn't want to be anywhere near the damn thing.
A car's "radiator" doesn't actually lose heat by radiation though. It conducts heat to the air rushing through it. That's absolutely nothing like a radiator in a vacuum.
Is it an advantage though ? One of the main objections in the article is exactly that.
There's no atmosphere that helps with heat loss through convection, there's nowhere to shed heat through conduction, all you have is radiation. It is a serious engineering challenge for spacecrafts to getting rid of the little heat they generate, and avoid being overheated by the sun.
A typical CPU heatsink dissipates 10-30% of heat through radiation, and the rest through convection. In space you're in a vacuum so you can't disipated heat through convection.
You need to rework your physical equipment quite substantially to make up for the fact you can't shed 70-90% of the heat in the same manner as you can down here on Earth
But the cooling cost wouldn’t be smaller. There’s no good way to eliminate the waste heat into space. It’s actually far far harder to radiate the waste heat into space directly than it would be to get rid of it on Earth.
I don't know about that. Look at where the power goes in a typical data center, for a 10MW DC you might spend 2MW just to blow air around. A radiating cooler in space would almost eliminate that. The problem is the initial investment is probably impractical.
>99.999% of the power put into compute turns into heat, so you're going to need to reject 8 MW of power into space with pure radiation. The ISS EATCS radiators reject 0.07 MW of power in 85 sq. m, so you're talking about 9700 sq. m of radiators, or bigger than a football field/pitch.
Pardon, but the question of "could the operational cost be smaller in space" is almost not touched at all in the article. The article mostly argues that designing thermal management systems for space applications is hard, and that the radiators required would be big, which speaks to the upfront investment cost, not ongoing opex.
Ok, sure, technically. To be fair you can't really assess the opex of technology that doesn't exist yet, but I find it hard to believe that operating brand new, huge machines that have to move fluid around (and not nice fluids either) will ever be less than it is on the surface. Better hope you never get a coolant leak. Heck, it might even be that opex=0 still isn't enough to offset the "capex". Space is already hard when you're not trying to launch record-breaking structures.
Even optimistically, capex goes up by a lot to reduce opex, which means you need a really really long breakeven time, which means a long time where nothing breaks. How many months of reduced electricity costs is wiped out if you have to send a tech to orbit?
Oh, and don't forget the radiation slowly destroying all your transistors. Does that count as opex? Can you break even before your customers start complaining about corruption?
Maintenance will be impossible or at least prohibitively expensive. Which means your only opex is ground support. But it also means your capex depreciates over whatever lifetime these things will have with zero repairs or preventive maintenance.
But ground support will not be cheap. You need to transfer a huge amount of data, which means you need to run and maintain a network of ground stations. And satellite operations are not as cheap as people like to think either.
Things on earth also have access to that coldness for about half of each day. How many data centers use radiative cooling into the night sky to supplement their regular cooling? The fact that the answer is “zero” should tell you all you need to know about how useful this is.
The atmosphere is in the way even at night, and re-radiates the energy. The effective background temperature is the temperature of the air, not to mention it would only work at night. I think there would need to be like 50-ish acres of radiators for a 50MW datacenter to radiate from 60 to 30C. This would be a lot smaller in space due to bigger temp delta. Either way opex would be much much less than average Earth DC (PUE almost 1 instead of run-of-the mill 1.5 or as low as 1.1 for hyperscalers). But yeah the upfront cost would be immense.
I think you’re ignoring a huge factor in how radiative cooling actually works. I thought the initial question was fine if you hadn’t read the article but understand the downvotes due to doubling down. Think of it this way. Why do thermoses have a vacuum sealed chamber between two walls in order to insulate the contents of the bottle? Because a vacuum is a fucking terrible heat convector. Putting your data center into space in order to cool it is like putting a computer inside of a thermos to cool it. It makes zero fucking sense. There is nowhere for the heat to actually radiate to so it stays inside.
> A 1 m^2 radiator in space can eliminate almost a kilowatt of heat.
Assuming that this is the right order of magnitude, a 8MW datacenter discussed upthread would require ~8000 m^2, plus a fancy way of getting the heat there.
A kilowatt is nothing. The workstation on my desk can sustain 1 kW.
Look up Tech Ingredients episode on Radiative Paint.
The fact that people aren’t using something isn’t evidence that it’s not possible or even a great idea, it could be that a practical application didn’t exist before or someone enterprising enough hasn’t come along yet.
When something has been known for millennia and hasn’t been put to a particular use even after decades where it could have been used, that is pretty good evidence that this use isn’t a good idea. Especially when it’s something really simple.
Radiative cooling is great for achieving temperature a bit below ambient at night when you don’t have any modern refrigeration equipment. That’s about all. It’s used in space applications because it’s literally the only option.
I think by far the most mass in this kind of setup would go into the heat management, which could probably last a long time and could be amortized separately from the electronics.
How would the radiators be useful if the electronics no longer are? Unless you can repurpose the radiators once the electronics are useless, which you can't in space, then the radiators' useful lifetime is hard limited by the electronics' lifetime.
I got hired at my current company back in 2022 through their post on a Who’s Hiring post here. Likely never would have heard of them otherwise, but glad I did!
Thanks for sharing this. I had the pleasure to work with Don Eyles in a previous job, he is a brilliant man, didn’t realize he had written a book. I actually didn’t even realize who he was initially when I first met him until another co-worker told me of Eyles’ history.
Fun fact: when the Quindar tones were no longer needed, NASA initially removed them. However, it caused confusion in mission control since everyone was used to them, so NASA added them back.
Your argument assumes a just state that will not outlaw things like dissent against the government, being LGBTQ, etc. Plus, having something to hide does not automatically mean you are doing something criminal. I do nothing criminal, but I still have things to hide.
>Your argument assumes a just state that will not outlaw things...
No, once things are outlawed you should stop doing them. Any prior messages don't matter since the law can't apply retroactively. I am assuming ex post facto laws are prohibited.
>Plus, having something to hide does not automatically mean you are doing something criminal.
If you the reasoning for hiding is because you don't want the government to see it then it is likely criminal.
I'd also add Flying Blind by Peter Robinson, a book written specifically about the demise of Boeing's culture and how it translated into the 737 Max disaster.
Out-Sourced Profits: The Cornerstone of Successful Subcontracting by L. J. Hart-Smith, an engineer at Boeing's Phantom Works unit during that transition period, was a internal paper that walks through one of the many disputes between Boeing and Douglas/GE culture and derives why the Boeing way was correct.
Figure 2 is great, but I would agree the entire paper is needed if you are implementing raft. There are a few specifics in the paper that you need when implementing it.
> Figure 2 is great, but I would agree the entire paper is needed if you are implementing raft. There are a few specifics in the paper that you need when implementing it.
It was more than that. I'm blanking on what it was but there were parts where I really couldn't find anything about the intended behavior in the paper (let alone in Figure 2) except for in Diego's thesis or in the TLA+ spec.
Though maybe I was just not reading the paper correctly.
I love this site. When I was learning & implementing raft in my distributed systems course, this page was invaluable. Plus the paper itself is pretty easy to read.
On the SEU issue I’ll add in that even in LEO you can still get SEUs - the ISS is in LEO and gets SEUs on occasion. There’s also the South Atlantic Anomaly where spacecraft in LEO see a higher number of SEUs.