Asteroid cores are not cold

It's not the fact it's a very fine atmosphere that's the issue, that I agree won't take away much heat, it's the fact that it's there at all. In a complete vacuum the ejecta from the asteroid, including any gases will continue traveling in the direction of the impulse from the explosion and dissipate into the emptimess of space very quickly, of course physics in ED doesn't actually model this behaviour but that's beside the pont. The cold gases produced by the explosion aren't expanding into a vacuum but the very thin atmosphere of a gas giants rings, and of course larger rings will have a denser atmosphere, and that will act as a break on the spread of the gaseous cloud created by the explosion, creating a much denser and longer lasting area of dense and very cold fog from the very cold asteroid.
 
So you think you could use electromagnetic energy transfer to cool your home, while basically living in a vacuum chamber where no thermal conduction is happening? Just to be clear, you're talking about EM, not convection based energy transfers, right?

I am talking about EM radiation, yes. Mostly visible light in this case, or whatever is emitted at the highest temperature I can heat the radiator in question to without destroying it.

You said "provided I had enough energy" when energy is what you're trying to get rid of.

No heat pump is 100% efficient and with potentially several stages needed, efficiency would be horrible with anything I could affordably build now. I'd assume they are more efficient in the Elite setting, but there would be some losses.

I had a chilled-water cooled TEC setup (a phase change condensor coil cooling water, that was used to cool the hot side of a pile of peltiers, that cooled a mix of ethanol, glycol, and water that ran to my CPU and video card) on one of my computers for a while. Kept parts that needed to dissipate about 600w of heat quite frosty, but I needed about 2000w of power to move that 600w of heat with the temperature delta I was looking for.
 
Everyone knows that if you crack a core asteroid and then pilot the ship into the debris, your ship temperature drops until the screen ices up. I've seen it mentioned in videos that it is because the core, or the result of the blast, makes that area of space very cold. That's wrong. That area of space is no colder than any of the rest of the environment.

Cold is, of course, the absence of heat. There are only three ways that heat moves: conduction, convection and radiation. In conduction a cold body and a hot body in contact with each other share their heat until they are the same temperature. In convection a hot body warms the air around it, which then rises and allows cooler air to take its place. A hot body radiates infra-red light and in doing so loses energy, which results in it cooling down.

In space there is nothing in contact with your ship, so you can't use contact to cool down, (except for heatsinks.) So you can't use conduction. There is also no air and no "up" for it to move, so you can't use convection. The only way for your ship to lose heat is to radiate it. Radiation is very inefficient for cool bodies and so this is a problem that real space ship designers battle with. See how they do it on the ISS.

The rocks in icy rings are cold. They contain liquid oxygen, which puts their temperature between −218.79 °C and −182.962 °C. Every rock is that temperature; there is nothing special about cores. That has no effect on your ship because vacuum is a perfect heat insulator; the temperature of the rocks around you makes no difference to your ship. It is built to work in deep space where there is even less heat coming from a sun and, anyway, with a spacecraft, the problem is always cooling down, not warming up.

Except, that is, in the dense debris field immediately following the cracking of a core. The blast vapourises stuff and throws out rocks of all sizes, large and small. You hear it impact your hull when the rock cracks. The smallest rock shown when you fly into the debris is maybe a tonne, but there must be a whole host of smaller stuff too, down to grains of sand or less, and there will be a whole lot of it. Of course it will dissipate in time, drifting away and being attracted to other nearby bodies. But not quickly; the big rocks are only drifting slowly and the smaller stuff will not be moving any faster. For a few minutes following the blast, that space is not empty. One might say that it is not a vacuum, although the vast majority of it will be solid and liquid rather than a gas.

All that solid, liquid and gaseous matter makes contact with the hull of your ship. The heat of your ship is conducted into the cold particles and your ship cools down. Rather than being a strange property of cores or the explosion that cracks them, the cooling effect is a beautifully observed simulation of common physical law.
Cracked cores, BRRRR
Game says BRRR.
(y)

I'm freezing my Asp off. BBRRRRRRR!
 
Ooh, this was a fun nerdy thread to read through. Sorry I was mostly at work while it was happening!

If I can throw in my two cents (is there a UK equivalent to this expression?) -

Agreed, the cooling effect is clearly caused by contact with a cloud of tiny ice particles. The graphics clearly imply that core cracking leaves behind a substantial cloud of dust, presumably of the same composition as the cracked asteroid. Since the cooling effect only occurs in clouds in icy rings, we can infer that it is due to the volatile nature of the ices. Since the cloud is persistent rather than quickly dissipating, we can infer that it is composed of solid particles rather than gas. (Yes, gas giant rings technically have an atmosphere. But no, it would be too tenuous to affect anything.) That said, I'm skeptical that our ships' radiators would in reality contact a sufficient mass of dusty debris to provide such a strong cooling effect.

Morbad is correct, it is plausible for our ships to dump the claimed power levels using thermal radiation. Just eyeballing the color temperature, the radiators at full load look to be a couple thousand Kelvin if they are nearly ideal blackbody radiators. Fun fact! According to the Stefan-Boltzmann equation, 2050 Kelvin is the temperature at which an ideal blackbody emits one megawatt per square meter.

For what it's worth, my theory is that our ships use a liquid metal for the high-temperature cooling loop, and firing a heat sink actually dumps the current supply and replenishes it with a fresh (cool) stock. That would explain why they can fire so fast, but have a really long reload time.
 
For what it's worth, my theory is that our ships use a liquid metal for the high-temperature cooling loop, and firing a heat sink actually dumps the current supply and replenishes it with a fresh (cool) stock. That would explain why they can fire so fast, but have a really long reload time.

This is an interesting possibility I hadn't considered and would certainly simplify heatsink integration.
 
I am talking about EM radiation, yes. Mostly visible light in this case, or whatever is emitted at the highest temperature I can heat the radiator in question to without destroying it.



No heat pump is 100% efficient and with potentially several stages needed, efficiency would be horrible with anything I could affordably build now. I'd assume they are more efficient in the Elite setting, but there would be some losses.

I had a chilled-water cooled TEC setup (a phase change condensor coil cooling water, that was used to cool the hot side of a pile of peltiers, that cooled a mix of ethanol, glycol, and water that ran to my CPU and video card) on one of my computers for a while. Kept parts that needed to dissipate about 600w of heat quite frosty, but I needed about 2000w of power to move that 600w of heat with the temperature delta I was looking for.
Actually I would think just the opposite. The way that ships overheat just from using standard modules indicates they still don't have a firm grasp on efficiency and heat dissipation.
 
Back
Top Bottom