That's correct. You need either pretty big or extremely hot radiators to dissipate the kinds of heat ED power plants produce.The radiation is supposed to be much more inefficient than convection.
Let's look at worst-case scenario, the E-rated PP, and take a ship with a straight 4E PP, no engineering. It has an eff rating of 1, which means 1MW of heat for 1MW of usable power. (We'll ignore the heat of all other systems for now; most of it is thrusters anyway, which should get rid of their own heat through the exhaust nozzles.) 10.6MW of usable power means 10.6MW of heat.
[For comparison, a 4A PP with Low Emissions 5/TS has 13.3MW at an eff rating of 0.13, so it produces only 1.72MW of heat]
The efficiency of radiation scales with the 4th power of the temperature, so for a given amount of heat it's desirable to run the radiator as hot as possible. However, there are physical limits, for instance the boiling point of the coolant. The best theoretical coolant would be Lithium, with a boiling point of roughly 1600K. But it has other issues, so in practice you'd probably want to use something like a molten salt at a considerably lower temperature.
The radiators of a Cobra Mk III seem to be roughly 50m² in area. Maybe 60 but certainly not much bigger.
To reject 10.6MW of heat over a 50m² radiator, it would have to run at 1400K --> that's extremely hot, so basically lithium is the only acceptable coolant.
The same ship running that 4A/LE/TS PP would require a radiator temperature of only 890K, still pretty hot but probably doable with molten salt coolant.
Either way, the radiator glow we observe in the game is pretty consistent with temperatures around - broadly - 1000K.
(Note that present-day spaceship radiators operate only at something like 360K, but only have to dissipate a few kW, and use organic fluids as coolant.)
Calculations done with https://space.geometrian.com/calcs/radiators.php
Edit: I mixed up some numbers between 4E and 4A PP, fixed
Last edited: