Here is the code:
// base temperature
sb.BaseTemp = 255 / Math.Pow(sb.OrbitalDistance / Math.Pow(sb.ParentStar.Luminosity, 0.5), 0.5);
if (sb.BaseTemp < 4) sb.BaseTemp = 4;
The result is in Kelvin.
I decided to not to add the effects of other stars in the system. I have the necessary formulae, but it would be hard for players to visualise.
Does the real Sol system use hard-coded base temperature values to better match reality in that case or is there something else afoot that's causing real Sol bodies to be ever-so-slightly warmer than they "should" be? When I plotted the hundreds of real Sol bodies, regression analysis spat out 255.3406815 with R
2=1.000 (and it was the only system to do so) so I assume there's something exceptional in play.
It's not out of the question that the 4 Kelvin minimum is playing games with the data but Venus wouldn't be presenting the same problem if that was it (and other systems would've been showing something other than 255.000 as well)
I had briefly considered if other stars in a system would add a bit of temperature or not but discovered they did not as I checked them; thank you for saving me that headache