I'm fascinated by the mechanics that drive the "time to travel to objective" readout when supercruising - the lower figure in the readout.
Unless I'm mistaken, the program is trying to predict the time it will take to reach the objective, with constantly varying speed and acceleration. So, it starts at maybe > 1 year, as speed increases the time ticks down at an ever increasing rate, until on approach when speed begins to drop the rate slows again.
Problem is, it seems to exhibit some pretty unusual behaviour. I've not actually recorded readings and done sums, but the rates of increase and decrease don't seem always to tally with the changing velocity. I've even seen the time start to increase as the distance decreases!
Surely, all the sums to calculate the speed changes are in the software, so why can't this figure do the one thing it just doesn't ever do at the moment, which is to tell you how long it will be before you arrive?
Unless I'm mistaken, the program is trying to predict the time it will take to reach the objective, with constantly varying speed and acceleration. So, it starts at maybe > 1 year, as speed increases the time ticks down at an ever increasing rate, until on approach when speed begins to drop the rate slows again.
Problem is, it seems to exhibit some pretty unusual behaviour. I've not actually recorded readings and done sums, but the rates of increase and decrease don't seem always to tally with the changing velocity. I've even seen the time start to increase as the distance decreases!
Surely, all the sums to calculate the speed changes are in the software, so why can't this figure do the one thing it just doesn't ever do at the moment, which is to tell you how long it will be before you arrive?