ED's Black Holes are close...

these is all material that was released during the Interstellar release. Wonder why it took 4 months before io9 finally picked it up?
 
Fair enough... point is: once measured, many times cut. All you need is a couple of models and some simple ways of varying them and there's your uncle Robert...
Hm, not necessarily in this case... I guess you only know the pictures of the Interstellar black hole from that edge-on perspective, but if you'd see the point of view changing you would see there's some really weird sh*t going on. (In the Wired video about the effects is a very short scene of that.) That's very hard to recreate. Sure, you could come up with a very crude approximation that would at least look more or less similar from selected point of views, but move too much and it comes apart.

- - - - - Additional Content Posted / Auto Merge - - - - -

these is all material that was released during the Interstellar release. Wonder why it took 4 months before io9 finally picked it up?
Because they are referring to a scientific paper by Kip Thorne and the VFX people that only came out a fews days before.
 
Last edited:
i understand what your saying but do you understand me if light speed was the limiting speed then it couldnt fall into anything it would always escape ergo it falling in faster than it can escape which must be by definition faster than light
Well first off most light would just bend round and fall back in. It can still do this while going at the speed of light all the time. The only case you'd still have a problem is for a ray leaving a black hole directly away from the centre.

The problem then is that light speed is a local thing. You're talking about measuring something more global - the time for a photon in the vicinity of a black hole to leave that area and get to where you are. And then you can have time delays for light travelling through a gravitational field, and for a black hole those time delays just keep getting bigger near the horizon. Google 'Shapiro delay'.
 
The latest edition of 'Click' on BBC iplayer has some coverage of the black hole generation for Intersteller. Apparently it took a 20,000 node farm 100 hours to render each frame. That would not make for a smooth gaming experience...

Seriously though, fascinating stuff :)

EDIT: Must read the rest of the thread before opening my mouth - sorry for the duplication!
 
Last edited:
black holes prove that light speed cannot be limited because if light cannot escape it something must be pulling the photons of light in faster than they can escape ergo faster than light
The problem is not a the force - which would be an interchange of particles. Since no particle can go faster than light, gravity cannot be an interchange of particles with the photon (no graviton could ever catch up with it to interact with it).
However gravity affects spacetime by warping it. Photons can only ever go straight within spacetime (along geodesics). A photon's path is never bent.
The 'bent' path we perceive is actually the straight path (by the definition of 'straight' - which is the least-time path a photon can take between two points). The warping we see is just because we, as humans, cannot perceive spacetime with our eyes. We just see the '3D space'-part of it (projected on a 2D retina, to boot).

The reason why no light can escape a black hole is because from a point inside the event horizon there are no paths which lead outside it. Once inside there literally is no direction in which you could point that would point to the outside.

*Note also that while particles are limited to less than c (and anything going at c must be massless) this does not apply to spacetime curving/expansion. Spacetime curvature changes/expansion/contraction are allowed to happen faster than c (e.g. there are galaxies over our observability horizon which are 'travelling' away from us faster than c due to expnasion of space between us and them)
 
Last edited:
At the moment, as far as I can tell they are merely rendered as spheres with an environment mapping from the background, and does not capture any "foreground" objects such as nearby stars - nor does looking through lensing effect have any effect on bending these around.

Doing that right would effectively boil down to raytracing, a.k.a., "you and what render farm?" as was alluded to already. Even a close enough heuristic approach involving environment maps would require quite a few (technically I guess it'd be an infinite number of renders, at which point raytracing starts to look like a really attractive proposition!) of those maps to properly capture the quite non-affine projections taking place close to the body.

Modern 3D APIs are struggling with simple, affine, thin-lens refraction or even seemingly simple transformation effects like that good old quake underwater wobbly-screen. Advanced physics are very much out of the question and require smoke and mirrors :(
 
Spacetime curvature changes/expansion/contraction are allowed to happen faster than c (e.g. there are galaxies over our observability horizon which are 'travelling' away from us faster than c due to expnasion of space between us and them)
No, this isn't the case. A change in curvature propagates at c as well (at least as far as we know) - gravitational radiation for example. The Hulse-Taylor binary pulsar indicates this is the case to quite a good degree of precision.

That globally things can look like they're doing things 'faster than light' doesn't mean changes to spacetime can propagate faster than light.
 
No, this isn't the case. A change in curvature propagates at c as well (at least as far as we know) - gravitational radiation for example.
You're right, I didn't put it as precisely as I wanted: The first derivative of that can be faster than c (not the move of the swell out from the source but the change of the swelling (i.e. the expansion/contraction of space itself) at a point of measurement can be faster than c).
 
When do we get to warp between black holes instead of this jump journeys that go on for hours. That's some space time I'd like to deal with.

I would also say, I would not be surprised if ED black holes were EXACTLY like the Theory.
If you build a theory, and build simulation to support that theory, then surely a game (or movie) can be built as well, to be just like the theory, in real life. :p

And if not, well, you only need to build another theory to provide the missing link to go to infinity, and beyond.

When does the wormhole open in ED? :)
 
Very big rep Titus. I find one of the interesting aspects of the pieces was that after a years work by Double Negative and so much research by Thorne, that Nolan decided to go with something less "real" because the vast majority of the audience would not be able to appreciate it, much less understand it. I must congratulate Frontier on not giving into the wiz-bang mentality that most game designers do and make black-hole representations spectacular but stupid.

As for true modeling the physics, forget it, beyond the capability of an i7 4790K. A good game is all about immersion while trying to stay as true to reality as possible which normally involves mapping graphics and dynamics onto a system in order to try to look like the physics would be in actuality, and that is a GOOD game, not most games. That is one of the things I like about SC, that they are going the extra step to do things like calculate the actual signal-to-noise ratio and the effect of chaff as opposed to just auto-buffing the missile strike percentage like other games. Even that results in nex-gen hardware requirements and 20GB file size for pre-Alphas.
 
Last edited:
Doing that right would effectively boil down to raytracing, a.k.a., "you and what render farm?" as was alluded to already. Even a close enough heuristic approach involving environment maps would require quite a few (technically I guess it'd be an infinite number of renders, at which point raytracing starts to look like a really attractive proposition!) of those maps to properly capture the quite non-affine projections taking place close to the body.

Modern 3D APIs are struggling with simple, affine, thin-lens refraction or even seemingly simple transformation effects like that good old quake underwater wobbly-screen. Advanced physics are very much out of the question and require smoke and mirrors :(
For a full recreation of the weird light-bending effects, that's certainly true. But if you just want to warp objects in the system just like the skybox is warped, you could do this by rendering to a texture. That's (most likely) the way other distortion effects in ED are already done (e.g. the heat effects when you look closely at other ships main thrusters, or the distortion when entering or leaving frameshift travel). In many other games, you can also find glass objects, that refract the objects behind them – also render to texture.
 
I still don't agree that the black hole itself is black. Between you and the event horizon would be material falling into the singularity which would be heating up in the accretion disk and while gravity would be stretching the material into the red it would be still false shaded visible.
 
Back
Top Bottom