ED & 3DTVs

We need frame-packed 1080P 60 fps per i.

Many good 3D TV's have a minimum of 120hz.

Not got a 3D TV myself, my other half does, but I think she'd notice if I swapped them :( Although after trying various TV's out in the shops, not found one that works with me, I seem to always get erm.. crosstalk? ghosting? not sure what the technical term is.
 
Hi,

I saw the dev diary and it says support for 3D TVs:D I'm thinking of getting as Panasonic TX-P50VT30 or TX-P55VT30 plasma. They are compatible with the 3D shutter glasses I have already. A huge 50/55 inch screen. I'm wondering will it be proper 1080P60 like Blu-ray 3D or the 1080 divided to display two frames like Sky3D?


Thanks
Haider

Consider getting an Optoma 3D projector and a 100" screen....I can't wait until ED is released and I get to play it on that. I'm just going to sit back gaping at the screen so much I'll probably get blasted to pieces by pirates over and over...
 
I have an Optoma :) Does a great job, if only in 720p. If you have an ATI card you can pick up TriDef software and some shutter glasses fairly cheap too. I can say Skyrim in 3D is amazing on a 80+ inch display with the lights out sitting on the sofa. Will no doubt try Elite out on the big screen if it works well in 3D. The small caveat being if you need the keyboard close to hand (which I am expecting) I will give it a miss on the big screen. It is great though for gamepad games.
 
HDMI 3D and other terms that should be banned

Would love some feedback from people on how 3D looks/works.
I'm planning to buy a new TV for ED in Jan anyway so if 3D is well recommended then it'd be rude not to go the whole hog

This game is getting more expensive by the day :D

It's late, and I'm bored, so forgive me if I ramble on. Much of this will be review for most people so I'll try to divide it into sections or you can just skip the whole post. But he asked.... :)

How 3d Works

First, bear in mind that all 3d is a deception. You're tricking your eyes into perceiving a third dimension that doesn't actually exist. Your eyes see depth because you're getting two views of an object (one from each eye) from a slightly different point in space. Your brain translates the difference in those two images into a single image with depth. Experience 'trains' your brain into judging that depth accurately. 3D simulates that by showing each eye a different image and fooling your brain into perceiving depth that actually isn't there. This should technically be called stereoscopy as it really is not 3d, but buzzwords will be buzzwords.
There are many methods of doing this from crossing your eyes (remember stereograms?) to elaborate periscope systems using prisms (these started out as ViewMasters). Currently, the popular methods can be divided into 3 groups. Anaglyph or Passive, Active, and Integral Imaging.

Anaglyph / Passive
This group uses some sort of passive filtering system to present different images to each eye. In the "old days" it was a color system, usually red and green filters, to allow an image to only go to the eye that it was intended to reach. Obviously this limited the colors that were available to be seen so it was much more a novelty than a usable technology. Today, polarized glasses are used. This is method most commonly used in "cinema 3d" such as you see in the theater and most televisions

Active
Active systems use a timer to open and close a shutter alternately in front of each eye. The display is coordinated to present the correct image to the eye that is not being covered at that instant. This method is used by Nvidia's 3d system.

Integral Imaging
This method uses one display for each eye. This usually is a head mounted device using small screens at close proximity to the eye to avoid images bleeding over to the other eye. Oculus Rift as an example. Nintendo uses two screens in their handheld 3d device and separates the screens by thin strips that block each eye from seeing the 'other' eye's screen.

Pros and Cons

Passive - inexpensive, glasses are light weight and reasonably comfortable and currently is the most popular method for cinema and home theater. While very acceptable, it is the least 'sharp and clear' method

Active - Clear picture, excellent for gaming and a good 'middle of the road' solution. More expensive than passive, glasses tend to be heavier and a little awkward as they contain a battery and a receiver for the timing signal

Integral Imaging - Best picture but by far the most expensive. Double screens, double cost. Head mounted units tend to be a bit heavy, and this method is the most likely to induce motion sickness as most of them block all outside stimuli

continued
 
HDMI 3d and other words that should be banned cont.

HDMI and 3d

There's a lot of confusion about HDMI and 3D primarily because of how it has been implemented. The specs get released but there is no requirement to comply with them. Each manufacturer can comply fully with the spec or just meet the minimums and still say they meet the spec. The minimums are mostly bandwidth related and again, are all that are required to claim compliance with a given spec. For this reason, you should shop HDMI equipment for features, not specs.

Synopsis of HDMI Specs (technical, ignore if you like)

1.0 - original spec, max output 4.9GPS ( 1080p @ 60Hz)
1.1 - added support for DVD audio
1.2 - 1st big jump, added SACD (8 channel audio) PC connector and low voltage support for PCIE.
1.2a - Consumer Electronic Control (CEC) support. Was supposed to support "smart interoperability". Wasn't very well implement as there was no standardization and most equipment was only 'smart' if you matched brands
1.3 bandwidth increase to 10.2GPS and additional color space (16 bit support, up from 8 bit) Mostly a failure as the cables weren't capable of complying
1.3a small changes to CEC and connector integration
1.3b defined testing for 1.3 spec
1.3b1 added parameters for testing products
1.3c added parameters for testing with active HDMI cables
1.4 Big one here... added Ethernet support, defined protocols for 3D video formats, micro connectors and increased bandwidth
1.4a added 3D format for broadcast content. HBO jumped on this, and made 3D on demand available. Currently most used for home theater
1.4b (Trumpets blowing..) 3D 1080p@120 HZ support thus allowing 3D at 60 Hz per eye.
2.0 Big update.. bandwidth is 18GPS, resolutions up to 4K@60Hz, wide angle theatrical support, updated CEC and backwards compatible with high speed HDMI cables. Not yet widely implemented

Back to the topic...
Currently there are really only 4 types of HDMI cables. regular speed with and without Ethernet support and High speed with and without Ethernet support. With current prices not being much different, most recommendations are to go with an inexpensive high speed cable. The connectors vary between manufacturers, some offering signal boost for longer distance runs, some protection for automotive use and some other uses, but the fact is, it's mostly marketing hype. HDMI is a digital signal which means it works or it doesn't. It's not like an analog signal that will degrade over distance or with interference. Either there's enough data to make a picture or it fails completely. As none of them had enough bandwidth available to offer 3D at 60 FPS/Eye until 1.4b, computer implementations of 3D required DVI-D connectors (Nvidia and ATI). 25 - 30 FPS/Eye was adequate for television but not well received by gamers. If you glanced through the above list, you'll see that HDMI wasn't capable of 3D support at 60 FPS/Eye until 1.4b became available. The problem now is that 3D is a chain. To achieve acceptable frame rates for games, the source, the cable and the display device must ALL be fully 1.4b compliant. Any "weak link" in the chain will limit transmission to the spec of the 'weak link'. Those of you who were talking about new 3d TV's should bear this in mind when you go shopping.

continued
 
My tv is an LG 42" model number 42LW450U I cannot find if it supports HDMI 1.4b but it says in the manual it supports 3D at 60p ??:S
 
My tv is an LG 42" model number 42LW450U I cannot find if it supports HDMI 1.4b but it says in the manual it supports 3D at 60p ??:S

I looked up your TV. It will do 1080/60p. Translated loosely, the 1080 is the resolution, 60 is the frame rate and the p means is uses progressive scanning as opposed to interlacing. Progressive scanning produces a much smoother, more 'film like' display.

The frame rate on a display has to be 'split' to produce an image for each eye. In order to achieve 60 frames per eye (FPE), the minimum refresh rate for a display has to be at least 120 Hz or FPS.

The human eye will perceive "smooth" motion around 23 - 25 FPS. Generally 30 FPS is enough to avoid any flickering. Most hard core gamers will tell you that you have to have at least 50 fps for a game to play smoothly, but the reality is that your card can produce a million frames per second and they will get discarded if the display can't keep up.
Am I the only one here that has the Nvidia kit?
I use Nvidia's stereoscopic 3D system in my computer, with a 120Hz monitor. I've experimented with it, and I can't visually tell the difference between 60 fps and 100 fps or higher. I can tell the difference between 30 and 60 fps visually. It's not a quality difference, it's more of a fluidity difference if that makes any sense. Motion seems smoother, but that's just my opinion. My television is also 120hz as is my DVD player. I haven't found a way to scale it down to 30 to test it, but the difference in 3D quality between the active (Nvidia) system and the Passive (TV) is noticeable. Both are more than adequate, but the active system is sharper and motion is smoother. Again, an opinion.
 
Last edited:
I looked up your TV. It will do 1080/60p. Translated loosely, the 1080 is the resolution, 60 is the frame rate and the p means is uses progressive scanning as opposed to interlacing. Progressive scanning produces a much smoother, more 'film like' display.

The frame rate on a display has to be 'split' to produce an image for each eye. In order to achieve 60 frames per eye (FPE), the minimum refresh rate for a display has to be at least 120 Hz or FPS.

The human eye will perceive "smooth" motion around 23 - 25 FPS. Generally 30 FPS is enough to avoid any flickering. Most hard core gamers will tell you that you have to have at least 50 fps for a game to play smoothly, but the reality is that your card can produce a million frames per second and they will get discarded if the display can't keep up.

I use Nvidia's stereoscopic 3D system in my computer, with a 120Hz monitor. I've experimented with it, and I can't visually tell the difference between 60 fps and 100 fps or higher. I can tell the difference between 30 and 60 fps visually. It's not a quality difference, it's more of a fluidity difference if that makes any sense. Motion seems smoother, but that's just my opinion. My television is also 120hz as is my DVD player. I haven't found a way to scale it down to 30 to test it, but the difference in 3D quality between the active (Nvidia) system and the Passive (TV) is noticeable. Both are more than adequate, but the active system is sharper and motion is smoother. Again, an opinion.

I agree at 1080p active 3D is technically the better solution overall, becasue passive halves the resolution. If either of the current 3D formats are to survive however in standard TVs long term, I think it will be passive, becasue once 4K TVs take off, which should not be too far away, imo the halving of the resoltion using passive 3D is not going to be a problem.

My TV is active, and to be fair 3D is not its strongest suit but even so, expensive glasses which need charging, with visible flicker if there are light sources in the wrong places, and slight cross talk are all things which imo will stop active 3D from bettering passive once TVs have a higher resolution.

As for frame rate, I agree it is a law of diminishing returns the higher you go, but 30fps? to be honest 30fps makes me slghtly headachy after a while. Anyone who is used to playing at 60fps would instantly be able to tell the difference between that and 30 when gaming
As for TVs supporting HDMI 1.4b or HDMI2, unless your set is new and top of the range, i think you will be out of luck. By the time ths game launches however perhaps things will be better on that front.
 
Last edited:
I looked up your TV. It will do 1080/60p. Translated loosely, the 1080 is the resolution, 60 is the frame rate and the p means is uses progressive scanning as opposed to interlacing. Progressive scanning produces a much smoother, more 'film like' display.

The frame rate on a display has to be 'split' to produce an image for each eye. In order to achieve 60 frames per eye (FPE), the minimum refresh rate for a display has to be at least 120 Hz or FPS.

Kawboy thankyou so much for helping me, you are commended for your 3D knowledge!! I guess my TV is fine for Elite dangerous judging by what you say
I have an AMD radeon uber graphics card R9 290 so the TV will just be able to keep up with the frame rates, If I ever see you in game expect a few K credits or a big shield/gun coming your way!:D
 
If I ever see you in game expect a few K credits or a big shield/gun coming your way!:D
Unnecessary... just don't shoot me! :D

3D has fascinated me since I saw my first stereogram and could see the picture in it. I've done a ton of research/reading on it and I like to babble on anyway so it's not a big deal to share.
 
Passive is definitely going to win. It's already at a much higher market penetration, it's significantly less expensive and is already available at 60 FPE.

As sad as it sounds, I knew passive was the way to go when the Adult film industry adopted it as the preferred method of delivering 3D. Historically, when new video technologies become available, the one that the Adult film industry backs will win.

Betamax vs. VHS
DVD vs. LaserDisk
Blu-ray vs. HD-DVD.

"The race is not always to the swift, nor the battle to the strong. but that's the way to bet! "
 
Passive is definitely going to win. It's already at a much higher market penetration, it's significantly less expensive and is already available at 60 FPE.

The good news is unlike betmax or laser disk, or HDDVD, long term it does not matter which "wins" really. even if every manufacturer ditched active tech tomorrow my TV will continue to work in 3D (ok getting glasses could become problematic)

so to anyone contemplating getting a 3D tv now, imo do not let the format war put you off. Get the TV you like as chances are by the time you come to replace it, the next set you get will be a very different beast to anything on the market right now any how.
The only small print to this would be, IF you can find one which supports the newer HDMI protocol to allow 60hz 1080p 3D, it is worth considering that.
 
Passive is definitely going to win. It's already at a much higher market penetration, it's significantly less expensive and is already available at 60 FPE.

As sad as it sounds, I knew passive was the way to go when the Adult film industry adopted it as the preferred method of delivering 3D. Historically, when new video technologies become available, the one that the Adult film industry backs will win.

Betamax vs. VHS
DVD vs. LaserDisk
Blu-ray vs. HD-DVD.

"The race is not always to the swift, nor the battle to the strong. but that's the way to bet! "

Ok... HUH? you've been drinking?
 
I have it, very good 3D indeed.

So how do we see the battle video in 3D on our Nvidia kit? I installed TriDef. But when you run that stuff the monitor doesn't go to 3D mode. Even if I set to always be in 3D mode, the TriDef built in demo's don't do any 3D.

How do you set it up?
 
HDMI and 3d
Back to the topic...
Currently there are really only 4 types of HDMI cables. regular speed with and without Ethernet support and High speed with and without Ethernet support. With current prices not being much different, most recommendations are to go with an inexpensive high speed cable. The connectors vary between manufacturers, some offering signal boost for longer distance runs, some protection for automotive use and some other uses, but the fact is, it's mostly marketing hype. HDMI is a digital signal which means it works or it doesn't. It's not like an analog signal that will degrade over distance or with interference. Either there's enough data to make a picture or it fails completely. As none of them had enough bandwidth available to offer 3D at 60 FPS/Eye until 1.4b, computer implementations of 3D required DVI-D connectors (Nvidia and ATI). 25 - 30 FPS/Eye was adequate for television but not well received by gamers. If you glanced through the above list, you'll see that HDMI wasn't capable of 3D support at 60 FPS/Eye until 1.4b became available. The problem now is that 3D is a chain. To achieve acceptable frame rates for games, the source, the cable and the display device must ALL be fully 1.4b compliant. Any "weak link" in the chain will limit transmission to the spec of the 'weak link'. Those of you who were talking about new 3d TV's should bear this in mind when you go shopping.

continued

For 120Hz 3D OR 1080P you need a HDMI high speed cable therefore most people have a required cable. Panasonic TX-P50VT20 can be pushed to 90Hz in 2D. I haven't pushed it to 120Hz as it's meant generate more heat.


A simple rule of thumb if you're buying a TV; if it is LED/LCD STEER CLEAR. If it is plasma buy.:D
 
Back
Top Bottom