? for FD

just read an article about some software called Orion, which integrates Leap Motion into virtual reality environments. While the likes of Oculus and HTC's Vive are experimenting with complex, often clunky controllers for VR - Leap Motion's tech allows for an accurate reconstruction of your entire hand.
The software has been released to developers today.
have you any info on if the team will be utilising this.
cheer's:)
Leap Motion chief executive Michael Buckwald said that it was working with all the major VR makers to embed the device within their headsets.
For now, however, the device rests within a little mount attached to the front of the headgear.
 
Last edited:
Its a tricky thing, because while your hands might be there, non physical controls, throttle, joystick moving those ingame, doesn't really work well since there's no resistance and such.
 
Its a tricky thing, because while your hands might be there, non physical controls, throttle, joystick moving those ingame, doesn't really work well since there's no resistance and such.

You still use your existing controls. The hand tracking would be for interaction with switches... Of which there aren't any in ED cockpits. Maybe it would be nice for the galaxy map. It looks cool but not very useful in Elite where everything can be accessed from a hotas. I'd prefer they spend their time fixing the VR support for headsets before they consider bringing our hands into VR.
 
You still use your existing controls. The hand tracking would be for interaction with switches... Of which there aren't any in ED cockpits. Maybe it would be nice for the galaxy map. It looks cool but not very useful in Elite where everything can be accessed from a hotas. I'd prefer they spend their time fixing the VR support for headsets before they consider bringing our hands into VR.

So agree with you! First things first!
 
Leap motion has become a joke because of the way it only tends to work accurately about 30% of the time ( and even then it is highly dependent on ambient lighting as their sensors were poor and picked up a lot of background noise) and have blown though their funds trying to create tech demos instead of refining the code and laying off most of the developers that knew what they were doing.

Oculus has purchased a company called Nimble several months ago, and their tech was fully functioning when they were bought and prior to being integrated into the Oculus team, they had already done the recreation of your hands in the game environment.

I dont think they are going to get anywhere with this as both Oculus and Vibe have developers already doing this. There is not going to be any demand for a third party application when first party team already has it and it is integrated into the headsets.
 
You still use your existing controls. The hand tracking would be for interaction with switches... Of which there aren't any in ED cockpits. Maybe it would be nice for the galaxy map. It looks cool but not very useful in Elite where everything can be accessed from a hotas. I'd prefer they spend their time fixing the VR support for headsets before they consider bringing our hands into VR.
Yeah, lets get headsets up and running first, but it is an interesting idea, though will be tricky to get such things right.

Leap motion has become a joke because of the way it only tends to work accurately about 30% of the time ( and even then it is highly dependent on ambient lighting as their sensors were poor and picked up a lot of background noise) and have blown though their funds trying to create tech demos instead of refining the code and laying off most of the developers that knew what they were doing.

Oculus has purchased a company called Nimble several months ago, and their tech was fully functioning when they were bought and prior to being integrated into the Oculus team, they had already done the recreation of your hands in the game environment.

I dont think they are going to get anywhere with this as both Oculus and Vibe have developers already doing this. There is not going to be any demand for a third party application when first party team already has it and it is integrated into the headsets.
Aww that's a shame if they've gone down the hill, looked to be very nice with those tech demo's but that is of course, a tech demo.
 
Just remember, most of their tech demos they show were pre-rendered, not live as well.. they were more concepts to get funding for their kickstarters at the time.

Nimble was showing off at tradeshows live at around the same time. There was a reason why they were pickedup and not Leap motion.
 
Leap motion has become a joke because of the way it only tends to work accurately about 30% of the time ( and even then it is highly dependent on ambient lighting as their sensors were poor and picked up a lot of background noise) and have blown though their funds trying to create tech demos instead of refining the code and laying off most of the developers that knew what they were doing.

Oculus has purchased a company called Nimble several months ago, and their tech was fully functioning when they were bought and prior to being integrated into the Oculus team, they had already done the recreation of your hands in the game environment.

I dont think they are going to get anywhere with this as both Oculus and Vibe have developers already doing this. There is not going to be any demand for a third party application when first party team already has it and it is integrated into the headsets.

looks like NIMBLE went from Oculus to HTC.
 
Back
Top Bottom