just to add some variety to the conversations, found the following interesting tech. As well as your standard 6 axis head tracking, it also picks up what your eyes are getting up to. might be tricky with the looking right when you turn your head left issue of head trackers but I am sure it can configured to take that into account. has anyone tried
this with ED?
I built one of these recently for a disabled friend:
http://www.instructables.com/id/The-EyeWriter-20/?ALLSTEPS
I immediately programmed it to work as a headlook axis in ED "for testing purposes". Of course I then immediately realized that you can't use your eyes for headlook. As soon as you look at something, it moves to the center of the screen, so you then have to look at the center of the screen, which moves the object back to where it was. Basically it means you can't look at anything without it running away from your gaze. It really should've been obvious.
As far as foveating is concerned, that's where it becomes useful. It would also be cool for activating HUD elements, but using it to create or modify headlook behavior is something I can't visualize in any useful way. Of course, I may be near-sighted in this belief. ;D
BTW, if anyone has any disabled friends who can't type and also can't use voice recognition, seriously, build them one of the Eyewriters. It's super cheap, uses most of the stuff we use for optical headtracking already, and will let people finally communicate again who haven't been able to afford a medical-grade eyetracker(minimum $3000).