Faceware in Star Citizen & the 'Commander Chronicles: Retaliation' Short

It was encouraging to see that avatars are rigged full face animation, even if it's not an in game feature.

It'll be great for fan fiction if a face tracking solution is implemented down the line. Even just base lip sink with some selectable expressions would be good.

Also, I want my holo-me to smile, the miserable so and so...
 
It would be really cool to have face tracking. It's not new as others pointed out, Everquest 2 had it years ago.

This is important for roleplaying and machinima to make cool videos where players can roleplay and show their emotions while interacting with each other.
 
It would be really cool to have face tracking. It's not new as others pointed out, Everquest 2 had it years ago.

This is important for roleplaying and machinima to make cool videos where players can roleplay and show their emotions while interacting with each other.

I think so too. I'd be more concerned about client performance today than Frontier quietly developing on the side. The cinematic shows that they have been, just like how the (season one) launch trailer had some kind of character generation in the engine back then .. but it was a season and a half before we saw that in 2.3 release.

I think faceware is the future basically. It could also be a very interesting machine learning project when it comes to what you do when there's a VR unit on your face - self teach a subroutine to extrapolate a position for your eyebrows (the tracker can't see) from the shape of your lips, which it can? There's already faceware out there but if Frontier were able to develop a more optimised / more intelligent version, then they might land themselves a nice proprietary software, that they can sell .. so I don't think it's reasonable to count it out, even if it's not something people think especially fits with ED right now.

I think it probably will though and that it'd be a very different technology to game missions content, that I think most people want to see brought further to the front in ED soon as possible. Plainly the avatar machette can have animations loaded on to it - and a guy working on different inputs in a back room doesn't get in the way of that for me. It's also not 1980 any more .. and Frontier need a game product they can sell for the NEXT 20 years, which means not being content with the indusrty standards of the previous decade.
 
Last edited:
It would be really cool to have face tracking. It's not new as others pointed out, Everquest 2 had it years ago.

This is important for roleplaying and machinima to make cool videos where players can roleplay and show their emotions while interacting with each other.

Yeah, that's cool, but as an actual game feature, it isn't something I would ever bother with.
 
Yeah, that's cool, but as an actual game feature, it isn't something I would ever bother with.

Exactly. Really cool and has some definite uses, but there are a lot of other things that need looking at before it. Like exploration, a better way to handle crafting, ways to play the BGS that don't involve grinding the same kind of mission over and over because it's the only good way to increase your pet faction's influence. Heck, grinding in general is a problem.
 
Not sure if this is the best forum for this (mods, please move this thread if you wish).

Star Citizen recently announced a face tracking and lip syncing feature for in-game avatars when the player has a web cam:

https://youtu.be/REUAt0OO-2A?t=57s

I thought this was quite a cool feature, and I was wondering whether similar technology could be implemented in Elite for Holo-Me characters, and the Minor Faction representatives at stations.

And then today I see that the recent 'Commander Chronicles: Retaliation' short has characters with lips synced to speech:

https://www.youtube.com/watch?v=38bHNSB4cDM

So, I'm interested to know to what extent Elite has support for voice and lip-syncing? Have Frontier developed their own 'Faceware' already? Is this something we could see in-game very soon?

Or did the video creators have to use some artistic license to get the Holo-Me characters to move their lips?

Just curious.

I think it would be cool if they added it, but not at the expense of QoL features many of us have been asking for.

And personally, I would prefer space legs first.

I would be OK with a system like Rainbow Six Vegas had 11 years ago where your characters mouth moved when you talked. It wasn't synced using a camera, but was better than being a ventriloquist ;-)

And it would be nice to see the holo-me avatars affected by g-forces like the first person perspective is - maybe even have them frown when their ship is being fired on.

Currently, the impressive looking holo-me avatars have the default unconcerned look even when in combat and on the verge of death :-(
 
I'd be happy with lip syncing to voice (Arma III does this), and the head tracking being visible to other multicrew CMDRs.

So if I look at them and speak, my CMDR will look that way too, and move his lips accordingly.

Works well enough in Arma 3.
 
Regardless if in game or not, I just want to take a second to point out how good the animation and models were.
All that on a fraction of the budget of the other game too ;)
 
mark my words" SC will release as a subscription based game" squadron 42 will be the base package but star citizen will be a pay monthly subscription.
 
Nice idea. I'm sure FDev will think about it when Space Legs DLC will come. It'll a year or two after Planet Atmosphere... so i guess we can wait a long time.

For now core game is more important.
 
I'm struggling to see the usefulness TBH.

In a game with face-to-face interaction I guess it would be novel, I suppose.

In ED, I don't really see any application currently.
About the best you could hope for is that, under certain certain circumstances, you'd be able to use the camera-suite to zoom in on another CMDR, see their "rage-quit" face and watch them mouth a word that starts with "F" as you turn their ship into debris.

Given the way an eye-tracker throws a fit when I drink coffee or scratch my nose, I'm not entirely sure this has any substantial merit.

Course, I'm sure the people who were so keen on long hair will see it as another opportunity to enhance their ED-based fan-fic videos.

As someone that often plays in VR... I'm not entirely convinced any other CMDR wants to see a 60fps animation of my "Oculus Face"...

As for uses... Content creators making ED based vids, perhaps? Not really sure how the Devs currently deal with facial animations, but I'm guessing it's hard coded mostly. Honestly, I think finger and hand tracking to allow you to flick switches would probably be far more amusing, though not really needed either, for those with a higher end HOTAS.

Z...
 
FD, just give our avatars the opportunity to smile ..... just a little ..... please :)

I want my CMDR to be able to do this:

Buddy_Christ_2.png


Z...
 
Head-look attached to my Rift would be good enough for me :)

Or even just POV in general, for that matter. I noticed my Commander's collar moving as he was looking around even though I had moved my view over facing somewhere else.
 
Honestly, I think finger and hand tracking to allow you to flick switches would probably be far more amusing, though not really needed either, for those with a higher end HOTAS.

Z...

See, that's something I'd love to see.

As a flight-sim fan, being able to move to VR would be a complete game-changer.
Trouble is, like most flight-sim fans, I also aspire to a proper sim-pit full of switches, knobs and levers.... which you can't see when you're using VR.

If there was some way to accurately track hand position (some kind of funky gloves, pehaps?) then that'd really be a step forward.
Given that most sim-pits tend to either be generic or "in the style of" a particular aircraft, I guess that "calibrating" a VR hand-tracker would be a complete nightmare though.
Seems like the game-dev would have to make the in-game "cockpit" completely configurable or the player would have to build a sim-pit to be a fairly close approximation of the in-game cockpit.

Personally, I suspect a better real-world solution would be to pursue some kind of "Augmented Reality" system whereby the VR headset either has a HD camera built-in or uses clever lenses so that looking down allows you to see stuff in the real world and looking straight ahead fills your view with the in-game display.

Doing it using VR hand-tracking would be amazing, though, if it was 100% reliable because it'd mean you could create a sim-pit with no functional controls at all.
You'd be able to just cobble together a pit using all sorts of non-functional switches, knobs and levers and not have to worry about connecting them up to controller boards because the flight-sim, itself, would recognise what controls you were activating from the hand-tracking movements.
 
Back
Top Bottom