Faceware in Star Citizen & the 'Commander Chronicles: Retaliation' Short

Not sure if this is the best forum for this (mods, please move this thread if you wish).

Star Citizen recently announced a face tracking and lip syncing feature for in-game avatars when the player has a web cam:

[video=youtube_share;REUAt0OO-2A]https://youtu.be/REUAt0OO-2A?t=57s[/video]

I thought this was quite a cool feature, and I was wondering whether similar technology could be implemented in Elite for Holo-Me characters, and the Minor Faction representatives at stations.

And then today I see that the recent 'Commander Chronicles: Retaliation' short has characters with lips synced to speech:

[video=youtube;38bHNSB4cDM]https://www.youtube.com/watch?v=38bHNSB4cDM[/video]

So, I'm interested to know to what extent Elite has support for voice and lip-syncing? Have Frontier developed their own 'Faceware' already? Is this something we could see in-game very soon?

Or did the video creators have to use some artistic license to get the Holo-Me characters to move their lips?

Just curious.
 
Strange priorities, the Star Citizen dev's have.

I'm guessing what really happened there was that they came up with motion-capture software to create something for a cut-scene and then thought "Hey, we could stick that in the game too!"

Also, I can't wait to see how this works for somebody using an Oculus Rift.
 
Last edited:
Good observation +1 .... and it is relevant to 2.4 or beyond ;)

Ed said this was cinematic and therefore not in the game, but he was possibly covering for "not in the current game" :)

Strange priorities, the Star Citizen dev's have.

I'm guessing what really happened there was that they came up with motion-caption software to create something for a cut-scene and then thought "Hey, we could stick that in the game too!"

Also, I can't wait to see how this works for somebody using an Oculus Rift.

Yes, something else to throw in for a project team to say "this is what we did this month, sir"
 
All well and good having this feature within SC but they need to stop adding stuff and get the core game finished.
The Game was kickstarted a month before Elite and was due for release by the end of 2014, Yet Frontier has realised a core game with plenty of updates.
 
All well and good having this feature within SC but they need to stop adding stuff and get the core game finished.
The Game was kickstarted a month before Elite and was due for release by the end of 2014, Yet Frontier has realised a core game with plenty of updates.

Not to mention releasing planet coaster then Jurassic derp next year.
 
The facial animation software CIG is using is existing software developed by a third party.

LOL - so their real progress is ... zero? Implementing other people's libraries doesn't count.

I guess it will distract everyone from the massive failure that was CitizenCon (hint's in the name! ;) ).
 
I'll keep all SC talk in the at some point box until they actually bring out a final game for release. Very pie in sky stuff from most of it that I've seen which is just crazy given the amount of money that they got!
 
I really like some design choices in SC (space legs right out of the box), but when SC is released, there will be Elite Feet™ in Elite already. This facial thing is just for showing off. This is what you get when you give money to Chris Roberts.
 
Meh - please don't let this thread descend into a Star Citizen bashing thread. I'm really only interested in Elite's implementation.

Edit: Ninja'd by Jenner.
 
Last edited:

rootsrat

Volunteer Moderator
Ed said this was cinematic and therefore not in the game,

This. I don't think we'll see anything like that in Elite anytime soon. Even if they have it planned, I can't imagine it being anywhere near the top of their priority list.
 

Avago Earo

Banned
From 2012. Jump to 01:08 for animation:

[video=youtube_share;dmeb1ntW8Dg]https://youtu.be/dmeb1ntW8Dg?t=1m2s[/video]
 
Last edited:
I'm struggling to see the usefulness TBH.

In a game with face-to-face interaction I guess it would be novel, I suppose.

In ED, I don't really see any application currently.
About the best you could hope for is that, under certain certain circumstances, you'd be able to use the camera-suite to zoom in on another CMDR, see their "rage-quit" face and watch them mouth a word that starts with "F" as you turn their ship into debris.

Given the way an eye-tracker throws a fit when I drink coffee or scratch my nose, I'm not entirely sure this has any substantial merit.

Course, I'm sure the people who were so keen on long hair will see it as another opportunity to enhance their ED-based fan-fic videos.
 
The best motion capture system implemented to date was actually a surprisingly simple solution developed by Team Bondi in their game L.A. Noire. For those of you who haven't played that game, they basically took video captures of the actor's lips while reciting the script, then replay those videos where the mouth normally sits on a face. You can tell exactly what they're saying even if you turn off your audio. It's quite amazing.

As for ED, I suspect they hired a mo-cap studio and actors to produce the animations for them, which FDev's content team later imported into Maya or whatever authoring tool they use. There's a lot of proprietary mo-cap tech out there, but fundamentally they're just sophisticated trackers. It's the sort of thing that if your company doesn't do a lot of mo-cap work, it's much more cost effective to outsource it.

As for in-game, I don't really think this is something to expect. Even if in some future content update we can populate our bridge with NPCs that randomly talk, those speeches and animations will likely be scripted, outsourced, and imported into the game. Bold artists may even try their hand at manually rigging the faces, though that's more of a niche liberty among artists these days and it's simply faster and cheaper to rent a mo-cap studio. It's unlikely a game studio will use this kind of real-time tech for in-game use as it's not cost effective and prone to a tonne of issues and edge cases.

Still, it's interesting tech. Such facial recognition and tracking algorithms are useful in the field of robotics.
 
The facial capture stuff makes sense in SC (sortof - we'll see how it works out when real people use it. I can't wait to see a 6.6' musclebound bouncer with the exaggerated facial expressions of a 10 year old), because in that game players are expected to interact with each other face to face a lot. To get missions and trade you have to get out of your ship, some missions you have to do on foot, there's FPS combat, etc. In ED on the other hand, you don't see anyone's face unless they're crewing with you and even then you've got better things to pay attention to. So I wouldn't expect anything even remotely like facial capture done in ED - it would be an even bigger waste of resources than it is in SC. One thing that would be nice to see in ED though is the head tracking that the face capture also provides - it's really a good alternative to "real" VR in that it's not as immersive as VR, but you can still see your real hands so you can type, write down stuff etc. That I would hope to see in ED some time.
 
Last edited:
I don't have a webcam, so its of zero use to me. And besides, it will be years before we will get space legs probably, so not like we see each other's characters much anyway.

I presume they used a standard flappy lips library or it was just animated by the animation team.
 
I don't have a webcam, so its of zero use to me. And besides, it will be years before we will get space legs probably, so not like we see each other's characters much anyway.

I presume they used a standard flappy lips library or it was just animated by the animation team.

That's basically it - where the guy talks about tier 1 and tier 3 characters, that refers to the level of detail on the motion capture files for the character. Some characters are important NPCs so they have a lot of detail on their mocap, others are player avatars so they get a lot less detail because all those animations use up memory and you're going to have a lot more Crewman Lennys running around than you're going to have Captain Kirks. The facial capture API examines video of someone's face and interprets facial expressions out of it, and those expressions are then fed to the animation system so it can play back the corresponding mocap animations for the character.

Also, if you haven't tried headtracking yet (trackir or edtrack), you're missing out. Trust me :)
 
Last edited:
I'd rather Frontier not waste the money they've made from me so far on something like this sort of niche tech demo vaporware feature to be honest.

Thankfully, to my knowledge, no, they aren't working on this.
 
Last edited:
Back
Top Bottom