Star Citizen Thread v6

Status
Thread Closed: Not open for further replies.
Snarfbuckle that's a great and useful list using average salaries.

You need to account for 100 days mocap costs at the most expensive mocap studio in the world, actor fees, third party studios (Ilfonic, the AI team, various art teams, external 3d modelers etc)

Also the money taken out in shares (listed at companieshouse)

Also the wages of the big 3

Also marketing costs, travel (the backers will never know), and hospitality.

See what you come up with.
 
Considering people have already got it working with a hack it's hardly a giant leap. And if ED did it in 2014 it can be added in SC.

Sure it can but they will also need special Vomit-bag-extension for their VR.....thats actually good idea for CIG marketing sector...they can make this special NEW tech sickness bags and earn few more bucks.......
 
Last edited:
The faceware feature won't work, because:

- a specific periphery is needed, at least a webcam, maybe even a special one (and some people will have to buy it only for this specific purpose)
- users have to invest time for configuration
- the user has to create a specific light setting for good results
- it adds more data and costs performance
- it doesn't look good, it doesn't look realistic, it seems to offer mostly a comedy effect which does not fit well into the serious, realistic world of Star Citizen
- it is not relevant to the core gameplay, it is a negligible feature in a space-shooter as close interaction between avatars won't play a big role in this type of game
 
Snarfbuckle that's a great and useful list using average salaries.

You need to account for 100 days mocap costs at the most expensive mocap studio in the world, actor fees, third party studios (Ilfonic, the AI team, various art teams, external 3d modelers etc)

Also the money taken out in shares (listed at companieshouse)

Also the wages of the big 3

Also marketing costs, travel (the backers will never know), and hospitality.

See what you come up with.

Yes, i know, there are added costs apart from wages.

Motion and Performance Capture
100 days mocap and costs might be difficult to calculate since they got their own mocap studio, so it's mostly hardware and salary cost. But, it was a 10 million stretch goal and one visit at a studio is between 25-50K USD and gives about 200 "moves" which means that with a studio they could perform 40 000 moves at a studio but most likely a LOT more with their own studio (which was set up in a rented apartment and their own tech.) So motion capture would in this case be rather cheap.

Performance capture is even more difficult since we are talking about hired actors AND Imaginarium studios and that they had a partnership with them so they might have gotten it cheaper as well. But let's take the average salary of an American actor 2017 at 26 USD/hour and quadruple that, and finally round down to an even 100 usd. And since this is not a movie we are not talking about giving an actor a few hundred millions since they are not worrying about hollywood accounting but they have a straight hourly contract.

With 20 actors according to IMDB we also say for simplicity that with retakes we have 100 straight hours per actor AND they paid 50K per day for the studio.
That's 200K for the actors alone and with 8 hour work days we have an additional 625K for the studio. Let's add a nice hotel for each actor at 200 USD / night for 2 weeks each. That's 56K more.

Total average performance capture: 881 000 USD. Let's up that to a straight million if some actors were even more expensive to work with.

Considering that the stretch goal was 10 million for that we can double the cost for the full performance capture and add the same cost for the motion capture cost and still get under half the stretch goal amount.

Wages and other costs
Wages of the big three can easily be added and let's use the 2014 management salary at 150% which becomes 12500 USD/Month and 2 100 000 USD from 2013 to today for 3 people.

Considering that no money is earmarked for marketing from backers that would not be reasonable to add. And im pretty sure the subscriber money would cover that. Let's assume 1% of registered accounts are subscribers - that's 18K people at 10 USD a pop. That's 180 000 USD per month and a yearly budget of 2,1 million a year for around the verse and other little marketing shows.

Travel and hospitality are not exactly a fixed production cost but i have a feeling it does not break the 2,1 million a month from subscribers and I get the feeling the entire staff does not go for all paid vacation to bahamas.

Wages+Stretch Goals

Just smashing wages+stretch goals together we can see how much has been used/earmarked so far.

Wages: 80 million
Goals: 65 million

Total: 145 million
 
The faceware feature won't work, because:

- a specific periphery is needed, at least a webcam, maybe even a special one (and some people will have to buy it only for this specific purpose)
- users have to invest time for configuration
- the user has to create a specific light setting for good results
- it adds more data and costs performance
- it doesn't look good, it doesn't look realistic, it seems to offer mostly a comedy effect which does not fit well into the serious, realistic world of Star Citizen
- it is not relevant to the core gameplay, it is a negligible feature in a space-shooter as close interaction between avatars won't play a big role in this type of game

I do have to disagree on some points

- Most games require some periphery and this one gives headtracking and facial motion capture without a headset required
- Investing time to configuration is required in ALL games unless we are talking about a nintendo platformer. Key configuration and graphic settings to name a few. Not to mention we have no idea how well this works out of the box.
- We do not know IF the user have to create a specific light setting (but i do not dispute the possibility). Im more wondering about WHERE the camera can be put since I use a 42" computer screen and the camera will most likely see the top of my nose and forehead...
- Add data and cost performance is a possibility, unless they fix he netcode, and we do not know how much data that would be.
- Their current version might look wonky but it's not finished.
- What core gameplay? Squadron 42? A group if actual online roleplayers? Your way of playing? It's too early to tell.
 
I do have to disagree on some points

- Most games require some periphery and this one gives headtracking and facial motion capture without a headset required
- Investing time to configuration is required in ALL games unless we are talking about a nintendo platformer. Key configuration and graphic settings to name a few. Not to mention we have no idea how well this works out of the box.
- We do not know IF the user have to create a specific light setting (but i do not dispute the possibility). Im more wondering about WHERE the camera can be put since I use a 42" computer screen and the camera will most likely see the top of my nose and forehead...
- Add data and cost performance is a possibility, unless they fix he netcode, and we do not know how much data that would be.
- Their current version might look wonky but it's not finished.
- What core gameplay? Squadron 42? A group if actual online roleplayers? Your way of playing? It's too early to tell.

In respect to headtracking, sounds similar to the Tobii 4c.

The 4c :

- Requires no special lighting conditions, but I believe this is only because it emit its own IR via two IR LEDs
- For me the 4c headtracking aspect is great, with the exception of looking up, it seems to lose the ability to detect when my head tilts back
- The 4c attaches underneath the monitor, presumably because this is generally in a consistent position between users. On top of the monitor would vary due to monitor sizes
- 4c processing is done in hardware, the previous iteration, the Tobii EyeX did it in software, I can't remember the overhead though clearly the EyeX was doing less work that the Face Tracking will be

Also the Sony one, SOEmote, just wanted a 30fps cam, so no "special" hardware required for that :

http://www.pcgamer.com/soemote-implants-your-facial-expressions-movement-and-voice-onto-your-eq2-avatar/ said:
Robert Gehorsam: Within EverQuest 2, the Live Driver software essentially performs detailed measurements of the player's facial expression on every image captured from their webcam. Today's webcams typically operate at 30 frames per second. Our software tracks 64 points and makes around 100 expression measurements per frame, for over 5,000 measurements per second.

Actually it seem the SC version is actually the same software used in EQ2?

http://www.pcgamer.com/soemote-implants-your-facial-expressions-movement-and-voice-onto-your-eq2-avatar/ said:
RG: Prior to Live Driver—the technology used in EQ2—our professional facial animation technology, Faceware... was used in more than 40 AAA gaming titles, including Red Dead Redemption, GTA IV, Assassin's Creed, Max Payne and more.

Faceware?
 
Last edited:
In respect to headtracking, sounds similar to the Tobii 4c.

The 4c :

- Requires no special lighting conditions, but I believe this is only because it emit its own IR via two IR LEDs
- For me the 4c headtracking aspect is great, with the exception of looking up, it seems to lose the ability to detect when my head tilts back
- The 4c attaches underneath the monitor, presumably because this is generally in a consistent position between users. On top of the monitor would vary due to monitor sizes
- 4c processing is done in hardware, the previous iteration, the Tobii EyeX did it in software, I can't remember the overhead though clearly the EyeX was doing less work that the Face Tracking will be

Also the Sony one, SOEmote, just wanted a 30fps cam, so no "special" hardware required for that :

It's actually exactly like FaceTrackNoIR

http://facetracknoir.sourceforge.net/

It's just that it can also detect distortions to the human face and map the face to move that data to a character model. And all one needs is a 60fps camera.

Tobii is different since it is EYEtracking so the moment it looses "sight" of your eyes it looses track.
 
I do have to disagree on some points

- Most games require some periphery and this one gives headtracking and facial motion capture without a headset required
- Investing time to configuration is required in ALL games unless we are talking about a nintendo platformer. Key configuration and graphic settings to name a few. Not to mention we have no idea how well this works out of the box.
- We do not know IF the user have to create a specific light setting (but i do not dispute the possibility). Im more wondering about WHERE the camera can be put since I use a 42" computer screen and the camera will most likely see the top of my nose and forehead...
- Add data and cost performance is a possibility, unless they fix he netcode, and we do not know how much data that would be.
- Their current version might look wonky but it's not finished.
- What core gameplay? Squadron 42? A group if actual online roleplayers? Your way of playing? It's too early to tell.

- faceware adds additional configuration effort
- it was said in the Chris Roberts presentation that full performance (60fps tracking) is only possible when the environment is properly lighted
- it will cost data and performance. the facetracking alone uses computing power, see faceware website (system requirements), and needs bandwidth in an online envionment
- nothing is finished in Star Citizen, but that doesn't mean everything is possible
- as I understand it Squadron 42 is supposed to be the successor to Wing Commander and Star Citizen is supposed to be Privateer in an online environment. The core mechanic of both games is space-flight and space-combat
 
I don't get all the obsession with this Face over IP thing. To me, it's nothing more than a OH LOOK SHINY! distraction from the state of the 'game'. It's wasted effort to me. There's a whole lot more important problems (15 fps! Flight model - need we go there? Jankiness... "blocker bugs", and Sq42...) than this useless feature.
 
- faceware adds additional configuration effort
- it was said in the Chris Roberts presentation that full performance (60fps tracking) is only possible when the environment is properly lighted
- it will cost data and performance. the facetracking alone uses computing power, see faceware website (system requirements), and needs bandwidth in an online envionment
- nothing is finished in Star Citizen, but that doesn't mean everything is possible
- as I understand it Squadron 42 is supposed to be the successor to Wing Commander and Star Citizen is supposed to be Privateer in an online environment. The core mechanic of both games is space-flight and space-combat

The question is how MUCH data is transferred but I can definitely see some silly situations:

- Does the avatar turn the head if the player does? Might be some weird body movement.

While SQ42 is the spiritual successor to WG does not mean it cannot expand to do MORE. The FPS portion to the 1st person universe will most likely have some cool boarding actions between ships.

I see this tool mainly for those who wants to make Machinima videos similar to Clear Skies or role players.
 
I don't get all the obsession with this Face over IP thing. To me, it's nothing more than a OH LOOK SHINY! distraction from the state of the 'game'. It's wasted effort to me. There's a whole lot more important problems (15 fps! Flight model - need we go there? Jankiness... "blocker bugs", and Sq42...) than this useless feature.

It's the new rallying cry because it's the only truly new unexpected thing to come out of the presentation that appeared at face (AH Ha haha hahahaha) value to actually work.
 
I don't get all the obsession with this Face over IP thing. To me, it's nothing more than a OH LOOK SHINY! distraction from the state of the 'game'. It's wasted effort to me. There's a whole lot more important problems (15 fps! Flight model - need we go there? Jankiness... "blocker bugs", and Sq42...) than this useless feature.

I can see it as a tool for them - the devs.

- They can more easily add an npc and do quick voice-acting on them with minimal effort, combine that with their motion capture studio and they have a tool to quickly create a talking NPC scene.
 
They can more easily add an npc and do quick voice-acting on them with minimal effort, combine that with their motion capture studio and they have a tool to quickly create a talking NPC scene.

That will then stand there aimlessly until a custom script is kicked off for it, or it falls through the floor, or explodes into a tentacle monster, or ceases to exist if no player has it in view :(
 
That will then stand there aimlessly until a custom script is kicked off for it, or it falls through the floor, or explodes into a tentacle monster, or ceases to exist if no player has it in view :(

Sound like most MMO quest giver npc's... But the exploding tentacle monster is probably more limited to japan.
 
I can see it as a tool for them - the devs.

- They can more easily add an npc and do quick voice-acting on them with minimal effort, combine that with their motion capture studio and they have a tool to quickly create a talking NPC scene.

Sure, it could be used like that. Without adding it to the game at all.
 
Sure, it could be used like that. Without adding it to the game at all.

Hah, they HAVE actually used it for missions editing.

The Derby Studio has been busy as ever. The Facial Team finished off all the animations needed for the 3.0 Mission Givers while continuing SQ42 work. Eckhart alone has over 47,000 frames (26 minutes) of bespoke facial animation and is one of over 13+ mission givers currently in production for the PU.
After the recent 3-day Audio/Headcam shoot in London, all data has been tracked in Faceware and retargeted onto our face rigs in Maya. This is a great achievement for the team as there were over 125,000 frames or almost 70 minutes of footage shot.

But I guess missions and content adds nothing to the game...
 
Status
Thread Closed: Not open for further replies.
Back
Top Bottom