Gunner = Arcade Action Cam for the 12 yr olds?

To all you immersion fans.

I am going to break this to you. This POV is here to stay. They just spent the past, probably few years designing it. They are not going to take it out to satisfy you guys. Tell yourselves whatever you need to tell yourselves. But the rest of us like it. I like immersion as much as the next person, but there comes a point where it needs to be for more than just a certain group. If they had done it any other way (I promise you they tried a lot of things, before this was settled on) it would not have worked as well. How exactly will you aim 4+ turrets and not lose lock on target, and not get sick from the pilot flying the ship and controlling his weapons. It just not possible.

Sincerely,
All of us who like it.

There are plenty of ways to get the same effect. One has already been given and would have looked and played far better then what we have now.
 

Goose4291

Banned
I hate to say this, but there is a precedent in prior frontier games of this.

Remember the combat computer from First Encounters, that put you in third person behind your ship and allowed you to (badly) fight other ships. Admitedly this was just with your for'ard hardpoint, not turrets or stern but the basic principle of 3rd person combat applies.

Screenshot_20170218_080337.png
 
Last edited:
Why "100%" vomit cam?

I's been done in other games, I don't remember those being particularly vomit inducing.

Maybe you're right, testing will tell. Though basically I think the view is incredibly well produced but (imo here) a shade undercooked. Could be more of a tactical view?

No matter how nice the exterior of the ship looks (and it does look nice but 'could' it get old seeing it there? Is it helpful or distracting in a fight?) and no matter that you can give the third person cam a, 'not really a third person cam' technology lore (which I'm down with but I don't think is especially intuitive) .. motions sickness would be another reason (I think) maybe to look at further developing a more 'tactical' view from the one demonstrated. Not because I think Frontier have done a bad job but because I think going a bit further could ice the cake. 2.3 is already a massive upgrade, I do recognise and I like the new GUI. Camera position .. not very convinced though.
 
Last edited:
Maybe you're right

Well I don't know if I'm right, but I see no reason why it would. You are saying it will be "100% vomit cam" which is quite a claim. I mean there's plenty of precedent, recently Dreadnought for example, it isn't vomit inducing.

It's not about whether I am right or you are, I actually can't be sure, but you are making a claim so I am just wondering where you get this 100% figure from?
 
Last edited:
Come on old chap, fairs fair, I've listed my two, please list yours and posit your own two games :)

Put it this way, a simple scenario. In ED, if i kill your engines, unless you've got FA off at the time you stop dead in the (proverbial) water, whereas in Frontier, if an NPC killed your drive, you just kept going in the direction of travel.

Fair's fair.
https://www.youtube.com/watch?v=FwKY1XF6C0c <-- forgetting the odd name it being a random ios/android game, it shows arcade behaviour, as in very little realism and mostly based on fun.

Course my counter would be Elite/Star citizen to point to realistic flight models, yes I agree certain behaviour is not realistic, but those are gameplay choices, FA is an 'assist' basically fly by wire, many modern jets use this, and does a heck of a lot of compensation, handling the input from pilot before it is put out.
Now as for the whole thruster thing I agree, but again one aspect of something doesn't mean the whole is not realistic, at best Elite (and Star Citizen) in my book are realistic with gameplay specific choices. so yeah, not 100% realistic but still 90% realistic or so.
 
Well I don't know if I'm right, but I see no reason why it would. You are saying it will be "100% vomit cam" which is quite a claim. I mean there's plenty of precedent, recently Dreadnought for example, it isn't vomit inducing.

It's not about whether I am right or you are, I actually can't be sure, but you are making a claim so I am just wondering where you get this 100% figure from?

I stand corrected .. though there's certainly come concern in VR forums, and I woke up in a bit of a cold sweat about the general nature of the view?
 
Last edited:
Posting here because the other thread was locked.

https://forums.frontier.co.uk/showt...nner-First-Person-View-Arguments-(Discussion)

No? How do we do it in the real world?

I do believe most turrets on modern naval vessels by now are highly automated and remotely operated.

Said vessels also rarely move as fast, or engage enemy vessels at such short ranges as the ships do in Elite.

I'm not even myself a fan of the third person view - I think it will probably feel pants in VR just like the SRV turret does.

I just don't quite follow the "realism" argument. Did you think this was unrealistic in I-War too? If you remember, it had a gunnery view almost exactly like the one we're discussing here. You don't think technologies like augmented/virtual reality, remote controlled drones or telepresence are possible? Reconstructing a 360° view based on camera feeds?

All of this is more or less possible today, in 2017. The only thing about the update that is kind of sci-fi are the holographic player avatars, but they could be faked using augmented reality (which I seem to vaguely recall was the justification for holograms in the ED lore in the first place) Yes it raises some questions about long range communication in the ED galaxy but that stuff was never consistent in the first place.

If the game was fully realistic, anyway, we wouldn't have completely nonsensical poppycock like human-aimed weapons or WW2 style glass cockpits in the first place. We'd only engage using automated and remote controlled drones and missiles from vast distances. The game would resemble Children of a Dead Earth more than Elite in its current form.

But I get that WW2 planes in space is more fun to most people! I too backed the game because I wanted that space opera fantasy I remembered the original Elite for. There's no need for simulationist purism here. It's not that sort of game. It's not exactly Orbiter.

I think the best way to make the gunner view work in VR would be to have it on a sort of holographic "screen" floating in front of the CMDR using it, showing everything in miniature form. That way you'd avert the negative feelings and nausea potentially created by being a disembodied head floating in space. I don't see the need for something like this for flat screen players though, you're already looking at a screen in the first place, and aren't experiencing immersion like a VR player would.

Debate: Who has the advantage with a solo commander combat Anaconda ship vs. a pirate multi-crew combat Anaconda ship?
I think the more pertinent question is, does being in a single multi-crew ship give the pirates some advantage over the same amount of players flying in a wing of multiple Anacondas?
 
http://www.dailymail.co.uk/sciencet...s-appear-transparent-soldiers-stay-hatch.html

https://www.google.es/webhp?sourcei...al+reality+war+tactics+vehicles+noruegan+army




Nothing more to say here for me but i think that gunner´s cam could be a reality in the time of ED, was you thinking in luke skywalwer using Millenium Falcon´s torrets¿? lol

Now THIS would be 100% acceptable. But ED didn't go this route. We have glass canopies. Would be very jarring and nonsensical if they reversed course midstream.
 
Last edited:
I stand corrected .. though there's certainly come concern in VR forums, and I woke up in a bit of a cold sweat about the general nature of the view?

Oh VR yeah, maybe.

But saying that, the first person SRV view makes me want to wretch in VR too, within minutes. :(

And oddly, perhaps a 3rd person gunner style SRV view with viewable HUD might actually help there! =p
 
Last edited:

Goose4291

Banned
I do believe most turrets on modern naval vessels by now are highly automated and remotely operated.

As someone who's stood freezing my posterior off next to a 20 mil on exercise for hours on end in the Channel I can tell you that's not the case.
That said, CIWS weapon systems are remotely operated (and for the most part, automated).

You're right on the rest though :).
 
I like the GUI. Direction to target indicators, line of sight gun to target indicators ..

Would love to see powerplant and module health percents added (warning, sensors at 1% .. you're about to lose your gunner!?) And if your hull is high, but your powerplant is about to fold, that really would be eyes in the back of the Helmsman's head?

[up]
 
Last edited:
Just watched a few minutes of the footage. To me it seems that Elite is "evolving" from this:

c700x420.jpg


To this:

reaper-pilot425x320.jpg


That makes me think that maybe we shouldnt have canopies at all, or even fly our ships at all. I want to be the first guy.
 
Last edited:
Now THIS would be 100% acceptable. But ED didn't go this route. We have glass canopies. Would be very jarring and nonsensical if they reversed course midstream.

I'd super like it if stuff changed, gives a sense of tech advancing. it'd fit with the whole WWII in space thing too where you go from garbo caveman equipment to radios and jets in like six years

- - - Updated - - -

Just watched a few minutes of the footage. To me it seems that Elite is "evolving" from this:

https://img.grouponcdn.com/deal/5Pdys9wsPunF5Nnybu2Y/5P-440x267/v1/c700x420.jpg

To this:

http://www.motherjones.com/files/reaper-pilot425x320.jpg

That makes me think that maybe we shouldnt have canopies at all, or even fly our ships at all. I want to be the first guy.

you want to be a giant logistical bottleneck that has to be replaced after two botched missions instead of an unkillable ghost that can jump from job to job in an instant???
 

Goose4291

Banned
I'd super like it if stuff changed, gives a sense of tech advancing. it'd fit with the whole WWII in space thing too where you go from garbo caveman equipment to radios and jets in like six years

- - - Updated - - -



you want to be a giant logistical bottleneck that has to be replaced after two botched missions instead of an unkillable ghost that can jump from job to job in an instant???

Its because drone pilots aren't portrayed taking part in homoerotic beach volleyball scenes, which is the one true reason we need space legs.

sSzCDRnOMaq3K.gif
 
So we need it to be believable huh? You ever consider that what the gunner sees is a computer simulation of the ship in the environment its in? If I can sit at my desk at home today and play a fairly realistic looking simulation of a ship flying around space, why can't this be possible in the future? When I switch to the outside 3rd person view, the player could just be looking at a computer rendering of the ship and its environment... much like you will be doing at your desk at home when 2.3 is released. Of course it is bonkers to think that when we switch to gunner view or selfie cam that the viewer is magically given the sight from a magic floating camera outside the ship (could be a drone but besides the point) but consider what is seen is actually a computer rendering.

When we use the galaxy map.. nobody complains about the immersion being broken on that? Because we simply accept that it is a computer schematic of the universe and not a direct view of the universe first hand being viewed like the eyes of God himself. I think people immediately assume that the 3rd person cam in ED is first hand view because that is what's accepted from all other games in the past. Which may not be the case if blinkers are taken off.
 
So we need it to be believable huh? You ever consider that what the gunner sees is a computer simulation of the ship in the environment its in? If I can sit at my desk at home today and play a fairly realistic looking simulation of a ship flying around space, why can't this be possible in the future? When I switch to the outside 3rd person view, the player could just be looking at a computer rendering of the ship and its environment... much like you will be doing at your desk at home when 2.3 is released. Of course it is bonkers to think that when we switch to gunner view or selfie cam that the viewer is magically given the sight from a magic floating camera outside the ship (could be a drone but besides the point) but consider what is seen is actually a computer rendering.

When we use the galaxy map.. nobody complains about the immersion being broken on that? Because we simply accept that it is a computer schematic of the universe and not a direct view of the universe first hand being viewed like the eyes of God himself. I think people immediately assume that the 3rd person cam in ED is first hand view because that is what's accepted from all other games in the past. Which may not be the case if blinkers are taken off.

I'll cross post this from another thread. It's a bit of a long explanation of a likely way in the future you would pull of the type of telepresence that elite has shown off. I think it's a good way to see how nothing about the gunner cam is immersion breaking when the whole of multicrew is virtual in nature:





I don't know how much you've dealt with VR stuff in our real world, so I'll try to explain the best I can why you don't need a drone camera in any way.

lets start with something simple, your character is not in your friends ship, you are there virtually. Now how do you look around your friends ship as if you were really there? In VR you'll be able to lean all over and walk around the cabin as you can now in your own ship as well. How would you accomplish this? It's not something as simple as "there is a camera there, you're looking through the camera". There are a few ways this would work in real life and the game.

One would be if the camera was some kind of drone with stereoscopic cameras that moved exactly 1:1 with your movements with 0 latency. This certainly might be possible in the future, but would be the most inefficient way to go about it. Another way would be some futuristic 360 stereoscopic lightfield camera so that you could move in physical space and have the image look correct. If you want an idea of what I mean by this, look at a 360 video on youtube, you can swivel all around and look wherever you want, but you can't lean to the side and see a different angle on the object in front of you in the video. Now, lightfield cameras for VR visualization is something being worked on in real life, and in fact intel had a recent demo at CES where they had a video of a scene, not a 3d model, that you could physically move around in VR. Really cool stuff. Problem with way #2 is unless the camera moves, theres no way you could look at stuff behind things blocking the camera, like behind or under your chair, which you can do in elite in VR.

So really, physical direct camera ways, are pretty much out as ways this would be done in elite. The real way you would accomplish this, and another thing being played with in real life for real VR is to have sensors (cameras, laser scanners, whatever they do in the future), around the area you want to virtually visit and then do a real time 3d reconstruction based on all that data. The same way something like the microsoft hololens scans your environment in real time to create a 3d map, or photogrammetry is done. If you haven't used VR before, heres an example of someones 3d reconstruction, created using only cameras and visualized in VR:
[video=youtube;ybdKP45IVDQ]https://www.youtube.com/watch?v=ybdKP45IVDQ[/video]

That is something that we can do today, in 2017. In the next decade, possibly more, we will be able to capture scenes in greater fidelity than that, in no time, even real time possibly. At the very least its not hard to imagine in 1,300 years, they can do what we do in 2017 in real time.



Ok, so thats explaining how you're seeing your friends cockpit, how you're visiting it and visualizing it. At least the most likely explanation on how it would be done based on our current knowledge. How does this apply to the 3rd person camera? Well, simply put, you have all that sensor data from the ship that knows exactly where all objects around it are, and exactly what position they are in, the orientation and look of your own ship. It would be fantastically trivial to churn together all that data and create the same kind of 3d reconstruction we can do in 2017, and once you have that 3d reconstruction, you can place the virtual camera that you view the scene from, in literally any position, at any angle you want.

People aren't blowing smoke when they make the argument we can practically do all of that today, in 2017.

Now, whether you, or anyone else likes the gunner cam, is a completely different argument, but I hope I've been able to give you an idea why there is no such thing as a drone needed for the kind of view you get in that gunner cam.
 
I'll cross post this from another thread. It's a bit of a long explanation of a likely way in the future you would pull of the type of telepresence that elite has shown off. I think it's a good way to see how nothing about the gunner cam is immersion breaking when the whole of multicrew is virtual in nature:





I don't know how much you've dealt with VR stuff in our real world, so I'll try to explain the best I can why you don't need a drone camera in any way.

lets start with something simple, your character is not in your friends ship, you are there virtually. Now how do you look around your friends ship as if you were really there? In VR you'll be able to lean all over and walk around the cabin as you can now in your own ship as well. How would you accomplish this? It's not something as simple as "there is a camera there, you're looking through the camera". There are a few ways this would work in real life and the game.

One would be if the camera was some kind of drone with stereoscopic cameras that moved exactly 1:1 with your movements with 0 latency. This certainly might be possible in the future, but would be the most inefficient way to go about it. Another way would be some futuristic 360 stereoscopic lightfield camera so that you could move in physical space and have the image look correct. If you want an idea of what I mean by this, look at a 360 video on youtube, you can swivel all around and look wherever you want, but you can't lean to the side and see a different angle on the object in front of you in the video. Now, lightfield cameras for VR visualization is something being worked on in real life, and in fact intel had a recent demo at CES where they had a video of a scene, not a 3d model, that you could physically move around in VR. Really cool stuff. Problem with way #2 is unless the camera moves, theres no way you could look at stuff behind things blocking the camera, like behind or under your chair, which you can do in elite in VR.

So really, physical direct camera ways, are pretty much out as ways this would be done in elite. The real way you would accomplish this, and another thing being played with in real life for real VR is to have sensors (cameras, laser scanners, whatever they do in the future), around the area you want to virtually visit and then do a real time 3d reconstruction based on all that data. The same way something like the microsoft hololens scans your environment in real time to create a 3d map, or photogrammetry is done. If you haven't used VR before, heres an example of someones 3d reconstruction, created using only cameras and visualized in VR:
https://www.youtube.com/watch?v=ybdKP45IVDQ

That is something that we can do today, in 2017. In the next decade, possibly more, we will be able to capture scenes in greater fidelity than that, in no time, even real time possibly. At the very least its not hard to imagine in 1,300 years, they can do what we do in 2017 in real time.



Ok, so thats explaining how you're seeing your friends cockpit, how you're visiting it and visualizing it. At least the most likely explanation on how it would be done based on our current knowledge. How does this apply to the 3rd person camera? Well, simply put, you have all that sensor data from the ship that knows exactly where all objects around it are, and exactly what position they are in, the orientation and look of your own ship. It would be fantastically trivial to churn together all that data and create the same kind of 3d reconstruction we can do in 2017, and once you have that 3d reconstruction, you can place the virtual camera that you view the scene from, in literally any position, at any angle you want.

People aren't blowing smoke when they make the argument we can practically do all of that today, in 2017.

Now, whether you, or anyone else likes the gunner cam, is a completely different argument, but I hope I've been able to give you an idea why there is no such thing as a drone needed for the kind of view you get in that gunner cam.

Yeah, I have been playing with any lightfield demos i can with my DK2 headset. They blow you away. When we have the capability to compress the data sufficiently to allow streaming of that tech, movies are going to take on a whole new era!
 
I'll cross post this from another thread. It's a bit of a long explanation of a likely way in the future you would pull of the type of telepresence that elite has shown off. I think it's a good way to see how nothing about the gunner cam is immersion breaking when the whole of multicrew is virtual in nature:





I don't know how much you've dealt with VR stuff in our real world, so I'll try to explain the best I can why you don't need a drone camera in any way.

lets start with something simple, your character is not in your friends ship, you are there virtually. Now how do you look around your friends ship as if you were really there? In VR you'll be able to lean all over and walk around the cabin as you can now in your own ship as well. How would you accomplish this? It's not something as simple as "there is a camera there, you're looking through the camera". There are a few ways this would work in real life and the game.

One would be if the camera was some kind of drone with stereoscopic cameras that moved exactly 1:1 with your movements with 0 latency. This certainly might be possible in the future, but would be the most inefficient way to go about it. Another way would be some futuristic 360 stereoscopic lightfield camera so that you could move in physical space and have the image look correct. If you want an idea of what I mean by this, look at a 360 video on youtube, you can swivel all around and look wherever you want, but you can't lean to the side and see a different angle on the object in front of you in the video. Now, lightfield cameras for VR visualization is something being worked on in real life, and in fact intel had a recent demo at CES where they had a video of a scene, not a 3d model, that you could physically move around in VR. Really cool stuff. Problem with way #2 is unless the camera moves, theres no way you could look at stuff behind things blocking the camera, like behind or under your chair, which you can do in elite in VR.

So really, physical direct camera ways, are pretty much out as ways this would be done in elite. The real way you would accomplish this, and another thing being played with in real life for real VR is to have sensors (cameras, laser scanners, whatever they do in the future), around the area you want to virtually visit and then do a real time 3d reconstruction based on all that data. The same way something like the microsoft hololens scans your environment in real time to create a 3d map, or photogrammetry is done. If you haven't used VR before, heres an example of someones 3d reconstruction, created using only cameras and visualized in VR:
https://www.youtube.com/watch?v=ybdKP45IVDQ

That is something that we can do today, in 2017. In the next decade, possibly more, we will be able to capture scenes in greater fidelity than that, in no time, even real time possibly. At the very least its not hard to imagine in 1,300 years, they can do what we do in 2017 in real time.



Ok, so thats explaining how you're seeing your friends cockpit, how you're visiting it and visualizing it. At least the most likely explanation on how it would be done based on our current knowledge. How does this apply to the 3rd person camera? Well, simply put, you have all that sensor data from the ship that knows exactly where all objects around it are, and exactly what position they are in, the orientation and look of your own ship. It would be fantastically trivial to churn together all that data and create the same kind of 3d reconstruction we can do in 2017, and once you have that 3d reconstruction, you can place the virtual camera that you view the scene from, in literally any position, at any angle you want.

People aren't blowing smoke when they make the argument we can practically do all of that today, in 2017.

Now, whether you, or anyone else likes the gunner cam, is a completely different argument, but I hope I've been able to give you an idea why there is no such thing as a drone needed for the kind of view you get in that gunner cam.

I don't have a major issue with the gunner role, just thought it could be done in a better way.

Now the problem I have is why can't there be a 3rd person view to control the ship.

It's all the inconsistencies.

Unlimited range telepresence opens up a whole can of worms for consistency. Unless they are going to fundamentally change the way the whole game works.
 
Back
Top Bottom