Proposal: alternative to outright removal of ADS instascan & system map minigame for explorers

IIRC they said this is how it works.

"The system scan now returns an aggregated display of how energetic the electromagnetic emissions are in the system. Signals are sorted on a low to high scale by their apparent energy. For example, emissions from rocky clusters will appear at the lower end of the scale, hot gas giants at the upper range. This information requires some interpretation as signals can overlap."

Pretty much how the scanner works in the SRV; in that there's a different signal for each type of source. It's pretty much the same thing, really. The problem will be that this works okay in one dimension (horizontal) but how well is that going to work in 3 dimensions when signals, indeed, overlap.

That's easily solved in a buggy. You just move the buggy. Right? Shift a few meters. Resolves the overlap so you can crack on doing the do. So guess what we're all gonna be doing. Head down staring at our phones in the car as it's driving through the countryside because we gotta sort out the signals, bro. Only space is a heap bigger, so a lot more driving will be needed by comparison.

This idea we'll jump in, stand rock still, look at a bunch of planets in the mini-game, find them all in seconds flat, say "wow" and moment's later be off doing the thing ignores we will probably spend (some) time just trying to resolve an accurate picture. That could take several minutes or more, including driving about trying to get an idea where stuff is, glued to the scanner and all we've done is figured out where stuff is.

A few POI and improvements in what can be detected will make or break that, I reckon. Because that could become quite monotonous. If you're gonna make people look at maps all day, it helps to toss a little bit of treasure out there to find, no? The entire point of going out there, is for what's outside the canopy, not what's inside; at least it is for me. If I just wanna look inside the ship all day, I can do that in the bubble.

I am going to thrash this in Beta though. I want to understand if this will be the thing that helps me embrace the game again, and adds some well built structure, or put it aside for considerably longer than a small break because it's just endless busy-work where I stare at my phone all day, and end up forgetting the entire point of exploration in the first place.
 
Last edited:
Pretty much how the scanner works in the SRV; in that there's a different signal for each type of source. It's pretty much the same thing, really. The problem will be that this works okay in one dimension (horizontal) but how well is that going to work in 3 dimensions when signals, indeed, overlap.

That's easily solved in a buggy. You just move the buggy. Right? Resolves the overlap so you can crack on doing the do. So guess what we're all gonna be doing. Head down staring at our phones in the car as it's driving through the countryside because we gotta sort out the signals, bro. Defo sounds like exploring to me! lol. I hope it's not quite as silly as that.

This idea we'll jump in, stand rock still, look at a bunch of planets in the mini-game, find them all in seconds flat, say "wow" and moment's later be off doing the thing ignores we will probably spend time just trying to resolve an accurate picture. That could take several minutes or more, including driving about trying to get an idea where stuff is, glued to the scanner and all we've done is figured out where stuff is. We haven't even done anything else at that point. Woo. Riveting stuff! ;)

A few POI and improvements in what can be detected will make or break that, I reckon. To help throw some flavour. If you're gonna make people look at maps all day, it helps to toss a little bit of treasure out there to find, no? Since the idea we would actually be interested in what is outside the canopy, is mostly off the table now.

I am going to thrash this in Beta though. I want to understand if this will be the thing that helps me embrace the game again, or put it aside for considerably longer than a small break because it's just endless busy-work.

The difference is that you can't change anything regarding the SRV scanner. In the ADS you can change zoom, scan range, tune the scale, detect gravital distrubances and you get HUD information. All this stuff doesn't exist for the SRV and that makes it impossible to compare.
We'll see how it turns out in beta.
 
The difference is that you can't change anything regarding the SRV scanner. In the ADS you can change zoom, scan range, tune the scale, detect gravital distrubances and you get HUD information. All this stuff doesn't exist for the SRV and that makes it impossible to compare.

If there is a brick, in front of something else, no zoom or scan range or scale will make any difference unless you can move the point of view, in three dimensions, within the HUD, so you can offset that brick from what is behind it. The only other way to shift the point of view, is to move the ship.

So I really, really hope Frontier have remembered to ensure we can move our POV in the hud (much like we can in the galaxy map) and don't in fact require us to do exactly what we have to do in the buggy. Which is physically relocate. Because this would not be the first time, such an obvious thing that they would surely do, was anything but obvious to the developer and not surely done at all.

We'll see how it turns out in beta.

For sure. We don't have a date yet, though, right?
 
Last edited:
If there is a brick, in front of something else, no zoom or scan range or scale will make any difference unless you can move the point of view, in three dimensions, within the HUD, so you can offset that brick from what is behind it. The only other way to shift the point of view, is to move the ship.

So I really, really hope Frontier have remembered to ensure we can move our POV in the hud (much like we can in the galaxy map) and don't in fact require us to do exactly what we have to do in the buggy. Which is physically relocate. Because this would not be the first time, such an obvious thing that they would surely do, was anything but obvious to the developer and not surely done at all.



For sure. We don't have a date yet, though, right?

The way I understand it you don't need to move physically to look behind a brick. You will be able to look past objects by fine tuning your sensor scale and range. If there are two planets right behind each other each of them will have their own electromagentic signature. Initially this signature will be kind of fuzzy until you set your scanner correctly and this will also be shown on your hud. That way you can easily look 'behind' objects.
 
If there is a brick, in front of something else, no zoom or scan range or scale will make any difference unless you can move the point of view, in three dimensions, within the HUD, so you can offset that brick from what is behind it. The only other way to shift the point of view, is to move the ship.

We'll have to hope that the new system is smarter than the current system. I don't like that we can currently scan planets THROUGH the obscuring stars/planets either. Somehow the game knows that we can't high wake to system blocked by a planet/star, but it still lets us scan obscured planets through them...

That being said, if the object isn't completely obscured like taking a photo through partially closed blinds (or a ring system) then you can focus past the intervening object and still see the object behind clearly if more dimly.
 
The way I understand it you don't need to move physically to look behind a brick. You will be able to look past objects by fine tuning your sensor scale and range.

Yeah, that's sort of what seems to be the goal. But I have learned, that Frontier doesn't instinctively know, what we can fairly quickly and intuitively see. Yes, I'd like to think that. Of course we should be able to shift the POV? I'm not going to assume that though, because that'd be foolish. ;)

Changing the range of a sensor to 'reveal' a planet behind another is no different to the "you can scan a planet through a star" situation we have now, really. I'd prefer an ability to move the POV (without throwing my ship into 'drive') as that will be more effective at building a mental picture, than frobbing knobs and hoping the wine glass suddenly shatters, so you can see what's behind it.

We'll have to hope that the new system is smarter than the current system. I don't like that we can currently scan planets THROUGH the obscuring stars/planets either. Somehow the game knows that we can't high wake to system blocked by a planet/star, but it still lets us scan obscured planets through them...

When you've played for even just a little while, a lot of things sort of become intuitive, you know? And we see these things and scratch our heads. There's a planet between me and a destination that I can't apparently hyperspace through, but can scan through it. Yes this is supposedly by design. Much like the massive neutron jump range ended up being (it started as a mistake, like so many things at this point).

This, really, is why I seem a bit cautious. Because what seems obvious and intuitive to us, can be a revelation to them at times. So I hope it's a little like the galaxy map, in that there is some degree of control over filters, and that you can whip the POV about to get a good mental picture of where things are at. Lord I hope they let us move the POV though. Friend it would be a little bit funny, if we can't.

I shouldn't giggle. But I am. Because it does occur to me, that they genuinely, might actually not have thought of that.
 
Last edited:
We'll have to hope that the new system is smarter than the current system. I don't like that we can currently scan planets THROUGH the obscuring stars/planets either. Somehow the game knows that we can't high wake to system blocked by a planet/star, but it still lets us scan obscured planets through them...

That being said, if the object isn't completely obscured like taking a photo through partially closed blinds (or a ring system) then you can focus past the intervening object and still see the object behind clearly if more dimly.
Just because a body is blocking direct line of sight, does not mean that a scanner based on gravitational effects can't detect an object that is behind it. Just a question of tuning the scanner e.g. to focus on a 500-600ls range. Though will be interesting to see how they handle planets orbiting secondary stars 200kls away at a 10ls orbit.
And then there is the issue of zooming in on the objects to the point where you get the visual representation of the body (2nd screenshot of the reveal) - that one can't be explained by gravitational scan...
 
Just because a body is blocking direct line of sight, does not mean that a scanner based on gravitational effects can't detect an object that is behind it.

Sort of. We can detect planetary objects in distant systems based on wobble and light shifts but if there is a star between you and something it's obscuring, then that massive gravitational mass will cast a shadow on anything it's obscuring. The closer it is (to the body) the more of a shadow it would cast. Until all you can detect is the star's immense mass and mathematics suggests more mass present ergo there might be one or more bodies. A gas giant and a moon isn't going to be that much different. If a moon is close enough, then you can potentially know there's mass interacting, but how that's distributed, is an entirely different question.

Or you can just move the POV and spot the extra body and solve-for-x instead? Way more intuitive to do that then potentially "stumble" on the right frequency.

Which, to me, seems more effective than trying to guess what the planet is, that you cannot see or identify, in order to tune to it's signal (if you have to zoom all the way in, then go "oh it's an ice planet" and then can set the scanner for that, then maybe there's some redundancy at play). Thus if we can zoom, adding pan or tilt to the point of view (again, much like the galaxy map) strikes me as a no-brainer, really.

Thing is, is that a no-brainer to the developer? Mmm. We get to find out. Soontm.
 
Last edited:
Just because a body is blocking direct line of sight, does not mean that a scanner based on gravitational effects can't detect an object that is behind it. Just a question of tuning the scanner e.g. to focus on a 500-600ls range. Though will be interesting to see how they handle planets orbiting secondary stars 200kls away at a 10ls orbit.
And then there is the issue of zooming in on the objects to the point where you get the visual representation of the body (2nd screenshot of the reveal) - that one can't be explained by gravitational scan...

Yes, sort of. For clarity, it should be noted that the current scanners we have now are space magic that can see through solid objects. The scanners they are proposing seem to be based more on actual science of energy distributions. In astrophysics, these energy distributions are called "maxwell-boltzmann distributions" and the "energy" of a planet is measured by it's peak or total energy output. This energy output is measured in photons of light. The higher the energy, the higher the frequency of the photons. This the same way that colors of stars are determined, by their peak energy (photon) output in the visible spectrum. Red is lower energy. Blue is higher energy. Etc.

I am writing the above for other people reading the thread as I am sure you understand it because as you pointed out, even if we have futuristic graviton detectors on the ship (which it seems we do!) this wouldn't allow us to focus photons through a solid object into a zoomable picture.
 
Or you can just move the POV. Which, to me, seems more effective than trying to guess what the planet is, that you cannot see or identify, in order to tune to it's signal. If we can zoom in then okay, we know it's a dirt ball, or an ice planet or something, so again since we can zoom, adding pan or tilt to the point of view (again, much like the galaxy map) strikes me as a no-brainer, really.

The day they proposed the new mechanic it seemed pretty clear that the best spot to be in a system is sitting about 50 Ls above or below a star in relation to the ecliptic. This way you have enough parallax from the ecliptic plane to get a clear view of most objects without occultations or having to move.
 
Yes, sort of. For clarity, it should be noted that the current scanners we have now are space magic that can see through solid objects. The scanners they are proposing seem to be based more on actual science of energy distributions. In astrophysics, these energy distributions are called "maxwell-boltzmann distributions" and the "energy" of a planet is measured by it's peak or total energy output. This energy output is measured in photons of light. The higher the energy, the higher the frequency of the photons. This the same way that colors of stars are determined, by their peak energy (photon) output in the visible spectrum. Red is lower energy. Blue is higher energy. Etc.

I am writing the above for other people reading the thread as I am sure you understand it because as you pointed out, even if we have futuristic graviton detectors on the ship (which it seems we do!) this wouldn't allow us to focus photons through a solid object into a zoomable picture.

You can explain all that away by saying the current ADS uses the exact same science of "maxwell-boltzmann distributions" and peak or total energy output, AND is clever enough to distinguish planetary types for you, like any good and sufficiently advanced technology should be able to do.

You're just replacing one excuse for another. Replacing something which is automatic by downgrading it to a manual process!

Mein Gott! I just simply do not understand this utter insanity for calling for a downgrade!
 
The day they proposed the new mechanic it seemed pretty clear that the best spot to be in a system is sitting about 50 Ls above or below a star in relation to the ecliptic. This way you have enough parallax from the ecliptic plane to get a clear view of most objects without occultations or having to move.

Sure. Only not all systems have a single equator. Oh, if only it was that simple. And you're still gonna find bodies so close to parents, that being a few LS above or below the plane won't make an appreciable difference. And, like I said, if we have to start driving around to rely on parallax again, just to make a new mini-game function, just issue everyone a BDS with a telescopic lens, and move on.

Any suggestion this is going to be more intuitive and faster than a honk and driving out to planets goes out the window as soon as commanders are driving in random directions to get the right 'line of sight'. For gravitational sensors, no less. Rather than just having control (in 3 dimensions) of the sensor POV; which would be intuitive and actually make the thing workable.

I'll wait for Beta to see. I don't even want to contemplate the hurdles that will exist until the developer has them ready for us to enjoy at this point.
 
+1 to OP. I'm with you 100%
-I agree that removing the system map reveal on intial honk is harsh, and I'm not just thinking about explorers here: I'm talking about folks staying in the bubble that just want to map out where they're going, find stations without much hassle and have minimal interest in exploration. Say a trader doing missions (And forcing those folks to go hit the nav beacon rather than a quick honk? No, that's harsh...)
-I also realize that keeping the 'full color reveal' we have now is somewhat incompatible with the sciencey minigame the devs have in mind.

Your compromise is sound, the stations could appear with this quick honk and help bubble flyers, help explorers-at-a-glance and such. It even gives the devs an out if they want to reveal asteroid belts and not fuss putting them in the Discovery Scan... I am tempted to suggest adding a tiny bit of data to those black circles, something like Albedo value or energy levels as an added hint that something may be of interest yet hiding the facts behind some science.

Color me quirky, I would actually enjoy trying to tell apart ice worlds from strong-cloud-cover worlds by peering at masses and albedo or the like, fun sciencey puzzle. :)
 
Not sure if anyone has suggested this yet.
Why not implement the current system and the proposed system at the same time?
The current system will do what it always does: find planets and give us some info
The proposed system will provide information about signals and anomalous activity. IE USS/Static POIs

Commanders who just want to get a quick overview to see if a system is worth their time can honk and be on their way, while commanders who think they see something interesting can then engage in the new gameplay.

Seems like the best of both worlds without removing an already functioning layer.

I feel like to keep continuity it would make sense for the scanners to receive a "software update" adding the functionality discussed above rather than abruptly change their functionality entirely with no real explanation as to why they suddenly stop doing what they previously could.
 
I don't care if you consider it 'gameplay' or not.

I enjoy the existing process of using the ADS and DSS to uncover system information. I enjoy flying around in a spaceship looking at stuff. The new mechanics massively reduce the amount of flying I'll be doing, reducing the process to what amounts to a point-and-click adventure game. If I'd wanted to play Monkey Island I'd have bought Monkey Island.

You can play Monkey Island in the background while you wait in SC. Think about that. A game designed to have "dead time" to the point you have to find something else to do. Not good design.
 
There are over 550,000 objects in the solar system. Most of them were discovered by gravitational anomalies that perturbed orbits of other known objects, which is very similar to how the new mechanic points to unresolved objects.

This is a great point, and would go a long way toward making the new mechanics make more sense, if this sort of additional content where actually in the game. For just having a look around at some stars, worlds, and large moons though, I'm not convinced.

Though it isn't exactly easy (might take longer than a few seconds), you can spot four large moons of Jupiter with a good pair of binoculars or a cheap starter telescope. Maybe that should be an equipment option when choosing an exploration loadout, since our ships don't seem all that capable at it with this update.
 
Last edited:
This is a great point, and would go a long way toward making the new mechanics make more sense, if this sort of additional content where actually in the game. For just having a look around at some stars, worlds, and large moons though, I'm not convinced.

One of the gas giants and several other large objects were found by their gravity. It's entirely realistic. I agree though that if you can see an object that it should be easily resolved without have to hunt too much for it, but that would require using some sort of parallax to distinguish foreground planets from background stars (which is how the ancients found the "wandering" planets in the first place). I don't think most people want to use Stone Age tech to find stuff as a primary mechanic... that's been well established.
 
Last edited:
Back
Top Bottom