Release EDDI 3.3 - Bring your cockpit to life

Hi,

Amazing Program! I just have a couple of questions. Sorry if they've already been asked:

1. I assume EDDI knows where to get the info for ED since it is also in AppData regardless of where ED is actually installed. Is there a way to tell EDDI to look in a different location? Like a network share? I'd like to try running EDDI from my Windows tablet beside me. It has better voices, and I think it would be nice to have EDDI responses separate from the audio on my main computer. Sub-question, and more of a curiosity: What file(s) are EDDI monitoring? Does it recognize an event because of a change to the file or is there a new file with the new event?

2. EDDI has some incredibly useful responses and information, but the only way I know how to trigger some of them is with the Test button in the 'Speech Responder' window. An example would be the 'Damage check'. How else are those responses (user-initiated) triggered?

3. Some of the verbose information is overwhelming, but would be nice to see written instead. Is there a way to pipe the output into an overlay on the game? I'd even be happy with a window that displays the speechresponder.out file in realtime. I can open the file, but I don't know how to open it in a way that would show its updates. Is there a Windows utility that will show a file like that?

Thanks!
 
Hi,

Amazing Program! I just have a couple of questions. Sorry if they've already been asked:

1. I assume EDDI knows where to get the info for ED since it is also in AppData regardless of where ED is actually installed. Is there a way to tell EDDI to look in a different location? Like a network share? I'd like to try running EDDI from my Windows tablet beside me. It has better voices, and I think it would be nice to have EDDI responses separate from the audio on my main computer. Sub-question, and more of a curiosity: What file(s) are EDDI monitoring? Does it recognize an event because of a change to the file or is there a new file with the new event?

2. EDDI has some incredibly useful responses and information, but the only way I know how to trigger some of them is with the Test button in the 'Speech Responder' window. An example would be the 'Damage check'. How else are those responses (user-initiated) triggered?

3. Some of the verbose information is overwhelming, but would be nice to see written instead. Is there a way to pipe the output into an overlay on the game? I'd even be happy with a window that displays the speechresponder.out file in realtime. I can open the file, but I don't know how to open it in a way that would show its updates. Is there a Windows utility that will show a file like that?

Thanks!

Hello CMDR,
sorry, I did not understand everything. My English is just bad. But I might have some information for you.
Important, read the page of EDDI at github. There is already very much.
EDDI gets its info from the journal.log (and the others that came later, outfitting.json, status.json, market.json, NavRoute.json,
Cargo.json and ModulesInfo.json) and the API (Frontier, EDSM), see github. The pronunciation (the responder scripts)
is triggered via ED events, the API or by a call from a script.
I think Commander T'kael, Darkcyde, VerticalBlank and SemlerPDX can give you more specific information on your questions.

Translated with www.DeepL.com/Translator (free version)
 
Hi,

Amazing Program! I just have a couple of questions. Sorry if they've already been asked:

1. I assume EDDI knows where to get the info for ED since it is also in AppData regardless of where ED is actually installed. Is there a way to tell EDDI to look in a different location? Like a network share? I'd like to try running EDDI from my Windows tablet beside me. It has better voices, and I think it would be nice to have EDDI responses separate from the audio on my main computer. Sub-question, and more of a curiosity: What file(s) are EDDI monitoring? Does it recognize an event because of a change to the file or is there a new file with the new event?

2. EDDI has some incredibly useful responses and information, but the only way I know how to trigger some of them is with the Test button in the 'Speech Responder' window. An example would be the 'Damage check'. How else are those responses (user-initiated) triggered?

3. Some of the verbose information is overwhelming, but would be nice to see written instead. Is there a way to pipe the output into an overlay on the game? I'd even be happy with a window that displays the speechresponder.out file in realtime. I can open the file, but I don't know how to open it in a way that would show its updates. Is there a Windows utility that will show a file like that?

Thanks!
1) At the moment there is not an officially supported method for doing so, though you might be able to link up file system locations using symbolic links or similar. EDDI monitors the files in [your Saved Games folder]/Frontier Developments/Elite Dangerous/. A new journal file is created for each play session, and EDDI reads from that file and other supporting files in that location as the game writes lines.

2) You can invoke a script from another script using the F() function. You can also invoke a script from VoiceAttack using the speech plugin context. That said, some scripts will be able to provide better information if invoked at an appropriate time. Damage check is best invoked just after a Loadout event or (if connected to the Frontier API) just after an Undocked event (since these are the times when EDDI can obtain module health data).

3) There is no native overlay but the program Notepad++ can be configured in a simple overlay mode for this purpose if you'd like.

o7! :)
 
3) There is no native overlay but the program Notepad++ can be configured in a simple overlay mode for this purpose if you'd like.
Could you pls expand this answer? How is it done in Notepad++ and how to send the speech he don't want to hear to a file? Seems a very interesting option!
 
Could you pls expand this answer? How is it done in Notepad++ and how to send the speech he don't want to hear to a file? Seems a very interesting option!
If you enable the checkbox in the Speech Responder to write text to speechresponder.out, it will write all speech to that file. Writing selective speech is not currently possible, it's either all or nothing.

Notepad++ allows you to monitor the file, meaning that you can see changes to the file in real time without refreshing the file. It also allows you to enter Post-It mode, which hides everything except the file contents and keeps that window on top of other windows (at least when you are not in full screen mode).
 
If you enable the checkbox in the Speech Responder to write text to speechresponder.out, it will write all speech to that file. Writing selective speech is not currently possible, it's either all or nothing.

Notepad++ allows you to monitor the file, meaning that you can see changes to the file in real time without refreshing the file. It also allows you to enter Post-It mode, which hides everything except the file contents and keeps that window on top of other windows (at least when you are not in full screen mode).
Great tip ! (y):cool:
 
1) At the moment there is not an officially supported method for doing so, though you might be able to link up file system locations using symbolic links or similar. EDDI monitors the files in [your Saved Games folder]/Frontier Developments/Elite Dangerous/. A new journal file is created for each play session, and EDDI reads from that file and other supporting files in that location as the game writes lines.

2) You can invoke a script from another script using the F() function. You can also invoke a script from VoiceAttack using the speech plugin context. That said, some scripts will be able to provide better information if invoked at an appropriate time. Damage check is best invoked just after a Loadout event or (if connected to the Frontier API) just after an Undocked event (since these are the times when EDDI can obtain module health data).

3) There is no native overlay but the program Notepad++ can be configured in a simple overlay mode for this purpose if you'd like.

o7! :)
Thank You! That's very helpful information. How would I configure Notepad++ to do that? I am familiar with the program, but I have never done that. An overlay, and a method for viewing the updates to a file as they happen would be very useful for many things.
 
Did you watch the videos linked in my post above? Those show how to do it.
Sorry, somehow the posts after your first response didn't appear when I posted again, but they do now.

You are awesome! Those tips are PHENOMENALLY useful! And with video links. Thank you for taking the time to do that. I am very interested to try the EDDI output as a running log. It would also be interesting to see the latency across a network share, as that log could very easily be displayed on another device. I'll just try it on my other monitor for now. I believe Notepad++ also can perform formatting based on the content of the lines. Maybe different colors for certain types of messages. There may be a lot of possibilities there.

Thank You

o7
 
It would also be interesting to see the latency across a network share, as that log could very easily be displayed on another device.
Pls share a little "how to" when you setup your network share. I was thinking about using my old, unused laptop for EDDI and other E:D utilities and i was wondering about how to make it works.
 
I'm wondering about a "multi-output" feature, for the scripts, so i'll talk a bit to explain what i have in mind and get some feedback.
I'm asking here before "cluttering" the Github site with new feature requests.

There's a technical reason to have only ONE voice? I could use an additional english voice for "pure english" speaks (like the system/local messages, which now i can listen to thanks to the EliteG19s app).
Obviously, this will require the ability to select what voice should say the message generated by each script.
Also, the idea to show in some way some written text over the game screen is fascinating, and again it could need a way to put some script output in a dedicated log to be shown over the game window.

So, my question is: could it be possible to have additional "channels" in EDDI? Say one secondary voice and a secondary "log", and a function that redirect the script result in one of them (something like "{SendOutput(2ndLog)}" ) at the end of the script?
 
There's a technical reason to have only ONE voice? I could use an additional english voice for "pure english" speaks (like the system/local messages, which now i can listen to thanks to the EliteG19s app).
Obviously, this will require the ability to select what voice should say the message generated by each script.
There is a way to use different voices in EDDI already, it's called the Voice() function. This is from the help section:
Voice()
This function allows you to include a different voice in your script than then one currently selected. This function uses SSML tags.
Voice() takes two mandatory arguments: the text to speak and the voice to speak it (legal values for the voice should match one of the voices listed by EDDI's Text-to-Speech tab."). For Example:

{Voice("Now I can speak", "Microsoft Zira Desktop")}
{Voice("And I can listen", "Microsoft David Desktop")}
Although, I'm not sure this is exactly what you are thinking of, is it?
 
There is a way to use different voices in EDDI already, it's called the Voice() function. This is from the help section:
THIS IS WONDERFUL! I keep missing info and making dumb impressions, though :)

Although, I'm not sure this is exactly what you are thinking of, is it?
Well, all the EDDI localized words will still be in the personality language, but i guess that with some work i can make some decent script for pure english spoken messages.
So, about the "different voice" question is not exactly but pretty close, and i'm feeling happy.

And now, a {Write( "what should be written", "OverlayTextFile.log") } function does'nt appear so out of place :)
 
And now, a {Write( "what should be written", "OverlayTextFile.log") } function does'nt appear so out of place :)

Well, if you use Voice Attack, you can output to a file, something like this:
Code:
Set text [missionsdata] to [EDDI state missionsdata]
Write (overwrite), '{TXT:missionsdata}' to file '%AppData%\VoiceAttack\EDDI_Missions_Data.txt'
Then you could use the Notepad++ overlay feature mentioned a few replies above to view it. :)

EDIT: The VA Write command has options to overwrite or append, depending on how you want the file to be updated.
 
Hi,
I was wondering if this brilliant program has an easy..ish way to use the speech responder when the body scanning in the FSS shows biological or thargoid pois.
For example, I'd like to be notified that it found thargoid pois:
{ "timestamp":"2021-02-18T09:06:42Z", "event":"SAASignalsFound", "BodyName":"Evangelis A 4", "SystemAddress":1183967810114, "BodyID":11, "Signals":[ { "Type":"$SAA_SignalType_Thargoid;", "Type_Localised":"Thargoid", "Count":14 }, { "Type":"$SAA_SignalType_Biological;", "Type_Localised":"Biological", "Count":5 }, { "Type":"$SAA_SignalType_Human;", "Type_Localised":"Human", "Count":3 } ] }
{ "timestamp":"2021-02-18T09:06:43Z", "event":"Scan", "ScanType":"Detailed", "BodyName":"Evangelis A 4", "BodyID":11, "Parents":[ {"Star":1}, {"Null":0} ], "StarSystem":"Evangelis", "SystemAddress":1183967810114, "DistanceFromArrivalLS":78.400089, "TidalLock":true, "TerraformState":"", "PlanetClass":"High metal content body", "Atmosphere":"", "AtmosphereType":"None", "Volcanism":"", "MassEM":0.002800, "Radius":928006.312500, "SurfaceGravity":1.296070, "SurfaceTemperature":337.029785, "SurfacePressure":0.000000, "Landable":true, "Materials":[ { "Name":"iron", "Percent":21.352970 }, { "Name":"nickel", "Percent":16.150492 }, { "Name":"sulphur", "Percent":14.984314 }, { "Name":"carbon", "Percent":12.600257 }, { "Name":"chromium", "Percent":9.603140 }, { "Name":"manganese", "Percent":8.818559 }, { "Name":"phosphorus", "Percent":8.066899 }, { "Name":"germanium", "Percent":4.423873 }, { "Name":"niobium", "Percent":1.459362 }, { "Name":"molybdenum", "Percent":1.394336 }, { "Name":"tellurium", "Percent":1.145792 } ], "Composition":{ "Ice":0.000000, "Rock":0.666428, "Metal":0.333572 }, "SemiMajorAxis":23524144291.877747, "Eccentricity":0.003200, "OrbitalInclination":0.007422, "Periapsis":149.846352, "OrbitalPeriod":2923078.536987, "RotationPeriod":2923078.654812, "AxialTilt":-0.061101,

I know about the Surface Signals Detected event, but it gets triggered when using the DSS.
Thanks.
 
Hi,
I was wondering if this brilliant program has an easy..ish way to use the speech responder when the body scanning in the FSS shows biological or thargoid pois.
For example, I'd like to be notified that it found thargoid pois:
{ "timestamp":"2021-02-18T09:06:42Z", "event":"SAASignalsFound", "BodyName":"Evangelis A 4", "SystemAddress":1183967810114, "BodyID":11, "Signals":[ { "Type":"$SAA_SignalType_Thargoid;", "Type_Localised":"Thargoid", "Count":14 }, { "Type":"$SAA_SignalType_Biological;", "Type_Localised":"Biological", "Count":5 }, { "Type":"$SAA_SignalType_Human;", "Type_Localised":"Human", "Count":3 } ] }
{ "timestamp":"2021-02-18T09:06:43Z", "event":"Scan", "ScanType":"Detailed", "BodyName":"Evangelis A 4", "BodyID":11, "Parents":[ {"Star":1}, {"Null":0} ], "StarSystem":"Evangelis", "SystemAddress":1183967810114, "DistanceFromArrivalLS":78.400089, "TidalLock":true, "TerraformState":"", "PlanetClass":"High metal content body", "Atmosphere":"", "AtmosphereType":"None", "Volcanism":"", "MassEM":0.002800, "Radius":928006.312500, "SurfaceGravity":1.296070, "SurfaceTemperature":337.029785, "SurfacePressure":0.000000, "Landable":true, "Materials":[ { "Name":"iron", "Percent":21.352970 }, { "Name":"nickel", "Percent":16.150492 }, { "Name":"sulphur", "Percent":14.984314 }, { "Name":"carbon", "Percent":12.600257 }, { "Name":"chromium", "Percent":9.603140 }, { "Name":"manganese", "Percent":8.818559 }, { "Name":"phosphorus", "Percent":8.066899 }, { "Name":"germanium", "Percent":4.423873 }, { "Name":"niobium", "Percent":1.459362 }, { "Name":"molybdenum", "Percent":1.394336 }, { "Name":"tellurium", "Percent":1.145792 } ], "Composition":{ "Ice":0.000000, "Rock":0.666428, "Metal":0.333572 }, "SemiMajorAxis":23524144291.877747, "Eccentricity":0.003200, "OrbitalInclination":0.007422, "Periapsis":149.846352, "OrbitalPeriod":2923078.536987, "RotationPeriod":2923078.654812, "AxialTilt":-0.061101,

I know about the Surface Signals Detected event, but it gets triggered when using the DSS.
Thanks.
Hello @yianniv ,

When you jump into a system, the game loads any necessary or important information for the system. This information can be found in the Jounal.log and other files. EDDI uses among other things ED-Events to start own events (e.g. speech responder). There is no EDDI event to read all data for the system. That makes also no sense. Only if certain actions (here the object scan) were executed, special information is accessible. This is my view on your question.

Greetings from nepomuk
 
Hi,
I was wondering if this brilliant program has an easy..ish way to use the speech responder when the body scanning in the FSS shows biological or thargoid pois.
For example, I'd like to be notified that it found thargoid pois:
{ "timestamp":"2021-02-18T09:06:42Z", "event":"SAASignalsFound", "BodyName":"Evangelis A 4", "SystemAddress":1183967810114, "BodyID":11, "Signals":[ { "Type":"$SAA_SignalType_Thargoid;", "Type_Localised":"Thargoid", "Count":14 }, { "Type":"$SAA_SignalType_Biological;", "Type_Localised":"Biological", "Count":5 }, { "Type":"$SAA_SignalType_Human;", "Type_Localised":"Human", "Count":3 } ] }
{ "timestamp":"2021-02-18T09:06:43Z", "event":"Scan", "ScanType":"Detailed", "BodyName":"Evangelis A 4", "BodyID":11, "Parents":[ {"Star":1}, {"Null":0} ], "StarSystem":"Evangelis", "SystemAddress":1183967810114, "DistanceFromArrivalLS":78.400089, "TidalLock":true, "TerraformState":"", "PlanetClass":"High metal content body", "Atmosphere":"", "AtmosphereType":"None", "Volcanism":"", "MassEM":0.002800, "Radius":928006.312500, "SurfaceGravity":1.296070, "SurfaceTemperature":337.029785, "SurfacePressure":0.000000, "Landable":true, "Materials":[ { "Name":"iron", "Percent":21.352970 }, { "Name":"nickel", "Percent":16.150492 }, { "Name":"sulphur", "Percent":14.984314 }, { "Name":"carbon", "Percent":12.600257 }, { "Name":"chromium", "Percent":9.603140 }, { "Name":"manganese", "Percent":8.818559 }, { "Name":"phosphorus", "Percent":8.066899 }, { "Name":"germanium", "Percent":4.423873 }, { "Name":"niobium", "Percent":1.459362 }, { "Name":"molybdenum", "Percent":1.394336 }, { "Name":"tellurium", "Percent":1.145792 } ], "Composition":{ "Ice":0.000000, "Rock":0.666428, "Metal":0.333572 }, "SemiMajorAxis":23524144291.877747, "Eccentricity":0.003200, "OrbitalInclination":0.007422, "Periapsis":149.846352, "OrbitalPeriod":2923078.536987, "RotationPeriod":2923078.654812, "AxialTilt":-0.061101,

I know about the Surface Signals Detected event, but it gets triggered when using the DSS.
Thanks.
Unfortunately, information in the player journal is not always available at the same time and to the same degree that you see in the UI. The player journal only gets updated with surface signal information after you use the DSS. Until that occurs, we have no knowledge of those surface signals.

Here's a recent example from my logs:
JSON:
{ "timestamp":"2020-10-30T10:37:13Z", "event":"Scan", "ScanType":"AutoScan", "BodyName":"Musca Dark Region PJ-P b6-1 3", "BodyID":13, "Parents":[ {"Star":0} ], "StarSystem":"Musca Dark Region PJ-P b6-1", "SystemAddress":2875614897689, "DistanceFromArrivalLS":32.639658, "TidalLock":true, "TerraformState":"", "PlanetClass":"High metal content body", "Atmosphere":"", "AtmosphereType":"None", "Volcanism":"", "MassEM":0.086840, "Radius":2847893.000000, "SurfaceGravity":4.267596, "SurfaceTemperature":329.075714, "SurfacePressure":58.119381, "Landable":true, "Materials":[ { "Name":"iron", "Percent":22.524445 }, { "Name":"nickel", "Percent":17.036547 }, { "Name":"sulphur", "Percent":16.048769 }, { "Name":"carbon", "Percent":13.495353 }, { "Name":"manganese", "Percent":9.302366 }, { "Name":"phosphorus", "Percent":8.639957 }, { "Name":"zinc", "Percent":6.121301 }, { "Name":"zirconium", "Percent":2.615553 }, { "Name":"cadmium", "Percent":1.749125 }, { "Name":"tungsten", "Percent":1.236818 }, { "Name":"tellurium", "Percent":1.229764 } ], "Composition":{ "Ice":0.000000, "Rock":0.672921, "Metal":0.327079 }, "SemiMajorAxis":9785264730.453491, "Eccentricity":0.000019, "OrbitalInclination":0.000025, "Periapsis":227.365506, "OrbitalPeriod":921412.122250, "RotationPeriod":921548.097180, "AxialTilt":-0.451238, "WasDiscovered":true, "WasMapped":true }

{ "timestamp":"2020-10-30T10:38:57Z", "event":"SAAScanComplete", "BodyName":"Musca Dark Region PJ-P b6-1 3", "SystemAddress":2875614897689, "BodyID":13, "ProbesUsed":4, "EfficiencyTarget":6 }

{ "timestamp":"2020-10-30T10:38:57Z", "event":"SAASignalsFound", "BodyName":"Musca Dark Region PJ-P b6-1 3", "SystemAddress":2875614897689, "BodyID":13, "Signals":[ { "Type":"$SAA_SignalType_Thargoid;", "Type_Localised":"Thargoid", "Count":10 }, { "Type":"$SAA_SignalType_Biological;", "Type_Localised":"Biological", "Count":5 }, { "Type":"$SAA_SignalType_Human;", "Type_Localised":"Human", "Count":16 } ] }

In this example, we see that I scanned the body at "2020-10-30T10:37:13Z", mapped the body at "2020-10-30T10:38:57Z", then received the SAASignalsFound event (which triggers in EDDI as Surface signals detected) immediately after at "2020-10-30T10:38:57Z".

The event isn't being suppressed or slowed down in any way. We act on it as soon as we see it.
 
Last edited:
Was there ever an event in EDDI that let you know if the destination for an FSD jump was obscured?

Or am I just going more nuts than I thought?
 
I don't think so, I don't remember ever seeing one.

Maybe it's just my luck, but whenever I want to jump somewhere, the destination is always obscured. I'd just assumed it was the games way of making you do something on your journey, and help make bots less effective. 🤷‍♂️
 
Top Bottom