In-Development Virtual Desktop Program with embedded physics engine at the press of a button. coming in 2020

Oculus Rift CV1 + Oculus Touch

when i'm going to buy those new headsets like the Oculus Quest and the HTC Vive and time to waste on writing more lines of code to make this sccsVD4ED mod compatible with other headsets, i'll make it happen if i can. I don't play Elite Dangerous without VR so this upcoming mod currently is only for VR players owning the Oculus Rift CV1 + Oculus Touch .)


Mod description:

1. Virtual Desktop for Elite Dangerous. The whole project architecture is based on what i mainly learned by reading the code of github user Dan6040 Rastertek C# translation of the C++ Rastertek original tutorials. Dan6040 repository is here
In the version SCCoreSystems, i had started using abstract classes which i never cared to use before as i never had found a usage for them.
2. Recordable in-game sounds but currently only the microphone sound capture works. Each recording does a WAV file with an associated XML file. The XML file only contains minimal data of the WAV file it is associated with.
3. Speech Recognition

This sccsVD4ED mod is not available yet, although it was temporarily. I thought the "Ab3D.DXEngine 60 days evaluation period" was automatically going to popup for anyone who would've downloaded my program. I had supposed that the "Ab3D.DXEngine 60 days evaluation period" was associating automatically the IP addresses of everyone who could've downloaded my program and that it would have automatically activated that "Ab3D.DXEngine 60 days evaluation period",

It seems to be that this should have been set prior to release and also my program was missing DLL's licenses files. I will share news regarding my mod, as soon as i get them.

Please refer to the following thread for working released alternatives Virtual Desktops solutions:

Where you can find me:;u=666
Last edited:
Well it mixes a Virtual Desktop with speech recognition so that, very soon, once properly setup, you will be able to say things like: "record sound" and it starts recording thargoïds sounds, or "play sound" and it plays the thargoïds sounds that you have just recorded, and it currently incorporates a very basic sound visualiser. And later i will implement a record screen speech activated option.

I've found those Virtual Desktop on the Oculus store:

Virtual Desktop
CeekVirtual Reality
Virtual Space
Bigscreen Beta
Weelco VR

There has to be more out there but the list ever granding is still very small and i wanted my spot. Now, im not so sure anymore i will release on the Oculus store. I'm gonna think about it in a much later development stage.
Last edited:
Visual spectrum increased to 88200 cubes/faces instead of 44100 and those line bumps that you see in the grid like visual spectrum is me counting from zero to nine. i've still got some issues to fix before i can release SCCoreSystems-v1.0 .


and here is a screenshot of the ouput file created when you record a sound of that length. It includes only the basic data of the wave itself.

Last edited:
Temporary halt to the advertisement of my program. I re-swallowed my now un-released sccsVD4ED and i have to check whats going on with Github removing my repository .

Please refer to the following thread for alternative Virtual Desktops solutions:

thank you
Last edited:
news 2020 June 11th - my git repository is back. phew. that scared the hell out of me, and Github was kind enough to fix my bad tree hierarchy and they also set it to private:



news 2020 June 13th -
sccsv1.0 -

sccsv10 Resume: This video is a showoff of the performance of multiple instances of Virtual Desktops and voxel planets (the voxel planets are based on Craig Perko's C# minecraft tutorials found here Source: <== it's literally a copy paste) and cubes and my IK Voxel Human Rig (my IK is based on this, and my fake Visual Spectrum, all for my mod sccsVD4ED 4 Elite Dangerous. All of this was achieved in C# by me, Steve Chassé aka Ninekorn, by learning from reading the scripts of github user Dan6040 C# Rastertek translation of the original Rastertek tutorials written in C++. My sccsv1.0 solution was discarded a while ago but it was my program's beginning of my sccsVD4ED Elite Dangerous MOD. This is just a video i made today the 13th of June 2020, to show the Elite Dangerous community, how far sccsv1.0 can be pushed, but it's still very bad and without a working Virtual Desktop Keyboard, it's very useless. sccsv1.0 was an incomplete version missing many things, but i am going back to it from time to time. There are no network implementations currently in all of my solutions, as i don't know how to code that yet. My solutions sccsv1.0 and sccsv1.1 are practically the same, but i will review them later to see the classes hierarchy changes i made between those two versions, if there are any. sccsV1.1 showoff will be coming soon...

Number of objects spawned:
cube instances = 125000 - static
desktop virtual screen (cube rectangular form instances with the live Virtual Desktop texture) = 40000 - static
voxel cube instances = 1000
human rig instances = 100

Here were the issues with sccsv1.0:
1. The Virtual Desktop keyboard is not working.
2. The Virtual Desktop Oculus Rift Pointer is not working.
4. There are no fake ED cockpits.
5. For the reasons above and more, i switched to sccsV11 which is almost a copy paste of sccsv10 but i will have to check if i made any hierarchy class changes anywhere. sccsv11 is just as boring.
6. you can't move around your Visual Spectrum.
7. I wanted to put a Virtual Reality Virtual Desktop SmartWatch, where the watch, becomes a virtual desktop screen when you click on it or something.

Rastertek sources:

I look at your description and the images and still have no idea what the propose of this is.
Why am i building the mod sccsVD4ED and what is it for?! :
1. If all players of Elite Dangerous, at least those in Virtual Reality, could be able to share instantly with each other, what they have just "sound analyzed" or "sound recorded" in Elite Dangerous, at the press of a button, while being in the same place in VR, inside of my sccsVD4ED, the rate of ED players sharing with each other what we discover in-game, i thought could be increased/boosted (the rate of sharing that is). What it might mean is, a faster sharing of new "undiscovered content" like the probe signal so well explained by ObsidianAnt here
Source: ... But, are there more other "undiscovered content" like sound content or image content hidden inside of sounds in Elite Dangerous :unsure:? it is left to be determined i guess.
2. It would be much less cumbersome for me, that always have to take my Virtual Reality Headset off or peel it back on my scalp so that i can see my desk screen. It's very annoying to deposit your VR headset just to change a music on youtube, or to search for stations etc.
3. Originally, my whole idea for my project, even before i wanted to build the sccsVD4ED MOD for Elite Dangerous, was to build the perfect physics engine tester program, to stress test physics engines. After that, i wanted to build a cool minecraft alternative, strictly in Virtual Reality, But when i started re-playing Elite Dangerous, i fell in love again with Elite Dangerous and the idea of making a Virtual Desktop 4 Elite Dangerous flourished in my tiny brains. And it's approx before the 17th of April 2020 (or whatever the date of the first post on this thread was), that my main goal for where my program was heading, completely changed towards building my own Virtual Desktop 4 Elite Dangerous, instead of building a minecraft alternative game that incorporates a Virtual Desktop. It's all because of ObsidianAnt and the fact that he is making his so damn good presentations of Elite Dangerous. I got hooked up instantly, in wanting to see and hear more, solo or not, in Elite Dangerous itself.
4. My solutions don't have fake ED ships cockpits and i wanted to be in a fake ED ship cockpit right after recording Thargoïds sounds in Elite Dangerous. The goal was for me to minimize the time i spent outside of Virtual Reality Headset while i record in-game Elite Dangerous sounds, but to have a totally familiar fake ED cockpit. I didn't even have the chance to record in-game Elite Dangerous sounds yet :cry: ... In order to know why i wanted to do this in the first place, please watch this video here Source: and the link for the Spectrum Analyzer that ObsidianAnt shared on his YouTube link is if you've haven't checked those already.

Please refer to the following thread for alternative Virtual Desktops solutions:

thank you for reading me


  • githubnewreposetup.JPG
    265.1 KB · Views: 123
Last edited:

Inverse Kinematics for the upper/lower legs and feet are not setup yet.
My Virtual Desktop screen is static in this image, because i didn't properly set it yet to lock in place where it stands at a press of a button (i removed that temporarily because it wasn't good enough).
All of my solutions inside sccsVD4ED are still garbage to my eyes. There are no working Virtual Desktop Keyboards, in any of my solutions (although osk.exe was working temporarily before) so i'm gonna keep working on it.

edit 2020-july-02: i renamed my github repo folders. The reason for that is that the virtual desktop was, in the first place, for my mods i am developping on the game "Void Expanse" and for the game Void Expanse to be used in Virtual Reality. But leaving behind the Atomic Torch Void Expanse community behind for 7++ months just to developp the virtual desktop for playing Void Expanse and never playing or modding Void Expanse itself, i completely lost track that the virtual desktop was for void expanse to start off. I forgot about Void Expanse's Virtual Desktop when i was a couple months back developping my new programs architecture, and since i had restarted playing Elite Dangerous my whole Virtual Desktop architecture suddenly had shifted to being completely for Elite Dangerous but that wasn't supposed to be. l now will be using my new program template/architecture for both Virtual Desktops for both ED and VE and i will decide later if i just have 2 different launchers of the same Virtual Desktop for those 2 games or if i just remove everything in the scene but keep the physics engine Live/Loaded and just change the "atmosphere/loadout" of ingame objects.

My project is much bigger than i first imagined, although it only is for the Oculus Rift CV1 + Oculus Touch and i am not leaving my Virtual Reality Desktop Screen project behind for Void Expanse. I changed my repository to SCCoreSystems instead of sccsVD4ED and a total copy of the current advancement i have done for the Virtual Desktop for Elite Dangerous will be used for Void Expanse too. The character i am using is ugly and i will only use it as a reference for Inverse Kinematics and "rectangle objects or voxels objects for the limbs". It was just me miscalculating this whole virtual desktop screen project as my goal never was to stop working on the virtual desktop for playing void expanse, but since i had replayed elite dangerous, i had totally forgotten my first forum promise there at the Atomic Torch community forums about building one to play void expanse. I wanted to bring forth both the mods for Void Expanse and the Virtual Desktop for Elite Dangerous and Void Expanse at the same time, but i didn't even imagine for a second that both those virtual desktops could be part of the same visual studio solution... It's a very annoying

noObie mistake to make, and lame to realize it at this point, but since i am much further ahead in development for my virtual desktop for elite dangerous in terms of a complete program architecture and a part of the jitter physics engine incorporated, i took that decision this week for this.

It's lame to have to say that i am sorry everywhere about my forums mistakes and having to remanufacture another proper advertisement for this whole program/project to be released at some point. Give me a break anyone who is ed off about this whoever you are, i realized that i had built this really nice architecture of a program for the virtual desktop for Elite Dangerous and it's way better than my first implementation that was for Void Expanse so why in the heck wouldn't i make any changes now since this whole project is mine. There's nobody behind me telling me where to go and what to post so i make mistakes and i correct myself after tha but my goal of making a proper Virtual Desktop for Elite Dangerous hasn't changed.

There is still a lot of work left to do on my void expanse mods so i am going to try and continue development there now that i am much further ahead in the virtual desktop part, but i am working on screen divisions so that the screen texture2d is fractured in for the moment 100 screens and sending that texture2D into an array of ShaderResourceView for it's proper display on the faces of that Virtual Desktop. To make this happen, i have to enable for the first time in this project the usage of the jagged array for instanciation of the virtual desktop screen itself. AND also by doing that, i will have a total ability of using a secondary "identical divided object" to create the virtual desktop keyboards themselves and assign SharpDX.DirectInput keyboard strokes to all of those fractured screens (without the screen texture2D) in order to have the key presses done when the virtual reality "hand" (fingers hopefully at some point) presses the fractured screens and i will use the "boundingBox Intersect" and the jitter physics Raycast to detect the hand pressing on those fractured screens for activating those keyboard strokes. I wasn't sure on how to approach this and that's also a reason why i was in a stupid lethargy of no coding for the past 1-3 weeks.

Please refer to the following thread for alternative Virtual Desktops solutions:

thank you for reading me
Last edited:
I realized I never updated this thread to tell everyone I released my drafts of all of this on GitHub a while ago. but about 1 month to 1.5 months ago, my oculus rift headset cv1 broke. I tried to repair it but it broke even more. I purchased electronics parts to build myself a VR headset... but they're not straight out of the box compatible and since electronics is another pile of a headache I put all of this VR development of sccsVD4ED on hold/hiatus. But I will order some oculus rift cv1 headset replacement parts very soon.

released a while ago. open-sourced project. not even one line of code stolen from someone, as I am a free code/free license/MIT scripts hunter, and whatever I learn from someone or that I use tutorials parts of others, I give a reference to the link where I got the free code or where I learned it from. But without my oculus rift headset, I don't even wanna take 15-30 mins to make all of my programs of this repo a tidy bit less laggy. this repo is for noObs hence why the easy to read variable names for the most part... No classes for instantiation, no comments for readability and easy comprehension... well get in your heads it was a draft and before I even clean my programs, I wanna make sure it works first... so these drafts are the working versions (lag is fixable with changing barely a few lines of code) of massive use of instancing, although instancing is easy when you know how to do it, for anyone who never coded before, and my program has an almost instant loadout. Even if you are just a noob like me in programming, you know that you can yourself copy-paste the creation of the object into their own classes so the community shouldn't critic the fact I didn't do it, it's quite obvious I was preparing parts of my drafts to be sent into their own classes especially when I cut my scripts into titled slightly commented sections. I have made it easy for anyone to read my programs although uncommented they are, simply by reading the variable names one can swipe to my code like an easy book to read. imagine when all of it is cleaned and prepped and commented on. I will update the community as soon as I get a better working version of all of this, and I will hopefully finally get the steam VR working in sync with my program. but without an essential VR keyboard and a heightmaps virtual desktop, it's not going to be cool enough.

also, I have been without a job for over 1 year now. I don't think I would be able to make a living out of teaching programming to people without a degree, so raising money for a living on Patreon for being a teacher would be something I would like to do one day, providing I have a diploma first. I also barely restarted studying at team treehouse, and I will continue. If only one day i could make a living out of being a programmer, making mods or software for gamers, in drafts or demos or tutorials cleaned and commented or not, I would start feeling useful having learned as a hobbyist programmer giving everything for free and open source. I don't know if I have the right to put my donations link here. But I can't ask people 20$cad for a membership month on patreon for me teaching them unorganized scripts hierarchy and classes, it would be cool as my stuff works most of the time but it is cluttered there too on patreon in terms of talent and things to teach and although teaching how the virtual desktop works are something I would gladly do instead of having to go work manually at the minimum salary somewhere close where I live because that's all I can afford as a job because I don't have a university diploma or high school diploma.

It would make me able to provide the community with more free open source software from me, more often, at a faster pace. thank you.

any type of support would be greatly appreciated. I would be able to continue doing what I love to do, programming, with no strings attached.
Last edited:
Ok, I'm going to be honest here. I'm all for programming for fun, to see what sorta things are possible. Can definitely imagine programming for VR is exciting.

That being said, I don't really see what would make me want to use this application. As mentioned, the description is very unclear, and the scope seems very unfocused. Why would I want a physics engine inside my virtual desktop environment? If I wanted to record game sounds, why would I want a virtual desktop application to do it? Why is it designed for Elite Dangerous? What makes it particularly suited to do so?

As you're obviously aware, a number of virtual desktop applications exist, so what do you aim to do differently with yours? I'm not even sure what I'm supposed to do with the virtual desktop that this app provides.
"just for fun"? a virtual desktop with a physics engine with sound recording to XML and voice recognition and voxels and a vr human ik rig that can interact with objects, all working in one scene without lag, to be used in elite dangerous as an overlay with much better looks when I'm done. Just for fun? In the beginning, i wanted to be a software provider and sell my stuff on the oculus store... not in it's unfinished state of course. But then, never having the solid network security of big gaming companies and being a freelance hobbyist programmer, i thought, to hell with this, i took too many chances connected to the web all of the time so someone will probably release something like this before me and it's going to irritate me even more. So i released a couple of my programs because there ain't no way with my current network setup now or ever, was strong enough to make me release this program in it's finished state anyway and make a buck out of it anyway. so i really thought to hell with this, i'm providing this as a draft/template of how i would try and code this and let's see how it goes even in it's unfinished state. So yeah, it currently looks like a weekend fun project and any scripters that read code like books would see how easy it is to get started in rastertek C# tutorials, when one starts with learning instancing first at tutorial 37 here and then they learn how to do this here with xoofx's sharpdx libraries which are the only ones i tried here and then apply the texture 2d to the instanced objects. The vr part was what where i was having tons of difficulties years ago but finally i found the oculus rift cv1 c# wrapper that suited me, in the ab3d.oculuswrap/ab3d.dxengine.oculuswrap.

After so many years working on so many different iterations of this program that i became quite used to doing the same thing but slithgly better everytime although i am always heading towards performance first and shiny stuff later when i script so thats why my stuff doesn't necessarily look the most attractive right now. and currently, although my draft template can be beaten by anyone now or even before if they had something similar, i don't see why no one on youtube has this type of Virtual Reality multiple desktop screens in their cockpits to give verbal commands to their teamates in fights. That's why i forced myself to make my program the lightest possible. Personally, i am all for a gaming computer only and a home computer, and if i was a Carrier Ship owner, i would prefer seing what my crew is doing whether it's a small pirate fight or a big pvp fights so that even i can learn to be better at pvp. Nowadays, with a 1000 mpbs Optic Cable internet service provider and the cpu and ram power and video card power, having multiple desktop screens of your teamtes inside of a vr cockpit should be available so if my program doesn't lag, i think it could be a very good virtual desktop alternative when it is going to be finished but i don't know nothing of network and i don't know how much it costs in terms of network traffic to see each others screens like they do in OBS and Discord and youtube for streamers. Not sure if some players use this type of software yet, but i wanted to try and code this. The particle system is something i never coded though yet so i need to try out how it works first with the c# rastertek tutorials and then i need to test if i can instance (multiply) the particle system at a low performance cost. And right now i have got a 5mbps download internet connection speed and i can stream without any sort or form of streaming problem on twitch but with OBS as far as i tried. My program is not network ready at all yet but even with obs turned on and streaming and my software in the back even when elite dangerous is running, wasn't really making the cpu that hot unless you boost up the instances to a huge amount. My sccsv10 and sccsv11 instances numbers can be diminished and even removed altogether and the physics engine turned off also by commenting out some lines in script. I will loop back to this thread soon to at least post the tutorial on which lines to comment out to deactivate the physics engine altogether among which variables does what for the scene objects.

This means, although LAN parties are pretty cool and never get old, where we see each others screens and play as a team, when network is activated in my software, if Carrier ship owners or any team leaders had a tool like this or better, the desktop screens of any teamleader's player crew will give vision to the Carrier captains or any clan leaders in pirate fights or pvp fights so that even the leader can give better commands to their noObies when they join theirs clans for serious gaming. I am also talking of multiple webpages displayed inside of your cockpit and tons of youtube videos or netflix movies in your screen, with tons of aliens and ufos reference that get all of us in the mood for some elite dangerous, and just multiply the whole thing at the touch of a button. with a loadout of the whole program inside of 5 seconds. actually 1 second or faster if you don't spawn too many objects currently, but the more i work to developp this software, the more tools and functions it will have. I'm talking about "snowing inside of your cockpit, in summer if you want, or having flames burning your cockpit console without ever harming the console because it would be an independent particle system that is super light... without ever any sort or form of lag, coming out of my program... But of course there is lag currently, and although i could fix it this week, i will have to work a couple minutes to change from VR to directX rendering, because my oculus cv1 is broken. So no vr until it is repaired and i really hate the fact i have to go back to directX rendering to fix this... and although it takes seconds to recreate a directx environment with the tons of backups of the iterations of my software that are on my backups hardrives all over the place, i could have it not lag in minutes. But i have left the Atomic Torch Void Expanse community behind for months and months without ever updating my mods, and in the last 2 weeks, i have finally decided to release my 3d object (obj extension) generator for Void Expanse and released the start of the destructible mod i was talking about a while ago, and finally released the chunk pathfind of 9 tiles i was talking about in that community. Why such a huge story when i could answer to you in one phrase... i am behind my mods developments and vr software and my oculus rift cv1 breaks in my face... so i'm like, whatever... im releasing everything to at least show people that i am working on it but it's not there yet. but hundreds of webpages opened in separate overlays windows inside of my cockpit, while i see particle system overlays burning inside of my cockpit when i go back and try pvp again. Losing a type-10 defender against a pirate anaconda and it's pirate goonies, was a bad idea without ship insurance because i lost that 120 million ship. Then i get in some form of a miscommunication with some pvp players in the discord channels and we all get irritated and i get my *** kicked by a supposedly super good pvp player i thought... and then when i purposely let that anaconda pvp player get me out of warp so that we can fight, i turn around when out of warp and i see that huge vertical black obelisk with already 75% of my hull hp down and i don't even think i had the time to fire that i was dead. lol. not enough insurance again, but it was a fair ship fight in terms of wanting to both slap each others faces and i got it recorded in video still unreleased when that Monolith Vertical Anaconda destroyed me to bits. I'm wishing to never release it because i never had a chance lol, but the pvp player looks really good and nice with his Obelix Anaconda way of rising up his anaconda as if it was a standing black lozenge polygon monolith, with all of it's grandeur in the darkness of space, destroying me to bits and ashes in seconds so there went my chieftain coz i had not enough money for insurance there either. but they offered to do pirate missions for me and i declined, maybe i just needed a good blood rush on that elite dangerous pvp fight. It was really a cool initiation to pvp in elite dangerous.

I'd rather do a sideview combat project inside of rpg maker vx ace just for fun but i was serious even doing that. can't help it. this project wasn't done in a 5 min coding challenge. Only the release was made in a rush, but now that the blueprint is here, building this project all over again can be a piece of cake for anyone.

Although the most important aspects like the steam overlay and the virtual keyboard aren't incorporated yet, I spent a huge load of time and hours after hours of working on this program to make it work for gamers like you because I used to be a gamer, and I still am. But after working so many hours always connected on a public network and with no coding knowledge on how to protect my program, I was fearing all of my work could be stolen and intellectual property stolen so I just decided to release everything in a rush in its unfinished state and with nothing to gain. But I don't think I had any choices to release it in a rush like this, so I just decided to give everyone the blueprint of my program for free. But there was nothing on the internet like this prior to me posting it. Although I quite see clearly how cool my vr desktop could be for elite dangerous in vr, I'm not yet done. The sound recorder is working only with the microphone right now but my goal was to have the record button ready if ever I was going to face any thargoids, so that I could study the differences between them or at least see if there was any differences between them thargoids hidden messages in sounds. Anyway thargoids seem to have gone now as far as i read there hadn't been much big encounters even before the carriers came out. But why have a sound recorder inside of a virtual desktop? Why not? I think it's just cooler.

On another note, my program is an instancing engine, able to spawn hundreds of objects without lag and when the steam overlay will be working, I want to bring in those objects inside of my cockpit. I don't yet have an obj/fbx file extension importer but when it will be there, its not just orange cubes/cylinders/spheres that you will see in your cockpit, it's anything that can be made in 3d and that will be importable inside of my engine, like a tree that goes off your cockpit roof, birds that fly in your cockpit, sharks and fishes that swim inside and outside of your cockpit...

My program is a blueprint that people can use to learn to make their own virtual desktop environment if they wish so, because it's now open source and on github. Its fast and it's made outside of Unity3d. How rare is that?

But you are right. Currently, as it stands out, without a vr keyboard and the steam overlay, it's actually pretty useless and even if I spent hundreds of hours versions after versions to finally get here, my oculus rift cv1 broke 1 month++ ago... but I'm ordering spare parts soon for the cv1, and when I finally get the steam overlay on track, I'll update this page. I posted elsewhere on the elite dangerous forums for help on the steam overlay but no replies yet so I'll have to figure it out on my own.

It will be possible to disable the physics engine at anytime. Adding the physics engines Bepu V1 and Bepu v2, from my project, is actually a piece of cake for me and this program becomes a physics engine stress tester and with the instancing available inside of the engine, I wanted to make tons and tons of things appear inside of my cockpit as an overlay, without having to ever leave the comfort of VR. Although you don't need a physics engine in your elite dangerous cockpit, or a particle system overlay, having a physics engine that is light compared to having none, is not a disadvantage especially when you can disable them by changing a couple scripts variables (later to be visible UI buttons).

So currently, if you want a virtual desktop working out of the box, my program isn't for you because it isn't ready yet. I apologize for that and also apologize if you feel i wasted your time for checking my developer page. But thank you for your comments and critics, I really appreciate it. But right now, my software, as easy as it is to make it, it's useless in elite. it doesn't work yet as the overlay is intermittently working in vr. it worked great one day only to return to my backup project and see that it doesn't work anymore. i might try and fix some stuff in this virtual desktop but outside of vr, waiting for parts once they are delivered i can just plug that in and see if i can get all of it working for vr again. The sound recorder comes with a currently very basic and non intuitive at all visual spectrum made of instanced (multiplied and not instantiated objects, and in unity3d i tried to do instancing and i had some issues getting the instancing working for everything by using and learning from keijiro's kvant wall assets here so i decided to learn outside of unity3d on how to do it). so yeah it's free, it's open source and it doesn't work yet but if you would have time to try the program and swear a bit that it's lagging and completely useless for elite dangerous currently as it stands for the project SCCoresystems. The other project sccsv10 and sccsv11 currently work for the mouse cursor. if you want my current working virtual desktop solutions, you can look at this page. . i didn't activate my ab3d dxengine commercial version on this as it is looking more like a tutorial quality on how to make a virtual desktop work inside of virtual reality. i gave that project to andrej benedik at the ab4d studio although i am sure he already knew how to do it before i shared that with him, and i only use my github repository for my ab3d virtual desktop solution as a draft portfolio of things i have accomplished with the engine ab3d.dxengine. so i won't go back on my word and just start developping again that project although, if you are interested in trying it, it is supposed to work, and i could contact andrej benedik at the ab4d studio to try and reunderstand how to activate my ab3d.dxengine commercial version so that the community has access to the virtual desktop using the ab3d.dxengine.oculuswrap instead of just the ab3d.oculuswrap and it would make it tons easier for me too to provide updates to the editions i am working on. it's got better graphics using the ab3d.dxengine.oculuswrap for the cv1 than not using it so i always use both his libraries in my projects as the ab3d.dxengine.oculuswrap dll is dependant on the ab3d.oculuswrap dll. but why use my software when you already have other free virtual desktop solutions elsewhere, no other reason than the fact that it's free and open source so i don't hide where i learned from and i don't try and cause lag in my software just to irritate you in elite dangerous when you're in fight, so there is still work to be done here because currently the sccoresystems solution lags as i didn't have the time to reorganize and clean my scripts before releasing those drafts. but i have got 3 working versions of my virtual desktop where i remember the keyboard was working at least in one of them but i couldn't make multiple instances of the physics engine jitter dll work with them so i shelved those annoying projects that wouldn't work and i cannot test them currently. but i have 3 versions that don't have any of the sound recording code and any of the visual spectrum and any of the physics engine and i have also built them myself. you don't need to reply to this thread to get them or not as i had intended to put them on my github page as open sourced too the moment i had reviewed them as i thought they were quite slim and great for a working edition without much of instancing and voxels things going on. so my goal as a virtual reality overlay isn't even reached yet and multiple displays virtual desktop screen isn't reached yet either. but when you click on your right oculus touch home button, you can still get my apps to work in the background as a virtual reality solution, at least for the ones i posted there and none of those 5 projects have those extra tools of sound recording and objects and voxels anywhere and i was able to get over 200 stable fps++ on them for the screen capture but something was wrong when running the void expanse game when my program wasn't on the top layer of windows 10 but i am not sure if i had fixed that issue and that issue might be happening with elite dangerous when it is not on the foreground layer on your desktop display when my program is loaded up. i never tested those 5 projects with elite dangerous though so they might work great. personally i would be happy to know if they work great but consider them made by a game modder and hobbyist programmer, me, not a big game studio, so of course those on the oculus store are awesome and even the others who provide their versions of a vr desktop solutions on github who succesfully made their programs work as Overlays in steam VR. i couldn't make it work permanently and i will continue working on that and i'd rather bang my head against a wall for hours/days or weeks even if needed, to learn why i cannot make it work with the first few solutionsTutorials/stackoverflow help forums that i tried and i'd rather not read other Github'ers code before i tried the stackoverflow help forums unless github has some tutorials to offer on the matter which i didn't find last time. overlay shows in steamVR but never inside of elite dangerous anymore. none of my current uploaded to github programs has my steamvr overlay attempts. But to incorporate steamVR overlay faster, i could just use another MIT license github'ers project and use their scripting techniques for getting the steamvr overlay to work permanently in their c# vr desktop and that might be just what i'll do. i will restart working on the overlay very soon.

EDIT 2021-april-21: oups i just realized my repo use tons of ab3d libraries that are unnecessary for the vr desktop to work but it shouldn't affect the performance too much and i might clean that repo sooner than expected maybe this week. Those libraries are from the trial version but if you load up the programs, the dlls should give you a prompt message that you just activated a trial license of the ab3d.dxengine and the other ab3d trial libraries. But if i activate my commercial version of the ab3d.dxengine, people will be able to use those more than 60 days when i remove the dlls which i don't have a commercial license for. but i would have to set a price if i want to use the commercial version and i'm not fixed on setting any prices on anything right now, so i can only offer demos/drafts and unfinished projects startups as of this moment. and i wouldn't set a price on an unfinished project and those i offer, i consider them garbage drafts shelved demos that aren't doing totally what i want them to, until i put more work into them.
Last edited:
Ok, I'm going to be honest here. I'm all for programming for fun, to see what sorta things are possible. Can definitely imagine programming for VR is exciting.

That being said, I don't really see what would make me want to use this application. As mentioned, the description is very unclear, and the scope seems very unfocused. Why would I want a physics engine inside my virtual desktop environment? If I wanted to record game sounds, why would I want a virtual desktop application to do it? Why is it designed for Elite Dangerous? What makes it particularly suited to do so?

As you're obviously aware, a number of virtual desktop applications exist, so what do you aim to do differently with yours? I'm not even sure what I'm supposed to do with the virtual desktop that this app provides.
I 100% agree. I really don't think this belongs here. The poster should find a Reddit to join and post there. This seems like a "pet project" and not something that we need to know, especially in this kind of detail.
Edit-26August2021-10h37pm: The voxel physical keyboard mesh using pinvoke for pressing keyboard keys is now working but the keyboard keys voxel meshes don't have any spaces between them so i will keep trying to developp on it. The voxel virtual desktop mesh is destructible just like the voxel keyboard and the keyboard keys really are pressed when hitting the voxel mesh to destroy it, but the mouse cursor/oculus touch interactivity with the voxel virtual desktop mesh isn't developped yet. I will work on that soon too.

You can check the current status of my development here:

Edit-20august2021-01h19pm: update screenshot. to see more of this, please check my youtube stream when i'm there otherwise my past streams also are available to watch for proof of work and proof of concept that i think i can do something better still with what i already have developed and where i am at before i release my next version/revision. In my past streams, i am developing and digging the miniaturized instanced voxel chunk, and i develop/test the windows 10 virtual keyboard and mouse on the normal virtual desktop (i am not done with mouse cursor limits on the virtual desktop screen) and where i also develop/test 1st/3rd person view for the VRIK human rig. When i stream, you cannot see my programming computer's screenrecording as it is using the OBS software on my programming computer for that screenrecording while my stream has OBS running only for my SJ4000 desktop camera. But i record my screen on my programming computer. I am not finished developing and i will work on trying to rotate the chunk object and destroy it when it is rotated but it is already destroyable with the pickaxe and i developed a simili inverse kinematics byte location system to have a non-performance heavy byte breaking location system on both the item that is held and the chunk that is being broken. but i am not done. I have been working on that since my last Edit and more also. I will see if i can load up also a voxel instanced WIP virtual desktop and a voxel instanced keyboard in the same scene and i will update this post with a new edit when and if i am done working on that.


Edit-15august2021-03h28am: Here's a video of my current progress, shot from camera phone. miniaturized voxels with some explanations. I didn't have the time to do some editing yet though and it will finish uploading in 1hour40mins approx from my current upload rate. 3.5gig 36mins video of nerd comments and miniaturized voxels in action. with brain lag left and right and lack of verbal informative explanations on what i am coding. Im probably removing that video



Edit-15august2021-03h47pm: tiny performance stress test. Lots of instanced voxels (for the WIP instanced voxel Virtual Desktop) but the inside faces are all drawn and it lags too much on my video card rx570. As long as i use "wrapped" voxel instancing, it will lag with that many voxels anyway. I will need to work in "unwrapping" the whole thing. I can use a geometry shader for adding/removing vertex/triangles but i am still unsure if any vertex removal is possible on instanced meshes in low-level c# or not without changing the instances draw order, but i did try in the past to only remove vertex from the geometry shader without changing the instances draw order, and removing them on one instance removes the vertex completely on all instances and breaks all instanced meshes at the same time... So i will have to use "unwrapping" if the technique is named close to that or not, to not draw the inside faces of the instanced voxels, to use them somewhere else, because i've got to deal with 1 mesh and multiple instances of them... I've got some ideas in mind to move forward and try to unwrap the chunk, as if i was unwrapping a gift of tons and tons of extra vertex/triangles to display elsewhere (like recycling in scene vertex/triangles of instances and changing their positions in the scene is a possible approach i think also).


Edit-15august2021-03h24pm: fixed the normals on the instanced Virtual Desktop voxels WIP. the lights still seem to not be working.


Edit-15august2021-03h24pm:The multicolor is because i use the vertex binding InputClassification.PerVertexData input.color.xyzw for byte indexes instead of rgba color values. But with the pixel shader manipulation i don't need to send a color to the shader if i have a color as a global variable inside of the pixel shader itself. So instead of using what i was using before which was taking more place as a vertex binding additional vector4, i eliminated that additional vector4 and took the vector4 of the input.color.xyzw for sending the bytes index to the shader. and using the "w" of the vector4 of the input.color.xyzw for the type of face at the index from 0 to 5. Now to send the bytes to the shader i am using 4 vector4 instead of 8 integers, and 4 vector4 having their xyzw with a padding in front of 5 instead of 1 like so 51011.0f (where 1 is show face and 0 is don't show face for each bytes and the x covers 4 bytes and the y covers 4 bytes and the z covers 4 bytes and the w covers 4 bytes so i needed 4 Vector4 to cover 64 bytes. But in the vertex shader i am just moving the vertex position to somewhere where it won't display the face when it is a 0 so the instanced vertex is still in the scene) as right now i am limited by floating point precision and putting more bytes inside of a float might be problematic, so i decided to stick with my original 4 bytes width maximum, 4 bytes height maximum, and 4 bytes depth maximum, per instance. So right now, instead of using a set of 8 integers that were covering 8 bytes each for 64 bytes, i am now using 4 vector4 that each hold 4 bytes. Returning a new color in a pixel shader works. so using the input.color for something else wasn't because i had a choice, it was because when i was using, i had issues with the vertex binding padding and destructing bytes wouldn't register any hits because the padding in the vertex binding was wrong and the byte was lost in the buffer through some other vertex binding wherever they go, so i decided to eliminate input.indexPos.xyzw and recycle input.color.xyzw for the byte map indexes and type of face. Also, i introduced a new vertex binding as inputClassification.PerInstanceData in order to have different colors on each instance, called input.colorsNFaces.xyzw (although there are no faces inside of there because i decided to put that in the w of input.color.xyzw so i will change that to something around input.instancecolor.xyzw for that one). In input.colorsNFaces.xyzw are the colors xyz as rgb and the alpha i currently use it as a cpu counter to change the color per frame but i am failing at using per frame perlin noise on the xyz of currently and that's why my demo trailer was showing instanced voxel faces that were emitting "light shocks/flash" instead of a cool perlin noise wave per frame and i use the cpu for the perlin...


Edit-15august2021-02h03pm: success at miniaturizing instanced voxels at units of 0.01f and destroying them and correctly placing the screencapture
only on 1 face. Full Destroyable Voxel Virtual Desktop is WIP as no keyboard/mouse/oculus touch work yet on it. Also, please note that the current position of the
instanced voxel virtual desktop cannot be changed for the moment and is only on a positive x/y/z and the rotation also cannot be changed. So it remains a
working instanced breakble voxel example, and as a static terrain solution/level surrounded by walls where the player is unable to dig in negative x/y/z, could still be
doable and i might try to do this later. i don't like it though, since i was able to developp a negative/positive voxel destructible planet inside of the unity engine which i can move the position of without any issues. Here outside of the unity engine and in low-level c# instancing, i wasn't able to do it yet and this is what i will be working on next... And i will need to do more testing, because if i have the Virtual Reality IK rig with it's vertex/triangles displayed, it is already a ton somehow to display and spawning a new mesh of voxels in addition to the VRIk Rig, and it starts being heavy but both are different voxel instancing techniques. But i can still spawn a lot of VRIK Rigs which aren't breakble yet and without having noted the vertex/triangles at every tests i was doing, i cannot say which of the instancing technique is better, but the VRik rig isn't breakable yet and it could be but to break the instances of the VRik rig would require more development time. I didn't test multiple VRik rigs in addition to my new breakable instanced voxel technique so i cannot say yet how much voxels i can spawn in the scene.


Edit-14august2021-11h46pm: success at miniaturizing instanced voxels at units of 0.01f and destroying them and correctly placing the screencapture
only on 1 face. Full Destroyable Voxel Virtual Desktop is WIP as no keyboard/mouse/oculus touch work yet on it.


miniaturized instanced breakable voxel. sized up with both hands to show how small. size of 0.01f per byte/block. using LineList instead of TriangleList for a containment grid and with a different vertex/triangle setup.
miniaturized instanced breakable voxels. size of 0.1f per byte/block.


instanced breakable voxels. size of 1 per byte/block.

using linelist

Instanced breakable voxels. size of 1 per byte/block. using LineList instead of TriangleList for a containment grid and with a different vertex/triangle setup.

using point cloud

instanced breakable voxels. size of 1 per byte. using PointList instead of TriangleList

Instanced breakable voxels. size of 1 per byte/block. stress test. lag in scene.

Edit-10august2021: Here's a presentation video of my current progress as explained below, and what to expect in my next release on my instancing engine sccoresystems

Hi everyone, i am hard at work and trying to provide my next version of my virtual desktop solution. I developped an instancing breakable voxel terrain, inside of my engine sccoresystems . The voxel terrain is able to be resetted instantly, there is no load time currently for the voxel destruction reset. Although there are not a lot of cubic voxels in the screenshot above, they are all instanced voxel mesh (not inside of my unity engine portefolio yet though where they aren't instances yet) where i am manipulating the bytes, to render or move away from rendering the faces that are breaking, but the faces are all drawn inside of the chunk, so i would call this future release a "wrapped voxel chunk" where i am unable to move the faces inside of the chunk to display elsewhere for a bigger chunk to be destructed. So when i will work on the "unwrapped voxel chunk", i will cut off a lot of vertex/triangles from being rendered in the scene or i will use the vertex/triangles to display other things. Currently, there are 4 types of voxel options. The first one is a big destructible voxel chunk. The second one is a voxel pathtracing ability where the voxels are inverted and the user makes a path in the voxels and the voxels appear as the user moves inside of the voxel bounding box. The third one is a per byte/per user spatial location, so only the byte where the player is inside of the voxel chunk will be visible.

I inserted the jitter physics engine inside of a switch so that the user will be able to decide to activate the physics engine or not, so if you don't want it like Lombra, it's going to be deactivated from the start, anyway until i reincorporate the static/deactivate disabling rigidbodies with a spread check on collision islands (that i developped) and it seemed the jitter physics engine was showing much less to no spikes in cpu performance (without much debug and looking at the diagnostic tools debug resume) than the normal rigidbody deactivation feature of the jitter physics engine and i uploaded my project here and that project will soon find it's way in the github repo here when i move it there but that tiny help isn't enough yet. I would agree with Lombra for a part where a physics engine that isn't tweaked or tuned won't cut it for my solution, because the CPU usage wasn't looking that good in the Visual Studio Diagnostic Window with the "out of the box jitter physics engine from the google archive" without any sort or form of tweaks with the basics settings as i didn't have a lot of time to work on that. So that's why i decided to put jitter inside of a switch, and for every other tools inside of my engine, they will stay because i will have a use for them in the future (but i will include switches for the creation and rendering of the tools like the audio spectrum and voxel terrain and the IK rig maybe to reduce the number of voxels vertex/triangles because it bumps up the memory usage for rendering the moment i increase the vertex/triangles).

I also have worked on bringing inverse kinematics to the legs of the ingame character controller and although i had expected to be able to use the arms and multiply them for legs it didn't quite go that way ootb with what i had developped before. Then making them collide with the floor is where i will need to put some more work into. I've already developped some inverse kinematics legs in my unity engine portefolio here but there are no virtual desktops inside of that repo yet.

I have also readded the option to "recall" the Virtual Desktop just like in my solution sccsv11.

I have tweaked a bit the inverse kinematics but not that much. I still have issues when the angles reach a certain area but the rotations of the limbs look and rotate much better now.

But most importantly, i was able to make the Virtual Keyboard of windows 10 work inside of my solution so i can say that my next released version will have both the mouse and keyboard working. Also i have a prototype mesh voxel keyboard working. I will add some more screenshots to this post later on today.

My current working virtual desktop solutions can be found also on my github page. But none have a working keyboard inside of sccsv10 and sccsv11 and sccoresystems-rerelease, except some in my repo where i remember leaving there at least one or two working virtual desktop with a virtual keyboard and mouse working that i had also developped but there are no inverse kinematics in that repo yet. My virtual desktop solutions can be found here where users will need to build the dlls for themselves for the moment:
very soon here under release

Testing my homemade arduino hand controllers circuitry (WIP - it's missing 2 triggers and all of the rest rofl- but the 3 buttons and thumbsticks work for both controllers) with an older version of my engine SCCoreSystems:

Presentation of my solution pre-sccsv10:

Presentation of my solution sccsv10:

Presentation of my solution sccsv11 - soundtrack futuresequenceTumulus by Drombeg :

DawnTreader777, as far as i know, my virtual desktop solutions work inside of elite dangerous (when using the Oculus App Home button or alt-tabbing and having my app work in the background), for the most part. so even if it's a pet project, i see no reasons to leave. I will see a reason to stop developping this for elite dangerous if frontier developps it's own virtual desktop solution maybe.

CMDR Von Abrahart - I am trying to bring inside of my solution, a lot of voxel flavors. And users/players/cmdrs will have a chance to break/destruct some voxels in my next release, hopefully without any sort or form of lag for the CMDRs and their computers.

Thank you for your continued patience,

N.B. Please note that i was starting to use D classes names also for things i was myself developping (like the DContainmentGrid class and i will check which other but it's as simple almost as adding a 3rd loop to cover the xyz axis instead of only 2 axis but still i had to code it) as i was unsure if i was going to do an unoffical rastertek tutorial on instancing and virtual reality and inverse kinematics and voxels. But i left that idea i think a while back, and only a couple of remnants of my old code still has D classes as they used to name them in DRastertek C# tutorials. But my scripts and engine remain very compatible with where i learned from. Also, please note that i would so love to restart playing Elite Dangerous again and finally jump in Odyssey but i still didn't have the time yet because i wanted CMDR's and myself to have my tool ready, for instant instancing heightmaps mesh screenshots of the virtual desktop screencapture that any players could take while playing or using a camera or anything else, when playing elite dangerous simulator to keep track of what you are doing and the awesome environments where you are... It is actually a very cool idea and i am not promising anything because i don't know if we can get heightmaps of any games that are showing on our windows/ubuntu screencapture desktop, but i am right at the door anyway of making instanced heightmaps meshes from a windows background of planet earth heightmap, with what i have developped in code...

In the same line of Virtual Reality voxel projects with inverse kinematics, but without a virtual desktop, i have developped a tiny portefolio of different versions of breakable voxels here:


Those unity projects can be found on my github page here
Last edited:
Top Bottom