Has Fdev considered building an optical computer?

Since I have worked in the electronics industry for the last 25 years, and I work for a company that develops some of the leading tech hardware, I can say that the hardware we have now is not the limiting factor in what computers can do for 99% of the applications that most users would use it for. Additionally much of the software out there now does not fully utilize the resources of most modern CPU's.
Most of the hardware now is System on a Chip (SOC) where an ASIC has most of its functions within the chip package. Some chips have a single silicon die, and others have multiple dies in one package. As a result the speed of operations is limited by the amount of transistors on the die. The scale that they are making these chips now is absolutely astounding now. They can make a transistor as small as 7nm now, and the next node is 4nm. That is obscenely small, and allows for more transistors to be package into a device, reduces power consumption and increases speed.
The method used for moving data between chips on a board is much different that it was 20 years ago. For the most part parallel buses with single ended termination are a thing of the past for high speed applications. Nowadays they use Low Voltage Differential Signals with SERDES gates in the chips to achieve high data rates between chips. And Mult Gigabit Transceivers can achieve data rates that would blow your mind not only within a circuit board, but across cables that can connect different pieces of hardware. This is what the PCIe bus uses that connects your Video Card to the CPU on your motherboard.

A purely optical computer is impeded by the fact the a computer functions on the principal of binary logic (0 or 1) and how would one go about interpreting light frequencies into logic states that could be calculated? At the end of the day for the data to be displayed on a screen, it would have to be converted back into binary data at some point, which would negate the massive bandwidth advantage of an optical link.
Now with Quantum computing they are taking the binary machine state principal and essentially doubling or quadrupling the output of a logic gate. Now instead of a single 0 or 1, we can now have 00, 01, 10, 11 in a single gate. This makes a single calculation even faster than the traditional transistor gate logic. No doubt as the technology matures they can take this idea and make each gate 4-bit, 8-bit, 16 -bit wide, thus exponentially increasing performance without needing faster clock cycles. This would also reduce the number of gates that would need to be put into a device to achieve a given level of performance.

The idea would be to either quickly pull out the needed logic on one end reutilizing the correct hardware in a specialized way and minimizing the data or something else. It would depend how many things you have to use as outputs. If you treat the data directly as something the GPU already deals with then it could simply spit some out to the monitor. The CPU could possibly use an encryption or other method to very quickly deal with some of the data to get it to it's next location if possible. I was thinking the end would be very fast passthroughs minimized to still get effective speeds. And maybe some specialized software tricks to reduce the data handled so you still get the effective bandwidth. Some of this could be done on the optical side to get it directly to the component as needed or sent through the bus and software could use some logic to get the correct data basically on the software writers side. If you use just varied forms of data like using the color codes of the data can you quickly convert to usable form directly in a simplified manner. And us similar logic to get use out of all the data to increase overall effective bandwidth. Something I'm assuming everyone now is too lazy to bother with or doesn't care about for various reasons. Then the optical is part of a converter system for highly specialized software. Depending on how much you can minimize the size and time needed for the data on the traditional end of it. If it's only 1,000 times the speed of a current machine it's not that bad. Especially if you can use the optical to convert data fast and efficient. I would think the problem is getting it not the thrash the traditional equipment. Might not be maximum throughput but it could be a hell of an upgrade. And if could condense the data down with more optical bridges and combine and still use the effective data maybe you could keep increasing it's effective performance to some degree. Especially with the right software/hardware combos. I'm assuming it would turn out a bit like multiple GPU setups that way. Which I wonder if this could fully utilize while acting similarly in their own functioning.

If you need to you can always put a loop or other manipulations into the optical side and just use the hardware as a distributor and do it very efficiently. I'm assuming it depends on the hardware it's working with. I'm also assuming there are some ways to reduce the data when and if needed and use the traditional software sparingly. But also hoping there is a way to simplifiy the optical side. Could be some intersting ways to use the read side to do manipulations or something. Can you use heat or other aspects to make the hardware pass on the data in an efficient form and basically reuse things as much as possible. Obviously, the practical end is to get as much out of it. How far can the software side go? That is technically the more versatile side depending on if you can get the form of data efficient enough. Depends on now how much hardware needs to be built. I think I was thinking some stuff could be pre translated in some ways and use a very predictive form of software. Maybe with pre known efficiencies on the fly somehow.
 
Last edited:
Yeah, thi thread has became something of an entertaining nonsensical rabbithole, at first I thought the OP was some kind of savant, and was actually going to support him... That was until I seen the huge glaring ommissions in his theoretical design, and the comment:



And I was like "computers without logic gates hmmm..." Then the onionhead allegations came in and I was like, "very likely", epecially when you take into considerationhe appears to think he singlehandedly has worked out the theory of optical computing, that have seemingly alluded the "disengenious researchers" who he accuses of, or at least infers "are essentially stealing research money". Messiah complex? So universities cannot crack this by throwing money at experts in the field and he fully expects a small software company to plough its resources into mastering it for the benefit of one of their games, ok. Or maybe he is the genius he thinks he is and we are all wrong, but in the balance of probabilities I don't see that heppening.

Seriously @Noobilite - I'm dissing you here based on how your posts in this thread have portrayed you, I would like to be wrong on this, please go and master your concepts, prove they work, get some publicity, and necro this thread to give us all a big and deservedly sanctimonious bordering smarmy "I TOLD YOU SO!!!!".
Pure optical computers wouldn't need logical gates in the traditional form. You can send it to specialized equipment to parse the data out as is needed. At least potentially. You could deal with it in a lot more ways than normal modern logical gates as you could deal with much more data at once(And need different manipulations). Not to mention you could hypothetical change the hardware out more easily depending on how it's constructed and either change or add on for new outputs. It would open up a bunch of new methods for computer design. Treating it purely as traditional hardware would be a bit restrictive unless you can use/keep the part forever. And hope it's not overly expensive and becomes obsolete with and upgrade and is reusable for other applications. A proper optical computer should be basically completely open ended adjustable and upgradable.

I'm basically refering to using pure modern logic gates and low bandwidth light to literally make a computer to make up for the data difference. There are other ways to do it, unless of course you are intending to stick to the same formats for other hardware in the long run. Or are just working for the relative short term.
 
Last edited:
Hey, it's been dark for a while here and the show is still going? And I thought stand-up comedy was a 10 minutes affair...

Also, how would an optical computer work on a very foggy morning? Rather bad I suppose, oh what a misfortune that could be.
 
Something I'm assuming everyone now is too lazy to bother with or doesn't care about for various reasons.
And you're so capable, diligent and professional you deem non optical computer research as laziness? Or are the rest of us just too stupid or lazy to understand your wonderful ideas which:

Pure optical computers wouldn't need logical gates in the traditional form. You can send it to specialized equipment to parse the data out as is needed.
Thus introducing more lag by having to translate from light to data and back again...

Also, how would an optical computer work on a very foggy morning? Rather bad I suppose, oh what a misfortune that could be.
Don't know - I'm completely mistifyed by it.
 
And you're so capable, diligent and professional you deem non optical computer research as laziness? Or are the rest of us just too stupid or lazy to understand your wonderful ideas which:


Thus introducing more lag by having to translate from light to data and back again...


Don't know - I'm completely mistifyed by it.

1. No i'm refering to the fact that stuff like this is not done to peak performance for all application for practical reasons potentially. There is always more room. And lots of it. Not to mention convenience for the people doing the work. And potentially dealing with pay vs education levels.(So, basically yes.)
2. Not if it's all light based like a monitor knowing how to send the data to get the correct end result on the monitor purely from light. with the same light sent directly manipulated. You can duplicate and send the full data stream and have the hardware physically remove/manipulate and send the correct light directly to the monitor basically.
 
I didn't read all that either, but I am guessing if every one starts spending $1000 on ARX then FDev might consider.
 
Maybe the OP thinks FD can't introduce his game breaking, stupendously fair, and above all logical massive ships (check out his 'builds' in his sig lol) unless the entire game is rewritten for a computer that hasn't been made yet.
 
@Noobilite All you have described is an optical to digital modem, you've not given any details on how you intend to PROCESS any information you encode into your optical computer. Also, how do you STORE data as light? Hell, how do you even generate light as an input or output - you said micro leds, whats driving them? Remember that LEDs are Light Emmitting Diodes - ELECTRONIC devices, so your proposed optical computer needs electronics to control its first part of the process, and electonics to decode the output colour/wavelength/frequency* of light arising from the output. You say minituarisation isn't needed, are you sure you want to go back to computers the size of houses?

In summary, of everything you have said thus far about your proposed optical computing model that you propose frontier develop in their basement for this game because duh, its so easy and computer sceintests are lazily misappropriating research funds, is that:
It cannot store data
It cannot process data
It uses different coloured light going through the optical pathways
It uses (micro)LED's as its light source - thus requiring more electronics
It passes "data", the light generated from thsoe micro leds, to electronic detectors and onto other "specialised equipment" in an open loop

Thus:
It has two transducements input and output, each adding more lag to the "throughput" of data
It is essentially a communications protocol not an actual computing platform
ERGO it mus slow down computing rather than enhance it.

The more you type, the sillier you are making yourself look.
 
@Noobilite All you have described is an optical to digital modem, you've not given any details on how you intend to PROCESS any information you encode into your optical computer. Also, how do you STORE data as light? Hell, how do you even generate light as an input or output - you said micro leds, whats driving them? Remember that LEDs are Light Emmitting Diodes - ELECTRONIC devices, so your proposed optical computer needs electronics to control its first part of the process, and electonics to decode the output colour/wavelength/frequency* of light arising from the output. You say minituarisation isn't needed, are you sure you want to go back to computers the size of houses?

In summary, of everything you have said thus far about your proposed optical computing model that you propose frontier develop in their basement for this game because duh, its so easy and computer sceintests are lazily misappropriating research funds, is that:
It cannot store data
It cannot process data
It uses different coloured light going through the optical pathways
It uses (micro)LED's as its light source - thus requiring more electronics
It passes "data", the light generated from thsoe micro leds, to electronic detectors and onto other "specialised equipment" in an open loop

Thus:
It has two transducements input and output, each adding more lag to the "throughput" of data
It is essentially a communications protocol not an actual computing platform
ERGO it mus slow down computing rather than enhance it.

The more you type, the sillier you are making yourself look.

Yes, I have said a bunch of that. It would be driven by a video card. Micro LED is because it has nanosecond response times. Although probably not single nano second response times.

You store data on a harddrive.

I wasn't thinking this but you can potentially keep data in the light stream and pull data as a form of calcualtion. If it's big enough you can keep all the data going as a single loght color reference. Even use the speed of the light's repsonse to changes to make the light be the correct color to make it fight the needed thing for the output instead of making the output calculate it. Then the input just needs to know what to send predictively. You can do that with any software. And you can hypothetically use much more simplified data than if the game calculated everything the traditional way as you can stick with simplified data potentially. There are a lot of ways to do this on the software side potentially if are only sending the data first to the optical and then making it send the correct data to the output. At least as one example. Maybe combine the two.
 
Last edited:
1. No i'm refering to the fact that stuff like this is not done to peak performance for all application for practical reasons potentially. There is always more room. And lots of it. Not to mention convenience for the people doing the work. And potentially dealing with pay vs education levels.(So, basically yes.)
2. Not if it's all light based like a monitor knowing how to send the data to get the correct end result on the monitor purely from light. with the same light sent directly manipulated. You can duplicate and send the full data stream and have the hardware physically remove/manipulate and send the correct light directly to the monitor basically.
1. = You saying no no its not that but yes it is you saying computer science as it is just now is a grand larsonous misappropriation of funds 😂
2. = You saying your optical computer is going to work like a computer monitor by manipulating light, newsflash sonny, monitors work off electricity until their final output of light, there is no manipulation of light in a monitor.

OR are you saing you will manipulate the light on its route to the monitor, tell me how you are going to do this? Tell me how you are going to store light to be able to use its data later? Tell me how you are going to use light to drive the monitor - translating it to electrical signals for the journey from your optical computer to the monitor is not an option as it would as I've already stated induce lag and erode performance.
 
1. = You saying no no its not that but yes it is you saying computer science as it is just now is a grand larsonous misappropriation of funds 😂
2. = You saying your optical computer is going to work like a computer monitor by manipulating light, newsflash sonny, monitors work off electricity until their final output of light, there is no manipulation of light in a monitor.

OR are you saing you will manipulate the light on its route to the monitor, tell me how you are going to do this? Tell me how you are going to store light to be able to use its data later? Tell me how you are going to use light to drive the monitor - translating it to electrical signals for the journey from your optical computer to the monitor is not an option as it would as I've already stated induce lag and erode performance.
I'm saying monitors could have it built into them as specilized hardware and sent simple raw signals. It depends on the hardware design. Optical leaves a wider range of options at it's base as you can maintain speed and infinitely split the stream potentially.

And, yes, hardware development is steadily turning into mass fraud. It will keep getting worse of the next several decades or more and into the future. Everything goes through this process and we are seeing increases in it.
 
1. = You saying no no its not that but yes it is you saying computer science as it is just now is a grand larsonous misappropriation of funds 😂
2. = You saying your optical computer is going to work like a computer monitor by manipulating light, newsflash sonny, monitors work off electricity until their final output of light, there is no manipulation of light in a monitor.

OR are you saing you will manipulate the light on its route to the monitor, tell me how you are going to do this? Tell me how you are going to store light to be able to use its data later? Tell me how you are going to use light to drive the monitor - translating it to electrical signals for the journey from your optical computer to the monitor is not an option as it would as I've already stated induce lag and erode performance.
People are working on this.
Top. People.
(Indiana Jones meme).
:D
 
Here’s something on the subject passed through half a dozen translations looping back to English.
It kind of feels in tune with the OP.

This calculates or calculates light photon particles with a laser diode. In the mid-sixties, light particles (fiber optics) are used in the calculation of electron velocity.

Digital computer systems, data, how many research projects, flow theory, most computer components are theories. This method is optimized for the computing company in the short term prospect for two-way optoelectronics sitting together, it seems. However, optoelectronic devices or light particles, electrons increase up to 30% of returned objects. When you send an exchange of messages as an invitation. Changes to eliminate the need for electricity (€) A call to the theory of computers to eliminate the theory of electricity. [1]

As a view in radar theory, this synthesis has been developed to use application-specific device (SAR) theory and optical correlators. For example, optical information, time interval identifiers, and communication stages can be divided into objects [2]. [3]
 
Top Bottom