Has Fdev considered building an optical computer?

Data transmission is not the same as data processing and bandwidth from component A to component B is rarely the primary limiting factor in performance.

Optical/photonic computing is a fascinating topic, but thinking it's currently some sort of drop in replacement for electrical circuits, or that an interconnect constitutes a processor, are grievous fallacies.

A short article on the general topic...pay particular attention to the 'challanges' section: https://www.findlight.net/blog/2019/02/01/optical-computing/



You've described the theoretical throughput of some sort of vague optical interconnect which does not remotely constitute a computer, let alone a product that Frontier could acquire.



Just because you have millions of LEDs and some fiber optic cable doesn't mean you could build a useful, much less super, computer with them.



The optical components you haven't described aren't doing anything except acting as a loopback.



OP is proposing a 1GHz, 16.7 million bit, optical interconnect that would provide vastly more bandwidth than the hardware it's connected to could use while harming performance by requiring latency introducing conversion at each end.



This is a bunch of treknobabble that features vague references of an optical computer (the 'read device') doing some sort of processing, but doesn't make any coherent mention of where this optical computer would come from.

The smallest LEDs are many orders of magnitude larger than modern transistors and you'd require far more than just an LED to make an optical transistor equivalent. You could use them as part of a set of logic gates that talked to each other with flashing light, but they'd be enormous, and enormously slow, compared to an equivalent integrated circuit.

Even real proposals for photonic transistors are likely to be limited to signal routing applications because they may never be competitive for processor logic. Faster theoretical switching speed, even if fully realized, likely wouldn't overcome the density penalty. If you can cram a hundred times as many electronic transistors in the same area, they are going to make a much faster processor, even if they cannot cycle as fast.

You realize that half the people who keep trying to make transistors and logic gates to process optical stuff are con artists. You wouldn't use anything of the sort in an optical computer... You just need to understand combing streams of light into new streams of light and read the colors range as a data set and then manipulation or segregation of the stream. It's all pretty low tech unless you want to shrink it a lot.

It also doesn't need to be shrunk to get bandwidth or take up room. Why does everyone think this? Quit applying electronics to an optical computer. It would be made completely differently except where bridging with an electrical device.

You are thinking of another form of optical computer mimicking electrical components we have now. That is completely idiotic and I have no idea why any researchers are even bothering unless they are literally stealing research money. It's much slower too unless you really want to package a lot of smaller stuff into CPU dies...

You do not need to hypothetically shrink optical components as much though. Or at all. As the basic math shows you can get a peta byte throughput with a single stream. If you use two streams you can combine them with physical manipulation and send them to normal components or develop pure optical version which are very simple by design.. If using bridges to electronics it's a matter of if you can simplify the data to a manner the normal computer can deal with it. Probably by seperating the data needed in optical form and only making the normal PC deal with what else is needed. This gives a potential benefit of using the normal PC of tracking and checking for errors in the optical and for calibration and other things. You just re-utilize the other electrical parts for more than their original intent. And use them to get video etc. You can do it if you want to.

And read devices are much more powerful than write. So, it's just a matter of reading a single light stream, which can be shrunk or grown between any sized devices with mirrors or lenses at the point of transition. It's matter of how complex or simple you want the optical side. It can be done with 1 or two light streams and mirror depending on the setup. You can process predictively if you know what you are sending and what you should receive. It's not just an interconnect...

The reason for the patch is that read data could be used to have a form or error correction feed back through the other side to make sure it works. It might not be totally live, unless it can be simplified enough for the other components, But it might not need to be. If you know what it should be, and you know what you are getting, you can adjust what you send to get what you should receive. If it's a static problem, you are set.

BTW, one of the main advantages of optical computers is to get away from the complexity of modern systems and making it much simpler again.
 
Last edited:
And if we all upgraded our PCs and consoles to use cheap quantum-comms entangled to FDev's servers there wouldn't even be any network lag.
I'll say there wouldn't! I probably shouldn't say, but I'm part of a small team alpha-testing this for FDev right now. It's very excellent. Next month I made 2 trillion credits off of next week's mission-board glitch.

o_O
 
You realize that half the people who keep trying to make transistors and logic gates to process optical stuff are con artists. You wouldn't use anything of the sort in an optical computer... You just need to understand combing streams of light into new streams of light and read the colors range as a data set and then manipulation or segregation of the stream. It's all pretty low tech unless you want to shrink it a lot.

Since it's so cheap and easy I look forward to your first product!
 

Stealthie

Banned
OP is proposing a 1GHz, 16.7 million bit, optical interconnect that would provide vastly more bandwidth than the hardware it's connected to could use while harming performance by requiring latency introducing conversion at each end.

Reverse the polarity.

It'll be fine.
 
Optical computer you say...
9.jpg
 
Data transmission is not the same as data processing and bandwidth from component A to component B is rarely the primary limiting factor in performance.
[...]

One such example would be putting chips in/on the human brain. But that's a completely different topic.
 
Top Bottom