Has Fdev considered building an optical computer?

Trust me, you don't want to.
The funny thing is I did as a kid in primary/elementary school, but my eyes naturally fixed themselves as I aged as part of the aging process. Kind of weird.

But yeah, you're probably right. I was sort of joking around a little bit with that post. The glasses I wore as a kid were those big, rounded rectangular, thick framed kind that were common in like the '70s and '80s. As an older teen and young adult though, I kind of missed out on getting something I might like more, like the vintage style Windsor glasses.

Not meaning to make light of people's vision difficulties.
 
Last edited:
The funny thing is I did as a kid in primary/elementary school, but my eyes naturally fixed themselves as I aged as part of the aging process.

Kind of weird.

My eyes did exactly the opposite. 🤓

Some years ago I tried contact lenses for some months and oh my god, that indescribable sense of freedom. I also had quite some issues, since my prescriptions makes everything appear quite a bit smaller in both ways, so I had trouble estimating sizes and distances (everything appeared huge to me) and even recognizing me in a mirror (last time I saw my naked face I was 7 years old, seeing the true size of my eyes relative to face was kind of disorienting).

We are talking about optics stuff, so still on topic right?
 
I badly injured an eye last year, so for a few months I had an eye patch on, then for the last few months of recovery I was wearing special glasses with a blue light suppression filter in them getting past them and going back to seeing the world in true colour rather than bronze tinted was very disorienting. Although not disorienting enough to make me post a thread suggesting a small computer games studio develop optical computing and port their game(s) to the new optical computer system they knock up in their basement.
 
You do realize Frontier doesn’t own the game servers, right? They lease servers that reside in the Amazon Data Center, and they don’t even have physical access to them.
 
Sounds like a job for Geordi!!
Get busy with the photonic converters lad!! Snap to it.

7d626c43e1f60fca245543f6cbfb1432.jpg
 
Begrudgingly, I suppose I should post something more on topic in this thread as well...

For transferring data over some distance, fiber optics can be useful, but as far as I know for smaller scale, local computing like on a motherboard, keeping signals electricity based is more practical and efficient than having to convert the signals back and forth.

I haven't really looked into fiber optic networking for a server farm, distributed computing, or something similar, but I suppose it might make sense in that sort of scenario.
 
Since I have worked in the electronics industry for the last 25 years, and I work for a company that develops some of the leading tech hardware, I can say that the hardware we have now is not the limiting factor in what computers can do for 99% of the applications that most users would use it for. Additionally much of the software out there now does not fully utilize the resources of most modern CPU's.
Most of the hardware now is System on a Chip (SOC) where an ASIC has most of its functions within the chip package. Some chips have a single silicon die, and others have multiple dies in one package. As a result the speed of operations is limited by the amount of transistors on the die. The scale that they are making these chips now is absolutely astounding now. They can make a transistor as small as 7nm now, and the next node is 4nm. That is obscenely small, and allows for more transistors to be package into a device, reduces power consumption and increases speed.
The method used for moving data between chips on a board is much different that it was 20 years ago. For the most part parallel buses with single ended termination are a thing of the past for high speed applications. Nowadays they use Low Voltage Differential Signals with SERDES gates in the chips to achieve high data rates between chips. And Mult Gigabit Transceivers can achieve data rates that would blow your mind not only within a circuit board, but across cables that can connect different pieces of hardware. This is what the PCIe bus uses that connects your Video Card to the CPU on your motherboard.

A purely optical computer is impeded by the fact the a computer functions on the principal of binary logic (0 or 1) and how would one go about interpreting light frequencies into logic states that could be calculated? At the end of the day for the data to be displayed on a screen, it would have to be converted back into binary data at some point, which would negate the massive bandwidth advantage of an optical link.
Now with Quantum computing they are taking the binary machine state principal and essentially doubling or quadrupling the output of a logic gate. Now instead of a single 0 or 1, we can now have 00, 01, 10, 11 in a single gate. This makes a single calculation even faster than the traditional transistor gate logic. No doubt as the technology matures they can take this idea and make each gate 4-bit, 8-bit, 16 -bit wide, thus exponentially increasing performance without needing faster clock cycles. This would also reduce the number of gates that would need to be put into a device to achieve a given level of performance.
 
You are thinking of another form of optical computer mimicking electrical components we have now. That is completely idiotic and I have no idea why any researchers are even bothering unless they are literally stealing research money.

A computer has to compute...i.e. perform logic functions. So yes, in that sense it has to mimic the logic gates in more traditional computing.

Processing performance is largely dictated by the number of logic gates you can fit in a given area and how fast you can get them to switch. Even with an order of magnitude higher frequencies than practical with electronic transistors, you couldn't fit enough photonic logic gates onto an optical integrated circuit to make competitive processors for most applications.

Begrudgingly, I suppose I should post something more on topic in this thread as well...

For transferring data over some distance, fiber optics can be useful, but as far as I know for smaller scale, local computing like on a motherboard, keeping signals electricity based is more practical and efficient than having to convert the signals back and forth.

I haven't really looked into fiber optic networking for a server farm, distributed computing, or something similar, but I suppose it might make sense in that sort of scenario.

There are legitimate applications for all-optical interconnects, even across very short distances, as this would eliminate electricity conversion and transmission losses. There are even some compelling proposals for limited function optical processing filling targeted niches.

Some examples:

In any case, virtually no one other than Noobilite considers it practical to replace dense electronic transistor logic with optics for general purpose computing.
 
Yea, but it would give so much leway they only have to not screw up the networking. And they wouldn't have to worry about needing to use purely P2P as they could use the current ones or cheaper as the backbone for the optical computer and even run that cheaper. Probably adds to network cost more than anything.

And home computers could be amplified with similar potentially too.

I imagine you could get optical connections on the PC or on an expansion card to hook something up to expand your computer.
The problem is FD set the bar low that lets you create an instance with a wonky ISP and a potato player computer with insufficient capability to host a proper instance that holds up when it gets populated by players around the world.

AWS is an awesome service when its users all have 21st century providers, CLECs and good PCs. P2P enables us to play without a subscription. The majority of players want it this way.

Optical computers are a pipe dream. We don't have the money to do something like that yet nor the technology.

o7 Cmdr
 
Since I have worked in the electronics industry for the last 25 years, and I work for a company that develops some of the leading tech hardware, I can say that the hardware we have now is not the limiting factor in what computers can do for 99% of the applications that most users would use it for. Additionally much of the software out there now does not fully utilize the resources of most modern CPU's.
Most of the hardware now is System on a Chip (SOC) where an ASIC has most of its functions within the chip package. Some chips have a single silicon die, and others have multiple dies in one package. As a result the speed of operations is limited by the amount of transistors on the die. The scale that they are making these chips now is absolutely astounding now. They can make a transistor as small as 7nm now, and the next node is 4nm. That is obscenely small, and allows for more transistors to be package into a device, reduces power consumption and increases speed.
The method used for moving data between chips on a board is much different that it was 20 years ago. For the most part parallel buses with single ended termination are a thing of the past for high speed applications. Nowadays they use Low Voltage Differential Signals with SERDES gates in the chips to achieve high data rates between chips. And Mult Gigabit Transceivers can achieve data rates that would blow your mind not only within a circuit board, but across cables that can connect different pieces of hardware. This is what the PCIe bus uses that connects your Video Card to the CPU on your motherboard.

A purely optical computer is impeded by the fact the a computer functions on the principal of binary logic (0 or 1) and how would one go about interpreting light frequencies into logic states that could be calculated? At the end of the day for the data to be displayed on a screen, it would have to be converted back into binary data at some point, which would negate the massive bandwidth advantage of an optical link.
Now with Quantum computing they are taking the binary machine state principal and essentially doubling or quadrupling the output of a logic gate. Now instead of a single 0 or 1, we can now have 00, 01, 10, 11 in a single gate. This makes a single calculation even faster than the traditional transistor gate logic. No doubt as the technology matures they can take this idea and make each gate 4-bit, 8-bit, 16 -bit wide, thus exponentially increasing performance without needing faster clock cycles. This would also reduce the number of gates that would need to be put into a device to achieve a given level of performance.
we have the best forum. o7 🤓
 
Yeah, thi thread has became something of an entertaining nonsensical rabbithole, at first I thought the OP was some kind of savant, and was actually going to support him... That was until I seen the huge glaring ommissions in his theoretical design, and the comment:

You are thinking of another form of optical computer mimicking electrical components we have now. That is completely idiotic and I have no idea why any researchers are even bothering unless they are literally stealing research money. It's much slower too unless you really want to package a lot of smaller stuff into CPU dies...

And I was like "computers without logic gates hmmm..." Then the onionhead allegations came in and I was like, "very likely", epecially when you take into considerationhe appears to think he singlehandedly has worked out the theory of optical computing, that have seemingly alluded the "disengenious researchers" who he accuses of, or at least infers "are essentially stealing research money". Messiah complex? So universities cannot crack this by throwing money at experts in the field and he fully expects a small software company to plough its resources into mastering it for the benefit of one of their games, ok. Or maybe he is the genius he thinks he is and we are all wrong, but in the balance of probabilities I don't see that heppening.

Seriously @Noobilite - I'm dissing you here based on how your posts in this thread have portrayed you, I would like to be wrong on this, please go and master your concepts, prove they work, get some publicity, and necro this thread to give us all a big and deservedly sanctimonious bordering smarmy "I TOLD YOU SO!!!!".
 
Fun fact - The name ‘Frontier Developments’ is not a reference to game development and Frontier: Elite II, but is actually in reference to the company’s main focus of cutting edge technology development. 😀
 
Top Bottom