Maybe one day when I'm really, really old I'll get to wear cool looking glasses.
Trust me, you don't want to.
Maybe one day when I'm really, really old I'll get to wear cool looking glasses.
The funny thing is I did as a kid in primary/elementary school, but my eyes naturally fixed themselves as I aged as part of the aging process. Kind of weird.Trust me, you don't want to.
The funny thing is I did as a kid in primary/elementary school, but my eyes naturally fixed themselves as I aged as part of the aging process.
Kind of weird.
But thats gonna change when they develop @Noobilite's optical computer concept ;-)You do realize Frontier doesn’t own the game servers, right? They lease servers that reside in the Amazon Data Center, and they don’t even have physical access to them.
Damn, you know what? I bet that didn't even cross their minds.. Huh..Has Fdev considered building an optical computer?
You are thinking of another form of optical computer mimicking electrical components we have now. That is completely idiotic and I have no idea why any researchers are even bothering unless they are literally stealing research money.
Begrudgingly, I suppose I should post something more on topic in this thread as well...
For transferring data over some distance, fiber optics can be useful, but as far as I know for smaller scale, local computing like on a motherboard, keeping signals electricity based is more practical and efficient than having to convert the signals back and forth.
I haven't really looked into fiber optic networking for a server farm, distributed computing, or something similar, but I suppose it might make sense in that sort of scenario.
The problem is FD set the bar low that lets you create an instance with a wonky ISP and a potato player computer with insufficient capability to host a proper instance that holds up when it gets populated by players around the world.Yea, but it would give so much leway they only have to not screw up the networking. And they wouldn't have to worry about needing to use purely P2P as they could use the current ones or cheaper as the backbone for the optical computer and even run that cheaper. Probably adds to network cost more than anything.
And home computers could be amplified with similar potentially too.
I imagine you could get optical connections on the PC or on an expansion card to hook something up to expand your computer.
we have the best forum. o7Since I have worked in the electronics industry for the last 25 years, and I work for a company that develops some of the leading tech hardware, I can say that the hardware we have now is not the limiting factor in what computers can do for 99% of the applications that most users would use it for. Additionally much of the software out there now does not fully utilize the resources of most modern CPU's.
Most of the hardware now is System on a Chip (SOC) where an ASIC has most of its functions within the chip package. Some chips have a single silicon die, and others have multiple dies in one package. As a result the speed of operations is limited by the amount of transistors on the die. The scale that they are making these chips now is absolutely astounding now. They can make a transistor as small as 7nm now, and the next node is 4nm. That is obscenely small, and allows for more transistors to be package into a device, reduces power consumption and increases speed.
The method used for moving data between chips on a board is much different that it was 20 years ago. For the most part parallel buses with single ended termination are a thing of the past for high speed applications. Nowadays they use Low Voltage Differential Signals with SERDES gates in the chips to achieve high data rates between chips. And Mult Gigabit Transceivers can achieve data rates that would blow your mind not only within a circuit board, but across cables that can connect different pieces of hardware. This is what the PCIe bus uses that connects your Video Card to the CPU on your motherboard.
A purely optical computer is impeded by the fact the a computer functions on the principal of binary logic (0 or 1) and how would one go about interpreting light frequencies into logic states that could be calculated? At the end of the day for the data to be displayed on a screen, it would have to be converted back into binary data at some point, which would negate the massive bandwidth advantage of an optical link.
Now with Quantum computing they are taking the binary machine state principal and essentially doubling or quadrupling the output of a logic gate. Now instead of a single 0 or 1, we can now have 00, 01, 10, 11 in a single gate. This makes a single calculation even faster than the traditional transistor gate logic. No doubt as the technology matures they can take this idea and make each gate 4-bit, 8-bit, 16 -bit wide, thus exponentially increasing performance without needing faster clock cycles. This would also reduce the number of gates that would need to be put into a device to achieve a given level of performance.
I do what I want!And don't pee into the wind!
You are thinking of another form of optical computer mimicking electrical components we have now. That is completely idiotic and I have no idea why any researchers are even bothering unless they are literally stealing research money. It's much slower too unless you really want to package a lot of smaller stuff into CPU dies...
Feel free to shed some light on the matter...As someone working as an administrator for a university high performance computing cluster I just want to say that this thread has been absolutely hilarious, thanks to everyone for participating
Aswesome! Btw, why are we discussing FDEV hardware in a ED forum?