Crossover, the tech could be used on the next generation of DC.Aswesome! Btw, why are we discussing FDEV hardware in a ED forum?
Optical DC FTW!
(Smashes into wall).
Crossover, the tech could be used on the next generation of DC.Aswesome! Btw, why are we discussing FDEV hardware in a ED forum?
Docking Computer?Crossover, the tech could be used on the next generation of DC.
Optical DC FTW!
(Smashes into wall).
Yes, only available in Wyrd in open.Docking Computer?
Since I have worked in the electronics industry for the last 25 years, and I work for a company that develops some of the leading tech hardware, I can say that the hardware we have now is not the limiting factor in what computers can do for 99% of the applications that most users would use it for. Additionally much of the software out there now does not fully utilize the resources of most modern CPU's.
Most of the hardware now is System on a Chip (SOC) where an ASIC has most of its functions within the chip package. Some chips have a single silicon die, and others have multiple dies in one package. As a result the speed of operations is limited by the amount of transistors on the die. The scale that they are making these chips now is absolutely astounding now. They can make a transistor as small as 7nm now, and the next node is 4nm. That is obscenely small, and allows for more transistors to be package into a device, reduces power consumption and increases speed.
The method used for moving data between chips on a board is much different that it was 20 years ago. For the most part parallel buses with single ended termination are a thing of the past for high speed applications. Nowadays they use Low Voltage Differential Signals with SERDES gates in the chips to achieve high data rates between chips. And Mult Gigabit Transceivers can achieve data rates that would blow your mind not only within a circuit board, but across cables that can connect different pieces of hardware. This is what the PCIe bus uses that connects your Video Card to the CPU on your motherboard.
A purely optical computer is impeded by the fact the a computer functions on the principal of binary logic (0 or 1) and how would one go about interpreting light frequencies into logic states that could be calculated? At the end of the day for the data to be displayed on a screen, it would have to be converted back into binary data at some point, which would negate the massive bandwidth advantage of an optical link.
Now with Quantum computing they are taking the binary machine state principal and essentially doubling or quadrupling the output of a logic gate. Now instead of a single 0 or 1, we can now have 00, 01, 10, 11 in a single gate. This makes a single calculation even faster than the traditional transistor gate logic. No doubt as the technology matures they can take this idea and make each gate 4-bit, 8-bit, 16 -bit wide, thus exponentially increasing performance without needing faster clock cycles. This would also reduce the number of gates that would need to be put into a device to achieve a given level of performance.
Pure optical computers wouldn't need logical gates in the traditional form. You can send it to specialized equipment to parse the data out as is needed. At least potentially. You could deal with it in a lot more ways than normal modern logical gates as you could deal with much more data at once(And need different manipulations). Not to mention you could hypothetical change the hardware out more easily depending on how it's constructed and either change or add on for new outputs. It would open up a bunch of new methods for computer design. Treating it purely as traditional hardware would be a bit restrictive unless you can use/keep the part forever. And hope it's not overly expensive and becomes obsolete with and upgrade and is reusable for other applications. A proper optical computer should be basically completely open ended adjustable and upgradable.Yeah, thi thread has became something of an entertaining nonsensical rabbithole, at first I thought the OP was some kind of savant, and was actually going to support him... That was until I seen the huge glaring ommissions in his theoretical design, and the comment:
And I was like "computers without logic gates hmmm..." Then the onionhead allegations came in and I was like, "very likely", epecially when you take into considerationhe appears to think he singlehandedly has worked out the theory of optical computing, that have seemingly alluded the "disengenious researchers" who he accuses of, or at least infers "are essentially stealing research money". Messiah complex? So universities cannot crack this by throwing money at experts in the field and he fully expects a small software company to plough its resources into mastering it for the benefit of one of their games, ok. Or maybe he is the genius he thinks he is and we are all wrong, but in the balance of probabilities I don't see that heppening.
Seriously @Noobilite - I'm dissing you here based on how your posts in this thread have portrayed you, I would like to be wrong on this, please go and master your concepts, prove they work, get some publicity, and necro this thread to give us all a big and deservedly sanctimonious bordering smarmy "I TOLD YOU SO!!!!".
And you're so capable, diligent and professional you deem non optical computer research as laziness? Or are the rest of us just too stupid or lazy to understand your wonderful ideas which:Something I'm assuming everyone now is too lazy to bother with or doesn't care about for various reasons.
Thus introducing more lag by having to translate from light to data and back again...Pure optical computers wouldn't need logical gates in the traditional form. You can send it to specialized equipment to parse the data out as is needed.
Don't know - I'm completely mistifyed by it.Also, how would an optical computer work on a very foggy morning? Rather bad I suppose, oh what a misfortune that could be.
So this is how it goes.
Innocent nonsense forum post -> technological breakthrough -> optical rogue AI -> Butlerian Jihad -> docking computer flying me to a wall 3306.
And you're so capable, diligent and professional you deem non optical computer research as laziness? Or are the rest of us just too stupid or lazy to understand your wonderful ideas which:
Thus introducing more lag by having to translate from light to data and back again...
Don't know - I'm completely mistifyed by it.
@Noobilite All you have described is an optical to digital modem, you've not given any details on how you intend to PROCESS any information you encode into your optical computer. Also, how do you STORE data as light? Hell, how do you even generate light as an input or output - you said micro leds, whats driving them? Remember that LEDs are Light Emmitting Diodes - ELECTRONIC devices, so your proposed optical computer needs electronics to control its first part of the process, and electonics to decode the output colour/wavelength/frequency* of light arising from the output. You say minituarisation isn't needed, are you sure you want to go back to computers the size of houses?
In summary, of everything you have said thus far about your proposed optical computing model that you propose frontier develop in their basement for this game because duh, its so easy and computer sceintests are lazily misappropriating research funds, is that:
It cannot store data
It cannot process data
It uses different coloured light going through the optical pathways
It uses (micro)LED's as its light source - thus requiring more electronics
It passes "data", the light generated from thsoe micro leds, to electronic detectors and onto other "specialised equipment" in an open loop
Thus:
It has two transducements input and output, each adding more lag to the "throughput" of data
It is essentially a communications protocol not an actual computing platform
ERGO it mus slow down computing rather than enhance it.
The more you type, the sillier you are making yourself look.
1. = You saying no no its not that but yes it is you saying computer science as it is just now is a grand larsonous misappropriation of funds1. No i'm refering to the fact that stuff like this is not done to peak performance for all application for practical reasons potentially. There is always more room. And lots of it. Not to mention convenience for the people doing the work. And potentially dealing with pay vs education levels.(So, basically yes.)
2. Not if it's all light based like a monitor knowing how to send the data to get the correct end result on the monitor purely from light. with the same light sent directly manipulated. You can duplicate and send the full data stream and have the hardware physically remove/manipulate and send the correct light directly to the monitor basically.
I'm saying monitors could have it built into them as specilized hardware and sent simple raw signals. It depends on the hardware design. Optical leaves a wider range of options at it's base as you can maintain speed and infinitely split the stream potentially.1. = You saying no no its not that but yes it is you saying computer science as it is just now is a grand larsonous misappropriation of funds
2. = You saying your optical computer is going to work like a computer monitor by manipulating light, newsflash sonny, monitors work off electricity until their final output of light, there is no manipulation of light in a monitor.
OR are you saing you will manipulate the light on its route to the monitor, tell me how you are going to do this? Tell me how you are going to store light to be able to use its data later? Tell me how you are going to use light to drive the monitor - translating it to electrical signals for the journey from your optical computer to the monitor is not an option as it would as I've already stated induce lag and erode performance.
People are working on this.1. = You saying no no its not that but yes it is you saying computer science as it is just now is a grand larsonous misappropriation of funds
2. = You saying your optical computer is going to work like a computer monitor by manipulating light, newsflash sonny, monitors work off electricity until their final output of light, there is no manipulation of light in a monitor.
OR are you saing you will manipulate the light on its route to the monitor, tell me how you are going to do this? Tell me how you are going to store light to be able to use its data later? Tell me how you are going to use light to drive the monitor - translating it to electrical signals for the journey from your optical computer to the monitor is not an option as it would as I've already stated induce lag and erode performance.
This calculates or calculates light photon particles with a laser diode. In the mid-sixties, light particles (fiber optics) are used in the calculation of electron velocity.
Digital computer systems, data, how many research projects, flow theory, most computer components are theories. This method is optimized for the computing company in the short term prospect for two-way optoelectronics sitting together, it seems. However, optoelectronic devices or light particles, electrons increase up to 30% of returned objects. When you send an exchange of messages as an invitation. Changes to eliminate the need for electricity (€) A call to the theory of computers to eliminate the theory of electricity. [1]
As a view in radar theory, this synthesis has been developed to use application-specific device (SAR) theory and optical correlators. For example, optical information, time interval identifiers, and communication stages can be divided into objects [2]. [3]