Has Fdev considered building an optical computer?

Talking of which, we all seem to agree the OP is not as clever as he pretends to be, so its a fake intelect hes presenting here... MABYE, hes the guardian Artificial Intelligence?
 
More likely to be Guardian technology for optical computing....

iu
 
But will it run Crysis?

Now please don't be silly. On to a more serious question on the subject of optics computing:

If a bonfire on top of a hill 10 km away is relaying optical data with a maximal theoretical throughput of 250 Gigaembers per second in the white/yellow spectrum, how long would it take to download the entire Read Dead Redemption 2 in a caleidoscope with a 50 mm aperture? And how long should be the caleidoscope tube to store the entire game?

Asking relevant questions here, for science.
 
Now please don't be silly. On to a more serious question on the subject of optics computing:

If a bonfire on top of a hill 10 km away is relaying optical data with a maximal theoretical throughput of 250 Gigaembers per second in the white/yellow spectrum, how long would it take to download the entire Read Dead Redemption 2 in a caleidoscope with a 50 mm aperture? And how long should be the caleidoscope tube to store the entire game?

Asking relevant questions here, for science.

2.8%
 
I thought that "grainy potato" could have been a more accurate estimate, but yes 2,8% seems plausible enough. You can't beat the scientific method.
 
Exaflops was from the article about the new frontier supercomputer. Those were their words.

And if you drop the standards could this be used as a variable form of ram to expand system ram or add another effective cache layer for parts of an existing system? Even if you don't get maximum throughput couldn't you use it to do variable forms of calculation on the fly for different applications potentially?
 
Last edited:
Becuase you could simplifiy the data down to image data and possibly simplify the method of manipulating it quite a bit.less.

so let's assume for a picosecond you build that computer. it would take decades of international space agency (or military) levels of investment and research, but you weren't going to be intimidated by that. you also would have fend off governments, shady agencies and corporations from all over the world who would want to put their hands on such a device, and would gladly nuke a country or a dozen for it if that's what it took (and with good reason because among other things it probably would break much of their precious encryption), and you would have shipped that thing to cambridge regardless. in a pizza box. to host their game. like a boss.

I haven't heard you or anyone else go into a single side of the software of the specifics

yeah, let's talk about that. because you haven't yet explained the bit "It seems all of their problems stems from servers and computer limitations.", which is clearly a false statement (simply because of the 'all' bit in it). so what specific problems are those? and what limitations do you think they are hitting?

because afaik most problems frontier has (cheating, network performance, low quality, lack of content in a lifestyle game, improvised design leading to irrational codebase (TM)(braben)) in elite:dangerous have been solved already for years by several other game developers with commercial grade hardware and industry standard software development methods. i mean, doing their job.

so, unless i'm missing something here, i'm afraid all that effort to build your optoputer3000 would be for nothing. well, probably the loading screen would be a bit faster. maybe they could store the whole galaxy in a pizza box, but it would still have to get to you over a leased wire, to run on your conventional processors. i'm afraid you wouldn't see much difference. it will cost you years of gdp of several countries and probably a few wars, though. are you sure you want to get through with this?
 
Last edited:
While you're here - how did you go about licking thargoids without elecroting yourself? Just something that came to mind earlier in this thread...

Whilst I bow to RELSPI's expert knowledge on such matters, lore wise I've never heard of electrocution being an issue, temperature may be more of a concern as something like this may happen...

9SFEd8P.jpg



Now please don't be silly. On to a more serious question on the subject of optics computing:

If a bonfire on top of a hill 10 km away is relaying optical data with a maximal theoretical throughput of 250 Gigaembers per second in the white/yellow spectrum, how long would it take to download the entire Read Dead Redemption 2 in a caleidoscope with a 50 mm aperture? And how long should be the caleidoscope tube to store the entire game?

Asking relevant questions here, for science.

I don't know but it's probably better than my internet connection at the moment.


In the dim, distant past, when I was a student, I vaguely recall talk of optical computers being the next big thing. Now that was a long time ago , back when blue LEDs were new, rare and much more expensive than regular red, green or yellow ones. It seems to be one of those technologies like nuclear fusion that always seem to be x years away from being commercially viable.

I'm no expert in the field but my gut feeling is that a purely optical computer can't easily be made from off the shelf components. Even if it were possible and happened in Cambridge, it is vanishingly unlikely that FDev would be the company behind it. They simply don't have the expertise to do it. I'm not knocking FDev, any rational person can see that such things are not in their purview. They are a small/medium games software developer/publisher they make theme park sims and ED and may have some ability to make other genres of game. They employ coders and various creative types such as artist and animators there is zero expectation that any of them need to be electronic engineers specialising in opto-electronics..

IIRC one of the big issues holding back optical computers is the need to ditch Silicon as it can't be made into an emitter so it would be necessary to switch to slightly more exotic semi-conductors like GaAs
 
I’m still laughing at exaflops.

Oh, that's actually a real word.
FLOPs == Floating-point operations per second

'Exa-' is a number prefix -> Mega-, Giga-, Tera-, Peta-, Exa-, etc.
'Exascale' is the current target and buzzword in high performance computing; as a single multi-socket server can now do well over a TeraFLOP/s and most Clusters have a peak performance somewhere in the PetaFLOP range, exceeding one ExaFLOP/s is obviously the next step.

E.g.: The current leading system of the HPC top 500 is the 'Summit' cluster at Oak Ridge, with ~2.5 million CPU cores and a theoretical peak performance of ~200 PetaFLOP/s.
 
Oh, that's actually a real word.
FLOPs == Floating-point operations per second

'Exa-' is a number prefix -> Mega-, Giga-, Tera-, Peta-, Exa-, etc.
'Exascale' is the current target and buzzword in high performance computing; as a single multi-socket server can now do well over a TeraFLOP/s and most Clusters have a peak performance somewhere in the PetaFLOP range, exceeding one ExaFLOP/s is obviously the next step.

E.g.: The current leading system of the HPC top 500 is the 'Summit' cluster at Oak Ridge, with ~2.5 million CPU cores and a theoretical peak performance of ~200 PetaFLOP/s.
I know, but it still sounds ridiculous.
 
Wouldn't these sorts of things be useful for any software where you need a lot of computation and the end result is a relatively simple answer? Or where you can wait for the lag for the end result because you don't need it live. Not sure what that would be though. Would it aid in graphics processing of very large projects? Say, if you had really large storage separate for something like a movie. Or medical research?

I thought there were ways to in essence get complex results by simply applying them to a monitor result instead of storage or as a data value also. If you don't need to do certain things with the end result does it matter.
 
Last edited:
Top Bottom