Hardware & Technical 24 GiB RAM. Hathankyyouverymuch

Well, the latest bottle neck in my PC has now shifted again thanks to buying 16 GiB Ram from Scan. Installed and works as per. Fear my 24 GiB.


That is all. :D :D
 
Like everything, it depends on where the bottleneck is. With hindsight, I kinda wish I didn't get so many Skylake systems and should have went with Haswell-E on at least one of them. The application? Prime number finding. High end consumer quads are more ram bandwidth limited than CPU limited.
 
Like everything, it depends on where the bottleneck is. With hindsight, I kinda wish I didn't get so many Skylake systems and should have went with Haswell-E on at least one of them. The application? Prime number finding. High end consumer quads are more ram bandwidth limited than CPU limited.

With my Haswell and my ram dual channel, I am a happy man (for the moment)
 
For something like 99% of the cases, that is fine. I just happened to be interested in the 1% that it makes a big difference in. I need lower cost interests...
 
And now you can wait for the game which actually needs more than 16Gb of RAM... ;)
Even 8Gb is more than most games will ever need.
 
Indeed. My mates first PC (the ones that the 'Turbo' button to boost the CPU) had 4MB of RAM and 120MB HDD. We thought it was amazing.

Such young people around here. My first computer was an Ohio Superboard with 4 KByte of memory. Later had a BBC where we 1st met DB and Elite, which could only directly address 64 KByte OF RAM due to them being 8 bit (256 x 256 address).
When we got IBM compatibles we wondered how we were ever going to use all of the 640 KByte available on 16bit systems. We soon found out. Programmers got lazy and business wanted instant results so nobody optimised code anymore.
If you played games you soon learnt how to write Autoexec.bat and config.sys files to put drivers etc. above the 640 KByte limit. Which wasn't helped by the fact there were two systems, "Expanded" and "Extended" memory. So to change from one game to another usually meant re-booting with a different floppy.

Never believe those stories that say it was better in the good old days.
 
Last edited:
If you played games you soon learnt how to write Autoexec.bat and config.sys files to put drivers etc. above the 640 KByte limit. Which wasn't helped by the fact there were two systems, "Expanded" and "Extended" memory. So to change from one game to another usually meant re-booting with a different floppy.Never believe those stories that say it was better in the good old days.

The good old days were all the same a delicious period in the computing

;)
 
Such young people around here. My first computer was an Ohio Superboard with 4 KByte of memory. Later had a BBC where we 1st met DB and Elite, which could only directly address 64 KByte OF RAM due to them being 8 bit (256 x 256 address).
When we got IBM compatibles we wondered how we were ever going to use all of the 640 KByte available on 16bit systems. We soon found out. Programmers got lazy and business wanted instant results so nobody optimised code anymore.
If you played games you soon learnt how to write Autoexec.bat and config.sys files to put drivers etc. above the 640 KByte limit. Which wasn't helped by the fact there were two systems, "Expanded" and "Extended" memory. So to change from one game to another usually meant re-booting with a different floppy.

Never believe those stories that say it was better in the good old days.


Cheers! It's been a while since I was called young. I'll take it! :)

Yeah, my first computer was a ZX81, which was certainly entertaining if your sadistic. Sadly the pattern continues to this day, the more resources we are given the less 'streamlined' code seems to become. Or perhaps I'm being unkind, as when quality or quantity of improvements increase over the years the difficulty in coding it all surely also increases at least at the same rate if not exponentially?
 
In mainframe shops the answer is ALWAYS more CPU - optimising code is never an option (it must therefore be perfect all the time). PC's just follow the trend...

My brief period of technical masochism was writing RISC assembler...
 
In mainframe shops the answer is ALWAYS more CPU - optimising code is never an option (it must therefore be perfect all the time). PC's just follow the trend...

My brief period of technical masochism was writing RISC assembler...

Well not always - most mainframe shops already have the CPU there, just need the license and an IBM site engineer to take out a screw :D

Entirely depends upon workload though, and if you are a ZIP or ZAP leaner.
 
Back
Top Bottom