OK. So, many years back when we were reaching the clock frequency of around 3GHz I had assumed that because of the physical constraints of chip design and manufacturing being that one cannot etch a single transistor on the silicon wafer any smaller than a single atom wide with one’s etching laser that it was all about to be over and computers weren’t going to be getting any quicker and that was going to be the end of all that then.
Of course, what happened next was that computer engineers simply added a second core to the CPU, and dialled down the frequency of each core so that a chip could be just a little bit faster than the previous years’ chip, and that’s essentially what has been happening ever since.
Extra cores get added, and the frequency of the CPU product as a whole has been constantly increasing according to the same Moore’s law ever since just as it had done so for some decades previously with the incremental improvements in speed every two years being kept stable, likely for economic capitalistic reasons I expect, but that’s not the discussion I want people to have here.
However, I read a headline the other month that some Nvidia high-up engineer had stated recently that they had reached a ‘brick wall’ with regards to processing power.
Is it so that now chip manufacturers are unable to add any further physical cores to our CPU’s and GPU’s and has the industry now reached the point where we are or will soon no longer be able to manufacture faster computers? And what do people think this would mean for the industry? Not just the hardware industry, but also the gaming industry, which I think we can safely assume has been one of its primary driving forces.
I’m hoping this will generate an interesting, thought-provoking and informative discussion, as I know there are a whole bunch of smart people here on these forums that know far more about computer engineering science than I do, but there will also be a lot of people that know far less than me who would be equally better enlightened from reading such a discussion and hearing other people’s better-informed viewpoints on the matter.
Some of the questions I would posit are; do we really need faster computers now anyway? And should we instead be trying to make computers that are more energy-efficient rather than faster? Or perhaps with better recyclability in mind, rather than cheaper and more efficient disposability being the primary manufacturing goal?
Obviously, a business computer that just needs to run an internet browser together with some office software doesn’t need to have the same processing power that a gaming rig requires, so energy can be conserved there by not needing such fast clock speeds or memory and storage available, and the business computer which I am typing this document on only consumes as much as 30W maximum including the 26” monitor it runs with and it’s still way too over-powered for what I do on it IMHO. But with computer games when so many of them are not requiring such high-powered graphics now often being more focused on gameplay and with development time needing to be kept efficient also being an important factor, does this now mean that the graphics of our computer games, which ever since the first computer game has been improved upon every year and has always been the primary selling point is now no longer as important as it was and is now no longer able to drive the hardware industry into manufacturing ever-more powerful and graphically-capable computers anyway?
Please let’s stick to conventional computing as we know it, and not deviate too much into the realms of quantum computing or any other unusual esoteric computer types that may actually only be theoretical and not actually practical and reality-based.
Of course, what happened next was that computer engineers simply added a second core to the CPU, and dialled down the frequency of each core so that a chip could be just a little bit faster than the previous years’ chip, and that’s essentially what has been happening ever since.
Extra cores get added, and the frequency of the CPU product as a whole has been constantly increasing according to the same Moore’s law ever since just as it had done so for some decades previously with the incremental improvements in speed every two years being kept stable, likely for economic capitalistic reasons I expect, but that’s not the discussion I want people to have here.
However, I read a headline the other month that some Nvidia high-up engineer had stated recently that they had reached a ‘brick wall’ with regards to processing power.
Is it so that now chip manufacturers are unable to add any further physical cores to our CPU’s and GPU’s and has the industry now reached the point where we are or will soon no longer be able to manufacture faster computers? And what do people think this would mean for the industry? Not just the hardware industry, but also the gaming industry, which I think we can safely assume has been one of its primary driving forces.
I’m hoping this will generate an interesting, thought-provoking and informative discussion, as I know there are a whole bunch of smart people here on these forums that know far more about computer engineering science than I do, but there will also be a lot of people that know far less than me who would be equally better enlightened from reading such a discussion and hearing other people’s better-informed viewpoints on the matter.
Some of the questions I would posit are; do we really need faster computers now anyway? And should we instead be trying to make computers that are more energy-efficient rather than faster? Or perhaps with better recyclability in mind, rather than cheaper and more efficient disposability being the primary manufacturing goal?
Obviously, a business computer that just needs to run an internet browser together with some office software doesn’t need to have the same processing power that a gaming rig requires, so energy can be conserved there by not needing such fast clock speeds or memory and storage available, and the business computer which I am typing this document on only consumes as much as 30W maximum including the 26” monitor it runs with and it’s still way too over-powered for what I do on it IMHO. But with computer games when so many of them are not requiring such high-powered graphics now often being more focused on gameplay and with development time needing to be kept efficient also being an important factor, does this now mean that the graphics of our computer games, which ever since the first computer game has been improved upon every year and has always been the primary selling point is now no longer as important as it was and is now no longer able to drive the hardware industry into manufacturing ever-more powerful and graphically-capable computers anyway?
Please let’s stick to conventional computing as we know it, and not deviate too much into the realms of quantum computing or any other unusual esoteric computer types that may actually only be theoretical and not actually practical and reality-based.