In the rich EU, where everything used to be plentiful, we are now foreseeing an upcoming winter, where people are going to find it difficult to heat their homes, and where the power providers warn that there might be times with no electricity. It's not that EU won't have energy. It's just that 15% of it comes from Russia or Ukraine. Yet, it shows how dependent on energy we have become. There is no substitute for energy, so the price skyrockets, even with relatively small changes in supply, because the demand constantly increases.
It takes energy to flip a bit. There is even a theoretical minimum amount of energy needed, called the Landauer Limit. Today we are not even close to being able to flip bits that efficiently, and technology is constantly being developed that increase efficiency, but at the same time the number of bits flipped worldwide is also increasing, roughly following Moore's Law. Combined the global energy energy use related to IT is growing exponentially, because the advances in technology doesn't make up for the increase in data.
Today we use roughly 10% of all the electricity used in the World for IT. A third of that is used in datacentres, a third is used for transferring data, and a third is used in personal devices like smartphones or PCs. Those 10% will become 20% in roughly three years time, then 40%, 80% 160%... I hope y'all get the picture. Electricity is not equal to energy used. Globally we use roughly five times as much energy as we use electricity, so the 10% electricity "only" amounts to 2%, but those 2% become 4%, 8%, 16%, 32%, 64% and finally 128% in six doubling times, or in less than 20 years! That surely won't happen, but it's remarkable how little we normally talk about it.
This is an "issue". We simply can't "produce" energy enough to keep having faster and faster hardware. It needs to be efficient rather than fast.
Edit: Btw. that includes writing more efficient code. The days of writing cheap inefficient code are soon to be over.