General / Off-Topic Video card prices? Monopoly? 2d/3d computer lego assembly!


The Ceo's are relatives. Is it possible this is all a scam run by them to avoid monopoly status in places like the US. They could indefinitly create fake competition and then do god knows what. Especially with cryptocurrency being pushed for some time on video cards.

I'm surprised they haven't been investigated. Especially with the hardware prices going up. Also related to crypto currency. It should be views as potential mass fraud and price fixing among other things. And possible sales blocked in counties with such laws if not expedition and jail time if they enter the country.

I think we need a new hardware revolution. I've had this idea for ages. Make them like lego parts and assemble. Make them based on solid high speed connections(if possible) maybe light based. Then assemble a PC. Make cavets in it or submerge or put in a casing to deal with heating and cooling. Then instead of buying a new card you upgrade the individual components. Add more Vram. Customize the hardware.

Then we just need to kick software people in the butts and make them learn to write proper code again. If you logically break it down it's not that complicated. Write code for various hardware and predict endless expansion. Multiple CPU and other things to utilize the hardware in a more diverse way. In fact you could write any software and have people buy new parts or replace outdated ones. Software could be done in much more interesting ways also. If you know they could add any type of chip... The hardware is flexible then. You could write predicting unpredictability. That could hypothetically be easier. And have long term development of software, meaning it might not need to be changed later. Just amended. All hardware will be viable again as you can simply add it instead of being at the whim of a hardware/mobo manufacturer. Then more focus can be put on pure engineering as everything has a potential market!!

I would love to assemble a Mobo.

Plus it could be 2d or 3d as you could stack for better connections between parts. I would love to simply buy more vram. They could also design hardware with different focuses. Maybe unlimited ones. Then we just need driver pools and whatnot to keep it supported.

In fact I think you could get much more radical designs than what I am imagining. You could make whole new backbones for PC's as you wouldn't have to base it on a pcb. You could make all sorts of new designs. The only potential limitation is creating ease of assembly. And that is an if as you could cheapen equipment to do it more proffesionally making another new market. In fact they could work together as different parts may have ideal methods for different applications. You would want openness as people would need it for different circumstances.
 
Last edited:
Especially with the hardware prices going up. Also related to crypto currency. It should be views as potential mass fraud and price fixing among other things. And possible sales blocked in counties with such laws if not expedition and jail time if they enter the country.

Prices aren't going up, they are going down, because there is more competition now, because AMD actually has competitive products and a revenue stream.

In the mainstream and lower high-end, AMD has been aggressively undercutting NVIDIA since the release of Navi and NVIDIA has responded with the release of a Turing refresh.

I think we need a new hardware revolution. I've had this idea for ages. Make them like lego parts and assemble. Make them based on solid high speed connections(if possible) maybe light based. Then assemble a PC. Make cavets in it or submerge or put in a casing to deal with heating and cooling. Then instead of buying a new card you upgrade the individual components. Add more Vram. Customize the hardware.

Beyond the package level the trend has been moving toward greater component integration, not greater modularity, and with good reason.

What's the incentive to make a video board with swapable components when the power delivery requirements, memory standard, memory channels, etc, can change every other generation? You'd radically increase board costs for little practical benefit.

I would love to simply buy more vram.

Socketed video memory used to be a thing, but it's neither technically nor economically practical any longer.

Very few cards would benefit from simply adding more VRAM, especially at the consumer level. Socketed BGA adapters would not be cheap and would not make for easy upgrades, while VRAM on separate modules would significantly increase overall trace lengths (causing signaling issues and probably limiting performance) as well as compromise the ability to cool higher performance ICs.

In fact I think you could get much more radical designs than what I am imagining.

Sure, but it would be more expensive and thus less profitable.
 
I was thinking more, scrap the boards all together and just assemble it. Get rid of PCI-e Stuff potentially and run it closer together. Or anything you want as you could modularize the PCI-e framework also. I guess it's likely someone would try to monopolize it though and keep prices high. But if you could assmeble the board you could do interesting things. And new designs could be used in general hypothetically.

You cold put the video card components next to the CPU with super high speed connections and have ram and Vram approriately stuck in stages as is needed to the architecture you assemble. The use of adapter slots could be optional. You could also utilize a 3d design to get extra stuff close to each other instead of just a 2d design. You could try to mitigate cost by having smaller components and different designs to compete with 2d boards cost effectiveness if it's an issue. Hell, 2d boards could incorporate space for this in their boards to give more options. The CPU area could be modular with extension to the rest of the board. This could hypothetically allow a 2d and 3d design together or just 2d customization. Including ram on board near the CPU. you could stick ram chips right next to and surrounding the CPU. Or have new designs that make the CPU/GPU obsolete in concept and do something new or different as an option. CPU/GPU designers could try out whole new things. It could free up lots of options for all manufacturers.

Designers and manufacturers could also release things in a simpler manner as anyone could upgrade at any point without as much problem. if you don't have to buy a new mobo and everything else for a chip and they can run with old parts then people are more free to buy as it's not attached to other things. They won't need the flashy presentations in the same manner. Although they still can. They can just say this will be released in X and X time. Buy one of our chips to upgrade your rig. And everyone can do that. Plus outdated things will last longer and have more use as people can do more with their computers and change them out on a whim. Including keeping old things to swap out for compatibility and whatnot. It would be normal to keep a bin of old stuff.

Possibly stupid video:
Source: https://www.youtube.com/watch?v=dLkWB-Iw3-s
 
Last edited:
No, those were onboard. I mean make the individual components on little lego like peices and put them together instead of having a mainboard potentially. You could literally build a PC. Literally buy the individual components that are normally on a mobo and put them together in whatever way you wish.

Instead of video card you would buy the stuff on the video card and put them together with the mainboard stuff however you wanted.
 
Last edited:
No, those were onboard. I mean make the individual components on little lego like peices and put them together instead of having a mainboard potentially.
That'd make systems very expensive and way less reliable than what we have now. Good interconnects are expensive and come with their own issues like being susceptible to contamination (especially in high-current paths), mechanical aging, often a low number of actuations, etc.; you usually want to have as few of those as possible, and PCIe is currently doing a good job at compromising between base-level integration and extensibility. It's not the fastest or lowest-latency interconnect, but it manages to transport a lot of data over relatively cheap, standardised, and rugged connectors because it takes that hit.

There's another issue in circuit design. Take a good look at the traces on your mainboard, particularly betweem CPU and RAM, but also in PCIe lanes, and you will see a lot of squiggly lines between them. That's because the various data and clock lines need to be length-matched so all data is in the correct place at the correct time. There are additional problems like minimising capacitance and making the layout fit for the frequencies it's supposed to run at so your beautiful data stream doesn't become a softly humming noise at its destination. To keep performance in a fully modular system, you'd end up having to put components in stricly defined orders and topologies, and still would probably have to reduce clock speeds drastically.
 
Could you simplify the systems a lot by keeping things very close and in a distinct geometric designs? I wondered about light being used as a connector to help with other issues. If it doesn't use lenses but manages to send the data with a long term clean connection somehow. Is there a type of materials that doesn't have problems with connectors getting bad or solves the problems. If it's small enought maybe it can use more exostic materials. Maybe mounted on cheaper ones for size and cost.

If you can get a whole system with in essence shorter tracings and 3d instead of 2d how small could it be and how much could it save cost if any? More exotic things could be used to replace the existing if needed potentially. Maybe it would have use for some people if not entirely cost effective. You need very little material for a connection potentially.

I know the new diodes for monitors will have nano second response times(not oled. One of it's competitors. MicroLED). And they could be replacable in a small module without needing to sauder if you really wanted to. Or just pre saudered and firmly aligned with fiber optics that are connected and sealed. You could sell replacements for the fiber optic wire and sell cleaners. Might loose something going from copper to light unless you can use all light but maybe you could get similar performance or go full light based boards. If that works you might just need to deal with the heat. Which could be dealt with in layout. CPU/GPU could be on top/outside. Or newer heat efficient designs used.

Maybe the computer is a cube or pyramid or other shape no bigger than a few inches square. With things shrinking and becoming more electrically efficient this may be an option one day. Maybe an alternative to laptops if it can't offput heat enough? If it uses lights could it communicate with your phone via light through the monitor(if designed to do so) or via a camera or similar. Or vice versa.

You could put some hobby aspects back into PC's with some of this! 8)

BTW, could fiber optics be programmed to deal with the timings to distance isn't an issue? If so it could open up some new computing designs.

If you look at microLED's supposed abilities it could be the basis potentially of a light based bridge system. I've always wondered if you can do a color based high bit system on top of a fast system to get data across quickly. If you can do full color spectrum that is 24-32 bit color depth. Can it be used to make a higher bit spectrum bandwidth. Maybe combine multiple nodes or enhance the color like 24 to 32 bit to a higher bit spectrum like 64bit. Could it be used as a start to light based RAM and HDD's or even CPU/GPU's. That would get rid of the heat issues.

Party of this is it would be nice to not be confined and gouged by manufacturers who tell you what you need on your components. Not that they can't find new ways to do this. But if you could add components as needed in some way it would be nice. And what types of new designs could be done if the mobo were interchangable. It might bring out whole new ideas.

BTW, you wouldn't need just vertical and horizontal light paths. You could use many angles.


Also gets rid of burn issues that Oled has. When will light based computing be viable. I'm surprised it's not yet. Or I hope it will be soon.

MicroLED vs OLED – Advantages and Disadvantages

There are a lot of manufacturing hurdles that companies are going to encounter with microLED technology but if it turns out to a viable approach, then this display actually holds significant perks over OLED tech.


Improved brightness to power ratio (measured in lux/W): What this means is that microLED will be able to achieve the same level of brightness as OLED while consuming less power. In fact, the latest iteration can be up to 90 percent more efficient than LCD and up to 50 percent more efficient than OLED.


Longer lifespan: As you’ve all heard of the OLED burn-in issues taking place with smartphones, with Pixel 2 XL owners reporting this problem a lot, microLED will not exhibit the same issues. It is even possible for this technology to last longer than LCD panels before you start to witness shifts in color patterns.


Higher resolutions in smaller form factors: Being able to see a smartphone with a 4K panel is going to be a common sight and will not just be limited to the Sony Xperia XZ series.


Very high response time: Measured in nanoseconds (thousand times faster than microseconds, or the response time of OLED screens).


Color benefits: This will range from higher contrast ratio, wide color gamut, and higher brightness levels.





The perks that you’re seeing is only half the story. Let us take a look at the disadvantages that come with using microLED displays.


Very costly: 3-4 times more expensive than OLED panels.


Though this is just a single disadvantage, it branches out to more problems over time. Companies will be reluctant to invest in the expensive facilities and machinery that are required to make these sorts of panels. According to a source close to this information, Apple nearly dropped out of pursuing microLED panel manufacturing because of the complications and ridiculous costs that accompany this sort of venture.

Hypothetically, having the ability to replace components might be a good design for light based computing if the lights start to burn out. It might save money to modularize in the end as you don't want to replace more than needed potentially. If it can be done.

A lot of components are small. How much would a light based version with something like microLED cost? Could a CPU be done for relative cost is considering equivalent performance per space? Or could something bigger be used to get the same performance for a good cost? Could you have ram and HDD and CPU/GPU combined into one big thing removing tracing altogether? Just a monitor that displays and processes. Maybe in between display functions. Or on a seperate part of the display not visible or both?! If the refresh rate is so fast the eye can't see it you could use the unneeded refresh for processing.

Could light be used to make reprogrammable gate logic to use the same hardware to run any effective design in hardware?!


And, as RCrown mentioned, I believe that optical waveguides will be cheaper to fabricate in the long run...you all know how expensive copper is these days. Imagine getting rid of most of the copper lines (you will still need some for power, etc.) as well as all those tiny little capcitors, and so on needed to impedance match each individual line. You can make optical waveguides out of plastics!


I think the real run begins when we can manufacture PC's at home to our own designs someday.



This apparently never happened, but I wonder why. I think they are slow walking us to some extent to make more money. Companies don't want to be replaced or have to adapt. If not I wonder when this will start to happen.
 
Last edited:
You can't just go "lol 3D" to solve scaling issues. Pretty much every circuit board in computers (except power supplies) is already composed of several layers to make routing between all parts even possible, and at some point you need to solder on components. You can stack on mezzazine boards, but then you need to solve thermal issues because you lose any airflow. You can't wave the magic wand of 3D-printing either since that's a really slow, expensive, and inefficient way to make things.

I won't go into that whole "light-based computers" thing because I haven't looked too much into it, but it seems to be stuck somewhere between small-scale research and science fiction. You can make fibre-based interconnects, but then you need to put transceivers for those into everything, and those appear to be either slow, big, and expensive (TOSLink) or fast, big, and really expensive (anything useful) compared to electrical interfaces that you can just integrate into your chips.
 
Source: https://www.youtube.com/watch?v=UWMEKex6nYA


Here's a basic video. It's nice someone actually realized you can add more bandwitch via changing color. It's one of it's biggest potential advantage compared to current stuff.
I liked the mirror idea also. Splicing the light also could produce multiples of the same or seperate calculations at the same time. Could be interesting if it ever takes off. Totally different computers. A lot like stories/fiction about Atlantis!...
 
Last edited:

The Ceo's are relatives. Is it possible this is all a scam run by them to avoid monopoly status in places like the US. They could indefinitly create fake competition and then do god knows what. Especially with cryptocurrency being pushed for some time on video cards.

What we saw a few years back did indeed smell fishy. Then again there were only so many GPU's around and it had become 'fashionable' in investment portfolio's of the rich to have something to do with cryptocurrencies (and that is still the same now i suspect, just that all the hardware bought up a few years back is probably still running good (or being sold on the second hand market with new (but fewer needed) models taking their place)).

We are still very vulnerable (as gamers with limited budgets) to a repeat of a few years back. Price wise things are ok now, but we could see big spikes in prices again especially around cryptocurrencies (i'd like to see legislation against them myself (from the environmental perspective)).

As for the hardware side i think the sheer complexity and critical aspect of cooling (in particular) would hinder and 'lego' style approach to most modern GPU's. They are just orders of magnitudes more powerful and complicated vs stuff from the past where we did have some experiments in the module aspects. Having said that people do 'mod' their cards, so you could go that route if so inclined.

For my own preferences i'd just like to see Nvidia and AMD (especially) keep working on power efficiency. 120w is my max TDP that i am happy running and off course i'd rather have that lower (along with good performance off course), my 75w TDP 1050Ti being my current preferred option (with an eye on those 1660 ranges). I am not a typical gamer however in this respect, but think in general as part of all aspects of our societies needing to become more energy efficient, gaming should be part of that.
 
Only tangentially related, but still interesting: https://www.tomshardware.com/news/r...rboard-silicon-interconnect-fabric,40475.html

For my own preferences i'd just like to see Nvidia and AMD (especially) keep working on power efficiency. 120w is my max TDP that i am happy running and off course i'd rather have that lower (along with good performance off course), my 75w TDP 1050Ti being my current preferred option (with an eye on those 1660 ranges). I am not a typical gamer however in this respect, but think in general as part of all aspects of our societies needing to become more energy efficient, gaming should be part of that.

Power efficiency is continually improving. It's impossible for it to be others as power budgets are largely constrained by form factors and cooling, so the only way to increase performance is to increase efficiency.

Economies of scale also apply to data processing; the more powerful the GPU, the more efficient it can be, for a given generation. FLOP for FLOP, texel for texel, or hash for hash, my 1080 Tis can be tuned to be more efficient than a 1050 Ti...lower ratio of support hardware per execution unit.
 
If you made the parts small enough you coudl design chips differently. What if instead of square the CPU/GPU were on the top of a pyramid type shape. You coudl do traditional 4 side pyramids or other multi faceted shapes. Then the tops and corners could be various higher functioning chips. If small enough and power efficient it could be submerged in a smaller container like a small glass with small amounts liquid and some water flow and a much cheaper water like cooling systme. Maybe samller cheaper water/liquid designed surface cooling mount. Non conductive liquids could be used. Maybe customized to help with dripping the parts off. Or make it so it doesn't matter if it drip drys first.

Mineral oils could be a cheap start. Maybe made to drop off the parts quicker. Or parts made to have the mineral oil drip off it. Companies could custom make a lot more individual parts giving them more freedom to enter and exit markets. Making their finances more sound potentially. It could be much cheaper and safer to make much smaller parts.

Higher end cooling sytems would be much cheaper if you have a system the size of a baseball or smaller. Then you can increase or change the system designs as more things are available. And liquid and more exotic cooling systems could compete with air cooling. I've seen stuff where people say you can add peltiers to a water tank to cool the medium as a huge increase. This could be done to the outside of such a container to remove condensation. Or get rid of the problem by making the parts not sensitive to water. Or add an indirect non connected loop to transfer heat a pad at the bottom of the small glass sized container to a seperate higher heat tranfering loop with better cooling qualities. Or just make the container itself the transfer medium and use the peltier on the bottom to cool the inside liquid, like mineral oil.

If your case is the size of an 8oz glass, you can do a lot of things potentially you can't now.

Source: https://www.youtube.com/watch?v=PvmMs6mU0NU

BTW, you don't have to have all parts indivudually mounted. Youl could have partials with a bottom for a triangle and certain parts on it like a mobo. Then you could customize parts of the mobo design with many different designs. Graphics cards and other things could be all seperated or made into chunks for placement in a shape also if people don't want to fully customize things. Either way, important things could be upgraded.

And non partial things or certain designs could help manufacturers make more things without the risks, getting technology out there faster.
 
Last edited:
What if instead of square the CPU/GPU were on the top of a pyramid type shape. You coudl do traditional 4 side pyramids or other multi faceted shapes.

If you need more total die area in a similar volume, more traditional 3D die stacking would be superior in essentially every way.

You cannot economically apply lithography onto each facet of such a shape...it would be more costly than making an entire wafer of conventional parts and have vastly lower yields.

If you are talking about assembling the shape after the fact, this would also not provide many benefits over conventional 3D stacking, as logic would be spread further apart and cooling would be unlikely to be superior than with embedded micro fluid channels that have already been demonstrated.
 
If you are talking about assembling the shape after the fact, this would also not provide many benefits over conventional 3D stacking, as logic would be spread further apart and cooling would be unlikely to be superior than with embedded micro fluid channels that have already been demonstrated.
That's assuming appropriate cooling systems could be made available at acceptable prices. Most idiots these days can cope with air cooling, but the amateur aquarium equipment that goes for liquid cooling in the consumer area is barely worth the while and wouldn't survive a day trying to feed fine structures.
 
That's assuming appropriate cooling systems could be made available at acceptable prices. Most idiots these days can cope with air cooling, but the amateur aquarium equipment that goes for liquid cooling in the consumer area is barely worth the while and wouldn't survive a day trying to feed fine structures.

We are still a long way from 3D stacked GPUs being worthwhile for most markets, but when they are, I'd expect the system directly cooling the die stack to be embedded and used as a heat exchanger coupled to a more conventional cooling system.
 
Back
Top Bottom