EVGA & Nvidia Breakup - Thoughts?

Just found out about this today. Seems there's been a couple of YouTube videos concerning this.

As someone who builds their own computer to play video games on I thought this was a rather odd situation to have happen. Not happy about it.

Any comments or thoughts?
 
I think alot of people knew Nvidia was not a nice company to deal with for their partners and sometimes end users. So I am not surprised. I hope Evga as a company survives this, Nvidia I hope learns a lesson, but I really doubt Nvidia will learn anything. My old pc is an Intel I5-2500k with an Nvidia 1070. My new pc is an Amd 5900x with an Amd Rx 6800, not much I can do except support companies that give better deals to their end users and do not seem to beat up their partners as much as Nivida.

So good for Evga for taking a stand, and I hope it works out for them.
 
I think alot of people knew Nvidia was not a nice company to deal with for their partners and sometimes end users. So I am not surprised. I hope Evga as a company survives this, Nvidia I hope learns a lesson, but I really doubt Nvidia will learn anything. My old pc is an Intel I5-2500k with an Nvidia 1070. My new pc is an Amd 5900x with an Amd Rx 6800, not much I can do except support companies that give better deals to their end users and do not seem to beat up their partners as much as Nivida.

So good for Evga for taking a stand, and I hope it works out for them.
Totally agree.
Will add that my next purchase will be AMD.
intel & nvidia can do one
 
I saw likes of evga don't get cost prices confirmed until the day we get selling prices from Jenson.

Manufacturers cannot design and produce things like that. Imagine the expected margins turn out to be lower, nightmare.

Likes of evga aren't allowed to modify and allow more power limits
 
Last edited:
It's mostly an irrelevant situation for consumers who aren't blinded by brand loyalty. EVGA had some nice models and generally good warranty service, but they weren't remotely the only NVIDIA AIB.

When it comes to NVIDIA's business, EVGA was a drop in the bucket; they aren't going to sell any less product because of the loss of one AIB. It may be prudent for them to reevaluate how they treat partners however, as they need some AIBs to have a healthy consumer marketplace.

EVGA is probably in trouble. GPUs were a substantial majority of their revenue, and even if it hadn't been profitable recently, the company that is left when that part of the business is excised doesn't have a whole lot to it.
 
But it does make people like me question them, and perhaps give AMD another look the next time a video card purchase comes up. After all, even a loyal car brand person starts to look around when their particular make and model of car is discontinued.
 
Remember last year when EVGA cards started cooking themselves when pushed to their limits by New World? Not exactly a great advertisement for NVIDIA (and New World copped a fair bit of blame too) even though the flaw was due to EVGA's design using parts that were unsuitable for the job...
 
  1. It was ONE card. The RTX 3090.
  2. The issue was found to be a faulty fan controller on that particular model that allowed the fans to go to runaway speeds and burn themselves out.
  3. EVGA replaced all of them for free.
It wasn't fan controllers, it was MOSFETs (voltage regulators) that couldn't handle the load required. Regardless of whether they replaced them or not, the damage to the brand(s) was already done.
 
It wasn't fan controllers, it was MOSFETs (voltage regulators) that couldn't handle the load required. Regardless of whether they replaced them or not, the damage to the brand(s) was already done.
Turns out, a runaway frame rate might not have been the issue at all. A new report from Igor’sLAB seems to have narrowed down the problem to EVGA’s fan controller, which just runs buck-wild during New World game sessions.​
Igor'sLAB ran New World on an EVGA RTX 3090 with several hardware monitoring programs going to see what might pop out. As you can see from the screenshot, the GPU fan went from a cool and quiet 1600 RPM to north of 200,000 RPM, which must have caused quite the racket.​
At that fast a speed, the fan is likely to burn itself out, which would then lead to the GPU operating without enough cooling. If the player didn't notice the fan had died and kept playing, it could lead to a catastrophic failure of the graphics card.​
EVGA has already agreed to help players with bricked RTX 3090s and replace them free of charge.​

 
That was the initial conclusion reached by some amateurs and parroted by every tech site. Then a few months later EVGA confirmed the actual cause, which you can easily find if you use Google.

Thanks for demonstrating that incorrect truths, especially if they're widely reported by tech "news" sites, will still stick in people's minds though.
 
Last edited:
Not surprised, when you hear manufacturers prices arent confirmed till the day of series launch.
Manufacturers may end up with lower margins than expected, in addition nvidia seems to dictate what the like of evga can do with gpu.

I remember the days when manufacturers could enhance change GPUs to make them better, now all controlled by nvidia even the power limits.

Shame I bet there is more locked away and the series could be opened up if manufacturers were allowed too, but this will make nvidia own model be less appealing.

The 40 series looks good but look at the chips, my 2080ti is a 102
Now 4080 is rumoured to be 2 models 103 and 104 chips, we are getting lower chips for more money, I'm hoping a 4080ti is released with 102 chip.
The 4080 12gb version appears to have lower bandwidth than my few years old 2080ti, doesnt feel right.

I suspect th nvidia are trying to push prices up of 40 series while keeping mid range 30 series prices inflated while they reduce stocks.

I hope AMD do well next gen and offer something at a more reasonable price
 
I suspect th nvidia are trying to push prices up of 40 series while keeping mid range 30 series prices inflated while they reduce stocks.
That's about it right there.

Especially given that NVidia are in so much trouble financially with their stock having crashed and burned by 56% this year so far.
 
Or, and I'm just spit-balling here, you could post some links that back up your claims; especially the part about reputable sites just copying "some armatures". I'd really like to see that.
You really don't know how to use Google?


All dated after the articles you posted.
 
You really don't know how to use Google?
Yes, I do. But I shouldn't have to be the one doing it when YOU are the one making the claims.

Notice how I did the work FOR YOU because it was ME that stated the claim? That's how it's supposed to be done. If you're too lazy to support your own statements, stop making them.

It wasn't fan controllers, it was MOSFETs (voltage regulators) that couldn't handle the load required.
That is incorrect.

From your own article:
Inspections found that soldering around the MOSFET circuit – a crucial part within the GPU – was done poorly, causing affected cards to break.

That was the initial conclusion reached by some amateurs and parroted by every tech site.
That is also factually incorrect. From your own article you posted:

It also pointed out that third-party tools such as GPU-Z and HWInfo were misreporting data from the fan controllers, which is where a lot of theories originated.

It wasn't "amatuers". It was faulty software.

Thanks for you proving yourself wrong. It was entertaining to say the least.
 
Back
Top Bottom