AMD Mantle Support?

Right so Nvidia couldn't just give Apple a sweet deal for the Mac Pro?

Who is pushing OpenCL? And which GPU vendor "better supports" OpenCL? And now, as a bonus, add the "who is selling the cheaper GPU" ?

NVs Maxwell GPUs slaughter the AMD GPUs in a Mac Pro. Sometimes is not all about Technology.
 
Quite amusing that nVidia's RnD exceeds AMDs, given that AMD have an x86 and APU line to "invest" into as well.

But all that is irrelevant really, Steam hardware survey shows AMD at 29% and slipping, 970/980 was the most successful GPU launch in history in terms of volumes, and AMDs only response has been to slash prices and margins on their expensive 512bit bus product to the bone.

The point being, that implementing Mantle would be a complete waste of time and effort. You'd see a 10% boost in minimum frame rates for 3% of the playerbase.
 

What the hell is that supposed to be? Nvidia's discrete graphics share is vastly comprised of low end GPU and this is a fact. AMD barely even registers there due to replacing the low-end with APUs. To suggest that AMD's APUs shouldn't count as actual graphics while claiming crap like the GT 740 and below should count is a joke.


Thats just pure nonsense. Lets at least keep to reality, ok?

View attachment 10608View attachment 10609

And lets look on revenue.
Computing and Graphics division:
Q4 2013 888M$(15M$ loss)
Q4 2014 662M$(56M$ loss)
This is how AMD looks like without consoles.
In the same period nVidia grew 11% to 1225M$ revenue for the Q.

Which part of "AMD has different priorities" was so hard?

Which part of AMD ditching SOI R&D costs can't you grasp? Which part of AMD ditching TSMC R&D costs can't you grasp?

You seem to pay an awful lot of attention to AMD's financials Shintai, which makes me wonder how you missed the important part. Let me help you out.

http://seekingalpha.com/article/283...-2014-results-earnings-call-transcript?page=5

Q)
John Pitzer - Credit Suisse

Are you under-investing in certain areas right now to try to maintain that non-GAAP breakeven?


A1)
Devinder Kumar - SVP and CFO

The question that always comes up internally, and Lisa obviously and Mark Papermaster watch that very carefully in terms of making sure that we continue to invest in the R&D area, especially for the products and the technology around the future stuff that we are doing. But John, you're right, we do want to manage from a viewpoint of overall cash and the P&L, that's important. The breakeven P&L has come down significantly as we have managed the OpEx. But the R&D investment, especially with the roadmap that we are projecting for 2015 and 2016 continue.

A2)
Lisa Su - President and CEO

We are actively designing a number of products in 14 -- the 14-nanometer technology. I think that will be very important for us in -- from a competitive standpoint. So it's an important technology for us.

AMD's R&D costs have dropped due to the decision to drop SOI, drop TSMC and due to their embedded model. There is no loss of future products so sorry but either get up to speed on the reality of the situation or get over your dashed dream.

- - - - - Additional Content Posted / Auto Merge - - - - -

Quite amusing that nVidia's RnD exceeds AMDs, given that AMD have an x86 and APU line to "invest" into as well.

I guess you are blissfully unaware of Nvidia's "console", their phones and tablet CPUs (and automotive)? It's all making a horrible financial loss, funnily enough!

But all that is irrelevant really, Steam hardware survey shows AMD at 29% and slipping, 970/980 was the most successful GPU launch in history in terms of volumes, and AMDs only response has been to slash prices and margins on their expensive 512bit bus product to the bone.

What's Nvidias share like on the PlayStation and XBox networks?
 
Last edited:
Sorry adoredtv, I just dont think you have any ideas what you talk about.

External payments would be external in the financial reports, including those for GloFo. And AMD never payed TSMC for anything than wafer allocation. Thats the entire purpose of being fabless.

The reason why R&D drops is because the Computing and Graphics division drops. Its that division due to margins that pay for R&D.

And your quote from kumar is funny. Because what he says is simply their R&D fits their roadmap. Something they keep cutting in. Carrizo for desktops is also dropped. Plus its already delayed half a year.

Its not for fun AMD had a 13% revenue drop in Q4. 7% layoffs and now expects another 15% revenue drop in Q1. Its not for fun that AMD is working as fast as it can to leave the PC segment in favour of the semicustom to save the company.

As I said, lets stick to reality rather than the 5 stages of grief. You are only fooling yourself if you think everything is fine in AMD.
 
Last edited:
I guess you are blissfully unaware of Nvidia's "console", their phones and tablet CPUs (and automotive)? It's all making a horrible financial loss, funnily enough!



What's Nvidias share like on the PlayStation and XBox networks?

Their other bits might be making a loss, not sure how that is relevent, but their RnD budget exceeding AMDs, without a CPU/APU business, says something.

Consoles are irrelevant. Neither use "Mantle", AMD tried to pretend they did but both Sony and MS denied this. You can bet that the XB1 is going to be DX based as it's a MS api. As the PS3 uses an nVidia GPU, and the PS3 outnumbers the PS4 still, I'd say nVidia's share on PSN is quite high ;)

ED isn't even on console and hopefully never will be.

So all we should be caring about is PC market share.
 
Last edited:
Sorry adoredtv, I just dont think you have any ideas what you talk about.

External payments would be external in the financial reports, including those for GloFo. And AMD never payed TSMC for anything than wafer allocation. Thats the entire purpose of being fabless.

Really? Let's see what TSMC says about it.

TSMC claims a fortune in R&D expenses from companies they make products for.

Morris mentioned R&D expenses of TSMC versus Intel and Samsung, the difference being, TSMC collaborates with customers/partners and leverages R&D expenses. So the equation looks like this:

Top 10 TSMC customers R&D expenses + TSMC R&D expenses > Intel + Samsung R&D expenses

You honestly believe that Glofo wasn't leveraging SOI R&D payments from AMD too? Let's see what AMD says about that then.

LINK

Separately, AMD will move to standard 28nm process technology and significantly reduce reimbursements to GF for future research and development costs.

We anticipate these savings will be approximately ~$20M per quarter during the next several years which also helps achieve our OPEX target of $450M by Q3 2013

R&D savings help drive AMD to its Q3’13 $450M OPEX target through adoption of 28nm standard process technology.

That all seems rather straightforward Shintai. It's funny how you seem to know an awful lot about AMD, yet don't seem to know this stuff? Almost like...you only know about the bad stuff? How did that happen, I wonder?

And your quote from kumar is funny. Because what he says is simply their R&D fits their roadmap. Something they keep cutting in. Carrizo for desktops is also dropped. Plus its already delayed half a year.

Tell me more about Nvidia's Volta, what happened to that? What about K1, have they got it out yet? Wasn't Maxwell supposed to have launched in 2013? How late is Broadwell? I wonder if you actually pay attention to any other tech companies apart from AMD?
 
Last edited:
Their other bits might be making a loss, not sure how that is relevent, but their RnD budget exceeding AMDs, without a CPU/APU business, says something.

What do you think Tegra is?

Consoles are irrelevant. Neither use "Mantle", AMD tried to pretend they did but both Sony and MS denied this. You can bet that the XB1 is going to be DX based as it's a MS api. As the PS3 uses an nVidia GPU, and the PS3 outnumbers the PS4 still, I'd say nVidia's share on PSN is quite high ;)

ED isn't even on console and hopefully never will be.

So all we should be caring about is PC market share.

That's not really true. The consoles are very similar to Mantle (PS4 in particular) - so much so that porting between them is extremely easy.

What's more, Dx12 is basically just a copy of Mantle so learning Mantle now is the key to being fast at getting Dx12 games out fast.
 
Last edited:
Quite amusing that nVidia's RnD exceeds AMDs, given that AMD have an x86 and APU line to "invest" into as well.

But all that is irrelevant really, Steam hardware survey shows AMD at 29% and slipping, 970/980 was the most successful GPU launch in history in terms of volumes, and AMDs only response has been to slash prices and margins on their expensive 512bit bus product to the bone.

The point being, that implementing Mantle would be a complete waste of time and effort. You'd see a 10% boost in minimum frame rates for 3% of the playerbase.

Actually it isn't, since Nvidia is pursuing their own architecture while AMD is using industry standards, and can license patents from other innovators in the field. You can't beat everyone to the punch in general computing, but Nvidia is the only researcher in their category, so every dime spent on R&D for CUDA is spent by Nvidia, whereas X86 and ASICs have a huge number of companies investing in their technologies.
 

The link you post simply shows TSMCs R&D into the foundry business and 10 of its customers into IC design. There is no money from the customers to TSMC in R&D. Customers order wafer allocation at TSMC, TSMC takes a healthy margin (~45%). And that margin is used to pay TSMCs R&D.


You honestly believe that Glofo wasn't leveraging SOI R&D payments from AMD too? Let's see what AMD says about that then.

LINK

The WSA, that btw runs the next 20 years because AMD was in no position to negotiate. Is about wafer allocation. AMD have commited themselves to a certain volume for 20 years ahead. Something they pay a high price for. the FX CPUs for example wouldnt even exist anymore if it wasnt for this. But no, its not about AMDs R&D.

That all seems rather straightforward Shintai. It's funny how you seem to know an awful lot about AMD, yet don't seem to know this stuff? Almost like...you only know about the bad stuff? How did that happen, I wonder?

Again for the 3rd time in a row int his post you confuse IC design R&D cost and foundry R&D.

Tell me more about Nvidia's Volta, what happened to that? What about K1, have they got it out yet? Wasn't Maxwell supposed to have launched in 2013? I wonder if you actually pay attention to any other tech companies apart from AMD?

Volta is now turned into Pascal. K1 is generating revenue and X1 is close to launch.

- - - - - Additional Content Posted / Auto Merge - - - - -

Actually it isn't, since Nvidia is pursuing their own architecture while AMD is using industry standards, and can license patents from other innovators in the field. You can't beat everyone to the punch in general computing, but Nvidia is the only researcher in their category, so every dime spent on R&D for CUDA is spent by Nvidia, whereas X86 and ASICs have a huge number of companies investing in their technologies.
Yes and no.

AMD actually runs something called STREAM. You can say its equal to CUDA. But unlike CUDA it never got popular and lives a rather anonymous life. OpenCL runs on top of both of them.
 
Last edited:
Again for the 3rd time in a row int his post you confuse IC design R&D cost and foundry R&D.

Right, explain this to me.

We anticipate these savings will be
approximately ~$20M per quarter during the next several years which also helps achieve our OPEX target of $450M by Q3 2013

R&D savings help drive AMD to its Q3’13 $450M OPEX target through adoption of 28nm standard process technology.

How - exactly - are AMD saving $20 million per quarter in R&D ("R&D savings" being key here), achieving their $450 million OPEX target through ditching SOI? It's R&D, not cost of sales...


Volta is now turned into Pascal. K1 is generating revenue and X1 is close to launch.

Right...Volta "turning into Pascal" seemed to add another year or two to the release date then. How can that be with AMD seemingly being the only company who is releasing stuff slow?
 
Last edited:
Almost like...you only know about the bad stuff?

Ignoring market share for a moment.. lets pretend you are looking for a job.. In no particular order :) - (1) Intel, (2) NV, (3) AMD
Intel is the slow and steady wins the race - a safe bet.
NV - a bit more risky, in a more challenging economic position, but pushing the GPU-power-per-watt envelope.
AMD.. hmm.. just laid off a whole boatload of people, competing against Intel and ARM in the CPU market, and competing against Intel and NV in the GPU market..

Intel CPUs are just outright crushing AMDs offerings, leaving AMD in the low-end CPU/APU market.. in the Discrete GPU Market its tough for both NV and AMD - Intel is slowly strangling the PCIe links to them.

I don't work for any of the three, btw.
 
Right, explain this to me.

How - exactly - are AMD saving $20 million per quarter in R&D, achieving their $450 million OPEX target through ditching SOI? It's R&D, not cost of sales...

Bulk is cheaper to develop IC designs for due to a broader usage and avaliability of tools. Still 0 R&D payment to Glofo or TSMC. And TSMC was always bulk.
 
Bulk is cheaper to develop IC designs for due to a broader usage and avaliability of tools. Still 0 R&D payment to Glofo or TSMC. And TSMC was always bulk.

So you're just going to plain deny the words in the WSA? You're going to deny that Chang (TSMC President) said that TSMC gets paid R&D expenses from their customers? You are denying that AMD was normally first to the new node at TSMC and now will no longer need to pay those R&D costs?

Right, we're done Shintai.

- - - - - Additional Content Posted / Auto Merge - - - - -

Ignoring market share for a moment.. lets pretend you are looking for a job.. In no particular order :) - (1) Intel, (2) NV, (3) AMD
Intel is the slow and steady wins the race - a safe bet.
NV - a bit more risky, in a more challenging economic position, but pushing the GPU-power-per-watt envelope.
AMD.. hmm.. just laid off a whole boatload of people, competing against Intel and ARM in the CPU market, and competing against Intel and NV in the GPU market..

Intel CPUs are just outright crushing AMDs offerings, leaving AMD in the low-end CPU/APU market.. in the Discrete GPU Market its tough for both NV and AMD - Intel is slowly strangling the PCIe links to them.

I don't work for any of the three, btw.

AMD also hired Jim Keller and Raja Kaduri from Apple - two of the best CPU and GPU designers around. People leave and join these companies all the time.
 
Last edited:

Yaffle

Volunteer Moderator
I have no idea how this turned into a rather embarrassing Team Red v Team Green thread, so let's please bring it back to being about Mantle support and lay off the fighting about whose graphics are best.
 
Actually it isn't, since Nvidia is pursuing their own architecture while AMD is using industry standards, and can license patents from other innovators in the field. You can't beat everyone to the punch in general computing, but Nvidia is the only researcher in their category, so every dime spent on R&D for CUDA is spent by Nvidia, whereas X86 and ASICs have a huge number of companies investing in their technologies.

What?

For CPUs, nVidia only use ARM designs, very open, lots of licensees. They have used off the shelf designs (K1 32bit, X1 use A15) and in house modified designs in a similar manner to quallcomm (K1 Denver)

For CPUs, AMD are fiddling with ARM, but also are an x86 licence holder. Given that Intel certainly give as little help to AMD as possible in this regard (AMD only have an instruction set license, not a licensed design) and that VIA are the only other people with an x86 license, it's massively more expensive to develop an x86 CPU than an ARM one. And AMD are still doing ARM as well!!

Both AMD and nVidia have their own architectures for GPUs.

nVidia's CUDA has turned out to be a massive winner, they sell Tesla cards for thousands of dollars a pop, and people build massive clusters out of them. GPGPU is one of nVidia's most profitable arms.

All GPUs are ASICs, I don't understand what you mean here.

I'd expect that to make the same amount of progress, AMD, with a more diverse product offering, would have to spend WAY more than nVidia on RnD.
 
So you're just going to plain deny the words in the WSA? You're going to deny that Chang (TSMC President) said that TSMC gets paid R&D expenses from their customers? You are denying that AMD was normally first to the new node at TSMC and now will no longer need to pay those R&D costs?

Right, we're done Shintai.

Again you mix oranges and apples. The paid R&D expenses talked about is from the margins on wafers. The rest is the IC design cost that customers have to do Just as AMD gets margins from you when you buy an AMD product. Yet you wouldnt claim you payed R&D money to AMD would you?

The only R&D money AMD use is put into their own software and IC design cost. The only transfer between GloFo or TSMC is wafer cost besides the mask for the design that AMDs delivers.
Now since neither TSMC or GloFo works for free. They add a margin on top of their production cost. Just as AMD does when they sell you a product. Part of this cost pays GloFos and TSMCs R&D.

The TSMC statement is nothing but PR. Claiming they are bigger if you combine their top10 customers with their own R&D against intels or Samsungs. Of course thats not how it works. Since TSMCs customers usually are competitors.
 
Last edited:
Right, sure Shintai. AMD lays it out flat that they would save $20 million in **OPEX from R&D** due to switching to bulk and you deny that's how it is?

3O0khoH.png

What's that, a lie?

And we're supposed to believe that TSMC are lying as well? When TSMC were developing a tech like 40nm (when AMD was months ahead of Nvidia), AMD would also be spending a lot of R&D on that as well. That's what the word "collaborates" means. It's all R&D until it's a finished product. It's the exact same with MS and Sony paying R&D for AMD upfront during the development phase of the consoles - meaning AMD saves *even more* on chip R&D that they were previously paying (The consoles are based on Jaguar and GCN). They pointed this out in a financial call also, yet I suppose you "missed" that one too.

As pointed out by the mod however, this has precious little to do with Mantle.
 
Last edited:
The thread could pretty much end. Only thing thing would be if a dev came in and layed out their future plans.


If it was Mantle, DX12 or simply continue with DX11.x that gets extended to 11.3.
 
Last edited:
I can't wait till DX12 goes mainstream and I hope we can then be done with manufacturer exclusive techs like mantle.

I also hate other examples of this such as AMD TressFX and NVidia Gameworks technologies that get implemented on each manufacturers current pocket-games company yet just further serve to restrict and fragment gamers and generate bad blood. When buying a PC game I would like it to look identical on all similar performing hardware. In this day and age us PC Gamers need to stick together as much as possible to hold off the horde of sperging console gamers and their micro transaction filled, unfinished pre-orders of doom.
 
I can't wait till DX12 goes mainstream and I hope we can then be done with manufacturer exclusive techs like mantle.

I also hate other examples of this such as AMD TressFX and NVidia Gameworks technologies that get implemented on each manufacturers current pocket-games company yet just further serve to restrict and fragment gamers and generate bad blood. When buying a PC game I would like it to look identical on all similar performing hardware. In this day and age us PC Gamers need to stick together as much as possible to hold off the horde of sperging console gamers and their micro transaction filled, unfinished pre-orders of doom.

Consider that without Mantle there probably would not have been a Dx12 - and absolutely not this fast. MS simply didn't care until they came under threat.
 
Back
Top Bottom