AMD’s Polaris debuts in Radeon RX480: I told you so

In a recent blogpost, after dealing with the nasty antics of a deluded AMD fanboy, I already discussed what we should and should not expect from AMD’s upcoming Radeon RX480.

Today, the NDA was lifted, and reviews appear everywhere on the internet. Cards are also becoming available in shops, and street prices become known. I will make this blogpost very short, because I really can’t be bothered:

I told you so. I told you:

  1. If AMD rates the cards at 150W TDP, they are not magically going to be significantly below that. They will be in the same range of power as the GTX970 and GTX1070.
  2. If AMD makes a comparison against the GTX970 and GTX980 in some slides, then that is apparently what they think they will be targeting.
  3. If AMD does not mention anything about DX12_1 or other fancy new features, it won’t have any such things.
  4. You only go for aggressive pricing strategy if you don’t have anything else in the sense of a unique selling point.

And indeed, all this rings true. Well, with 3. there is  a tiny little surprise that AMD does actually make some vague claims to some ‘foveated rendering’ feature. But at this point it is not entirely clear what it does, how developers should use it, let alone how it performs.

So, all this shows just how good nVidia’s Maxwell really is. As I said, AMD is one step behind, becaue they missed the refresh-cycle that nVidia did on Maxwell. And this becomes painfully clear now: Even though AMD moved to 14nm FinFET, their architecture is so much worse in efficiency that they can only now match Maxwell’s performance-per-watt at 28 nm. Pascal is on a completely different level. Aside from that, Maxwell already has the DX12_1 featureset.

All this adds up to Polaris being too-little-too-late, which has become a time-honoured AMD tradition by now. At first, only in the CPU department. But lately the GPU department appears to have been reduced to the same.

So what do you do? You undercut the prices of the competition. Another time-honoured AMD tradition. This is all well-and-good for the short term. But nVidia is going to launch those GTX1050/1060 cards eventually (and rumour has it that it will be sooner rather than later), and then nVidia will have the full Pascal efficiency at its disposal to compete with AMD on price. This is a similar situation to the CPU department again, where Intel’s CPUs are considerably more efficient, so Intel can reach the same performance/price levels with much smaller CPUs, which are cheaper to produce. So AMD is always on the losing end of a price war.

Sadly, the street prices are currently considerably higher than what AMD promised us a few weeks ago. So even that is not really working out for them.

Right, I think that’s enough for today. We’ll probably pick this up again soon when the GTX1060 surfaces.

Advertisements
This entry was posted in Hardware news and tagged , , , , , , , , , . Bookmark the permalink.

52 Responses to AMD’s Polaris debuts in Radeon RX480: I told you so

  1. qwerty says:

    lol, it’s good that we got all the gloating out of the way :p. Just messing with you mate. Anyhow, in all honesty, the benchmarks turned out better than what I expected after reading a particular HardOCP article. 🙂

    Incidentally, I’m a little surprised that none of the big tech sites bothered to hook up an HMD with their review samples, considering how Polaris is supposed to bring VR to the masses.

    • Scali says:

      Subtlety doesn’t really work with AMD fanboys. I figured we just have to smash the facts into their faces with a big sledgehammer. AMD is *not* ahead of nVidia technologically (lol console wins), not by a long shot. For some reason, I could easily tell that last generation, but many were in denial. Polaris should make it pretty obvious to anyone, I would say. Those 2-year old 28 nm GTX970 cards are still a hard act to follow for AMD with their latest tech. And we’re talking every single area here… performance-per-watt, performance-per-dollar, getting feature-parity…

      I wasn’t the one who was being biased or had an agenda. I just told things as they are, and it’s not that pretty for AMD. As I said, the world is not fair, AMD and nVidia are not equals.
      It’s the people who desperately try to make AMD look like nVidia’s equal who are biased. But they don’t seem to understand the difference, or don’t want to admit to it, because they have an agenda.

  2. Alexandar Ž says:

    AMD’s Radeon RX 480 draws an average of 164W, which exceeds the company’s target TDP. And it gets worse. The load distribution works out in a way that has the card draw 86W through the motherboard’s PCIe slot. Not only does this exceed the 75W ceiling we typically associate with a 16-lane slot, but that 75W limit covers several rails combined and not just this one interface.

    http://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html

    All that just so they can market it as only having a single 6pin connector.

    • Scali says:

      Ouch… someone is trying a bit TOO hard to stay in the game…

    • qwerty says:

      ,,,and “the catch” emerges. They obviously did that to make it appear more efficient than it actually is. Good heavens, an overclocked Tri-SLI system would probably melt the soldering on the mobo. lol

      • qwerty says:

        *Tri-CF, I meant. 😦

      • Scali says:

        Well, let’s think of theories why this is…
        Let’s assume they have always planned the cards to have the 6-pin connector. Changing the PCB to fit an 8-pin connector at the last moment was not an option.
        That would imply that the new 14 nm FinFET dies didn’t perform as well as AMD had hoped, and they had to push the clockspeeds further than originally planned, in order to actually get that 970-level performance. I guess pushing for the ultimate bang-for-the-buck was more important to AMD than you know… staying within PCI-e power specs.
        Sadly that means that as usual, you get what you pay for… Yes, it’s cheap, and yes it performs… But it’s also going to push your motherboard/PSU beyond their intended design parameters.
        If you also go for bang-for-the-buck motherboard and PSU, they are not likely going to be very tolerant to this.

  3. Rebrandeon says:

    http://www.tomshardware.com/reviews/amd-radeon-rx-480-polaris-10,4616-9.html

    “We skipped long-term overclocking and overvolting tests, since the Radeon RX 480’s power consumption through the PCIe slot jumped to an average of 100W, peaking at 200W. We just didn’t want to do that to our test platform.”

    http://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility/

    Kyle was right after all.

    • Scali says:

      Of course he was. He’s not an idiot. Nor am I. I guess Kyle and I are alike in some ways. If we post something, we stand behind that 100%, because we know it to be true.

  4. Scali says:

    Heh, AMD fanboi damage control is running rampant on forums now… Claiming that games such as Tomb Raider aren’t ‘true’ DX12 games because they don’t use async compute and whatever…
    Yea, I get it… in the AMD universe, async compute is the *only* DX12 feature you have. nVidia and Intel however also support other stuff, such as conservative rasterization, and rasterizer ordered views. Tomb Raider actually makes use of such features. Well, it would, if the RX480 would support them… And people linking to that AdoredTV trash on YouTube, lol! Also claiming that nVidia is holding up DX12. Oh really? As some of you know, I have posted hard evidence of NV being far more involved with the development of DX12 than AMD was.

  5. Dodecahedron says:

    Point 1 was not completely unreasonable as there have been some AMD cards that consumed less than their rated power in games. Unfortunately, now this makes one of AMD’s claims about the Polaris questionable. Namely, they claimed 2.8x energy efficiency when comparing RX 470 to R9 270X. It seems that they just used the cards’ rated power consumption figures in their calculations. However, the R9 270X consumes around 120-130 Watts in most games instead of the rated 180 W. If the RX 470 behaves like the RX 480, i.e. power consumption in games is close to the rated power, the difference in energy efficiency is much less than what was promised.

    • Scali says:

      I think it was unreasonable though… I mean, there were people making claims that it was more like 100-120W TDP in practice. That certainly is not the case. If that were true, AMD would surely have advertised with a figure well below 150W TDP, given that both the 970 and the 1070 operate in that area.
      I guess AMD made those slides before they knew about 1070. Was probably a shock to them that it was only 150W TDP. They probably thought “Well, 970 is 145W TDP, we can advertise ours as 150W TDP, and throw in a bit more performance and the 8 GB, then we’ll look good”.
      But after 1070, everyone expects a 150W TDP card to perform WAY better than 970. You’d expect something like 120W max for a 970-ish card at 14/16nm. We’ll see what 1050/1060 do soon.

      And in fact, as we’ve seen, AMD doesn’t even manage to stay within their power envelope… breaking PCI-e spec. nVidia just put 8-pin or 2×6-pin connectors on the 970 and 1070 so they don’t have the problem in the first place, even if they would overshoot it from time to time.

      • Dodecahedron says:

        Given what had been told about 14 and/or 16 nm, one might have expected even a simple Hawaii die shrink to consume well below 150 W.

  6. qwerty says:

    “I guess pushing for the ultimate bang-for-the-buck was more important to AMD than you know… staying within PCI-e power specs.”

    Thanks to their R&D cut-downs they’re reduced to being a one-trick-pony – their Fury-X overshot the thermal envelope, with a similarly low OC headroom. GCN is turning into AMD’s Netburst. The inevitable AiB cards would alleviate most of these engineering faults, but even they can’t add new features or significantly improve power usage.

    • Scali says:

      Funny… Redneckerz was ‘educating’ us on GCN only a few weeks ago, saying it was such a ‘forward looking’ architecture. Where is Redneckerz anyway? He hasn’t posted here in a while… Wonder if he regrets the stupid things he posted earlier 🙂

      • qwerty says:

        🙂 You’ll probably see him again, spouting the same nonsense under a different ID. Right now he must be very busy with his cheerleading obligations, in maximum damage control mode, “give me an ‘A’, give me an ‘M’…” lol.

      • Scali says:

        Yea, I like how what they put on the internet stays on the internet. Just like with Randy Allen and John Fruehe for example. AMD fanboys could argue all they wanted, up to the product launch and reviews. But once those were out, boy did it become painfully obvious how moronic they are 🙂 It only helps to strengthen my position. This is another one that Redneckerz can add to his list of “Everybody said A and Scali said B, but Scali was right anyway”.

      • Redneckerz says:

        Not really. This comment of yours clearly has a provocative intent, as is the case with the first alinea of this blog and your blog The damage that AMD marketing does. (And it worked considering i made a response, congratulations) I am all up for taking it up to arms, but in a 1v1 scenario. Not like that people go harass others who have nothing to do with it, and then draw conclusions from there (As you did). It just confirms the image that you always want to ”win” an argument no matter how silly it is. That is where i draw a line.

        Conclusion on the card remains the same – Good mid-range performance which is pretty much the majority of gamers out there. The 8 GB is a nice extra.

        @ Qwerty: You are right on one thing – He would see me again, with this comment. I dont use alternative names or ID’s tho, unlike the person you seem to agree with 🙂

      • Scali says:

        I am all up for taking it up to arms, but in a 1v1 scenario.

        We tried that before, remember? A discussion over private messages is even more mindnumbing than your public posts.
        You didn’t understand the first thing I tried to explain, about how games with ‘GI’ solutions without conservative raster use a lot of static, pre-baked voxelmaps to get acceptable performance, as opposed to a solution like VXGI where all geometry can be fully dynamic in realtime.
        I don’t know why you’d even want to try talking to an experienced graphics dev 1:1, you clearly lack the knowledge to have any meaningful conversation on the topic. I do know that no experienced graphics dev wants to talk to you 1:1, because it’s just them trying to explaing things that fall upon deaf ears, and you crying AMD fanboyisms and personal insults.

        It just confirms the image that you always want to ”win” an argument no matter how silly it is. That is where i draw a line.

        Says the guy who specifically comes to my own blog to post dozens and dozens of walls-of-text, addressed at me personally.
        No, I don’t need to “win” arguments. What I do want is to humiliate you and destroy your online persona forever (or well, you’re doing the destruction yourself, with all the walls-of-text full of retarded/paranoid statements, I’m just enabling you). You’ve earned it, with your behaviour. You need to be taught a lesson.

  7. meido says:

    AMD squad needs to apologize to Kyle, he was right on everything.

  8. Scali says:

    I also like how AMD fanboys try to argue about price… “There’s no way they can make the 1060 as cheap as the RX480, because the next step up, the 1070 costs “…
    Well, let’s point out the obvious here:
    1) The price of the 1070 is not a fixed number. nVidia determined those prices. They can easily lower the 1070 to make the gap with 1060 smaller, if they so choose (they probably had that planned all along, and launched the 1070 and 1080 relatively high-priced simply because they could, and will gradually reduce prices once production matures. That is generally the pattern… 970/980 were a bit of an exception since they were launched on a mature production process, and there was no reason to lower prices anyway).
    2) They could also introduce extra GPUs between a 970/RX480-level card and 1070, because even today we have the 980 in between there currently, performance-wise. Eg, they could make a 1060Ti that sits between the two.
    3) They could also just have the 1060 at competitive prices with RX480, the 1070 where it is now, and nothing in between. Why the heck not?

    At any rate, you can be sure that nVidia will launch a card that will compete directly against the RX480 in terms of price and performance. My guess is that they will go for the same price, but better performance.

    • Alexandar Ž says:

      I’d wager it will be a bit more expensive than the RX 480, due to current circumstances they can IMHO get away with higher prices.

      • Scali says:

        Depends on whether nVidia wants to just make a lot of money, or whether they want to take away the prestige of having the bang-for-the-buck-crown away from AMD.
        The latter would hurt AMD more. Bang-for-the-buck is everything AMD stands for. Intel did exactly that with the Core2Duo E6600. They went straight for AMD’s jugular, and priced it at about half of what AMD charged for a similar performing Athlon64. AMD is still feeling that one.

    • qwerty says:

      “At any rate, you can be sure that nVidia will launch a card that will compete directly against the RX480 in terms of price and performance. My guess is that they will go for the same price, but better performance.”

      Knowing Nvidia’s MO, the 1060 would be Y% better than the 480 and priced XY% higher, where X ≥ 1.2 :p. Well because, they can. I wonder where would we be today in terms of graphics, if Nvidia had to compete against a worthier adversary, like saaay Samsung owning ATI.

      • Scali says:

        It seems nVidia is still going all-out in terms of R&D (unless AMD and other GPU devs are really that weak). So I guess the main thing would be that nVidia would find itself in a pricewar, and would have to lower prices.
        Mind you, this implies that:
        1) nVidia would have less marketshare, so lower volumes.
        2) nVidia would have less profit margin per sold unit.

        Which means that they’d have a lot less money coming in for R&D. In which case, if I’m right about nVidia actually trying their best to develop the most awesome GPUs possible, we may actually have gotten less advanced GPUs by now. Because you’d have NVidia on a lower budget, and their competitor basically trying to re-invent the same wheel as nVidia, so progress will be less effective than a single company having the total budget of both.

    • “1) The price of the 1070 is not a fixed number. nVidia determined those prices. They can easily lower the 1070 to make the gap with 1060 smaller, if they so choose [snip]”
      Very true. Also, comparing prices at launch is actually pointless, because you’re basically guaranteed *not* to get a freshly-released card at MSRP. I’ve browsed a few online outlets and in all of them the RX 480 is way more expensive than most 970 models. If you want to get 970 levels of performance, why not getting a cheaper 970 instead? Maybe some people are simply impulse buyers, or just fanboys.
      And if they concede you that it’s more expensive, they’ll point to VR. As if that’s a deal-breaker in the eyes of most people.
      With that said, I believe the 480 will eventually become a nice card to upgrade an aging rig, but nothing more.
      Let’s hope that Zen CPUs are at least passable, but seeing how it’s going so far, I doubt they will

  9. Alexandar Ž says:

    Seems like you were right about the DX12 feature level, wiki only mentions DX 12.0

    https://en.wikipedia.org/wiki/AMD_Radeon_RX_400_series

  10. qwerty says:

    @Redneckerz

    Well, well, speak of the devil… 😀 No pearls of wisdom about the power and overclocking concerns of that good mid-range performance, or the fact that it (Polaris) is still limited to DX12.0? Or do facts not go well with your perception of reality?

    • Scali says:

      I quote: “GTX 980 levels and beyond”…
      Also, the value-for-money card seems to fall down a bit at the moment, with retail prices being much higher than AMD’s projected prices.
      Sure, it’s still a slightly better deal than some 2-year old 970OC cards (if you don’t mind missing out on DX12_1 of course)… but only slightly. Which is rather depressing for an all-new 14 nm FinFET architecture.
      Especially since Pascal was pretty much a landslide in the market segment they launched in. Both prices and power consumption were dropped immensely. Had RX480 done that in the 970/980-bracket, then we had something to be excited about. Now we’re just waiting for nVidia to finish the job with the 1050/1060.

    • Scali says:

      Here’s your power/overclocking drama: https://m.reddit.com/r/Amd/comments/4qfwd4/rx480_fails_pcie_specification/ 🙂
      Even one of AMD’s sockpuppets is in there for damage control… heh. Dudes, you should have done that during QA at the factory.

  11. qwerty says:

    “…with retail prices being much higher than AMD’s projected prices.”

    That’s probably a consequence of demand surpassing supply (in a bad way), which isn’t unusual for AMD products.

  12. Scali says:

    By the way, I read this beautiful exchange somewhere… Some AMD fanboy was ragging on a review, one of those walls-of-text that went something like: “Your benchmarks are unfair! This game is pro-nVidia because it uses GameWorks, and in that game you used this-and-that setting, which is unfair to AMD… and blahblah, and you didn’t use game X which is more fair for AMD etc etc”.
    Some guy responded with a beautiful one-liner:
    “Well, reading your post I’m thinking, why buy AMD at all, since it doesn’t seem to handle any games properly” 🙂
    I guess there’s truth in that 🙂

    • qwerty says:

      There are a bunch of iffy online sources that just feed the kind of half-baked misinformation that the fanboys thrive one. Case in point: “PC Gamer” (the site that works with AMD for E3) gave the reference 480 a score of 94%. For the average consumer, these sources are a fairly reliable “authority”.

    • Xander says:

      No surprise there.

      • Redneckerz says:

        Will be exciting to see if they can get that in the same price bracket as RX480. That might be quite a challenge for Nvidia, a lot of ”advised” prices end up getting fairly higher. Another problem would be supply of demand. As awesome as the new GTX 1070/1080’s are, they arent in big numbers, and you can debate about the price of the FE.

      • Scali says:

        And we’re back to price again…

  13. dealwithit says:

    Did AMD really make reference desing/cooler that bad. They say it because they wanted to make custom board makers look better. What does it say about AMD. This is how you market your cards to your fans?

  14. Rebrandeon says:

    Graphics programmers, if you want to do real interesting next-gen with Conservative Rasterization, Rasterizer Ordered Views & Tiled Resource Tier 3 work, forget about it on inferior hardware from AMD because it simply can’t support it.

  15. qwerty says:

    bwahaha AMD finally admits, in their typically obfuscated style:

    http://www.anandtech.com/show/10465/amd-releases-statement-on-radeon-rx-480-power-consumption
    “As you know, we continuously tune our GPUs in order to maximize their performance within their given power envelopes and the speed of the memory interface, which in this case is an unprecedented 8Gbps for GDDR5. Recently, we identified select scenarios where the tuning of some RX 480 boards was not optimal. Fortunately, we can adjust the GPU’s tuning via software in order to resolve this issue. We are already testing a driver that implements a fix, and we will provide an update to the community on our progress on Tuesday (July 5, 2016).”

    I love how the blame is hinted upon VRAM, and its speed is called “unprecedented”.

  16. Alex says:

    How long before AMD goes bankrupt?

    • Scali says:

      No idea, I thought the same thing when Bulldozer launched… somehow they bleed cash, but never die. I wonder though… one of the things Kyle Bennett said in his editorial on AMD-to-ATi was that the RTG wanted to split off from AMD and be bought by another company, and rumour was it that Intel was the prime candidate. If that is true (which I doubt, since Intel’s GPU department is very much up-to-date with the latest technology), then after Polaris they would probably go… “Erm, no thanks, we can do that better ourselves” 🙂

  17. Scali, it seems you’re smart & too deep into these things (the blog is about), so I have one question for you: Does it worth to buy GTX 1080 over GTX 1070? I mean price/performance ratio, does it worth? I sold my ex-card (GTX 780 Ti) [It was too weak already for modern games] & thinking to buy one of those two. Here’s my specs & I use my PC mainly for gaming & I wanna maintain constant 60 fps in modern games:

    CPU: Intel Core I7 3770k
    MOTHERBOARD: MSI Z77A-G45 Gaming
    GPU: Intel HD Graphics (temporary)
    RAM: 16GB G.Skill Trident X
    STORAGE: Crucial M500 SSD 120GB + 2TB HDD + 4TB HDD
    PSU: XFX PRO1050W Black Edition (80+ Gold)
    COOLER: Thermaltake water 2.0 (liquid cooling)
    CASE: Cooler Master HAF 912 Plus
    MONITOR: Philips 298P4 29″ UltraWide
    KEYBOARD: Corsair Vengeance K70
    MOUSE: A4Tech Bloody V7
    OS: Windows 7 64bit

    Thx in advance.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s