Some of you may have have seen the actions of a user that goes by the name of Redneckerz on a recent blogpost of mine. That guy posts one wall of text after the next, full of anti-nVidia rhetoric, shameless AMD-promotion, and an endless slew of personal attacks and fallacies.
He even tries to school me on what I may or may not post on my own blog, and how I should conduct myself. Which effectively comes down to me having to post *his* opinions. I mean, really? This is a *personal* blog. Which means that it is about the topics that *I* want to discuss, and I will give *my* opinion on them. You don’t have to agree with that, and that is fine. You don’t have to visit my blog if you don’t like to read what I have to say on a given topic. In fact, I even allow people to comment on my blogs, and they are free to express their disagreements.
But there are limits. You can express your disagreements once, twice, perhaps even three times. But at some point, when I’ve already given off several warnings that we are not going to ‘discuss’ this further, and keep things on-topic, you just have to stop. If not, I will just make you stop by removing (parts of) your comments that are off-limits. After all, nobody is waiting for people to endlessly spew the same insults, and keep making the same demands. It’s just a lot of noise that prevents other people from having a pleasant discussion (and before you call me a hypocrit, I may delete the umpteenth repeat of a given post, but I left the earlier ones alone, so it’s not like I don’t allow you to express your views at all).
In fact, I think even without the insults, the endless walls of text that Redneckerz produces are annoying enough. He keeps repeating himself everywhere. And that is not just my opinion. Literally all other commenters on that item have expressed their disapproval of Redneckerz’ posting style (which is more than a little ironic, given the fact that at least part of Redneckerz’ agenda is to try and paint my posting style as annoying and unwanted).
Speaking about the feedback of other users, they also called him out on having an agenda, namely promoting AMD. Which seems highly likely, given the sheer amount of posts he fires off, and the fact that their content is solely about promoting AMD and discrediting nVidia.
The question arose mainly whether he was just a brainwashed victim of AMD’s marketing, or whether AMD would actually be compensating him for the work he puts in. Now, as you can tell from the start of the ‘conversation’, this was not my first brush with Redneckerz. I had encountered him on another forum some time ago, and things went mostly the same. He attacked me in various topics where I contributed, in very much the same way as here: an endless stream of replies with walls-of-text, and poorly conceived ideas. At some point he would even respond to other people, mentioning my name and speculating what my reply would have been. However, I have not had contact with him since, and Redneckerz just came to my blog out of the blue, and started posting like a maniac here. One can only speculate what triggered him to do that at this moment (is it a coincidence that both nVidia and AMD are in the process of launching their new 16nm GPU lineups?)
Now, if Redneckerz was just a random forum user, we could leave it at that. But in fact, he is an editor for a Dutch gaming website, Gamed.nl: http://www.gamed.nl/editors/215202
That makes him a member of the press, so the plot thickens… I contacted that website, to inform them that one of their editors had gone rampant on my blog and other forums, and that they might want to take action, because it’s not exactly good publicity for their site either. I got some nonsensical response about how they were not responsible for what their editors post on other sites. So I replied that this isn’t about who is responsible, but what they could do is talk some sense into him, for the benefit of us all.
Again, they were hiding behind the ‘no responsibility’-guise. So basically they support his conduct. Perhaps they are in on the same pro-AMD thing that he is, whatever that is exactly.
I’ve already talked about that before, in general, in my blog related to the release of DirectX 12. About how the general public is being played by AMD, developers and journalists. Things like Mantle, async compute, HBM, how AMD allegedly has an advantage in games because they supply console APUs and whatnot. This nonsense has become so omnipresent that people think this is actually the reality. Even though benchmarks and sales figures prove the opposite (eg, nVidia’s GTX960 and GTX970 are the most popular cards among Steam users by a margin: http://store.steampowered.com/hwsurvey/videocard/).
Just like we have to listen to people claiming Polaris is going to save AMD. Really? The writing is already on the wall: AMD’s promotional material showed us a slides with two all-important bits of information:
First, we see them compare against the GeForce GTX970/980. Secondly, we see them stating a TDP of 150W. So, the performance-target will probably be between GTX970 and GTX980 (and the TFLOPS rating also indicates that ballpark). And the power envelope will be around 150W. They didn’t just put these numbers on there at random. The low-balling pricetag is also a tell-tale sign. AMD is not a charitable organization. They’re in this business to make money. They don’t sell their cards at $199 to make us happy. They sell them at $199 because they’ve done the maths and $199 will be their sweet-spot for regaining marketshare and getting enough profit. Desperately trying to keep people from buying more of those GTX960/970/980 cards until AMD gets their new cards on the market. If they had a killer architecture, they’d charge a premium because they could get away with. nVidia should have little trouble matching that price/performance-target with their upcoming 1050/1060.
Which matches exactly with how I described the situation AMD is in: they are one ‘refresh’ behind on nVidia, architecture-wise, since they ‘skipped’ Maxwell, where nVidia concentrated on maximizing performance/watt, since they were still stuck at 28 nm. I said that it would be too risky for AMD to do the shrink to 16 nm and at the same time, also do a major architectural overhaul. So it would be unlikely for AMD to completely close the gap that nVidia had opened with Maxwell. And that appears to be what we see with Polaris. When I said it, I was accused of being overly negative towards AMD. In fact, Kyle Bennett of HardOCP said basically the same thing. And he was also met by a lot of pro-AMD people who attacked him. After AMD released their information on Polaris however, things went a bit quiet on that side. We’ll have to wait for the actual release and reviews at the end of this month, but the first signs don’t point to AMD having an answer to match Pascal.
The sad part is that it always has to go this way. You can’t say anything about AMD without tons of people attacking you. Even if it’s the truth. Remember John Fruehe? Really guys, I’m trying to do everyone a favour by giving reliable technical info, instead of marketing BS. I can do that, because I actually have a professional background in the field, and have a good hands-on understanding of CPU internals, GPU internals, rendering algorithms and APIs. Not because I’m being paid to peddle someone’s products, no matter how good or bad they are.
In fact, a lot of the comments I make aren’t so much about AMD’s products themselves, but rather about their inflated and skewed representation in the media.
You can bet that more than half of these people online that go around defending AMD to ridiculous extents, have Intel/Nvidia systems themselves. These folks make asses of themselves online, hoping that the rubes would fall for their pseudo-technical babble full of tales of “draw calls” and async-compute and promises of bang-for-buck and future-proofing(whatever the hell that means), and buy AMD, so AMD can keep on providing competition for THEIR favored brands, so THEY can get better deals. Think about it. How can Nvidia have a >80% share in the dGPU market, yet in the forums, AMD fans equal or outnumber Intel/Nvidia fans almost every single place.
On a side-note, these days the most trending topic with AMD fans seems to be “the evils of monopoly”. lol
Well, I’m not sure how those fanboy mechanics work. All I can say is that there’s also quite a vocal group of linux-users out there. Now, linux has even less marketshare than AMD, and I don’t think that driving down Microsoft’s prices is what they are going for.
By the way, even AMD uses Intel systems. They would get too CPU-limited if they stuck to their own CPUs when showing off their GPUs 🙂
The sad thing is, once upon a time, AMD actually and genuinely was the enthusiast’s choice, and for very good reasons – their hardware was awesome. Nowadays it seems to me that a certain type of person is attracted to AMD. Let’s call them a “power user” – they have just about enough knowledge for it to be a dangerous thing, but if you scratch the surface even a little it quickly becomes obvious that they have no idea what they’re talking about. So armed with this mighty lack of knowledge they read some enthusiast websites and/or magazines, see that AMD hardware tends to be recommended with typically the reason being it’s claimed to have better price/performance ratio, then get religiously attached to what should have been a pragmatic decision. Other common symptoms include thinking that clock speed is a major factor in performance, cries of “but it works with everything else!”, thinking that a PC is just a more powerful console, and accusations of being somehow anti-AMD when one just points out facts.
I’m not enough of a psychologist to analyze this further. What I do wonder however is if AMD picked up these people from ATI. I recall shenanigans from ATI fanboys in the past similar to what AMD fanboys get up to today.
Yup, and I’ve always been an enthusiast. As demoscener/graphics coder, I’ve always wanted to seek out the most awesome hardware to develop on.
My list of AMD/ATi hardware speaks volumes to the initiated, eg:
Athlon XP 1800+
This hardware was either the most awesome (as in fastest, most feature-rich etc) stuff available at the time, or else it would give me pretty much the same as the competitor, at a considerably lower price.
And I would buy AMD again, if they started making awesome hardware again.
It’s crazy though, just today I read some comment like how “AMD is refined like a Ferrari, where nVidia is raw muscle like a Hemi V8”.
And all this based on async compute of course. For which we only have one benchmark, which is optimized specifically for AMD.
Even if we were to give this advantage to AMD (which I think is too early to call, as I explained in my previous blog), they still lose on pretty much every other aspect. Certainly on things like ‘refinement’, since nVidia has much better performance/watt, and more efficient handling of complex things such as tessellation. And of course they support the DX12_1 featureset.
If anything, it’s AMD that tries to keep an outdated, unrefined architecture going on the raw power of a single feature (okay, two, if we count in HBM). It nets them a few percent more performance, but still not enough to outperform nVidia, so how deluded do you have to be to make such statements? Nobody even corrects them anymore.
What happened to AMD? Their CPUs went from excellent to garbage and their GPUs are no longer competitive. Did they run out of cash?
Well said, I’m a former AMD ‘processor’ fan from <10 years ago.
AMD 64 X2 4400+ (S939) was my last AMD.
Here comes the latest AMD FUD & Lies campaign, guess they didn’t learn from Bulldozer.
They’re playing the emo-card?
So first reviews are coming in.
Didn’t you read the comments? AMD loses in the benchmarks, so it must be biased! Must be NV!
Mind you, the results are very consistent with the other leak from that Polish mag, 970OC outperforming the RX480 in many cases: https://imgur.com/a/zgHr7
Yeah I got a chuckle out of them. It’s still just one review, we’ll have a clearer picture once other venues publish theirs.
But it’s honestly rather underwhelming. I was hoping for performance on the level of a 390x/980. Matching or slightly exceeding the performance of cards one tier higher and a generation older seemed like a reasonable goal to me.
It seems to roughly match a 390, but that’s a cut-down part. At least the effiency seems to have improved by a nice margin, although they still only managed to match Maxwell while Nvidia moved on again.
Well, it’s two reviews now. The video and the Polish magazine. We’ll see in about an hour how accurate they were. I think nVidia could do a lot of damage if they introduce their 1060 today/this week.
By the way since when is HBAO+ considered evil and a ploy to kneecap AMD?
As far as I know it performs equally on both vendors while looking a good deal better than SSAO.
Anything is evil (probably arranged by NV) when AMD loses. When AMD wins, everything is always fair, so it’s still NV’s fault.
So yep, seems to be between a 390 and 390x, a couple percent faster than a reference 970.
Funnily enough it’s currently only about 40€ cheaper than mid-range OCd 970’s where I live. Really was expecting more.
Gotta hand it to AMD, they are very consistent in not living up to the hype. Oh how different the 1070/1080 were, where nobody really knew what to expect, and nVidia pretty much blew them away *by introducing the product*. AMD instead talks up their product weeks in advance, then doesn’t actually demo it in any meaningful way, and eventually, it doesn’t even live up to the claims, and struggles to compete with two-year old hardware on way outdated manufacturing technology. I’d love to see the face of Redneckerz and other AMD cheerleaders now… who thought AMD was the technology leader, and nVidia was in trouble in the ‘new age of graphics’ that AMD had ushered in.
Oh, and once again it’s just a warmed-up GCN chip, still no sign of DX12_1.
The performance-per-watt is a complete joke, barely keeping up with Maxwell… even worse than I predicted. Pascal is completely cleaning up in that area.
So, AMD did the only thing they could do: sell this underwhelming card as cheaply as possible. Just as I said: take away price, and there’s nothing left. At the same price, a GTX970OC is a better deal (yes, the whole ‘3.5G’ thing does not appear to make a difference in performance according to these benchmarks, seems like nVidia handles this well enough in their drivers): similar performance-per-watt, similar performance-per-dollar, but with DX12_1 support, and you can enjoy things like GameWorks/G-Sync/PhysX if you like.
Do you have a source? I can’t find it in the TPU review anywhere.
It’s kinda sad when the card draws as much power as a 1080 which is twice as fast.
I haven’t found any review mentioning DX12_1, while many point out that it’s basically the same ISA as GCN1.2 is. If it had DX12_1, AMD would have made sure that reviewers point that out (it’s a shame you have to spoonfeed reviewers, can’t any of them just think of running DXCapsViewer themselves? It’s rather important for any new architecture).
Therefore I think it’s safe to conclude there is no support.
They do appear to have some kind of new multi-projection feature (something for foveated rendering), but no info on how it works or performs at this point.
Pingback: AMD’s Polaris debuts in Radeon RX480: I told you so | Scali's OpenBlog™
Agree, AMD is step back comparing to both Nvidia & Intel.
So the new 3DMark Time Spy is out.
And barely a day went past before the accusations of bias and bribery came. Even though it shows GCN deriving a bigger benefit from the holy grail of AS, it’s apparently a grave sin that Pascal gets a perf boost too with it on, albeit not as big.
Yup, the overall cluelessness combined with the misplaced arrogance and know-it-all attitude of these AMD fanboys is disgusting.
They have absolutely no idea what DX12 even is, let alone how NVidia’s and AMD’s hardware work internally, or how to write DX12 code to take advantage of this… yet they make these wild claims about how AMD can do this, and NVidia can’t do that, and blahblah.
The logic is as pathetic as: “NV gets gains from AS, must be cheating!”.
Nobody talks about FL12_1, which AMD cannot do, however (as in: they actually cannot do this, that’s a fact, see DXCapsViewer. Not just something deluded fanboys think/hope that AMD cannot do).
Statements like these also don’t help:
“which again indicates that AMD has engineered GCN around asynchronous compute more than NVIDIA has with Pascal.”
You can’t engineer a GPU around async compute. It’s the workloads that dictate which units will be used, and which units will be idle, and what kind of workloads you can run in parallel efficiently.
The fact that Pascal gets less gains in the same benchmark could be explained in a number of ways. For example, if the compute shaders run faster on Pascal than on GCN (relative to the rendering), you can run them parallel to the graphics tasks, but you’ll be gaining less, because they took less time to begin with when running sequentially.
They are correct about this:
“To me, and this is just a guess based on history and my talks with NVIDIA, I think there is some ability to run work asynchronously in Maxwell but it will likely never see the light of day.”
It *was* enabled… but because of AoTS running code that was (deliberately?) suboptimal for Maxwell, they took a performance hit. So first NVidia asked the devs to disable it. Then they figured: “We’ll just block it in our drivers to prevent AMD from doing this again.”
We see the same on Pascal: DOOM doesn’t run async code there either. It seems that NVidia’s strategy for async compute is to ‘whitelist’ your application. They can enable async compute, but they will only do so for applications where they know there is a performance boost.
AMD will probably start doing the same if an ‘AoTS’ targeted at sabotaging AMD’s performance will surface at some point.
(People think async compute is just something you ‘turn on’, and performance gains will occur magically… Not true at all, even on AMD you can get severe performance hits if you do it wrong).