That’s what I’ve been wondering… Sometimes you hear rumours where you think “Okay, that’s obviously not very plausible, why do people repeat it blindly?” Where do those rumours come from anyway? For example, here’s one I’ve been hearing quite often:
DirectX 10.1 is what DirectX 10 should have been. Because of limitations in nVidia’s architecture, the standard was ‘dumbed down’.
Okay, let’s go through that one slowly… DirectX 10.1 is what DirectX 10 should have been? Well, that is possible, clearly each version of DirectX is better than what went before it, often changing and improving things based on new insights. That’s pretty obvious.
Limitations in nVidia’s architecture would cause DirectX 10.1 to be ‘dumbed down’ to DirectX 10? Well, there’s a problem, see. nVidia’s architecture might not have supported the full featureset, but it does support many of the features in DirectX 10.1 (e.g. it has no problem reaching the minimum requirement of 4xAA, and nVidia offers an extension to use the main feature in DirectX 10.1, namely multisample readback, which various games actually use). So that doesn’t make sense. If nVidia’s hardware is the reason why the standard was ‘dumbed down’, then why is it dumbed down BEYOND the limitations of nVidia’s hardware?
In fact, why didn’t any OTHER hardware support the full DirectX 10.1 featureset either? AMD’s Radeon HD2900 series wasn’t a full DirectX 10.1 part either, despite being released much later than nVidia’s part. It doesn’t add up.
So where does such a rumour come from? There could be SOME truth to the statement that the standard was ‘dumbed down’… but blaming it on nVidia would be totally off the mark, as I’ve established above. If you want to make a plausible argument, you should put the blame on Intel. Their IGPs do support DirectX 10, but unlike nVidia, they support the absolute minimum. They don’t have any AA at all, so obviously no multisample readback is possible either. Both AMD and nVidia supported at least some of the DirectX 10.1 features/requirements in their first DirectX 10 parts, so it is quite possible that the original DirectX 10 spec was more like DirectX 10.1 than what we know as DirectX 10 today. Besides, it’s widely known that Microsoft also ‘dumbed down’ the requirements for Vista because of other Intel hardware with the whole “Vista capable” debacle. So it wouldn’t be the first time that Microsoft would adjust a spec in favour of Intel’s limited hardware capabilities. Then again, Intel is the largest supplier of graphics chips. So it could be that this is the basis for the rumour, but somehow ‘Intel’ got replaced by ‘nVidia’ somewhere along the way.
Why would such a thing happen? Well, it seems AMD just has a large fanbase. People who think they’re in the know, and just love to kick nVidia for not supporting DirectX 10.1 for so long. While I agree that it wasn’t very nice of nVidia to hold back on DirectX 10.1 support, I won’t let that stop me from thinking about rumours and blindly repeating things which are obviously nonsense. But well, I’ve discussed ‘tech-savvy’ people before, haven’t I?