Richard Huddy responds again

Richard Huddy decided to send me a real reply again via email, rather than just threats. Not quite sure why, but it seems Andrew Copland’s reply on the previous blog has something to do with it. I will just quote it in full, and comment on some of the sections.

You just don’t seem to get it.

Indeed, which is why we’re waiting for a technical explanation of how your suggestions are going to work.

My article on ‘Why DX needs to go away’ was fuelled by _some_ developers who argue that it’s time for the API and the driver to go away.
In that sense I did not express my own opinion, I expressed a view that I’ve heard from an increasing number (but still only a small minority) of developers.  That’s why quoting Johan is relevant. Your criticism was essentially that no sane person could possibly support my words.  Because that is your stated position it seems to me that giving a single public example is therefore enough to prove you wrong.
[However, without justification beyond your own prejudice, you then go on to interpret Johan’s words to conclude that he did not mean what he said.]

Yes, my criticism is indeed that no sane person could support your words, and I pointed out various specific problems with what you are suggesting. I assume that Johan Andersson is a sane person, therefore he will not support you. Speaking of prejudice… My justification goes well beyond that. I have spoken to various developers on such issues over the years, and none of them support what you’re saying. We have Andrew Copland commenting here on this blog directly. We had Michael Glueck stating the same even in the original interview piece. And we have the various developers commenting at Beyond3D. They all bring up the same concerns.

Now, those are all direct quotes. You however, have no direct quotes whatsoever. Johan Andersson refuses to comment, and the post he made on Beyond3D is circumstantial evidence at best. Firstly, he does not literally say that we should abandon the API. That is just your interpretation. Secondly, he is speaking of a process of many years. In short, his statements are very different from yours, and certainly do not back you up all the way. Since Andersson refuses to make any direct comments to back you up, you’re basically on your own here.

Aside from that, as already mentioned earlier: finding developers who back you up isn’t the real problem here. Unlike you, I’m not a talking head, so for me this is not a he-said, she-said situation. I’m a very experienced developer, lived through the days of API-less and vendor-specific API programming, and I drop a number of technical issues at your feet. You need to present solutions first, before us sane developers can back you up. Without solutions to these obvious problems, your suggestions continue to be nonsensical.

You and I both know that _most_ developers benefit from an API and driver. I’m happy to accept that Andrew Copland falls into that group.  There is room for diversity of opinion here…  But I think you’d be surprised at how competent the vocal minority is that wants the API to go away.  They may be few, but they are justifiably influential.
So…
Finding individual developers who want to retain APIs is irrelevant to all this.  Of course there are some, indeed there are many.  No surprise there. Remember that I only claim that ‘some important developers want DX and drivers to go away’.  I made no claim to be speaking on behalf of everyone.
One related thing you might want to consider is “What was the appeal of Larabee [sic] to developers?”.
For many it was that the new h/w would be able to escape the limitations of the API with Intel writing the drivers.  That would mean they might get access to a ray-tracing renderer etc.  But there are other developers too who saw other benefits.
Clearly for some of them it would be the fact that they can code in C, C++, or even assembler and write their own rasterizers etc.  These are the kind of people who want the API to go away.

Certainly there are developers who could do without an API. These are the same developers that would use Cuda for example, despite locking out any non-nVidia hardware. The issue is that these people can afford to do so, because they develop software that needs to run on their machines alone. Do they develop for Cuda? No problem, they’ll just buy only nVidia cards. Do they develop for Larrabee? Fine, just get Larrabee systems for everyone. This is also why consoles work: you know everyone bought the exact same hardware from the same vendor. There is no need to abstract away differences, because the differences aren’t there in the first place.

Game developers such as Johan Andersson do not have that luxury. They cannot release games that only work on one type of hardware (with no guaranteed backward or forward compatibility either). This is also the reason why OpenGL dropped from mainstream usage years ago: games with OpenGL had serious compatibility issues, and developers often had to put in different rendering paths using vendor-specific extensions for multiple vendors. Some would argue that OpenGL’s power lies in the vendor-specific extensions allowing developers to have more direct access to the hardware, but at the same time it was its achilles heel. Now, I notice that Johan Andersson does not use OpenGL for the Frostbite engine. That is interesting. Why doesn’t he use OpenGL when it already offers at least some of the benefits that your suggestion of API-less programming offers? That is highly suspect…

I think it has a lot to do with the fact that Direct3D gives you shorter development times while at the same time supporting a larger range of hardware. This means that the games developed with Andersson’s engine will actually work on a large range of gamer machines, and gamers will actually buy the games. I don’t think the market will accept games that only run on a small subset of hardware for no good reason (the recent Rage release was already pathetic in that respect, and that’s a game that still uses a hardware abstraction layer. It required a slew of driver updates to start working on AMD hardware. What if there was no driver? Just buy a new videocard? If even a major, nay, legendary developer like that can’t pull it of, who can?). In a not-so-distant past, people weren’t very happy that they had to throw out all their Glide games after 3DFX went belly-up, and none of the available videocard upgrades could support their Glide games anymore.

Likewise, look at PhysX… Not many people buy an nVidia card specifically to enjoy the accelerated PhysX add-ons in a game. And no developer dares to make a game that requires PhysX acceleration, because it would limit their market too much. So they at least put in a fallback for non-accelerated PhysX. People don’t buy hardware to suit their games, nor do they buy games to suit their hardware. They just buy the games they like, and thanks to hardware abstraction and fallback paths, things will just work.

So I don’t think Andersson wants to go to an API-less world. He doesn’t even want to go to a vendor-specific extension world. He just wants a thin, efficient hardware abstraction layer.

So, can *some* developers live without hardware abstraction? Sure, I never denied that. Can PC game developers live without hardware abstraction? No. That is a more suicidal move than choosing OpenGL over Direct3D. I don’t see any sane developer making that choice.

You are invited to post this entire unedited email as a comment upon your blog.  I imagine you will choose not to do so.  Note also that I am sending this on my own behalf, not on behalf of my employer.

Don Ricardo

Capo di Tutti Capi

Of course I will choose to do so! I’ve made a new blog post out of it though, for better readability.

So, once again, you have not given us any answers. How are you going to make our software work on multiple vendors and past, present and future hardware?

You see, I think I speak for all developers when I say that we would LOVE to support what you’re saying. We all want faster software, more powerful hardware, and all that. The thing is… unless we have overlooked something here, your suggestions are just not going to work in practice.

One big part being the settling on a single instructionset. As I replied to Andrew Copland on the previous blogpost, I believe even nVidia uses an intermediate language for Cuda, and that is only meant to run on their own GPUs. Which would indicate that nVidia is not planning on settling on a single instructionset even for their own products. Let alone that we could get nVidia and AMD to agree on using the same instructionset. The instructionset is almost like the goose with the golden eggs for a GPU. Why give that away to the competitor? Look at the recent developments where AMD designed their Graphics Core Next architecture to work much like nVidia’s, abandoning their old VLIW-based instructionset completely. It doesn’t look like either of the two major GPU vendors have any plans of settling on an instructionset anytime soon.

Advertisements
This entry was posted in Direct3D, OpenGL, Software development, Software news and tagged , , , , , , , , , . Bookmark the permalink.

One Response to Richard Huddy responds again

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s