Over the years, I’ve seen a pattern emerge. I have documented this on various occasions. Such as the time when Jed Smith thought he knew about Windows security features such as ACLs. Or that time when Linus thought he had to give nVidia the finger, while in reality, it was the linux kernel and Xorg that were lacking the functionality required to implement a technology such as Optimus.
You see, Linus does not know what he is talking about. And he does not know that he doesn’t know what he’s talking about. So that’s the classic Dunning-Kruger effect right there (just like in Jed Smith’s case). Namely, if he would just have read the Optimus whitepaper from nVidia, he would understand that it requires quite a bit of support from the OS and the graphics APIs. It’s not just some feature that nVidia can switch on. It does not work on all versions of Windows either. You need Windows 7 or higher, because Windows 7 had an overhaul of the display driver model, which allows drivers of multiple vendors to be active at the same time, and it allows DXGI surfaces to be shared between these drivers efficiently (the key point of Optimus is that running applications can be moved from one GPU to the next on-the-fly).
Linux/Xorg had no such interface at the time. So while there were some hacks that claimed to give you Optimus-support in linux, such as Bumblebee, they really didn’t. Namely, what Bumblebee does, is a really ugly bruteforce solution: It starts up two X servers, one for the internal GPU, and one for the nVidia GPU. It then copies the window contents from one X server to the other. The obvious flaw here is that it’s not dynamic: an application needs to specifically be started on the proper X server. And once it is started, it cannot be moved to another. This means it cannot dynamically respond to changes in GPU-load, such as Optimus, and therefore, once the X server is started for the nVidia GPU, the nVidia GPU will remain active, until all programs on that X server, and the X server itself are closed.
Anyway, long story short: Linus has absolutely no idea of any of this. Which is probably mostly a result of the fact that he does not know anything about Windows 7. He probably assumed that linux/Xorg could do what Windows 7 does, and therefore he probably assumed that Bumblebee already did what Optimus does. So he does not need to read the Optimus whitepaper either, to find out what it REALLY is what Optimus does, and why it would specifically need Windows 7 or higher to do it. And why linux/Xorg would need to be modified to also support a similar generic interface to run multiple drivers simultaneously for a single X server, and share data efficiently/dynamically.
Jed Smith’s comments here were a similar story: clearly he only knew what ACLs looked like on the linux side, and then assumed that Windows would work the same, since he doesn’t really know anything about Windows.
Anyway, the other day I had some similar encounters, when I engaged in some discussion following CryTek’s posting of a job opening for a linux developer for CryENGINE. You’d get people claiming that linux is a superior desktop/gaming OS to Windows, and the only reason why there aren’t any good games on linux is because game developers aren’t developing good games and GPU vendors aren’t developing good drivers.
Well… allow me to dip my little fly in the ointment… These people pretended to be knowledgeable on all things OSes and kernels and whatnot, so I decided to engage in a technical discussion, and point out a few things. They were quick to refer to Valve’s blog where they claim the linux/OpenGL version is faster than Windows/Direct3D. Thing is, has anyone ever verified these claims? I have never actually seen any review site testing this hypothesis, yet it is assumed to be true.
Update: I have come across two sites that tested it now, the linux gaming site Rootgamer and the Dutch Tweakers.net. Both conclude that Windows runs the Valve games faster than linux. So there you have it.
I have only ever seen evidence of the contrary. Take for example Unigine‘s multi-platform benchmark Heaven. It shows that Direct3D 11 is clearly faster than OpenGL, for both nVidia and AMD hardware. It’s more difficult to find reliable sources for Windows vs linux comparisons, but everyone is free to download and try. In my experience, with an nVidia card, the linux performance gets reasonably close to the performance of OpenGL under Windows, but not good enough to match it, let alone getting near the Direct3D figures. AMD is slightly worse under linux.
The same goes for my own code. Anyone is free to download the BHM 3D skinning sample code, and compile it for their machines/OSes. I’ve tried to compile and optimize for Windows, OS X, linux and FreeBSD, but Windows is by far the fastest platform for OpenGL here. The fun thing about OS X is that you can run Windows on the exact same machine with bootcamp, and that makes it significantly faster. A guy by the name of SpooK helped me to port/optimize this code for native OS X (rather than using X11), and he posted his results on the asmcommunity forum:
Test Box Specs:
- Dual Core Xeon @ 3GHz
- 8GB DDR3 1066MHz RAM
- GeForce GTX 460 w/ 2GB RAM
- Windows 7 Home Premium 64-bit using 32-bit Example: ~6150 FPS
- Windows 7 Home Premium 64-bit using 64-bit Example: ~6400 FPS
- Mac OS X 10.6.8 under X11/64-bit: ~3000 FPS
- Mac OS X 10.6.8 Native/64-bit: ~5400 FPS
So Windows 7 wins out over OS X on Apple hardware, using OpenGL, which is Apple’s native 3D API. I also have a Direct3D version of this claw, which runs even faster. Sadly I don’t have linux results for the same machine. But as I say, you’re free to compile and run the example on your machine. You’ll see that linux will perform in the same ballpark as OS X, but slightly short of Windows. Drivers aren’t really the problem though, especially not on nVidia hardware. nVidia keeps their linux and FreeBSD drivers very much up-to-date with the Windows ones. The OpenGL portion is mostly a shared codebase between all OSes, which means that new features and optimizations are available on all OSes at about the same time (the version numbers are also directly comparable). The difference is mostly in the OS-specific portions (things like how granular the locking is in kernel mode, and how low-latency the scheduling can be for interrupt handlers, and things like that).
Anyway, it’s rather annoying that this myth of “OpenGL is faster than Direct3D” is being perpetuated without any evidence, even by Valve… but it’s getting us sidetracked somewhat. So let’s get back to this discussion I was having…
I pointed out that Windows is a more efficient desktop OS than linux because the system knows more about the desktop, and uses dynamic scheduling, temporarily boosting the priority of threads when input is received, for example. I also referenced some work that is being done on linux to try and improve the desktop responsiveness, to illustrate that it is indeed considered a problem in the linux world.
But, these people just didn’t seem to have any clue what I was talking about at all. Dynamic scheduling!? What? Linux has nice/renice! Uhhh… how is that dynamic? That’s just setting a static priority. No, what I’m talking about is documented nicely in MSDN:
The system boosts the dynamic priority of a thread to enhance its responsiveness as follows.
- When a process that uses NORMAL_PRIORITY_CLASS is brought to the foreground, the scheduler boosts the priority class of the process associated with the foreground window, so that it is greater than or equal to the priority class of any background processes. The priority class returns to its original setting when the process is no longer in the foreground.
- When a window receives input, such as timer messages, mouse messages, or keyboard input, the scheduler boosts the priority of the thread that owns the window.
- When the wait conditions for a blocked thread are satisfied, the scheduler boosts the priority of the thread. For example, when a wait operation associated with disk or keyboard I/O finishes, the thread receives a priority boost.You can disable the priority-boosting feature by calling the SetProcessPriorityBoost or SetThreadPriorityBoost function. To determine whether this feature has been disabled, call the GetProcessPriorityBoost or GetThreadPriorityBoost function.
After raising a thread’s dynamic priority, the scheduler reduces that priority by one level each time the thread completes a time slice, until the thread drops back to its base priority. A thread’s dynamic priority is never less than its base priority.
This gave me a distinct deja-vu of the earlier Jed Smith and Linus Torvalds episodes, hence this blog…
People think they are knowledgeable about OSes, yet they don’t even know that Windows does this?! Really? What did you think this radio button was for:
Which is quite ironic. Linux people pride themselves on knowing about OSes, and they generally support openness of technology. Well, both Optimus and the Windows priority boosting are documented, apparently. So why exactly do you not know of these things? And why are you so arrogant that you assume you know how things works, and try to give your opinion about things you clearly have no knowledge of?
In fact, many years ago, at my university, we had someone from Microsoft giving a talk on the then-new Visual Studio and Windows, and this priority boost thing was also part of his talk. And of course there were a few linux fanboys who attented the meeting and tried to ask ‘smart’ questions (trolling, basically). And of course they got put in their place easily by the Microsoft guy, because these linux guys just *thought* that they knew something. The Microsoft guy actually understood the internals of both Windows and its competitors, including linux, so you would not trip him up on a technicality. He came well-prepared. It was painful to watch, in a way.
But well, if even the main linux kernel developer is like that, things look bleak, very bleak. I’ve had this thought for a while, but now I’ll just flatout say it: Linux is the Dunning-Kruger OS. It’s the OS of choice for people who have *just* enough knowledge about technology to become dangerous. If they REALLY knew about OSes, I don’t think they would be so quick to pick linux as their OS of choice. Namely, a lot of OSes have a lot going for them. If it is indeed a UNIX-like OS that you are after, then OS X and FreeBSD are very interesting alternatives to linux, definitely worth checking out, because chances are, they may be better at some of the things you picked linux for, *thinking* it was better at this than Windows, which to you meant it was the best, since you were completely oblivious to any alternatives.
As for Windows vs linux… The above has shown that Windows is still the OS to beat when it comes to graphics/gaming. And Direct3D is still outperforming OpenGL. And if you’ve been using Windows as a serious desktop system, you were probably already aware of its rather aggressive scheduling of applications, which makes the system very responsive. I have recently gotten some Mac Mini G4 machines, at 1.42 GHz, with OS X 10.5.8 on them. And I quickly got annoyed by how unresponsive they became when you tried to do some serious stuff. Clearly OS X is not as good as Windows when it comes to priority boosts and such. I’ve used Windows on much slower single-core systems than a G4 1.42 GHz, but the responsiveness was certainly better than on the G4s. My experiences with desktop linux and FreeBSD were much the same as OS X: the OS does not seem to put a lot of effort into making the GUI responsive. As they say: you don’t know what you’ve got until it’s gone.
Having said that, you may actually have valid reasons for using linux. It’s possible. All OSes have their weaknesses, but they also have their strengths. The point here is just: Do you actually have these valid reasons? Or did you just assume that you did? If you can argue why linux is the best choice for you, and you can prove that with logic, technical facts and empirical evidence, then yes, more power to you. But if you just use linux because you heard that it’s better than Windows… Then you’re just fooling yourself. In which case, it’s high time that you start studying the technology, and learn some critical thinking. Then make your own decision of which OS you should or shouldn’t use.
And until then, don’t bother others with your ‘advice’ on OSes either. I get linux and OpenGL fanboys telling me the same nonsense on a daily basis. Really… stop wasting my time. I mean, if I say something about Direct3D, don’t assume that I’ve never used OpenGL, because I have. I use both. I just pick Direct3D on Windows because I have the choice, and I have investigated both options. So I have my reasons. Likewise, I pick Windows as a main platform for various desktop tasks because I have investigated the alternatives, and I have my reasons. But I do still develop for OS X, FreeBSD and linux as well. Just because they are not my primary choice doesn’t mean I ignore them completely, let alone that I am completely ignorant about them. Which you might well be. Think about it. And don’t wear your OS like a crown. Too many linux users feel the need to mention that they use linux at every possible occasion, no matter how irrelevant. Using linux does not make you cool, smart, or whatever else you may think. Nobody cares.
Also, there’s a difference between “Windows is faster at graphics/games” and “you can’t do games on OS X/linux”. Clearly you can play games with slightly less performance.