The story of multi-monitor rendering

When I wrote the story of microsleep, it was only part of a larger story, namely that of rendering to multiple monitors. The microsleep issue was only one problem that I ran into. Aside from that problem with the multithreading itself, I also ran into two other problems. It truly gives you that “off the beaten path”-feeling, when you run into bugs one after the other. Although the OS and the APIs are designed to handle multithreading and multi-monitor applications, it seems like the standard single-monitor setup is far more mature.

Vista is better than Windows 7

The first problem I ran into was when I wanted to have multiple windows fullscreen at the same time. I was using two separate videocards for this. First I made it work in Direct3D9, and that seemed to go okay in Windows 7. Then I worked on the DXGI version of the code, for D3D10 and above… And no matter what I tried, I could not get both windows to go fullscreen at the same time. As soon as I set the second window to fullscreen, the first window popped back to windowed mode for no apparent reason.

I looked over my code dozens of times, re-read the DXGI documentation again and again, tried various variations… but nothing I did made it work. I looked at the MultiMon example from the DirectX SDK, but even that didn’t work. Eventually I found this thread discussing the same issue.

So, apparently it is a known issue to Microsoft. In fact, I even found the bugreport on their Connect website at the time, and I added a comment myself. I can no longer find it though, I think they closed off that part of Connect, now that they are focusing on Windows 8 and Visual Studio 11 (which make the DirectX SDK obsolete).

However, as Charybdis is very specific about DXGI 1.1, I was wondering: does that mean it works on Vista with only DXGI 1.0? So I tested my code, and indeed: in Vista it works fine! My code was good all along.  In fact, even with the Platform Update installed, it seems to work (which adds DXGI 1.1 and DirectX 11 support for Vista). So by the looks of it, it is a Windows 7-specific bug. However, Microsoft never bothered to fix it, and I doubt they ever will. I can only hope that the bug is not present in Windows 8.

Why fullscreen?

For the time being, I will just need a workaround for Windows 7, much like how the microsleep is really just a workaround for the Aero problem. I have two options here:

  1. Use Direct3D 9 only
  2. Use a ‘fake’ fullscreen mode, by using a borderless window that is topmost, and maximized to fill the entire screen.

It’s obvious why 1. is not very ideal. I would not be able to use any new functionality. For 2. I may need to give a little more background information. What is the difference between windowed mode and fullscreen mode?

For starters, when you switch to fullscreen mode you can also control the resolution and refreshrate. So you can switch to a different screenmode than the regular desktop configuration. When your program exits (even if it crashes), the desktop settings are restored. You can implement switching yourself, using ChangeDisplaySettingsEx(), but if your program crashes, the original settings won’t be restored, which can be rather annoying (an issue that is present in many OpenGL applications as well).

There are also various other advantages to real fullscreen. For example, you can specify the actual pixelformat you want to use. Also, when in fullscreen mode, the frontbuffer and backbuffer can simply be flipped, where in windowed mode it will require a blt instead. Another advantage is that you can run in ‘exclusive mode’ in fullscreen, which means that the desktop manager is shut off altogether for your screen, which removes some CPU overhead and also frees up videomemory.

So there are various reasons why you would want to use fullscreen mode rather than just faking it via windowed mode. But sadly, that is not an option in Windows 7.

Direct3D vs OpenGL

The other problem I ran into was when I ported the multithreaded windowing code to my OpenGL renderer. I set up a simple test case where I could instance as many threads/windows/renderers as I liked, each playing back the claw animation that is used in the BHM sample. In Direct3D everything ran fine with 10 windows side-by-side. Each window had its own framerate counter, so I could see that the load was balanced quite nicely by the thread scheduler:

In OpenGL however, things weren’t going that well. Not only did the framerate counters jump up and down erratically for each window, but the animations themselves were actually looking rather jerky:

This was on my machine with my GeForce GTX460. I first tested it under Windows 7. I then decided to also try Vista and XP, but they all showed the same problem. So I decided to also test it on some other machines. One had a Radeon HD5770, the other a Radeon X1900XTX. Both machines ran the application smoothly. The average framerates were lower than on my GTX460, but they were smoother nonetheless. So, since D3D worked fine, and the Radeons worked fine as well, I figured I could rule out the Windows scheduler as the main cause for the load balancing problems. It seemed to be specific to the nVidia OpenGL driver.

So I decided to report it to nVidia as a driver bug. I sent them a link to the program as well, which should make it easier for them to reproduce and analyze the problem. And indeed, nVidia responded that they have been able to reproduce the problem. They said they had found the cause in their drivers, and that they hope to have a fix in a future driver.

Now that’s nice. At least one of my problems is actually going to get fixed! If only Microsoft was that committed. The other day I installed the new 301.42 drivers, and when I ran my test program again, it seemed to balance quite a bit better. I’m not sure if this driver already includes the fix for this problem, or if it’s just a coincidence that it runs better now (the changelog does not seem to make mention of the issue). It’s still not perfect, but it’s acceptable:

This entry was posted in Direct3D, OpenGL, Software development and tagged , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

18 Responses to The story of multi-monitor rendering

  1. Klimax says:

    Just a question: Did you try to reproduce Win7 bug on two Radeons?
    (There’s not much detail in the linked thread and no time to test it myself)

    • Scali says:

      No… I think I either had two GeForces (GTX460 and 7600GT) or a Radeon and a GeForce (HD5770 and 7600GT) when I tested it.
      However, since it worked in Vista on the same machine, and Vista and Windows 7 use the same drivers… and the fact that this is DXGI-functionality… for the driver it’s probably no different from a single-monitor application, I think it’s pretty safe to assume it’s a Windows 7-bug, and two Radeons would suffer as well (unless you run them in CrossFire perhaps, as DXGI will see them as a single adapter).
      The guy said it’s in DXGI 1.1. I think that’s correct.

      • Klimax says:

        Ok. Although I should note that driver model is newer in W7 (WDDM 1.0 vs 1.1) so it might be still that. But how one would differentiatee bug in WDDM and in driver I don’t know.
        (BTW: I can’t test it at all as I don’t have any modern Radeons. Switched it halfway through my comment in my head… )

      • Scali says:

        I don’t see how the minor differences in WDDM would have anything to do with this.
        In fact, I don’t even see why multimonitor would have an effect on it. For the driver it should make no difference in the code that switches it to fullscreen. It’s mostly the stuff in the D3D/DXGI runtime and the window manager that’s different (as you should know, unlike OpenGL, the display driver does not implement the entire API, only the bare lowlevel functionality. The common D3D/DXGI runtimes do the rest). MS probably just mistakenly handles the fullscreen status wrong, and somehow forces all other drivers out of fullscreen mode when one adapter/driver is going in fullscreen. When you have only one adapter with multiple displays, it does work.
        But we want to support multiple GPUs at a time, and not necessarily just via SLI/CrossFire.

      • Klimax says:

        I did search on changes between WDDM 1.0 and 1.1 and found summary in presentation: (Direct download link)
        There’re interesting slides number 35-37 detailing support for multicard setups. Looks like it can be still driver related. (per #37 – DX9 working as a fallback if at least one driver is WDDM 1.0)

        I wonder if same thing would be observed with 560-560 or other WDDM 1.1/DX11 GPUs. (And how multiple different drivers like nVidia/AMD would affect things)

      • Scali says:

        I highly doubt it. I suppose the 7600GT could have run with the driver in WDDM 1.0-mode, since although it used the same driver as the GTX460, it is a DX9-class card. Then again, in Vista they would both be running in WDDM 1.0 mode, and it worked there. And via the D3D9 API, they also worked on Windows 7. Just not DXGI.
        As I already said, I doubt there’s much difference between WDDM 1.0 and 1.1 for resolution changes in the first place at the driver end. So it’s extremely unlikely that nVidia’s drivers work in WDDM 1.0, but not if (at least?) one of them is running in WDDM 1.1 mode.

        Not sure why you are trying so hard to pin it on the display driver. For me it’s pretty obvious that it’s DXGI-related. Microsoft said as much.
        In fact, I believe we even tried it on another system, which has two Radeon HD4850 cards.

        The slide only talks about whether the DWM runs in D3D10 or D3D9 (10level9) mode. Which I don’t think has much to do with applications. The desktop worked fine.

  2. Thanks Scali! This post and the one on usleep have been extremely helpful to me as I’m also looking to get multiple high-framerate displays rendering concurrently.

    I haven’t tried it myself yet, but I found a note on MS’s site regarding what appears to be the issue that you were having on Win7 where you couldn’t get two full screen windows at the same time:

    “The first rule applies to the creation of two or more full-screen swap chains on multiple monitors. When creating such swap chains, it is best to create all swap chains as windowed, and then to set them to full-screen. If swap chains are created in full-screen mode, the creation of a second swap chain causes a mode change to be sent to the first swap chain, which could cause termination of full-screen mode.”

    Please let me know if this technique solves the issue you were having. I’d also be interested to know if you found any other workarounds for Win7.

    Thanks again!

    • Scali says:

      Yes, the fullscreen thing is one issue. But that applies only to DXGI. In D3D9 it worked correctly on Windows 7, but D3D10/11 use DXGI to manage the screen buffers, and that’s where the bug seems to be. The bug is not present on Vista. I have not tested on Windows 8 yet, but I assume it has been fixed.

      And yes, faking fullscreen circumvents this issue, but it’s not ideal, so I wouldn’t quite call it a solution. But that’s what we’re doing now.

      • Yes, I see that it’s only an issue with DXGI — the article that I quoted was specific to DXGI.

        But yes, now that I’ve got the code set up, it seems like I’m having a similar issue on Win7 with DXGI… My issue is that the other one screen goes black rather than staying with the full screen image I’m expecting. It does not appear to be going back to windowed mode like you described, so I plan to look into it more. Win8 does appear to be working fine, though, so it looks like the bug(s) relating to full screen DXGI on multiple monitors are fixed in Win8.

        Thanks again!

  3. allenpestaluky says:

    After looking into it some more, I seem to be having a problem that is most similar to this issue: …Except when switching an output to full screen mode the opposite output, if it was already in full screen, /always/ goes black. For my issue, disabling DWM did not resolve the problem. Again, this happened on Win7 but not Win8.

    Right now my best guess is that my problem occured because I created two separate ID3D11Device for the same adapter (this was a mistake on my part — again, the goal of this article was to have multiple adapters and therefore multiple ID3D11Device).

    After thinking about it, I have realized that the goals for my project are far less complex than the goals of this article, so I will be simplifying my application to run on a single thread. I have successfully got two full screen DXGI outputs on Windows 7 using DX10 and this technique:

    Thanks again for getting me started with my prototype! Your insights were definitely helpful!

    • Scali says:

      Well… you can go either way, really.
      I create multiple devices because they can each run in their own thread simultaneously. You don’t necessarily need multiple GPUs to use this technique (although they won’t actually run in parallel when you have a single GPU I suppose).
      You can also create a single device with multiple swap chains, one for each monitor. Then you have to render from a single thread (aside from the DX11 contexts that have limited functionality for multiple threads), but you can still render to any display you want (Windows will automatically take care of copying the backbuffer to the right place btw, so things work even if you render on GPU 1, but the actual display is connected to GPU 2, at the cost of some performance of course).

      My code is the best of both worlds in this sense: I can create a thread/adapter for each GPU in the system, and attach as many swapchains as I like to each of these adapters.
      This way I can open extra windows with D3D acceleration (either windowed or fullscreen), or attach a swap chain to user controls inside a window.

      Anyway, good luck! This is a bit more of a shady side of D3D, as most people just concentrate on single windows, usually a single fullscreen window (games), and as I’ve found, in Windows 7 it is quite bugged.

  4. Pingback: Latency and stuttering with 3D acceleration | Scali's OpenBlog™

  5. jornskaa says:

    I am developing an application which needs to render in fullscreen on multiple devices using DirectX 11, and I seem unable to get multiple devices to go fullscreen at the same time (tried Windows 7 and 8).

    Did anyone test multi device fullscreen rendering using Windows 8?

    When having multiple swap chains on one device, my program works perfectly, but as soon as I add another device, the application does not remain in fullscreen on all swap chains. This is really yanking my chain!

    Any input is very welcome.

    • I believe it might have something to do with initializing your swap chains in full screen mode — you must first initialize all of your swap chains in non-full screen mode and then switch them to full screen afterwards to prevent them from switching existing swap chains out of full screen mode:

      I have multiscreen rendering working in my input lag timer project. Try this with multiple devices hooked up and see if it does what you want. Let me know if it doesn’t work, but assuming it does, you can grab the source and see what you’re doing differently.


      • jornskaa says:

        Hi Allen. Thanks for the info!

        I am already waiting to go to fullscreen until after all swap chains have been created for all devices. However, the problem still remains. As fast as I go to using multiple devices (GPUs), swap chains tend to exit fullscreen mode for no apparent reason.

        I tried your InputLagTimer executable, but the same issue appeared there as well. Using multiple GPUs did not work in fullscreen mode. Have you tried your software with monitors on multiple GPUs? It would be encouraging to hear if u got this working.


  6. Pingback: OpenGL Multi-Context ‘Fact’ Sheet | gooning

  7. squall86r says:

    Hey Scali, sorry to necro an old blog post, but do you know of any reason that borderless fullscreen WITH vsync will result in 30fps if 2 monitors are enabled on seperate cards.

    I have a 1080 and 1060 driving 2 DVI panels and indeed, this occurs. I reproduced it in guild wars 2 (Windowed Fullscreen + Vsync) and Tomb Raider 2013 (Double and Triple buffered vsync + Exclusive fullscreen ‘OFF’.

    I also see it with Planet Coaster.

    Now its not so much a problem in Guildwars, i can set an fps cap in game, but planet coaster and tomb raider do not have this functionality.

    I already confirmed that going exclusive fullscreen puts fps up to 60, as does disabling the second screen on the 1060.

    • Scali says:

      I suppose it depends on what ‘vsync’ means in this context: it might try to wait for ALL devices to have reached the next vsync point, in which case if they are out-of-sync enough, you could get 30 Hz effectively, rather than 60 Hz.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s