Updated: GeForce cards mysteriously appear to play nice with TR's FreeSync monitors
Update 9/30/18 3:22 AM: After further research and the collection of more high-speed camera footage from our G-Sync displays, I’m confident the tear-free gameplay we’re experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10’s Desktop Window Manager adding its own form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we’re experiencing, I’d warn against drawing any conclusions from our observations at this time and sincerely apologize for the misleading statements we’ve presented in our original article. The original piece continues below for posterity.
It all started with a red light. You see, the primary FreeSync display in the TR labs, an Eizo Foris FS2735, has a handy multi-color power LED that flips over to red when a FreeSync-compatible graphics card is connected. I was setting up a test rig today for reasons unrelated to graphics-card testing, and in the process, I grabbed our GeForce RTX 2080 Ti Founders Edition without a second thought, dropped it into a PCIe slot, and hooked it up to that monitor.
The red light came on.
Some things are just not supposed to happen in life, like the sun circling the earth, people calling espresso “expresso,” and FreeSync monitors working in concert with Nvidia graphics cards. I’ve used GeForce cards with that Eizo display in the past as the occasion demanded, but I can’t recall ever seeing the monitor showing anything other than its white default indicator with the green team’s cards pushing pixels.
At that point, I got real curious. I fired up Rise of the Tomb Raider and found myself walking through the game’s Geothermal Valley level with nary a tear to be seen. After I recovered from my shock at that sight, I started poking and prodding at the game’s settings menu to see whether anything in there had any effect on what I was seeing.
Somewhere along the way, I discovered that toggling the game between exclusive fullscreen and non-exclusive fullscreen modes (or borderless window mode, as some games call it) occasionally caused the display to fall back into its non-variable-refresh-rate (VRR) default state, as indicated by the LED’s transition from red to white. That color change didn’t always happen, but I always noticed tearing with exclusive fullscreen mode enabled in the games I tried, while non-exclusive fullscreen mode seemed to reliably enable whatever VRR mojo I had uncovered. That trick seemed to work in other games, too.
Next, I pulled up my iPhone’s 240-FPS slow-mo mode and grabbed some footage of Deus Ex: Mankind Divided running on the RTX 2080 Ti while it was connected to the Eizo monitor. You can sort of see from the borderless windowed mode video that frames are arriving at different times, but that motion is advancing an entire frame at a time, while the exclusive-fullscreen mode shows the tearing and uneven advancement that we expect from a game running with any kind of Vsync off.
Now that we had a little bit of control over the behavior of our Nvidia cards with our Eizo display, I set about trying to figure out just what variable or variables were allowing us to break through the walls of Nvidia’s VRR garden beyond our choice of fullscreen modes.
Was it our choice of monitor? I have an LG 27MU67-B in the TR labs for 4K testing, and that monitor supports FreeSync, as well. Shockingly enough, so long as I was able to keep the RTX 2080 Ti within its 40-Hz-to-60-Hz FreeSync range, the LG display did the VRR dance just as well as the Eizo. You can see the evidence in the slow-motion videos above, much more clearly than with the Eizo display. While those videos only capture a portion of the screen, they accurately convey the frame-delivery experience I saw. I carefully confirmed that there wasn’t a visible tear line elsewhere on the screen, too.
Was it a Turing-specific oversight? The same trick worked with the RTX 2080, too, so it wasn’t just an RTX 2080 Ti thing. I pulled out one of our GTX 1080 Ti Founders Editions and hooked it up to the Eizo display. The red light flipped on, and I was able to enjoy the same tear-free, variable-refresh-rate experience I had been surprised to see from our Turing cards. Another jaw-dropping revelation on its own, but one that didn’t get me any closer to understanding what was happening. That card worked fine with the LG display, too.
Was it a matter of Founders Editions versus partner cards? I have a Gigabyte RTX 2080 Gaming OC 8G in the labs for testing, and I hooked it up to the Eizo display. On came the red light.
Was it something about our test motherboard? I pulled our RTX 2080 Ti out of the first motherboard I chose and put it to work on the Z370 test rig we just finished using for our Turing reviews. The card happily fed frames to the Eizo display as they percolated through the pipeline. Another strike.
Was Windows forcing Vsync on thanks to our choice of non-exclusive fullscreen mode? I pulled out my frame-time-gathering tools and collected some data with DXMD running free and in its double- and triple-buffered modes to find out. If Windows was somehow forcing Vsync, I would have seen frame times cluster around the 16.7-ms and 33.3-ms marks, rather than falling wherever.
Our graphs tell the opposite tale, though. Frame delivery was apparently happening normally while Vsync was off, and our Vsync graphs show the expected groupings of frame times around the 16.7-ms and 33.3-ms marks (along with a few more troublesome outliers). Doesn’t seem like forced Vsync is the reason for the tear-free frame delivery we were seeing.
Update: Some reasoning about what we’re seeing underlines why the above line of thought was incorrect. If the Desktop Window Manager itself is performing a form of Vsync, as Microsoft says it does, we probably wouldn’t see the results of those quantizations in our application-specific frame-time graphs for games running in borderless windowed mode. The DWM compositor itself would be the place to look, and we don’t generally set up our tools to catch that data. The application can presumably render as fast as it wants behind the scenes (hence why frame rates don’t appear to be capped in borderless windowed mode, another source of confusion as we were putting together this article), while the compositor does the job of selecting what frames are displayed and when.
We didn’t try and isolate drivers in our excitement at this apparent discovery, but our test systems were using the latest 411.70 release direct from Nvidia’s website. I can’t guarantee that this trick will work with older versions of Nvidia’s drivers, with every FreeSync display, with every game, or that it will work for you at all. We did install GeForce Experience and leave all other settings at their defaults, including those for Nvidia’s in-game overlay, which was enabled. The other constants in our setup were DisplayPort cables and the use of exclusive versus non-exclusive (or borderless windowed) modes in-game. Our test systems’ versions of Windows 10 were fully updated as of this afternoon, too.
Ultimately, I have no idea what’s going on here, but I’m pretty sure this is probably something that is not supposed to happen. I fully expect that its root cause will be found and patched out shortly—presuming this is a thing to begin with and we’re not totally barking up a nonexistent tree (ed. – as we did). My best guess is that involving Windows 10’s Desktop Window Manager by using non-exclusive fullscreen mode in the games we tested is somehow triggering VRR (ed. – actually, a form of regular Vsync) with our FreeSync monitors. We’ve asked Nvidia for comment on this story and we’ll update it if we hear back.
Let’s block ads! (Why?)