Billiam29
Senior Member
I just joined last decade and got my first variable refresh rate monitor for eventual gaming purposes. I have a stupidly simple question that I’m having trouble finding an answer to.
The advanced display settings in Windows of course has a refresh rate setting. It is comprised of distinct values: 60 Hz, 120 Hz, 144 Hz, etc.. Yet if your monitor supports adaptive refresh rates, the manner in which you enable that feature is found in the GPU driver settings. In my NVIDIA case, it’s just a setting of “adaptive”.
So which setting “wins” here? Does one override the other? Do the two settings apply in different circumstances?
My hunch is that the GPU “adaptive” setting may only apply when the GPU is in 3D mode and the Windows display settings would apply for everything else in the normal 2D flat land of the OS and desktop apps. I’ve been unable to find anything that confirms this though.
The advanced display settings in Windows of course has a refresh rate setting. It is comprised of distinct values: 60 Hz, 120 Hz, 144 Hz, etc.. Yet if your monitor supports adaptive refresh rates, the manner in which you enable that feature is found in the GPU driver settings. In my NVIDIA case, it’s just a setting of “adaptive”.
So which setting “wins” here? Does one override the other? Do the two settings apply in different circumstances?
My hunch is that the GPU “adaptive” setting may only apply when the GPU is in 3D mode and the Windows display settings would apply for everything else in the normal 2D flat land of the OS and desktop apps. I’ve been unable to find anything that confirms this though.