Does a Higher Refresh Rate Use More GPU?


In recent years, the gaming industry has undergone fast technological innovation, significantly improving the gaming experience. In my opinion, one of the most significant changes is the increase in monitor refresh rates, which has become a must-have feature for gamers.

However, faster refresh rates raise the question of whether or not they consume more GPU power. In this post, I will discuss whether higher refresh rates affect GPU consumption and whether investing in a high-refresh-rate gaming monitor is worthwhile.

How Is Refresh Rate Related to GPU?

Monitor refresh rate is not directly related to anything a GPU does.

Out of curiosity, I checked what people are saying about this on the forums, Reddit, and Quora, and honestly, I was shocked at how many people don’t understand refresh rate versus fps.

A monitor is just a screen like any other. It doesn’t affect the display data coming to it via the cable.

In other words, GPU and monitor don’t relate to each other in any way except by communicating what can and can’t be displayed.

Refresh Rate Vs. Frames Per Second

The monitor’s refresh rate is the number of times per second that the image on the screen can be updated. It is an inherent feature of the monitor and is commonly measured in Hertz (Hz). For example, a display with a refresh rate of 60Hz can update its image 60 times per second.

When watching moving objects on the screen, higher refresh means less blur during fast movements, which results in a more fluid and eye-pleasing screen to watch.

On the other hand, the FPS that the GPU can produce is the number of frames the graphics card can render and deliver to the monitor in one second. The performance of the GPU and the complexity of the graphics being produced define it.

For example, a GPU may be capable of producing 100 frames per second in a less demanding game but only 60 frames per second in a more demanding game. Higher fps in 3D-rendered graphics will result in eye-pleasing fluidity of the moving objects displayed. Low fps, however, will result in stuttering, unlike the blur we see with a low refresh rate.

The monitor’s refresh rate can be a million Hz, which would mean nothing if your hardware sends only 60 frames per second (fps). So you might as well have a 60Hz monitor.

But as you will see next, the answer to our main question isn’t really as simple as saying refresh rate doesn’t use more GPU, and fps does.

Does Higher Refresh Rate Use More GPU in Any Way?

First, to add to the previous statement, the monitor and a GPU communicate only to tell each other what can and can’t be displayed.

Operating systems like Windows 10 and 11 (even the older ones) automatically recognize the monitor’s refresh rate and adjust the fps accordingly.

That, in fact, puts more strain on the GPU by using a bit more of its performance. However, such a marginal difference in performance output isn’t even worth talking about. The oldest GPUs can easily render 2D desktop graphics in several hundred frames per second. Most of those graphics are just static images anyway.

This discussion becomes relevant when discussing 3D-rendered content, such as video games with more GPU performance needed to produce more fps.

A higher refresh rate doesn’t use more GPU directly. However, the refresh rate can require more GPU performance indirectly by needing more fps to match the refresh rate. Otherwise, a monitor can’t benefit from a high refresh rate, and the difference from fps can cause multiple visual problems.

In other words, if you have a 120Hz refresh rate and play a video game only at 60 fps, every other time the refresh happens, it will be refreshing a static image, effectively doing nothing.

The same logic works the other way around. So, for example, when running a game at 120 fps and having only a 60Hz refresh rate, half of those frames wouldn’t be displayed, effectively showing only 60 fps.

Considering all that, a higher refresh rate uses more GPU, but only if fps matches the refresh rate. And it should match because, as we will discuss next, there are a few annoying display issues when fps and refresh rates don’t match.

Is It Bad to Have a Higher Refresh Rate Than Fps?

In my experience, nothing happens if the refresh rate is a multiple of the frame rate. For example, if your FPS is 60 and your refresh rate is 120, each frame is displayed twice. That is equivalent to displaying each frame twice as long.

Assume you have a video file encoded at 30 frames per second. Re-encoding it at 60 makes no difference. Re-encoding it at 120 makes no difference. You simply wind up with several copies of the same frame.

If the refresh rate isn’t a multiple of the frame rate, you’ll notice some strange lag because frames will be up for varying lengths of time. This is sometimes referred to as “input lag.”

You can reduce the visual lag if the fps is at least stable (or locked). However, there are ways to remove the input lag entirely: G-Sync and FreeSync technologies.

G-Sync (Nvidia) and FreeSync (AMD) are similar technologies that, among other things, actively control the refresh rate of your monitor to sync it with the fps. They are implemented with newer gaming monitors. Some monitors have both, and some have only one of these.

If you are buying a high refresh rate monitor, I recommend the one with both G-Sync and FreeSync, as these technologies aren’t perfected yet. I primarily use G-Sync as I found it to be more optimized for my setup. However, in some cases, FreeSync works better, so I can just switch between the two in a matter of seconds.

Is It Bad to Have Higher Fps Than Refresh Rate?

Having a higher fps than refresh rate often results in screen tearing which looks like someone split the screen horizontally.

Tearing occurs when the GPU sends too many frames to the monitor to render. Monitors render frames at a steady rate, whereas graphics cards do not.

The visuals do not appear promptly on the monitor. If the GPU transmits a fresh image to the monitor while it is rendering an image, the monitor will continue to generate the image but with the most recent data from the GPU.

The result is the monitor compiling data from two different images into one, and since it’s generated from top to bottom of the screen, the screen tear is always horizontal.

Screen tearing isn’t always present when you have more fps than refresh rate. It depends on the number of frames and the timing. For example, the tearing might not appear if your frame rate is a multiple of your monitor’s refresh rate and the timing matches.

The V-Sync technology was used to control the framerate and the timing, but it would lock the framerate to 60, which is unacceptable by today’s gaming standards. Hence the V-Sync is outdated and unused by most of the gaming population.

Instead, already mentioned G-Sync and FreeSync technologies control the timing of the images generated by the monitor while not locking the number of frames. 

Although, this can cause input lag as well, similar to the one described in the case of having a higher refresh rate than fps. In this case, it only happens if you have a much higher fps than a refresh rate, especially if the fps are unstable. A solution is to lock the fps to a number equal to or close to the refresh rate of your monitor.

Final Thoughts

We can see that the refresh rate doesn’t directly draw more rendering power from the GPU. However, monitors and GPUs are highly sensitive to uneven numbers between fps and refresh rates, as we can see from the problems described in the previous two chapters.

The technology almost requires you to have equally performing GPU and monitor for everything to work smoothly.

Therefore, a statement that the higher refresh rate indirectly uses more GPU is valid, in my opinion

Recent Posts