What's The Difference Between Refresh Rate and Frame Rate?

written by jacob tuwiner Jacob Tuwiner

When you look at buying or building a new gaming computer, more often than not you’ll conduct a little research to figure out what kind of performance you can expect from game to game.

You might’ve also noticed a variable shown on monitors’ boxes known as refresh rate, or heard people talking about the refresh rate of their monitor.

So what the hell is this thing, what’s the difference between refresh rate and frame rate, and why should it matter to you at all?

Frame Rate

Video, in any form, down to its most simple level is just a ton of still pictures put together to simulate movement to the human eye. This is the process that allows GPUs to be able to display that YouTube video you may be watching or the Call of Duty game you might be playing (yeah, right).

framerate vs refresh rate

You’ll see the words “frame rate” used a lot when graphics cards manufacturers are revealing their next best project at events and comparing it to competitors.

Frame Rate, most often referred to as FPS (Frames Per Second), is just the measure of how many pictures, or frames, can be rendered by your graphics card in a 1 second timeframe.

So naturally, the higher your computer’s frame rate, the smoother the final product will be. After a certain point, you can’t see a difference between, say, 60fps and 120fps – this is where refresh rate comes in.

Refresh Rate

Refresh rate is very similar to frame rate.

There’s a slight difference between them, though. Basically, refresh rate is a measure of how many frames per second your monitor can handle. In other words, how many times the monitor can refresh its screen in one second.

refresh rate

Unlike with frame rate (measured in frames per second), refresh rate is measured in Hertz (Hz). Hertz, in short, is just a frequency unit that measures a cycle per second; so it might as well be frames per second.

How They Function Together

Frame rate and refresh rate do actually work together, and both have a hand in delivering the gaming performance that you’re expecting from your rig.

Just think of it like this:

You have a graphics card that can run Minecraft at 60fps and your monitor runs at a refresh rate of 60Hz, so you’ll get 60 frames per second.

Now, let’s say your graphics card runs Minecraft at 120fps, but your monitor is only rated for a 60Hz refresh rate.

fps and hz

In this case, you won’t be getting that full 120fps value out of your GPU since your monitor can only go so fast.

So, generally, if you want to run higher and higher frame rates, you need to purchase a high refresh rate monitor to keep up with your beastly performance.

We should also note that sometimes reaching very high frame rates on low refresh rate monitors can cause graphical glitches like tearing, but we can remedy that in the next section.

Syncing Refresh Rate and Frame Rate

GPU manufacturers have wanted a way to sync together the frame rate of graphics cards and the refresh rate of monitors to give users the smoothest possible performance while leaving no frame behind.

The solution, from Nvidia, was Nvidia G-Sync. Nvidia’s G-Sync enabled monitors to work alongside G-Sync compatible graphics cards to sync the rate for both.

freesync

GPUs like the GTX 1080 and 1080 Ti are great examples of G-Sync coming in handy, since they both deliver very high performance (FPS).

However, using G-Sync forces the GPU to run at a lower frame rate, which can be an issue with lower-end graphics cards.

Oh, and G-Sync only works with Nvidia cards – bummer.

AMD also has its version, called AMD Free Sync. AMD’s Free Sync further improved upon the syncing technology while also allowing both Nvidia and AMD cards the privilege of utilizing the software.

If you’d like to learn more about the differences between G-Sync and Free Sync, we explained it here in our article on the best monitors for the Nvidia GTX 1080 Ti.

Does Refresh Rate Matter for Gaming?

Well, the answer to that depends on who the target audience is, really.

For example, say you’re just trying to have fun playing a new adventure game or something of the sort.

All you really do is wander around, fight things, collect loot, and stare into the beautiful scenery.

Think about it, do you actually benefit from having a slightly smoother experience?

refresh rate gaming

60 frames per second is a generally accepted baseline for smooth gameplay across the board, and all you really need to support your 60fps gameplay is a generic 60Hz monitor.

Though, it may also be slightly beneficial to have a bit of headroom with a 75Hz monitor so that your frames never dip below that 60 mark.

On the other end of the scale, we have professional and online competitive gamers.

In the case of competitive and professional play, every single extra frame you have could contribute to life or death when gaming. If you misplace the shot by just a millimeter or if you don’t see the enemy quick enough you lose the tournament.

In this case, having a low refresh rate can be detrimental.

But don’t go thinking you just need a quick monitor to perform better in competitive play. Factors like your internet speed and the DPI (dots per inch) of your mouse can be just as important.

Especially that mouse thing, imagine trying to aim with low sensitivity, sheesh, try picking some poor sop in CS:GO when your mouse moves like its submerged in molasses.