So I’ve been tweaking my Steamdeck settings and whilst I don’t consider myself a total noob to this…I’m legitimately not sure what the difference is between the two, or which is more important to the overall smoothness of the game I’m playing.
So I’ve been tweaking my Steamdeck settings and whilst I don’t consider myself a total noob to this…I’m legitimately not sure what the difference is between the two, or which is more important to the overall smoothness of the game I’m playing.
The refresh rate is the amount of frames your display can show per second. The unit for this is Hertz (hz). This is 60hz on the Steam Deck. This is an engineering thing and there isn’t too much you can do to change this.
The framerate is the amount of frames your graphics card produces per second. The „unit“ her is often fps (frames per second).
You cannot exceed the 60hz limit of the Steam Deck‘s screen since it is a hard limit, you would need to build a new screen into the Deck. So optimally you want your GPU to produce 60 fps or more to use the display to its full extend.
Smoothness is a little harder. You can have a game with 60fps on a 60hz screen that still feels choppy because the timings are misaligned. Imagine your GPU produces 59 frames in half a second and then only 1 in the other half. Your screen would freeze for almost half a second because there is no new frame arriving at the display for half a second. Here you have to look at your frame times. They should be as consistent as possible.
So to sum up: refresh rate = times your monitor can show something new (hard limit)
fps = frames your GPU can produce per second (you can change that via the settings of a game)
frame times = the time a frame „waits“ on your screen. (The shorter and the more consistent, the better)
Sometimes lower fps seem more fluid than higher fps because the fewer frames are arriving more „punctual“.
With this in mind, I’ve read anecdotes that say you should have a monitor that ideally has double the refresh rate as the FPS of the game you’re playing. The thought being, I suppose, that as the monitor is refreshing more frequently, it will more likely catch a frame.
I can’t find where I read/watched this, but if anyone has any input or tests to this 3ffect I’d be interested to see it again.
That’s the Nyquist-Shannon theorem applied to framerates. You need a sampling rate at least twice the highest frequency of the Fourier transform of a signal to reproduce it without aliasing. For frames, that aliasing shows up as tearing or stuttering, it’s temporal aliasing not spatial aliasing that the various AA settings combat.