WTF ... IS WTF!?
We are a collective of people who believe in freedom of speech, the rights of individuals, and free pancakes! We share our lives, struggles, frustrations, successes, joys, and prescribe to our own special brand of humor and insanity. If you are looking for a great place to hang out, make new friends, find new nemeses, and just be yourself, WTF.com is your new home.

Monitor Refresh Rates

132
0
76
#1
I have an 19" lcd running at normal resolution and i do some gaming. You can choose to show refresh rates that the monitor "doesnt support". I put it to 85, a mode it doesnt support, and there doesnt seem to be a problem. It says it can cause damage, but what are the real dangers, is it really a big deal?
 

Jung

???
Premium
13,998
2,267
487
#2
Refresh rates don't really matter on LCDs, since they don't actually vertically refresh. CRT monitors are refreshed from top to bottom, using an electron beam. LCDs use a TFT to control each individual pixel, which determines responce time.
 

Descent

Hella Constipated
7,686
165
157
#3
junglizm said:
Refresh rates don't really matter on LCDs, since they don't actually vertically refresh. CRT monitors are refreshed from top to bottom, using an electron beam. LCDs use a TFT to control each individual pixel, which determines responce time.
They do when it comes to



So you can get 85fps without tearing.
 

LiberatioN

Trance Addict
1,432
0
100
#4
Yea, but he's running an LCD monitor...so vsync doesnt matter. Also, some video drivers on certain video cards prohibit certain refresh times for overheating prevention.
 

Jung

???
Premium
13,998
2,267
487
#5
There is no need to get over 60FPS in any game right now, and setting higher refresh rates will only degrade performance. Leave it alone.
LiberatioN said:
Also, some video drivers on certain video cards prohibit certain refresh times for overheating prevention.
The monitor is what determines refresh rate, not the drivers. His drivers should be able to support all the way up to 125Hz.
Descent said:
They do when it comes to So you can get 85fps without tearing.
You obviously have no understanding of v-sync. If your video card can't put out the FPS to match your refresh, you will get horrible FPS.

I don't feel like typing, but here is a very good explanation.
Let's say you're playing the sequel to your favorite game, which has better graphics. You're at 75Hz refresh rate still, but now you're only getting 50FPS, 33% slower than the refresh rate. That means every time the monitor updates the screen, the video card draws 2/3 of the next frame. So lets track how this works. The monitor just refreshed, and frame 1 is copied into the frame buffer. 2/3 of frame 2 gets drawn in the back buffer, and the monitor refreshes again. It grabs frame 1 from the frame buffer for the first time. Now the video card finishes the last third of frame 2, but it has to wait, because it can't update until right after a refresh. The monitor refreshes, grabbing frame 1 the second time, and frame 2 is put in the frame buffer. The video card draws 2/3 of frame 3 in the back buffer, and a refresh happens, grabbing frame 2 for the first time. The last third of frame 3 is draw, and again we must wait for the refresh, and when it happens, frame 2 is grabbed for the second time, and frame 3 is copied in. We went through 4 refresh cycles but only 2 frames were drawn. At a refresh rate of 75Hz, that means we'll see 37.5FPS. That's noticeably less than 50FPS which the video card is capable of. This happens because the video card is forced to waste time after finishing a frame in the back buffer as it can't copy it out and it has nowhere else to draw frames.

Essentially this means that with double-buffered VSync, the framerate can only be equal to a discrete set of values equal to Refresh / N where N is some positive integer. That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

Now maybe you can see why people loathe it. Let's go back to the original example. You're playing your favorite game at 75Hz refresh and 100FPS. You turn VSync on, and the game limits you to 75FPS. No problem, right? Fixed the tearing issue, it looks better. You get to an area that's particularly graphically intensive, an area that would drop your FPS down to about 60 without VSync. Now your card cannot do the 75FPS it was doing before, and since VSync is on, it has to do the next highest one on the list, which is 37.5FPS. So now your game which was running at 75FPS just halved it's framerate to 37.5 instantly. Whether or not you find 37.5FPS smooth doesn't change the fact that the framerate just cut in half suddenly, which you would notice. This is what people hate about it.
 

Descent

Hella Constipated
7,686
165
157
#6
junglizm said:
You obviously have no understanding of v-sync. If your video card can't put out the FPS to match your refresh, you will get horrible FPS.

I don't feel like typing, but here is a very good explanation.
I already know how it works - and my 9700 Pro can do that fine. If it can't I'll just turn it down.
 
132
0
76
#7
Thanks for the replies. I also had one other question: whenever the refresh rate is at 60 (recommended refresh rate), text blocks up in the middle and is fuzzy. When I put it at 75, this problem vanishes. What's up with that?