Originally by: Kyra Felann
lower framerates like in movies look smooth to the eye because of motion-blur, which most games don't have.
I feel I need to qft this fact again.
The human eye LOVES motion blur. It evolved in (or was constructed for, if you like) an environment where there was motion blur and it would have been a major investment to avoid it. So the brain decided to use the motion blur to its advantage instead.
If we see something that moves, we expect motion blur. If there is none, we will notice that.
So, we have a computer that produces frames that have not motion blur. What can we do to avoid the eye from noticing it? 3 possibilities:
(1) Add motion blur. For this the computer needs to compute additional frames and blend them together. To reduce the needed frames, rendering can be restricted to parts of the frame that have changed.
(2) Add blur around moving objects. Cheap and simple, the application knows which objects are moving and can add simple blur around them. Or if the camera is moving, it can just add blur over the whole screen.
(3) More fps to the eye. If there are more frames delivered to the eye, there will be some kind of saturation, meaning that the frames will blend together. And blended frames are blury where they are different.
Frames with motion blur look fine at 24fps (movies, either with real blur or CGI/SFX with solution #1), where computer games need more, e.g. 60fps (the max a normal LCD will display) to utilize solution #3.
People state that games feel more responsive without vsync. Read my large 3-post post and you'll know why. In short: Normal vsync introduces additional lag in the chain from game to eye. (actually it's not lag but older frames)
BTW: Thanks to the one who posted the info about DirectX. The last time I worked with teh software side of this, "VESA" was something some graphic cards supported <g>