Heh that figures.. Prove it to yourself and conduct some experiments. Let my lay the groundwork for you. This is so simple, try this.. Read the number on a taxi-cab as it zooms by in car chase on the movie screen. If you can eye-track it you can see the number, but it's all smeared out. You won't be able to read it till the fps of the capturing camera and presenting projector is increased. PC gaming is no different.
If you don't eye track it, then it naturally appears blurry and this is totally acceptable. It's when you try to follow an object that is being presented to you via too-few 'fps' that the problems of image quality come into question.
Since I am in the process of publishing a paper on this phenomenon I thought, in the interim, that I could refer to the FPS wiki article. I had high hopes it would cover the basics, but they left out the "framerate-fusion" section. Nor did they even mention spatial-fusion.
The closest 'thing' that the article describes is Judder. More specifically the 2:3 pulldown telecine Judder. To the gamer, to eliminate all tearing and judder and "jerkiness" and stutter, you MUST update the entire image for every horizontal dot change. Might as well throw in vertical movements and dot changes as well. And this is assuming you need to stay with a raster-scan type of display.
Ideally, but not cost effective right now - is to have each pixel on the LCD directly connected to its own memory location. A large parallel connection, if you will. Taking that further, each pixel should have its own shader unit. 1 per pixel. That will eventually be, but not today.
Furthermore, only pixels that change should be updated. With today's graphic technology, we update the entire frame, even though only a few pixels are updated in the frame buffer.
OK fantasy hardware and theorizing aside, Let us take this simple approach and cut through the marketing fluff. Throw away the conceptual and nebulous in-depth explanations. We will dispense with the notion forever, right here and now, that 60fps is horribly insufficient for correct motion depiction. Forget everything else, let us take it from the top.. You will forgive any condescending language, but it must be used.
As long as 60 fps is the accepted standard.. In order to achieve a true fluid motion appearance, you are limited to moving an on-screen object (a missile perhaps) by only a distance of 60 pixels within that one second time frame. For sake of this argument we will assume you are using a standard resolution of 1024x768 and sitting about 1 meter from the display. You should not be able to discern any of the individual pixels on the monitor by themselves, you should not see any jaggies or stair-step artifacts. If you can, give me you eyes! I want them!
Now - If you try to move the missile across a high resolution screen in one second, to achieve perfect fluid motion, you will need to update the image 1024 times in that one second. Or 512 times if you give it two seconds. One update per pixel.
Doing that insures the missile is represented in the highest detail possible as it traverses your screen. There is no spatial gap with that timing. No loss of detail, no skipped frames(read as no skipped pixels, as "frame" is a holdover from the days of film projection.) Each pixel change is a frame, anything less and you are being cheated. Each pixel in turn turns on and then off in the allotted time. Each pixel stays on for 1/60th second. There is no "gap", the missile is always perfectly rendered to best possible detail at all points during its 1 second traverse across a landscape of 60 pixels.
For the missile to cross from left to right - a 1024x768 landscape of pixels - and staying perfectly visible with all detail present, no jumpiness, no pulldown artifacts, no edge fuzzing.. Going from pixel-to-pixel, it would take 17.067 seconds to make the trip - at 60fps.
Well I'll be damned!! It's gonna take me 17 seconds to blow up the guy on the right side of the screen?? Good god!! I can't get behind that. I gotta blow him up
NOW.
I have 2 choices, reduce the spatial-temporal detail or get more GPU horsepower.
Let us look option #1 - reducing spatial-temporal detail, as it is the most commonly used method and by far the cheapest. It is a natural side effect anyways, so let us use it. By the way, this doesn't mean make a simpler texture or bigger or smaller object, or anything like using a different mip-map. Or texture compression, or t-buffer tricks with motion-blur, No. No. No.
For me to effectively play the game(whatever it may be), I'm going to take liberty and assume my missile will impact the target on the right side of the screen in one second after being launched from the left side. (I think we're playing a side-scrolling shoot-em-up.) I've got a decent video card that can jam at 60fps too. And my resolution is set at 1024x768.
To show me the missile flying that left to right distance, in the allotted one second, my spiffy spank'n GPU
**AND** LCD will give me 60 pictures a second. Not too shabby, eh? This looks awesome. On paper, and on the gold-foil-lined box my card came in.. Consider this: You must divide those 60 pictures up into 60 evenly spaced areas on the screen. The missile will, over the course of 1 second, occupy 60 discrete positions as it makes it's way across the playfield. Folks, that is only
**60** . My graphics card is positioning the missile only 60 times. Drawing it 60 times, updating it 60 times on the way to the target. That's it!
Ladies and gentlemen - This missile is
*MAGIC* it can jump 17 pixels in 1/60th second, imagine that! 17 pixels! Or perhaps a few millimeters depending on the specifications of your LCD panel, if you want to visualize it that way. My graphics card is big, it's strong, it's powerful, and blasts this missile from left to right in 60 huge leaps of 17 pixels at a time.
*INCREDIBLE*.. What is going on between those 17 pixel jumps? Where's the missile? It must teleport its way from jump to jump. It appears at the 1st 17 pixel "marker", disappears, then re-appears like magic 17 pixels further to the right. This has to happen 60 times till it gets to the target!
But if I follow the missile very carefully with my eye, focusing on it, tracking it, it will *appear* to jitter, parts of it may overlap, parts may fade in and out, perhaps smear, stretch a little (don't bring relativity into this, it doesn't practically apply here). Disappear and reappear. The left and right edges of the nosecone and flaming nozzle may widen and blur. Definitely not a smooth real life presentation by any means. All those "artifacts" are being generated by you, the missile is only occupying 60 discrete positions across the screen.
However, if I pay attention to the target and focus on that, the missile will appear to fly into view and cross the screen somewhat smoothly. But there will be an apparent motion-blur generated by your optic nerve and visual perception center in your head!
***YOU*** are filling in that gap, There is no real motion blur on the screen, you are filling it in! It is often this which causes headaches among some gamers and tv watchers.
The "more-in" your peripheral vision an object is, the more you fill-in the gaps.
(3D only makes this worse because you have to do the same work in trying to figure out just what is going on in another direction. As if 2 weren't enough! Something the MPAA and gaming industry does not explain to you. They will make a big stink of it later on though, and if I have time I'll tell you why.)
Mmm, I don't like the first option, Let us examine #2 and see if we have a better choice.
This is where it gets more realistic and more interesting. And it all falls into place here. Once again, we're gonna blow up a target, same launch position, same target area. We launch our missile, it's an old time one, one from the 50's. It doesn't have all the whizbang teleportation and pixel jumping capabilities. No. This is the real deal folks. As real as it gets! This missile obeys all the known (and unknown) laws of physics. This missile transitions smoothly from
pixel-to-pixel. At every of the 1024 steps along the flight from left to right our missile stays right here, in the known universe, it does not enter hyperspace, or drop through subspace. No sir. It does not distort its appearance or change looks on you when you look away. No funny stuff. During the one second flight, our missile stays visible on the playfield. Each of the 1024 steps along the way we can see all the detail. You can track it with your eye 100% percent. It doesn't hide, it don't jump 17 pixels at a time. Ahhh,
**REALISM**!! Just smooth pixel-to-pixel velocity. We draw one image, move it one pixel, draw it again. All the tedious way from left to right. Never missing a dot-clock! Imagine that!
We updated our moving object 1024 times, yes folks, yes, a thousand times. If we crank up the turbopump this missile moves faster and hits the target in 0.5 seconds. For it to be represented fully, completely, accurately, no hyperspace pixel-jumping, we must
**ABSOLUTELY** update the screen 2048 times per second. This ensures the missile is presented to you fully, at every step of the way. If we do not then there is mis-representation of the data. Something is now missing. I don't know of a graphics card or monitor that refreshes at 2khz yet. But you can be assured the industry is hard at work trying to get there!
Please don't complain that a single pixel-to-pixel transition is considered a jump itself. If you do then I'm going to make you factor in a 256*2*1024 refresh rate! Because you could vary the brightness of each pixel beginning at 256 "brightness" and ramping it down to 0 then taking the next adjacent pixel and doing the same in the opposite direction.
INSANE!
And there you have it!! It can't get any simpler than that. Believe it or not, your gaming experience is
ULTIMATELY limited by your refresh rate and not how fast your graphics card is. In the interim we can live with some artifacting and jumping jitters. But let's now look at the big-picture(no pun intended) relationship between the monitor and refresh rate and graphics card framerate. I won't go into a discussion of CRT's as no one seems to use them anymore except for the die-hard classic gamers into the likes of systems from the Atari2600 and Intellivision and Commodore-64 era.
First, let us establish that the "perfect GPU" should update each pixel on the display device instantly as it changes. Alas this is not the case.
The absolute best we can hope for today is about a 100hz refresh rate. That means for your missile to be perfectly represented on a 1024x768 monitor, it must cover the left-to-right distance no faster than about 10 seconds. Anything faster and you will get artifacting like I've described above. Tearing, smearing, flickering, and especially - micro-stuttering.
In the end you can easily see imperfect renderings of fast moving object by tracking them as they move across screen. Once the object gets going fast enough, the leading and trailing edges get a little fuzzy, the whole missile seems to "buzz" and shimmer as it covers 2, then 3, pixels per refresh cycle. Only when the image is updated and refreshed for every pixel do we not see this effect.
Gaming consoles have an apparent advantage, they work at much lower resolutions. 320x200, 640x480, with the newer ones going to the standard HD resolutions. If you look at the lowest resolution ones and consider updating the screen image at 60hz, interlaced ntsc (now introducing a sliding-swimming artifact) you'll find that you can get away with the missile jumping 5 pixels in a 1 second traverse. Granted, you'd need to step back a little to alleviate the pixelation. But you get the idea.
What is even worse is an uneven presentation of frames. Say you got 100 fps, then 30 fps, than back to 63, now down to 40, then up to 75. Sadly that is the current state of affairs with videocards, there is no forced steady frame output. Sure you can lock the output to the refresh rate, but you still can't force the GPU to have the image ready to display in 16ms if it is not ready! You've just dumped an incompletely updated frame buffer, now you get tearing.
Till we can update on a per-pixel basis (perhaps 10 years from now), we need to at least keep a steady framerate, let's settle on 240hz or thereabouts.
Till more powerful gpu's come to pass we're going to have to live with imperfections like this! We will need to live with micro-stuttering and blurry objects. If James Cameron is dissatisfied with current framerates and wants to shoot the next Avatar sequel with a new system, then that's good enough for me! If you want to continue your own research go right on ahead. I invite you to prove me wrong.In the meantime, immerse yourself here. The last article - hdtv blur - just barely mentions the eye-tracking issue which
*IS* the root cause of today's
-poor display technology. All these newfangled features are patchwork in an attempt to hide the real problem. Slow refresh rate.
You may continue your research here, and you will come to conclusion the slow refesh rate is the major hangup.
http://www.pcgameshardware.com/aid,...from-current-multi-GPU-technologies/Practice/
http://www.overclockers.com/micro-stutter-the-dark-secret-of-sli-and-crossfire/
http://hardforum.com/showpost.php?p=1032646751&postcount=1
http://www.100fps.com/how_many_frames_can_humans_see.htm
http://msdn.microsoft.com/en-us/windows/hardware/gg463407.aspx
http://www.pcgameshardware.com/aid,...force-GTX-285-SLI-Multi-GPU-Shootout/Reviews/
[ame="http://en.wikipedia.org/wiki/Graphic_display_resolutions"]Graphic display resolutions - Wikipedia, the free encyclopedia[/ame]
[ame="http://en.wikipedia.org/wiki/Frame_rate"]Frame rate - Wikipedia, the free encyclopedia[/ame]
[ame="http://en.wikipedia.org/wiki/Motion_compensation"]Motion compensation - Wikipedia, the free encyclopedia[/ame]
http://en.wikipedia.org/wiki/Refresh_rate#Computer_displays
[ame="http://en.wikipedia.org/wiki/Micro_stuttering"]Micro stuttering - Wikipedia, the free encyclopedia[/ame]
[ame="http://en.wikipedia.org/wiki/HDTV_blur"]HDTV blur - Wikipedia, the free encyclopedia[/ame]