Why do modern screens still have a "refresh rate"?

Oct 17, 2014 10:01

It seems remarkably old-fashioned to me that monitors are still locked to having a set refresh rate - or, at least, a refresh rate that's locked to a refresh rate on the device that's driving them.

Why have we not got a system by which the computer (or console, or whatever) prepares new screens and then hands them off as they're finished, for the screen to update itself to?

At the moment, my screen is updating about twice a second (as the cursor blinks), and yet my graphics card is still sending the same image repeatedly.

Similarly, games end up using things like VSYNC, and multiple levels of buffering in order to smoothly transition from one frame to the next, because not hitting that 1/60th of a second mark leads to unpleasant consequences. Getting rid of that mark would remove all sorts of complications from the pipeline, replaced by a signal that says "Here's the next frame, go ahead and repaint the screen".

Am I missing something obvious here?

Original post on Dreamwidth - there are
comments there.
Previous post Next post
Up