Andrew Ducker (andrewducker) wrote,
Andrew Ducker

Why do modern screens still have a "refresh rate"?

It seems remarkably old-fashioned to me that monitors are still locked to having a set refresh rate - or, at least, a refresh rate that's locked to a refresh rate on the device that's driving them.

Why have we not got a system by which the computer (or console, or whatever) prepares new screens and then hands them off as they're finished, for the screen to update itself to?

At the moment, my screen is updating about twice a second (as the cursor blinks), and yet my graphics card is still sending the same image repeatedly.

Similarly, games end up using things like VSYNC, and multiple levels of buffering in order to smoothly transition from one frame to the next, because not hitting that 1/60th of a second mark leads to unpleasant consequences. Getting rid of that mark would remove all sorts of complications from the pipeline, replaced by a signal that says "Here's the next frame, go ahead and repaint the screen".

Am I missing something obvious here?

Original post on Dreamwidth - there are comment count unavailable comments there.

  • Interesting Links for 24-10-2021

    New Zealand trade deal is 'disgrace', says UK government climate adviser (tags: UK newzealand trade sheep environment ) White House delays…

  • Interesting Links for 23-10-2021

    Unite policy conference backs new voting system for UK general elections (tags: voting reform unions uk ) If Clouds Are Made of Water, How Do…

  • Interesting Links for 22-10-2021

    The NFT-based book-writing group aimed at teens that lasted nearly 12 hours before being questioned to death (tags: writing cryptography wtf…

  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

  • 1 comment