Andrew Ducker (andrewducker) wrote,
Andrew Ducker

Why do modern screens still have a "refresh rate"?

It seems remarkably old-fashioned to me that monitors are still locked to having a set refresh rate - or, at least, a refresh rate that's locked to a refresh rate on the device that's driving them.

Why have we not got a system by which the computer (or console, or whatever) prepares new screens and then hands them off as they're finished, for the screen to update itself to?

At the moment, my screen is updating about twice a second (as the cursor blinks), and yet my graphics card is still sending the same image repeatedly.

Similarly, games end up using things like VSYNC, and multiple levels of buffering in order to smoothly transition from one frame to the next, because not hitting that 1/60th of a second mark leads to unpleasant consequences. Getting rid of that mark would remove all sorts of complications from the pipeline, replaced by a signal that says "Here's the next frame, go ahead and repaint the screen".

Am I missing something obvious here?

Original post on Dreamwidth - there are comment count unavailable comments there.

  • Interesting Links for 25-02-2021

    '10-year prison sentences for breaching COVID-19 entry requirements into the United Kingdom'. How Governmental Decree is undermining the Rule of…

  • Interesting Links for 24-02-2021

    The bitcoin blockchain is helping keep a botnet from being taken down (tags: bitcoin malware ) Firefox enables Total Cookie Protection to stop…

  • Interesting Links for 23-02-2021

    Covid vaccines - 'spectacular' impact on serious illness (Pfizer cuts hospital admissions by 85% after a single dose) (tags: vaccine UK…

  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

  • 1 comment