• tacosplease@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    And old video games. They were designed for CRT and look better than on a new TV. Plus CRT has basically no latency. New tvs cause input lag because they have to process the picture. It makes many old games unplayable or very hard to play unless you have a very expensive screen made for gaming.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      If you’re measuring latency using the same methods as everything else, CRT has latency, and more of it than you might think.

      The standard is to measure at the point where the picture is drawn halfway down the screen. On NTSC with ~30fps, this is about 17ms of latency ( ((1 / 30) / 2) * 100 ). If you hit the button slightly before the screen is drawn, and the game processes it immediately and draws the frame accounting for it, then it will take about 17ms before we stop the clock on the standard method of measurement for latency.

      “But”, you might say, “the flatpanel can’t go any faster than it’s fed that NTSC signal, so its latency will be at least that much plus the upscaler plus its pixel response time”.

      Fair. A good gaming panel has around 2ms pixel response time. Upscalers can never be zero lag, but good ones like the OSSC and RetroTink are pretty damn close these days.

      This is already less than human ability to even notice the difference, but consider doing the same equation for PAL signals at 25fps. It comes out to about 20ms, which is 3ms slower than NTSC. The difference in latency between NTSC and PAL CRTs is about about the same as the difference between NTSC fed to CRTs or low latency flatpanels. It’s possible for flatpanels to be even less than PAL CRTs, and we’ll probably get there at some point.

      • faintbeep@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        “But”, you might say, “the flatpanel can’t go any faster than it’s fed that NTSC signal, so its latency will be at least that much plus the upscaler plus its pixel response time”.

        Won’t it be at least double, because it reads the whole frame into memory before displaying it?