• serenissi@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    21 hours ago

    what is the actual usecase of this interpolation feature? it should require capable hardware, so it doesn’t exist for nothing.

    • BorgDrone@feddit.nl
      link
      fedilink
      English
      arrow-up
      14
      ·
      21 hours ago

      I think the idea is to increase motion resolution.

      On a sample-and-hold display, like an LCD or OLED, the image on the screen stays the same for the entire frame. When the image suddenly changes because the TV displays a new frame, our eyes need a bit of time to adjust. The result is that when there is a lot of motion on screen, the image starts to look blurry.

      This was not an issue on older CRT displays because they used a beam that scanned the picture. Each ‘pixel’ (CRT’s didn’t have pixels but lines, but you get the idea) would only light up for a small amount of time. Since our eyes are relatively slow we didn’t notice the flickering that much, and because it wasn’t fully lit all the time the ‘pixels’ in our eyes didn’t get saturated and could quickly adjust to the new frame.

      By adding interpolated frames the image changes more often and this allows our eyes to keep up with the action. Another solution to the problem is black frame insertion, where the TV shows a black image between each frame. Again we don’t perceive this as flickering as our eyes are too slow for this, but the disadvantage is that the picture brightness seems to halve.

      How much blurriness you get in motion is a function of both how fast the movement on screen is and the frame rate. Fast movement and low frame rates cause more blurriness than slow movement and high frame rates.

      The use-case for this feature is mainly for fast sporting events on broadcast TV, where there may be fast movements (e.g. a soccer ball) combined with the low frame rate of broadcast TV (30 or 25 fps depending on where you are).

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 hours ago

        I suspect it’s also meant to mitigate the modern fascination with buying TVs that are too big, and sitting far too close to them all the time. If your soccer ball example involves the ball being in one position on this frame, and literally six inches away in real distance on the surface of the screen on the next, and this is happening all the time, people will get fatigued and cranky watching it for extended periods.

        • BorgDrone@feddit.nl
          link
          fedilink
          English
          arrow-up
          2
          ·
          17 hours ago

          Unless you’re Elon Musk rich, it’s pretty much impossible to buy a TV that is too big.

          I own a 77” TV and the optimal viewing distance for that is 2.7 meters for a THX recommended viewing angle of 36°. The size goes up quickly the farther you sit from the screen. If your couch is 4 meters from the screen you’re already looking at a 114” screen to get the same 36° angle.

    • tehmics@lemmy.world
      link
      fedilink
      English
      arrow-up
      32
      ·
      21 hours ago

      Motion smoothing, frame interpolation features in TVs. It’s what makes movement look unnatural and on default TV settings. Old people can’t tell/don’t understand so it’s customary to sneakily disable it for them when visiting

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      14
      ·
      21 hours ago

      Yes. Motion smoothing. It’s like kerning or the Wilhelm scream. Once you notice it, you’ll hate it.

      It makes the slow panning forests and splashy paint videos in Currys look nice, but it makes movies and TV shows look terrible.

      • varyingExpertise@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        17 hours ago

        Eh, I’ve been adding it on purpose to technical and astronomy documentations during transcoding for my library. 23.whatever fps NTSC pulldown is just choppy.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          17 hours ago

          Yeah, the way I see it if what you’re looking at is real, it might look alright. Sport, etc.

          For anything with special effects, it looks like unfinished behind the scenes footage. I saw The Hobbit in high frame rate and 3D, and let me tell you, it just looked like Martin Freeman in rubber feet. Although in fairness the whole film was gash even in standard 24 fps.

      • Katana314@lemmy.world
        link
        fedilink
        English
        arrow-up
        44
        arrow-down
        2
        ·
        1 day ago

        It’s a terrible effect, and people who don’t spend much time in their TV’s setup may not know or think to turn it off - or they delude themselves into thinking they like the effect.

      • krooklochurm@lemmy.ca
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        23 hours ago

        If you’re watching a tv and the frame rate hitches all over the place every few seconds then one of these stupid fucking settings is on.

        The shit wouldn’t be so fucking awful if it could actually maintain a stable frame rate but it can’t. None of them can.

    • sonofearth@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 day ago

      Wait DLSS is about upscaling right? The “features” mentioned in OP’s post are about motion interpolation that makes the video seem to be playing at higher fps than the standard 24fps used in movies and shows.

      • vithigar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        13
        ·
        22 hours ago

        Because names mean nothing Nvidia has also labeled their frame generation as “DLSS”.

      • lemming741@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        1 day ago

        It allows more resolution by cutting the fps. Fake frames are inserted into the gaps to get the fps back.

          • NekuSoul@lemmy.nekusoul.de
            link
            fedilink
            English
            arrow-up
            12
            ·
            edit-2
            23 hours ago

            It’s both. Nvidia just started calling everything DLSS, no matter how accurately it matches the actual term.

            Image upscaling? DLSS. Frame generation? DLSS. Ray reconstruction? DLSS. Image downscaling? Surprisingly, also DLSS.

            • AdrianTheFrog@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              18 hours ago

              Frame generation is the only real odd-one-out here, the rest are using basically the same technique under the hood. I guess we don’t really know exactly what ray reconstruction is doing since they’ve never released a paper or anything, but I think it combines DLSS upscaling with denoising basically, in the same pass.

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            17 hours ago

            What you’re thinking of is “DLSS Super Resolution.” The other commenters are right, nVidia insists on calling all of their various upscaling schemes “DLSS” regardless of whether they’re image resolution interpolation or frame interpolation. Apparently just to be annoying.

            There is a marginally handy chart on their website:

            All of it is annoying and terrible regardless of what it’s called, though.

        • Yggstyle@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          22 hours ago

          Its simply “visual noise” that tricks the viewer into thinking they are getting more of something than they are. Its a cheap inconsistent filler. Its nvidia not admitting they hit a technical wall and needing a way to force new inferior products onto the market to satisfy sales.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 hours ago

      DLSS Frame Generation actually uses the game’s analytic motion vectors though instead of trying to estimate them (well, really it does both) so it is a whole lot more accurate. It’s also using a fairly large AI model for the estimation, in comparison to TVs probably just doing basic optical flow or something.

      If it’s actually good though depends on if you care about latency and if you can notice the visual artifacts in the game you’re using it for.

      • Yggstyle@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 hours ago

        Motion blur is consistent and reproducible using math. The other isn’t. Something that cannot produce consistent results and is sold as a solution does have a name though: snake oil.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          17 hours ago

          Motion blur in video games is usually a whole lot less accurate at what it’s trying to approximate than averaging 4 frame generation frames would be. Although 4 frame generation frames would be a lot slower to compute than the approximations people normally make for motion blur.

          Yes, motion blur in video games is just an approximation and usually has a lot of visible failure cases (disocclusion, blurred shadows, rotary blur sometimes). It obviously can’t recreate the effect of a fast blinking light moving across the screen during a frame. It can be a pretty good approximation in the better implementations, but the only real way to ‘do it properly’ is by rendering frames multiple times per shown frame or rendering stochastically (not really possible with rasterization and obviously introduces noise). Perfect motion blur would be the average of an infinite number of frames over the period of time between the current frame and the last one. With path tracing you can do the rendering stochastically, and you need a denoiser anyways, so you can actually get very accurate motion blur. As the number of samples approaches infinity, the image approaches the correct one.

          Some academics and nvidia researchers have recently coauthored a paper about optimizing path tracing to apply ReSTIR (technique for reusing information across multiple pixels and across time) to scenes with motion blur, and the results look very good (obviously still very noisy, I guess nvidia would want to train another ray reconstruction model for it). It’s also better than normal ReSTIR or Area ReSTIR when there isn’t motion blur apparently. It’s relying on a lot of approximations too, so probably not quite unbiased path tracing quality if allowed to converge, but I don’t really know.

          https://research.nvidia.com/labs/rtr/publication/liu2025splatting/

          But that probably won’t be coming to games for a while, so we’re stuck with either increasing framerates to produce blur naturally (through real or ‘fake’ frames), or approximating blur in a more fake way.

  • nullptr@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    I ve read the post, i ve read the comments, i have still no idea what are we talking about

    • edinbruh@feddit.it
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      1
      ·
      1 day ago

      The post is about when you are a tech savvy person, and go to a relative’s house for the holiday and see some piece of tech with default configuration. Often tech companies (especially TV companies) enable buzzword technology to trick non tech savvy people into believing there was an improvement where there actually wasn’t. Often, inspection with a more educated eye reveals that the result actually looks bad and ruins the original media (unless it was already terrible).

      In this case the gripe is with frame smoothing technologies, which look smeared and ruin details and timing of movies. But to someone who doesn’t know better it looks like “whoa, it really is smoother, I’m gonna smooth all the smoothing with my new extra smooth smoother; the smoothness salesman sold me real smooth on this” (I’m calling out the dishonest seller, not the consumer with this).

      So when the tech savvy person sees the swindled relative, they try to fix up the situation disabling the bullshit, but every brand gives it a different patented bullshit name.

      It’s worth noting that inevitably, as soon as you leave the house the relatives will:

      • Not notice a thin
      • Call you because the TV “doesn’t do the thing it did before anymore” and you have to explain that you did it and why it’s better until they ask you to put it back
      • Spend too much time trying to pot back the thing on their own, making even worse choices along the way

      To actually help them you should have been involved in the choice of device, but if you ever got involved in a choice you would automatically become the designated tech purchase advisor forever and ever.

    • korazail@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 day ago

      Go watch any old (think Cinderella, Bambi) Disney movie on Disney+. Notice how it’s nice and sharp. It’s been upscaled. Notice how the frame rate is fast, it’s been interpolated.

      Now, closely watch the edges of the lines. They are inconsistent, smeared and now you can’t not see it… Sorry

      Many modern TVs are now doing this by default and it’s rarely a better experience.

    • weariedfae@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      20 hours ago

      You know how TV shows all look like soap operas nowadays? It’s because of goddamn motion smoothing and it is on by default on most TVs.

      People who can’t tell are monsters. Blind monsters.

      This post is for those of us willing to put in the work to turn it off and restore balance to the universe.

    • bookmeat@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      21 hours ago

      You’re the one whose tv will be fixed when family comes to visit over the holidays.

    • ninjabard@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      It’s a setting in newer TVs that “smooths” frames for lower quality media to maximize the capabilities of modern TV hardware. It very rarely looks good. This post lists what the major manufacturers call the technology.

    • brillotti@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      ·
      edit-2
      2 days ago

      It’s the setting to disable on smart TVs for a better image. The option can do oone or more of the following: adds in-between frames, reduces noise, and upscales video. Sounds good, but the implementation is always terrible.

      • Asafum@feddit.nl
        link
        fedilink
        English
        arrow-up
        27
        ·
        2 days ago

        Is this what that uncanny “too smooth” look is on my parents TV? Whenever I’d go to visit them whatever they had on always looked like the camera motion or character movement was way too smooth to the point it was kind of unsettling.

          • GalacticGrapefruit@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            ·
            1 day ago

            Animators too. FFS, when you deliberately position a character to look fluid, life-like, and emphatic at a frame rate, you have to respect it, or you lose it! Adding frames willy-nilly ruins movies and animation. Don’t like it? Wanna be a gamer? Well, maybe just sit tight and accept that you have to trust that the artist, idfk, knew how to do their fucking job.

            Personal rant here. I hate automated interpolation. I would literally prefer it if you deep-fried my work by overcompressing it over and over to ‘save space.’

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        I’m enjoying LG’s implementation. 🤷‍♂️ I have that stuff enabled, but not on the outputs where I play games.

      • kadu@scribe.disroot.org
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        I usually hate those too…

        But they are not universally bad. OLED screens have almost instantaneous response times, which if paired with lateral movement and content shot at 24 FPS can become a stuttery mess instead of a smooth camera pan. In some movies, it’s enough to give me a headache.

        In those scenarios, one of the interpolation settings available on my LG C1 instantly fixes the issue and does not add significant artifacts. The goal isn’t simulating 120 FPS on a TV show, but working around content filmed at abysmally low FPS (which was relevant when film was expensive and we used blurry TVs, not good for 2025).

    • bamboo@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      This is in regard to motion smoothing which is enabled by default on modern TVs, and give the effect of a soap opera look. People think that it makes the video look better, but it’s just adding fake frames to display at a higher frame rate. Not a lot of people like this: https://variety.com/2022/film/news/motion-smoothing-how-to-shut-off-1235176633/

      To make matters worse, all TV brands have their own name for this feature. This post is saying that when you go home for the holidays, this is the name of the motion smoothing in the settings to turn off for a better viewing experience the way the filmmaker intends.

      • Dozzi92@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        I’ve been turning off smoothing for at least 15, years at this point. Not every terrible tech is AI.

  • otacon239@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    I went to a friend’s house recently where this was enabled. I couldn’t bite my tongue for more than a few minutes before I had to bring it up. They were instantly impressed with how much better it looked lol.

  • Corkyskog@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    How new does your TV have to be to have this? I just scrolled through settings and couldn’t find it. Is it normally in screen settings?

    • GlendatheGayWitch@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      If it’s a smart TV, it’s probably there. Might be under a different name. I think my 6 year old Samsung calls it motion smoothing

    • Final Remix@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      It could be in “input settings” too, since it may be device-specific (HDMI 1 on Game mode, etc.)

      My plasma “”“120"”" Hz plasma true-motion-plus enabled Panasonic had that shit and it was a decade old.

  • brap@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    I think I’m the only person I know who doesn’t mind it.

    Shit, maybe I’m old and that’s what everyone else is going to look for and turn off.

    • jpeps@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      Unpopular opinion but generally I agree. Used it for a long time despite being someone that ‘always notices’ etc etc and honestly preferred it. Eventually turned it off though because every now and then I would be distracted by artifacts during fast movement.

        • flubba86@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Well, it uses AI to insert extra frames in between the real frames. So it doesn’t just look like AI content, it is AI content, spliced between every frame of your anime.

          • Barbecue Cowboy@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            I’m hesitant to call it AI for the stuff available on TVs. It’s similar at some level but drastically simpler and the tech most TVs will use has been around way longer than what we’d typically call AI. Motion smoothing is usually math and algorithms we mostly could understand.

            Now, if we’re talking DLSS/etc on a PC, yeah, that probably qualifies as AI but there’s a lot more that goes into that.

          • Cort@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Or if you’re at 120hz it’s 3 frames of ai for each frame of 30fps video.

  • TriangleSpecialist@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    “Photography is truth. The cinema is truth 24 times per second, plus some neural frame interpolation spliced in between.”

    • Jensen Godard, probably