This is for multiplying your fps by 3x or 4x, but the input lag, ghosting, stuttering, and other issues make everything worse. Overall I'd recommend sticking to lossless scaling at 2x or not using framegen at all.

This doesn't surprise me. Raw math, frame gen makes no sense to me unless you're already hitting 120 FPS natively, and therefore you need at minimum a 240Hz display to make use of it.

Basic math, to generate frames, you must have the next frame ready to generate an in-between. Which means your frame display is delayed by a frame, meaning your input lag is equivalent to natively running at half the rate you're natively running at. And this is assuming flawless, instant frame generation. For "motion smoothness", a vague, not all that important element of game feel, IMO.

So, crunch some numbers. Natively running at 60? Neat, you can have the "motion smoothness" of 120 for the input lag of 30. Not worth it IMO, 30 feels pretty rough when you're used to 60.

Native 120? Alright, the difference in input lag to 60 is way less. 8ms of added lag is tolerable, and with 4x frame gen you can drive a 480Hz monitor. Pretty good, and the time gap is small enough you'll have minimal visible errors in the generated frames. The question of course being... do you own a 480Hz monitor? Not to mention 120 has solid motion smoothness already, so it's still kind of a questionable trade. I'd still personally prefer native 120, but it's at least reasonable.

A debatable sweet spot might be 80-100, 40-50FPS is more than halfway to 60 from 30 (in milliseconds), and you can multiply into more reasonable monitors than 480Hz. 360Hz to fully leverage 4x frame gen is something you're more likely to actually own.

End of the day though, my core takeaway is that frame gen is incredibly niche. You either need to be obsessive about motion smoothness without caring about input lag, have a hella fast monitor and great performance, or uh... most likely, not understand any of this and want framerate go bigger.

Your concept of halving the framerate is wrong though... If you're natively at 100fps, each frame takes 10ms to render. Enable frame gen and you get 10ms of additional latency (plus the latency from generating the frame itself which is often 1 or 2ms). Thats a lot less than you would claim, which would be 20+ms from feeling like 50fps.

Its more comparable to the latency from vsync if anything, which people have been using for decades even on 30fps content.

There are plenty of tests that show this online, you dont need to even napkin math it out like this.

What? Your numbers are right, if you were running the game at 100FPS it would take 10ms to render a frame. Plus your 10ms of additional latency from holding the frame. 10ms + 10ms is 20ms.

If you were running the game natively at 50FPS, it would take 20ms to render a frame. That's the same number. The total input lag from rendering is identical. Add in the slowdown from your GPU rendering the in-betweens and it's even worse.

VSync may complicate this though, depending on the method, since you may already be holding a frame for some amount of time, I hadn't considered that. I personally use VRR, so it isn't much on my setup.

A lot depends on the game too. Some games are naturally slower movement, slower to swing a weapon, etc. In those slower paced games, some added input lag can be unnoticeable, while feeling like a major issue in a more twitchy game.

It's also worth mentioning that the popular lsfg frame gen option doesn't work this way, unlike baked in frame gen, the game engine's ability to accept input isn't delayed at all since the additional frames are added after. This means the generated frame quality is lower, but input lag is much less on most games.

It depends entirely on the genre. Turn based games are EXCELLENT candidates for frame generation. Who cares if there is sub-second input lag if you have unlimited time to make your selections? It's well worth a little bit of lag to run a game on much higher quality settings, and still maintain 60+ FPS.

Frame Gen is by its very design a really stupid technology

Upscaling tech (DLSS/FSR/etc) is nice as a way to help older/weaker hardware play newer games, and I've really appreciated it on the deck. I really don't like it when games use it as a crutch to avoid having to optimize their game to an acceptable level.

Frame gen is in a worse spot because it usually only works well on hardware that can already hit 60fps. I've never found a built in framegen option that was actually usable on the deck without horrendous input lag and/or graphical issues.

Lossless Scaling's Frame Gen is a sometimes exception, I've found a few Deck games that it works really well with. There are still occasional graphical issues/ghosting with it, but it can help out quite a bit. It's weird to me that 3rd party software from a small dev would work better than integrated FG from the game devs/GPU makers, but it is what it is.

It's pretty cool even if your hardware isn't struggling, FSR's Native AA at 1440p helps a bunch especially if a game has fucked up TAA/sharpening

Antialiasing and upscaling are fundamentally different tech than frame generation

I know? I'm replying to a comment talking about FSR

...but they were talking about it in response to a comment about frame-gen

I'm just calling out the drift in the meaning of the comment chain. The entropic drift, if you will.

They mentioned FSR as being a cool thing for struggling hardware. I expanded on it. That's how threads work usually lmao

If DLSS can reach its fifth generation AI can destroy component supply for a second year

Can’t wait!

But they make the numbers bigger! Biggest number is best number!

Pfft, you get a chart with no legend, we don't need no stinking numbers

Nvidia slop on an AMD GPU?

do unsupported thing on unsupported hardware

it's bad

*socked Pikachu*

Last time I tried one of these filters, the game looked terrible (it was Death Stranding). It applied a weird squiggly, distorted look that wasn’t even remotely better looking. For games where you need precision and timing, I can see this making multiplayer games hell.

(NSFW warning)

When I had an RTX 2060 laptop, I had the most jank setup for Cyberpunk 2077.

I’d plug it into an older Sony OLED, which only supported low res HDMI input (can’t remember which res, I think 1080p?)

The RTX 2060 would run DLSS quality (for antialiasing) and output 2077 at low-res 60fps, and the TV would use its big ASIC to interpolate it to 120hz, and up to 4K.

And actually, it looked good! It felt smooth! Input lag wasn’t great, but absolutely playable.

I don’t have a PC that can do framegen (3090 now), but ironically, the DLSS framegen demos I’ve seen didn’t have interpolation as good as the Sony. And I believe Sony/Samsung support “no next frame” interpolation, so they don’t blow up input lag.


I’m not saying it’s a great idea, but there are ways of doing this that aren’t terrible.

This is anecdotal but I don't get all the frame gen hate, I've had to tweak it a little bit toward the quality setting but everything looks normal and it's totally smooth in maxed Cyberpunk at 165 fps

If the rendered framerate is over 60fps (which I'd wager is the case for you), it probably looks great.

Interpolation isn't psychic; of course its going to look like jello "guessing" what's between frames at a slideshow pace, especially with the constraint of low latency (without any future frames to use).

But I do have issue with some devs (and some of Nvidia's marketing) treating it as a crutch. It's not going to fix 15fps, but it's a sane way to get from 60 to 165 smoothly. TBH its less wasteful than trying to hit a native 165.

Yeah with frame gen off it's still like 110-120 with dips in the high double digits

Exactly. Frame gen is trash if you can only hit like 28 FPS before turning it on. But if you are already over 60, it can be fine.

But there is always a latency penalty, and that’s why if you can’t make frame rate without it, you are just digging a deeper hole latency wise.

Ah, right.

That’s sad. AFAIK Oculus developed more integrated interpolation techniques (and, separately, warping) to reduce perceived latency:

https://www.uploadvr.com/reprojection-explained/

But that’s like black magic these days. Studios can’t even get basic DLSS integration right.

The steam deck just isn't powerful enough for the games I play now. It's a shame because I love the device.

I think a lot of us are eventually going to upgrade to stronger systems (steam machine or something else), but it's going to be a long freaking time in that scenario before I stop streaming to my Deck when I'm home. It's just the king of comfy, kicked back on the couch or bed gaming.

midwest.social

Rules

  1. No porn.
  2. No bigotry, hate speech.
  3. No ads / spamming.
  4. No conspiracies / QAnon / antivaxx sentiment
  5. No zionists
  6. No fascists

Chat Room

Matrix chat room: https://matrix.to/#/#midwestsociallemmy:matrix.org

Communities

Communities from our friends:

Donations

LiberaPay link: https://liberapay.com/seahorse