For me it'd be a big deal as FSR1 looks awful and FSR2 looks bad - the whole "decoupling render resolution from output resolution" is a delusion for trying to push current GPUs to do things they are not really capable of doing. In 10 years we'll be looking back and make fun of how smeary, ghosty and blurry everything was.
> trying to push current GPUs to do things they are not really capable of doing
Sure, but until we get better GPUs, it's still the better solution to not being able to play it at all. Especially with the Steam Deck being battery powered, so you can't just brute force it with higher frequencies or more hardware. Even this new Steam Deck doesn't include a better GPU.
No, the better solution (IMO at least) is to write code that can run on current GPUs without relying on upscaling tech. Upscaling tech should be something to use once your GPU is old so you can run new games, not something that new games should rely on to run on current GPUs.
As for Steam Deck specifically, the current display resolution it has is perfect for its size, a higher resolution is going to have marginal results - and actually worse results if you need to rely on upscaling to reach it (at which point might as well stick with the current resolution and target it natively so you wont even have the upscaler's overhead).
Not just old GPUs, but also low-end GPUs. Which the Deck definitely is, that's why its target performance is 720p. I shouldn't expect to be able to plug it into a 4K screen and get performance without an upscaler. I also disagree with 720p being perfect for the size. I also have an Asus Ally which is 1080p, and it is a noticeable difference.
> actually worse results if you need to rely on upscaling to reach it
Not worse than just outputting the lower resolution. Otherwise there'd be no point of any of these upscaler.
I don't think comparing overhead to current resolution is 1:1 because modern upscalers also double as AA. Some modern games even force TAA, which upscalers replace. Especially at the lower resolutions, too less AA is more noticeable. And TAA kinda sucks.
Alan Wake 2 has FSR2 or DLSS instead of any other AA option even running them at native. Other games like Diablo IV give you the option of either TAA or upscaler.
> I also disagree with 720p being perfect for the size. I also have an Asus Ally which is 1080p, and it is a noticeable difference.
You can notice it but IMO the drabacks are not worth it. 720p will both perform faster and use less battery. The additional fidelity isn't worth the cost.
> Not worse than just outputting the lower resolution. Otherwise there'd be no point of any of these upscaler.
IMO it is actually worse - even on my main PC any game that has the option between "smart" upscaling and plain old bilinear i always choose the latter (assuming i can't run the game at native resolution at 60fps - which sadly seems to be the case with UE5 games and my RX 5700 XT) because the other options are both (slightly) slower and look worse. They look fine on static images or if the camera/objects doesn't move much, but fast changes create noticeable artifacts - especially on third person games where often the character model has a very visible "pixelly" outline. It looks like those 2D pixel art games that arbitrarily mix resolutions and as a result create a garish result. I'd rather have the consistent quality of bilinear upscaling.
Or, to be on topic, i'd rather have no upscaling at all and have the result be at the native resolution of the monitor with the latter being at a proper size so the underlying hardware can actually reach said resolution at playable framerates.