Jump to content

FSR 2.0 - Let’s try this again


crispy4000

Recommended Posts

 

Good riddance to 1.0, which is essentially a sharpening filter.

 

I really do hope they can get it right this time.  Not expecting it to go tit for tat with DLSS, but I hope it’s something I’ll actually want to use with my aging GPU, and will benefit the consoles.

In other FSR news, Switch Sports supposedly uses the older implementation.

  • Shocked 1
Link to comment
Share on other sites

DLSS gets all the rage right now, but wasn't it the case that DLSS 1.0 was rather dogshit, and it wasn't until 2.0 where DLSS finally hit its stride? Not saying the same will apply with FSR, but I'm cautiously optimistic this'll fit the bill for many. Even FSR 1.0 from what I've been told from a buddy of mine that FSR Ultra Quality for a game such as RE Village worked very well, and even the FSR Quality setting.

 

The tech can only get better at this point, and this is good news for all. :D

Link to comment
Share on other sites

2 hours ago, imthesoldier said:

DLSS gets all the rage right now, but wasn't it the case that DLSS 1.0 was rather dogshit, and it wasn't until 2.0 where DLSS finally hit its stride? Not saying the same will apply with FSR, but I'm cautiously optimistic this'll fit the bill for many. Even FSR 1.0 from what I've been told from a buddy of mine that FSR Ultra Quality for a game such as RE Village worked very well, and even the FSR Quality setting.

 

The tech can only get better at this point, and this is good news for all. :D


DLSS 1 was pretty good but required game-specific optimization which limited its applicability. DLSS2 was certainly better but I think the main benefit is they got it to work generally without special optimization. 

Link to comment
Share on other sites

22 hours ago, BlueAngel said:

1.0 is pretty good as is if you need it, curious to see how this turns out. 

 

10 hours ago, imthesoldier said:

Even FSR 1.0 from what I've been told from a buddy of mine that FSR Ultra Quality for a game such as RE Village worked very well, and even the FSR Quality setting.

 

FSR 1.0 works okay in dark games with heavy post-processing effects.  With most everything else, you’re getting a poor visual trade-off for the performance gains.  That is unless you’re nearly hitting 4K and could run the highest FSR setting to get there.  Lower native resolutions look better in most other circumstances.  Same is true with Nvidia’s FSR knock-off they slid out recently too for older gpus.


There have been reports of FSR helping excessively blurry Deck or Switch games look more legible.  That might be its most practical use as of yet.

Link to comment
Share on other sites

5 hours ago, imthesoldier said:

DLSS gets all the rage right now, but wasn't it the case that DLSS 1.0 was rather dogshit, and it wasn't until 2.0 where DLSS finally hit its stride? Not saying the same will apply with FSR

I have zero expectations that the same will apply with FSR. AMD has done nothing but let me down over and over and over. Weak GPUs, stuttery, underpowered CPUs, tech that arrives late and is inferior to the competition, and every time they "finally do it," it's always some raving review by someone who apparently only uses AMD. I fully expect this "way better FSR" to have massive issues that early adopters ignore.


I'd love to be wrong because a more universal competitor to DLSS would be great, but with AMD behind the wheel all I can hope is that when the car impacts the tree I'll be ejected through the windshield and into the passing nvidia truck.

  • Confused 1
Link to comment
Share on other sites

On 3/12/2022 at 1:41 PM, Xbob42 said:

I have zero expectations that the same will apply with FSR. AMD has done nothing but let me down over and over and over. Weak GPUs, stuttery, underpowered CPUs, tech that arrives late and is inferior to the competition, and every time they "finally do it," it's always some raving review by someone who apparently only uses AMD. I fully expect this "way better FSR" to have massive issues that early adopters ignore.


I'd love to be wrong because a more universal competitor to DLSS would be great, but with AMD behind the wheel all I can hope is that when the car impacts the tree I'll be ejected through the windshield and into the passing nvidia truck.

 

I have a love-hate relationship with AMD, and formerly ATI Radeon. Some of my first PC gaming experiences were done on ATI GPUs, but in the mid-2000s, AMD was lacking in the shaders department, so I dumped my X850XT (which had Shader Model 2.0), and went with an nVidia 7800GS (which had Shader model 3.0). On paper, they were both about the same in performance, but crucially, there were shader advantages the nVidia had, the AMD card did not, which many games at the time took advantage of.

 

After that, I still have yet to go back to AMD when it comes to GPUs. I think AMD though have something good going in the console space for Microsoft, and Sony, and while nVidia only is in that pie via Nintendo, it's still a profitable endeavor for them. Call the Switch underpowered all you want, the Tegra X1 has shown the potential of nVidia in the mobile chipset realm, and their future. While things such as Ray-Tracing are highly unlikely to be used in the next Switch, Tensor cores for DLSS I have a firm belief they will.

Link to comment
Share on other sites

11 hours ago, imthesoldier said:

While things such as Ray-Tracing are highly unlikely to be used in the next Switch, Tensor cores for DLSS I have a firm belief they will.


Any modern chip design design should support raytracing in some fashion.  Like Series S, we might not see it much due to the performance hit.  But Tensor cores are used for both DLSS and RT acceleration on NVIDIA's side.

Link to comment
Share on other sites

11 hours ago, crispy4000 said:


Any modern chip design design should support raytracing in some fashion.  Like Series S, we might not see it much due to the performance hit.  But Tensor cores are used for both DLSS and RT acceleration on NVIDIA's side.

 

Correct me if I'm wrong, but don't you still need the RT cores to render the image before it's processed through the Tensor cores? Otherwise, wouldn't that put the burden on the CUDA cores because I don't recall any of the Tegra chips have RT cores in them? Obviously, it's possible to render Ray-Tracing through software, but that's very intensive as it is, and same with the CUDA cores, which would mean less resources being dedicated for other work.

Link to comment
Share on other sites

2 hours ago, imthesoldier said:

 

Correct me if I'm wrong, but don't you still need the RT cores to render the image before it's processed through the Tensor cores? Otherwise, wouldn't that put the burden on the CUDA cores because I don't recall any of the Tegra chips have RT cores in them? Obviously, it's possible to render Ray-Tracing through software, but that's very intensive as it is, and same with the CUDA cores, which would mean less resources being dedicated for other work.

 

Yeah, I think you're right.  Don't see why they'd leave them out though, when they've been paired with CUDA cores since the start.

 

 

Link to comment
Share on other sites

55 minutes ago, crispy4000 said:

 

Yeah, I think you're right.  Don't see why they'd leave them out though, when they've been paired with CUDA cores since the start.

 

 

 

I think it's simply down to die size, plus power usage. Add the fact you're dealing with a mobile chipset, it probably makes less sense to utilize to begin with. That'll probably change in the coming years for sure, but I don't see it happening this go around. Even if Tegra did have RT cores, Nintendo would likely just disable them, or outright have them removed in order to reduce costs. Based on recent patents, plus Nintendo's EAD division, Nintendo themselves are quite interested in machine learning, and upscaling tecniques (hence why something such as DLSS makes sense), but don't appear to have the same feels towards ray-tracing right now. This is the same company after all that opted against including the Super FX chip into the SNES for the NA market due to costs. You could definitely imagine what would've been possible if more developers utilized it if it were baked into each system though.

 

But I'm sidetracking this whole thread by this point, so apologizes on that.

Link to comment
Share on other sites

15 minutes ago, imthesoldier said:

 

I think it's simply down to die size, plus power usage. Add the fact you're dealing with a mobile chipset, it probably makes less sense to utilize to begin with. That'll probably change in the coming years for sure, but I don't see it happening this go around. Even if Tegra did have RT cores, Nintendo would likely just disable them, or outright have them removed in order to reduce costs. Based on recent patents, plus Nintendo's EAD division, Nintendo themselves are quite interested in machine learning, and upscaling tecniques (hence why something such as DLSS makes sense), but don't appear to have the same feels towards ray-tracing right now. This is the same company after all that opted against including the Super FX chip into the SNES for the NA market due to costs. You could definitely imagine what would've been possible if more developers utilized it if it were baked into each system though.

 

But I'm sidetracking this whole thread by this point, so apologizes on that.

 

The most recent rumors suggest that whatever Switch comes next will have both.  Check the Nvidia leaks.

Nintendo haven't said anything about raytracing, afaik.  We wouldn't know they're not interested until we see they weren't.

Link to comment
Share on other sites

2 minutes ago, crispy4000 said:

 

The most recent rumors suggest that whatever Switch comes next will have both.  Check the Nvidia leaks.

Nintendo haven't said anything about ray-tracing, upscaling or machine learning themselves, afaik.

 

Looking at the sites that talked about the leaks, it looked to be some twitter user who mentioned "ray-tracing" support (as well as DLSS) because it uses Ampere architecture, which normally it would. Tegra Orin though normally does not have RT cores in it, so it would have to be pretty custom to include it. I do recall the leaks suggested it would use 8 ARM Cortex A78 cores vs. the 12 that is in the standard T234 version. Interestingly, nVidia does have a version of the Tegra Orin chip, named the Jetson Orin NX, listed on their site:

 

DEVELOPER.NVIDIA.COM

Jetson Orin NX NVIDIA® Jetson Orin™ NX brings the most powerful AI computer for energy-efficient autonomous machines to the smallest Jetson form-factor. It accelerates the NVIDIA AI software stack with more than double the CUDA cores and up to 5X the performance of NVIDIA® Jetson Xavier™ NX, and supports multiple sensors with the latest high speed interfaces. The Jetson Orin NX module is coming...

 

If you ask me, this is the more realistic version of what would power the Switch 2. The memory bandwidth is cut in half (uses 12GB instead of 16), same with the CUDA cores (1024 vs. 2048), uses only 8 ARM cores instead of 12, and can come in a 15w variant, same as the Switch in docked mode. It's almost a perfect upgrade over the Tegra X1.

 

If the Switch 2 does truly have RT cores, and of course full ray-tracing support, I'll be incredibly surprised.

Link to comment
Share on other sites

RDNA 3 is where you will finally see AMD seriously compete with nvidia in the gpu space for the first time in years. It will have been developed with a much higher r&d budget compared to previous gens.

I’m loving Intel coming back out swinging on the cpu front to keep prices in check. I hope it helps with gpu prices too. 

Link to comment
Share on other sites

17 minutes ago, Massdriver said:

RDNA 3 is where you will finally see AMD seriously compete with nvidia in the gpu space for the first time in years. It will have been developed with a much higher r&d budget compared to previous gens.

I’m loving Intel coming back out swinging on the cpu front to keep prices in check. I hope it helps with gpu prices too. 

I'll be glad if they're finally able to deliver. One company dominating in the GPU or CPU space is fucking awful. Always leads to slow power increases and ridiculous pricing, even outside of scalpers.

Link to comment
Share on other sites

20 minutes ago, Xbob42 said:

I'll be glad if they're finally able to deliver. One company dominating in the GPU or CPU space is fucking awful. Always leads to slow power increases and ridiculous pricing, even outside of scalpers.

AMD is very competitive in the CPU space, so much so that they were overcharging for cpus up until very recently when Intel struck back. I was a bit confused with your comment about stuttering cpus. They seem to work really well from my experience and a lot of gamers agree with me. Maybe I’m just missing something. 

Link to comment
Share on other sites

3 minutes ago, Massdriver said:

AMD is very competitive in the CPU space, so much so that they were overcharging for cpus up until very recently when Intel struck back. I was a bit confused with your comment about stuttering cpus. They seem to work really well from my experience and a lot of gamers agree with me. Maybe I’m just missing something. 

AMD is not hyper-competitive when it comes to gaming CPUs, moreso with general PC use or workstation use. Intel continues to dominate that area with on average superior performance. Not that AMD's is bad or anything, my 3900X does just fine, but seeing DF after DF video comparison with high-end Intel CPUs it leaves me wanting more.

 

As for stuttering, I was specifically talking about this:

 

01INjoms96d7cYMB4un5iN6-1.fit_lim.size_1
WWW.PCMAG.COM

Customers began experiencing the problems after enabling the fTPM module on the Ryzen CPUs.

 

Easily solved by disabling their sad fTPM (if fTPM is enabled on Windows 10 OR 11, you can get this issue) but they've been hearing complaints for months and won't have a fix for months more. I was getting this exact stutter on a few games and it was driving me fucking insane because I had no idea it was related to their busted ass fTPM. Considering how much I cannot stand stutter, this is something I won't soon be forgetting when deciding on a new CPU in the future.

  • Thanks 1
Link to comment
Share on other sites

26 minutes ago, Xbob42 said:

AMD is not hyper-competitive when it comes to gaming CPUs, moreso with general PC use or workstation use. Intel continues to dominate that area with on average superior performance. Not that AMD's is bad or anything, my 3900X does just fine, but seeing DF after DF video comparison with high-end Intel CPUs it leaves me wanting more.

 

As for stuttering, I was specifically talking about this:

 

01INjoms96d7cYMB4un5iN6-1.fit_lim.size_1
WWW.PCMAG.COM

Customers began experiencing the problems after enabling the fTPM module on the Ryzen CPUs.

 

Easily solved by disabling their sad fTPM (if fTPM is enabled on Windows 10 OR 11, you can get this issue) but they've been hearing complaints for months and won't have a fix for months more. I was getting this exact stutter on a few games and it was driving me fucking insane because I had no idea it was related to their busted ass fTPM. Considering how much I cannot stand stutter, this is something I won't soon be forgetting when deciding on a new CPU in the future.

AMD needs to release zen 4 and 5 a lot quicker. The gap between 3 and 4 has been far too long. 
 

I’m running 3800x win 10 and haven’t noticed. Maybe I’m not sensitive to it, but stutter usually bothers me. Where would I disable this security feature in windows 10? Most articles I just searched for were applicable to win 11. 

Link to comment
Share on other sites

5 minutes ago, Massdriver said:

AMD needs to release zen 4 and 5 a lot quicker. The gap between 3 and 4 has been far too long. 
 

I’m running 3800x win 10 and haven’t noticed. Maybe I’m not sensitive to it, but stutter usually bothers me. Where would I disable this security feature in windows 10? Most articles I just searched for were applicable to win 11. 

TPM is required to install Windows 11 (but not run it... funnily enough) so most people probably have it off by default. But if you have it on, you may experience the issue even on Windows 10.

  • Thanks 1
Link to comment
Share on other sites

11 hours ago, Massdriver said:

RDNA 3 is where you will finally see AMD seriously compete with nvidia in the gpu space for the first time in years. It will have been developed with a much higher r&d budget compared to previous gens.

 

These days, I'd sooner buy a weaker card with less oomph on paper if it was better at upscaling and resolving detail.  On paper specs have never mattered less.  Hopefully FSR 2.0 can change the narrative a little, and benefit the consoles too.

I'm interested to see what Intel is doing XeSS and Arc for the same reasons.

Link to comment
Share on other sites

9 hours ago, Xbob42 said:

AMD is not hyper-competitive when it comes to gaming CPUs, moreso with general PC use or workstation use. Intel continues to dominate that area with on average superior performance. Not that AMD's is bad or anything, my 3900X does just fine, but seeing DF after DF video comparison with high-end Intel CPUs it leaves me wanting more.

 

As for stuttering, I was specifically talking about this:

 

01INjoms96d7cYMB4un5iN6-1.fit_lim.size_1
WWW.PCMAG.COM

Customers began experiencing the problems after enabling the fTPM module on the Ryzen CPUs.

 

Easily solved by disabling their sad fTPM (if fTPM is enabled on Windows 10 OR 11, you can get this issue) but they've been hearing complaints for months and won't have a fix for months more. I was getting this exact stutter on a few games and it was driving me fucking insane because I had no idea it was related to their busted ass fTPM. Considering how much I cannot stand stutter, this is something I won't soon be forgetting when deciding on a new CPU in the future.

 

IMO, AMD's competitive edge tends to come in the form of power efficiency, plus in general their multi-threaded performance. Intel I always recall being dominant in single-threaded workloads, and will tend to overtake in raw horsepower, and brute force. AMD's Ryzen though has been the necessary drive to make Intel more competitive, but games, and still many applications don't tend to incorporate more than a few cores. I would've figured by now, given the PS4, and X1's tenure, 8 cores would've been the norm by now, but 6 cores is about the sweet spot, and I believe it'll remain that way for some time.

 

With AMD's FSR, it won't be superior to nVidia's DLSS, but like FreeSync before it, it's open-source nature will likely drive more adoption.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...