crispy4000 Posted March 11, 2022 Posted March 11, 2022 Good riddance to 1.0, which is essentially a sharpening filter. I really do hope they can get it right this time. Not expecting it to go tit for tat with DLSS, but I hope it’s something I’ll actually want to use with my aging GPU, and will benefit the consoles. In other FSR news, Switch Sports supposedly uses the older implementation. 1 Quote
Mr.Vic20 Posted March 11, 2022 Posted March 11, 2022 Excellent! All of this tech needs to keep advancing...because we are NEVER getting new GPUs. 3 Quote
Xbob42 Posted March 11, 2022 Posted March 11, 2022 Sounds good if true, but AMD continually disappoints, so... 1 1 Quote
stepee Posted March 11, 2022 Posted March 11, 2022 Temporal upscaling this time! They should really name it something else. Quote
Keyser_Soze Posted March 12, 2022 Posted March 12, 2022 3 hours ago, Mr.Vic20 said: because we are NEVER getting new GPUs. Except you. 1 Quote
silentbob Posted March 12, 2022 Posted March 12, 2022 So this could this be coming to SteamDeck too? Quote
stepee Posted March 12, 2022 Posted March 12, 2022 14 minutes ago, silentbob said: So this could this be coming to SteamDeck too? It will 100% work on steam deck. 1 Quote
imthesoldier Posted March 12, 2022 Posted March 12, 2022 DLSS gets all the rage right now, but wasn't it the case that DLSS 1.0 was rather dogshit, and it wasn't until 2.0 where DLSS finally hit its stride? Not saying the same will apply with FSR, but I'm cautiously optimistic this'll fit the bill for many. Even FSR 1.0 from what I've been told from a buddy of mine that FSR Ultra Quality for a game such as RE Village worked very well, and even the FSR Quality setting. The tech can only get better at this point, and this is good news for all. Quote
Dre801 Posted March 12, 2022 Posted March 12, 2022 Good to see they're improving it. DLSS 1.0 wasn't that great either. 2.x is way better. Quote
legend Posted March 12, 2022 Posted March 12, 2022 2 hours ago, imthesoldier said: DLSS gets all the rage right now, but wasn't it the case that DLSS 1.0 was rather dogshit, and it wasn't until 2.0 where DLSS finally hit its stride? Not saying the same will apply with FSR, but I'm cautiously optimistic this'll fit the bill for many. Even FSR 1.0 from what I've been told from a buddy of mine that FSR Ultra Quality for a game such as RE Village worked very well, and even the FSR Quality setting. The tech can only get better at this point, and this is good news for all. DLSS 1 was pretty good but required game-specific optimization which limited its applicability. DLSS2 was certainly better but I think the main benefit is they got it to work generally without special optimization. Quote
DPCyric Posted March 12, 2022 Posted March 12, 2022 13 hours ago, Keyser_Soze said: Except you. Us as well as he eventually hands them down to us Quote
crispy4000 Posted March 12, 2022 Author Posted March 12, 2022 22 hours ago, BlueAngel said: 1.0 is pretty good as is if you need it, curious to see how this turns out. 10 hours ago, imthesoldier said: Even FSR 1.0 from what I've been told from a buddy of mine that FSR Ultra Quality for a game such as RE Village worked very well, and even the FSR Quality setting. FSR 1.0 works okay in dark games with heavy post-processing effects. With most everything else, you’re getting a poor visual trade-off for the performance gains. That is unless you’re nearly hitting 4K and could run the highest FSR setting to get there. Lower native resolutions look better in most other circumstances. Same is true with Nvidia’s FSR knock-off they slid out recently too for older gpus. There have been reports of FSR helping excessively blurry Deck or Switch games look more legible. That might be its most practical use as of yet. Quote
Xbob42 Posted March 12, 2022 Posted March 12, 2022 5 hours ago, imthesoldier said: DLSS gets all the rage right now, but wasn't it the case that DLSS 1.0 was rather dogshit, and it wasn't until 2.0 where DLSS finally hit its stride? Not saying the same will apply with FSR I have zero expectations that the same will apply with FSR. AMD has done nothing but let me down over and over and over. Weak GPUs, stuttery, underpowered CPUs, tech that arrives late and is inferior to the competition, and every time they "finally do it," it's always some raving review by someone who apparently only uses AMD. I fully expect this "way better FSR" to have massive issues that early adopters ignore. I'd love to be wrong because a more universal competitor to DLSS would be great, but with AMD behind the wheel all I can hope is that when the car impacts the tree I'll be ejected through the windshield and into the passing nvidia truck. 1 Quote
imthesoldier Posted March 14, 2022 Posted March 14, 2022 On 3/12/2022 at 1:41 PM, Xbob42 said: I have zero expectations that the same will apply with FSR. AMD has done nothing but let me down over and over and over. Weak GPUs, stuttery, underpowered CPUs, tech that arrives late and is inferior to the competition, and every time they "finally do it," it's always some raving review by someone who apparently only uses AMD. I fully expect this "way better FSR" to have massive issues that early adopters ignore. I'd love to be wrong because a more universal competitor to DLSS would be great, but with AMD behind the wheel all I can hope is that when the car impacts the tree I'll be ejected through the windshield and into the passing nvidia truck. I have a love-hate relationship with AMD, and formerly ATI Radeon. Some of my first PC gaming experiences were done on ATI GPUs, but in the mid-2000s, AMD was lacking in the shaders department, so I dumped my X850XT (which had Shader Model 2.0), and went with an nVidia 7800GS (which had Shader model 3.0). On paper, they were both about the same in performance, but crucially, there were shader advantages the nVidia had, the AMD card did not, which many games at the time took advantage of. After that, I still have yet to go back to AMD when it comes to GPUs. I think AMD though have something good going in the console space for Microsoft, and Sony, and while nVidia only is in that pie via Nintendo, it's still a profitable endeavor for them. Call the Switch underpowered all you want, the Tegra X1 has shown the potential of nVidia in the mobile chipset realm, and their future. While things such as Ray-Tracing are highly unlikely to be used in the next Switch, Tensor cores for DLSS I have a firm belief they will. Quote
Zaku3 Posted March 14, 2022 Posted March 14, 2022 On 3/11/2022 at 5:51 PM, Mr.Vic20 said: Excellent! All of this tech needs to keep advancing...because we are NEVER getting new GPUs. Gonna need to use my Dell finacial account to buy an Alienware to steal the GPU. Quote
crispy4000 Posted March 15, 2022 Author Posted March 15, 2022 11 hours ago, imthesoldier said: While things such as Ray-Tracing are highly unlikely to be used in the next Switch, Tensor cores for DLSS I have a firm belief they will. Any modern chip design design should support raytracing in some fashion. Like Series S, we might not see it much due to the performance hit. But Tensor cores are used for both DLSS and RT acceleration on NVIDIA's side. Quote
imthesoldier Posted March 15, 2022 Posted March 15, 2022 11 hours ago, crispy4000 said: Any modern chip design design should support raytracing in some fashion. Like Series S, we might not see it much due to the performance hit. But Tensor cores are used for both DLSS and RT acceleration on NVIDIA's side. Correct me if I'm wrong, but don't you still need the RT cores to render the image before it's processed through the Tensor cores? Otherwise, wouldn't that put the burden on the CUDA cores because I don't recall any of the Tegra chips have RT cores in them? Obviously, it's possible to render Ray-Tracing through software, but that's very intensive as it is, and same with the CUDA cores, which would mean less resources being dedicated for other work. Quote
crispy4000 Posted March 15, 2022 Author Posted March 15, 2022 2 hours ago, imthesoldier said: Correct me if I'm wrong, but don't you still need the RT cores to render the image before it's processed through the Tensor cores? Otherwise, wouldn't that put the burden on the CUDA cores because I don't recall any of the Tegra chips have RT cores in them? Obviously, it's possible to render Ray-Tracing through software, but that's very intensive as it is, and same with the CUDA cores, which would mean less resources being dedicated for other work. Yeah, I think you're right. Don't see why they'd leave them out though, when they've been paired with CUDA cores since the start. Quote
imthesoldier Posted March 15, 2022 Posted March 15, 2022 55 minutes ago, crispy4000 said: Yeah, I think you're right. Don't see why they'd leave them out though, when they've been paired with CUDA cores since the start. I think it's simply down to die size, plus power usage. Add the fact you're dealing with a mobile chipset, it probably makes less sense to utilize to begin with. That'll probably change in the coming years for sure, but I don't see it happening this go around. Even if Tegra did have RT cores, Nintendo would likely just disable them, or outright have them removed in order to reduce costs. Based on recent patents, plus Nintendo's EAD division, Nintendo themselves are quite interested in machine learning, and upscaling tecniques (hence why something such as DLSS makes sense), but don't appear to have the same feels towards ray-tracing right now. This is the same company after all that opted against including the Super FX chip into the SNES for the NA market due to costs. You could definitely imagine what would've been possible if more developers utilized it if it were baked into each system though. But I'm sidetracking this whole thread by this point, so apologizes on that. Quote
crispy4000 Posted March 15, 2022 Author Posted March 15, 2022 15 minutes ago, imthesoldier said: I think it's simply down to die size, plus power usage. Add the fact you're dealing with a mobile chipset, it probably makes less sense to utilize to begin with. That'll probably change in the coming years for sure, but I don't see it happening this go around. Even if Tegra did have RT cores, Nintendo would likely just disable them, or outright have them removed in order to reduce costs. Based on recent patents, plus Nintendo's EAD division, Nintendo themselves are quite interested in machine learning, and upscaling tecniques (hence why something such as DLSS makes sense), but don't appear to have the same feels towards ray-tracing right now. This is the same company after all that opted against including the Super FX chip into the SNES for the NA market due to costs. You could definitely imagine what would've been possible if more developers utilized it if it were baked into each system though. But I'm sidetracking this whole thread by this point, so apologizes on that. The most recent rumors suggest that whatever Switch comes next will have both. Check the Nvidia leaks. Nintendo haven't said anything about raytracing, afaik. We wouldn't know they're not interested until we see they weren't. Quote
imthesoldier Posted March 15, 2022 Posted March 15, 2022 2 minutes ago, crispy4000 said: The most recent rumors suggest that whatever Switch comes next will have both. Check the Nvidia leaks. Nintendo haven't said anything about ray-tracing, upscaling or machine learning themselves, afaik. Looking at the sites that talked about the leaks, it looked to be some twitter user who mentioned "ray-tracing" support (as well as DLSS) because it uses Ampere architecture, which normally it would. Tegra Orin though normally does not have RT cores in it, so it would have to be pretty custom to include it. I do recall the leaks suggested it would use 8 ARM Cortex A78 cores vs. the 12 that is in the standard T234 version. Interestingly, nVidia does have a version of the Tegra Orin chip, named the Jetson Orin NX, listed on their site: Jetson Orin NX | NVIDIA Developer DEVELOPER.NVIDIA.COM Jetson Orin NX NVIDIA® Jetson Orin™ NX brings the most powerful AI computer for energy-efficient autonomous machines to the smallest Jetson form-factor. It accelerates the NVIDIA AI software stack with more than double the CUDA cores and up to 5X the performance of NVIDIA® Jetson Xavier™ NX, and supports multiple sensors with the latest high speed interfaces. The Jetson Orin NX module is coming... If you ask me, this is the more realistic version of what would power the Switch 2. The memory bandwidth is cut in half (uses 12GB instead of 16), same with the CUDA cores (1024 vs. 2048), uses only 8 ARM cores instead of 12, and can come in a 15w variant, same as the Switch in docked mode. It's almost a perfect upgrade over the Tegra X1. If the Switch 2 does truly have RT cores, and of course full ray-tracing support, I'll be incredibly surprised. Quote
crispy4000 Posted March 17, 2022 Author Posted March 17, 2022 RSR supposedly only works with the 1.0 FSR algorithm. Quote
imthesoldier Posted March 17, 2022 Posted March 17, 2022 4 hours ago, crispy4000 said: If those results are accurate, FSR 2.0 looks to be a nice step up over 1.0. The use of a Temporal Upscaling method vs. Spacial looks to be the biggest reason why the upgrade. Quote
Massdriver Posted March 18, 2022 Posted March 18, 2022 RDNA 3 is where you will finally see AMD seriously compete with nvidia in the gpu space for the first time in years. It will have been developed with a much higher r&d budget compared to previous gens. I’m loving Intel coming back out swinging on the cpu front to keep prices in check. I hope it helps with gpu prices too. Quote
Xbob42 Posted March 18, 2022 Posted March 18, 2022 17 minutes ago, Massdriver said: RDNA 3 is where you will finally see AMD seriously compete with nvidia in the gpu space for the first time in years. It will have been developed with a much higher r&d budget compared to previous gens. I’m loving Intel coming back out swinging on the cpu front to keep prices in check. I hope it helps with gpu prices too. I'll be glad if they're finally able to deliver. One company dominating in the GPU or CPU space is fucking awful. Always leads to slow power increases and ridiculous pricing, even outside of scalpers. Quote
Massdriver Posted March 18, 2022 Posted March 18, 2022 20 minutes ago, Xbob42 said: I'll be glad if they're finally able to deliver. One company dominating in the GPU or CPU space is fucking awful. Always leads to slow power increases and ridiculous pricing, even outside of scalpers. AMD is very competitive in the CPU space, so much so that they were overcharging for cpus up until very recently when Intel struck back. I was a bit confused with your comment about stuttering cpus. They seem to work really well from my experience and a lot of gamers agree with me. Maybe I’m just missing something. Quote
Xbob42 Posted March 18, 2022 Posted March 18, 2022 3 minutes ago, Massdriver said: AMD is very competitive in the CPU space, so much so that they were overcharging for cpus up until very recently when Intel struck back. I was a bit confused with your comment about stuttering cpus. They seem to work really well from my experience and a lot of gamers agree with me. Maybe I’m just missing something. AMD is not hyper-competitive when it comes to gaming CPUs, moreso with general PC use or workstation use. Intel continues to dominate that area with on average superior performance. Not that AMD's is bad or anything, my 3900X does just fine, but seeing DF after DF video comparison with high-end Intel CPUs it leaves me wanting more. As for stuttering, I was specifically talking about this: AMD to Fix 'fTPM' Stuttering Issues on Ryzen CPUs, But Patch Is 2 Months Out WWW.PCMAG.COM Customers began experiencing the problems after enabling the fTPM module on the Ryzen CPUs. Easily solved by disabling their sad fTPM (if fTPM is enabled on Windows 10 OR 11, you can get this issue) but they've been hearing complaints for months and won't have a fix for months more. I was getting this exact stutter on a few games and it was driving me fucking insane because I had no idea it was related to their busted ass fTPM. Considering how much I cannot stand stutter, this is something I won't soon be forgetting when deciding on a new CPU in the future. 1 Quote
Massdriver Posted March 18, 2022 Posted March 18, 2022 26 minutes ago, Xbob42 said: AMD is not hyper-competitive when it comes to gaming CPUs, moreso with general PC use or workstation use. Intel continues to dominate that area with on average superior performance. Not that AMD's is bad or anything, my 3900X does just fine, but seeing DF after DF video comparison with high-end Intel CPUs it leaves me wanting more. As for stuttering, I was specifically talking about this: AMD to Fix 'fTPM' Stuttering Issues on Ryzen CPUs, But Patch Is 2 Months Out WWW.PCMAG.COM Customers began experiencing the problems after enabling the fTPM module on the Ryzen CPUs. Easily solved by disabling their sad fTPM (if fTPM is enabled on Windows 10 OR 11, you can get this issue) but they've been hearing complaints for months and won't have a fix for months more. I was getting this exact stutter on a few games and it was driving me fucking insane because I had no idea it was related to their busted ass fTPM. Considering how much I cannot stand stutter, this is something I won't soon be forgetting when deciding on a new CPU in the future. AMD needs to release zen 4 and 5 a lot quicker. The gap between 3 and 4 has been far too long. I’m running 3800x win 10 and haven’t noticed. Maybe I’m not sensitive to it, but stutter usually bothers me. Where would I disable this security feature in windows 10? Most articles I just searched for were applicable to win 11. Quote
Xbob42 Posted March 18, 2022 Posted March 18, 2022 5 minutes ago, Massdriver said: AMD needs to release zen 4 and 5 a lot quicker. The gap between 3 and 4 has been far too long. I’m running 3800x win 10 and haven’t noticed. Maybe I’m not sensitive to it, but stutter usually bothers me. Where would I disable this security feature in windows 10? Most articles I just searched for were applicable to win 11. TPM is required to install Windows 11 (but not run it... funnily enough) so most people probably have it off by default. But if you have it on, you may experience the issue even on Windows 10. 1 Quote
Massdriver Posted March 18, 2022 Posted March 18, 2022 How do I turn it on in windows 10? Just curious. Quote
Xbob42 Posted March 18, 2022 Posted March 18, 2022 30 minutes ago, Massdriver said: How do I turn it on in windows 10? Just curious. It's a BIOS option. Quote
crispy4000 Posted March 18, 2022 Author Posted March 18, 2022 11 hours ago, Massdriver said: RDNA 3 is where you will finally see AMD seriously compete with nvidia in the gpu space for the first time in years. It will have been developed with a much higher r&d budget compared to previous gens. These days, I'd sooner buy a weaker card with less oomph on paper if it was better at upscaling and resolving detail. On paper specs have never mattered less. Hopefully FSR 2.0 can change the narrative a little, and benefit the consoles too. I'm interested to see what Intel is doing XeSS and Arc for the same reasons. Quote
imthesoldier Posted March 18, 2022 Posted March 18, 2022 9 hours ago, Xbob42 said: AMD is not hyper-competitive when it comes to gaming CPUs, moreso with general PC use or workstation use. Intel continues to dominate that area with on average superior performance. Not that AMD's is bad or anything, my 3900X does just fine, but seeing DF after DF video comparison with high-end Intel CPUs it leaves me wanting more. As for stuttering, I was specifically talking about this: AMD to Fix 'fTPM' Stuttering Issues on Ryzen CPUs, But Patch Is 2 Months Out WWW.PCMAG.COM Customers began experiencing the problems after enabling the fTPM module on the Ryzen CPUs. Easily solved by disabling their sad fTPM (if fTPM is enabled on Windows 10 OR 11, you can get this issue) but they've been hearing complaints for months and won't have a fix for months more. I was getting this exact stutter on a few games and it was driving me fucking insane because I had no idea it was related to their busted ass fTPM. Considering how much I cannot stand stutter, this is something I won't soon be forgetting when deciding on a new CPU in the future. IMO, AMD's competitive edge tends to come in the form of power efficiency, plus in general their multi-threaded performance. Intel I always recall being dominant in single-threaded workloads, and will tend to overtake in raw horsepower, and brute force. AMD's Ryzen though has been the necessary drive to make Intel more competitive, but games, and still many applications don't tend to incorporate more than a few cores. I would've figured by now, given the PS4, and X1's tenure, 8 cores would've been the norm by now, but 6 cores is about the sweet spot, and I believe it'll remain that way for some time. With AMD's FSR, it won't be superior to nVidia's DLSS, but like FreeSync before it, it's open-source nature will likely drive more adoption. Quote
imthesoldier Posted March 23, 2022 Posted March 23, 2022 AMD says FSR 2.0 will run on Xbox and these Nvidia graphics cards WWW.THEVERGE.COM AMD’s new answer to DLSS might be "demanding," though. R.I.P RX 580 owners. They get no love anymore. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.