Jump to content

PC Community Thread


stepee

Recommended Posts

45 minutes ago, Spawn_of_Apathy said:


Im not quite sure what I’m supposed to be seeing. There’s less blurr on the green orbs on the DLSS 3+Frame Gen. Is that it? 

 

Yes, less artifacting, though, it could be that the DLSS screen is a rendered frame and the FSR3 is generated

 

29 minutes ago, crispy4000 said:

Where is that comparison from?

 

Hardware Unboxed, but I don't recommend watching their comparison videos tbh, there's a heavy bias from one of the "journalists" there IMO that make them hard to watch vs someone like Gamers Nexus or DF.

Link to comment
Share on other sites

23 minutes ago, Spork3245 said:

 

Yes, less artifacting, though, it could be that the DLSS screen is a rendered frame and the FSR3 is generated


yeah I could see that being a possibility unless you’re able to compare both frame by frame. 
 

I think my feed would be full of more videos being very disappointed with FSR frame gen if it looked like all the time. So I do wonder what’s going on there. 

Link to comment
Share on other sites

20 minutes ago, Spawn_of_Apathy said:


yeah I could see that being a possibility unless you’re able to compare both frame by frame. 
 

I think my feed would be full of more videos being very disappointed with FSR frame gen if it looked like all the time. So I do wonder what’s going on there. 

 

TBF it's only in two games and there haven't been a ton of comparisons yet. The biggest thing is that, from what I read, it looks like HDR can't be used at the same time as FSR3 for some reason (I assume it'll be fixed by EOY)

  • Haha 1
Link to comment
Share on other sites

1 minute ago, Spork3245 said:

 

TBF it's only in two games and there haven't been a ton of comparisons yet. The biggest thing is that, from what I read, it looks like HDR can't be used at the same time as FSR3 for some reason (I assume it'll be fixed by EOY)


yikes. That would be a straight deal breaker for me. Even Auto HDR can add so much to the image. 

Link to comment
Share on other sites

That FSR3 Immortals image actually kinda matches how Forspoken looked when I tested it on the rog ally. I kinda hoped that would be because of the low resolution, or maybe that comparison was at a lower res too? 

 

I never like how FSR2 looks either though. Sometimes, especially handheld, it’s better than nothing, but they need to invest in AI. I think that is what makes the big difference in dlss looking like native or better and fsr looking like…something different.

Link to comment
Share on other sites

The 7800x3D has dropped all the way down to $350. After being on a 5900x I really wanted to go back to Intel for my next upgrade, but that's a ridiculous price. The 13700k really needs to drop to $300 or lower. If X670/B650E mobos could run 4 sticks of DDR5 without issues I'd probably have a new CPU/Mobo/RAM in the mail right now :p 

Link to comment
Share on other sites

13 minutes ago, Spork3245 said:

The 7800x3D has dropped all the way down to $350. After being on a 5900x I really wanted to go back to Intel for my next upgrade, but that's a ridiculous price. The 13700k really needs to drop to $300 or lower. If X670/B650E mobos could run 4 sticks of DDR5 without issues I'd probably have a new CPU/Mobo/RAM in the mail right now :p 

 

Go for it. Is there a special need for 4 sticks of memory? Just buy 2 huge sticks. DDR5, compared to DDR4, has inherently twice the rank per stick which is why both Intel and AMD have to run much lower speeds when 4 slots are populated - albeit Intel is still able to run a bit faster. AMD has also recently been focusing on improving memory performance, and have broken above the 6000 MHz barrier with the latest BIOS. Good chance they'll keep improving.

 

 

I'd be happy to see the 5800x3D drop to $200. I'm partially CPU limited when playing Forza at 1440p and that's quite annoying.

Link to comment
Share on other sites

16 minutes ago, cusideabelincoln said:

 

Go for it. Is there a special need for 4 sticks of memory? Just buy 2 huge sticks. DDR5, compared to DDR4, has inherently twice the rank per stick which is why both Intel and AMD have to run much lower speeds when 4 slots are populated - albeit Intel is still able to run a bit faster. AMD has also recently been focusing on improving memory performance, and have broken above the 6000 MHz barrier with the latest BIOS. Good chance they'll keep improving.

 

The aesthetics of two sticks is gross :p . Most Zx90 motherboards currently have no issue with 4xDDR5 at 6000-6400mhz with the latest BIOS, some even seem to be able to run 4 at 7000mhz+ but that varies user to user.

 

16 minutes ago, cusideabelincoln said:

 

I'd be happy to see the 5800x3D drop to $200. I'm partially CPU limited when playing Forza at 1440p and that's quite annoying.

 

What do you currently have?

 

Link to comment
Share on other sites

3 minutes ago, Spork3245 said:

 

The aesthetics of two sticks is gross :p . Most Zx90 motherboards currently have no issue with 4xDDR5 at 6000-6400mhz with the latest BIOS, some even seem to be able to run 4 at 7000mhz+ but that varies user to user.

 

 

What do you currently have?

 

There seems to be more silicon lottery jackpot potential with Intel CPUs and fast RAM. Or you can just get a motherboard with just 2 RAM slots :p

 

I've got a 5900X and 3090. Game definitely needs some patches to smooth out performance and bugs, but right now I'm hovering around that 60 fps mark with any RT turned on with the benchmark telling me I'm only using 75% of my GPU. Turning down the render resolution barely made a difference, as there were still small stutters. With RT off performance was fine and I was getting higher GPU utilization.

Link to comment
Share on other sites

7 minutes ago, cusideabelincoln said:

There seems to be more silicon lottery jackpot potential with Intel CPUs and fast RAM. Or you can just get a motherboard with just 2 RAM slots :p

 

ew GIF

 

7 minutes ago, cusideabelincoln said:

I've got a 5900X and 3090. Game definitely needs some patches to smooth out performance and bugs, but right now I'm hovering around that 60 fps mark with any RT turned on with the benchmark telling me I'm only using 75% of my GPU. Turning down the render resolution barely made a difference, as there were still small stutters. With RT off performance was fine and I was getting higher GPU utilization.

 

I don't think a 5800x3D is worth it, even for $200. https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/20.html you're looking at about a 8-12% gaming boost at 1440p unless it's super optimized for 3D cache. For us 5900x users, it's 13600k+ or 7700x+ for a minimum of a 20%+ boost otherwise it's kind of a waste, even considering needing a new mobo + RAM. I think it's often forgotten how well the 5900x performs :p 

 

Link to comment
Share on other sites

3 minutes ago, Spork3245 said:

 

ew GIF

 

 

I don't think a 5800x3D is worth it, even for $200. https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/20.html you're looking at about a 8-12% gaming boost at 1440p unless it's super optimized for 3D cache. For us 5900x users, it's 13600k+ or 7700x+ for a minimum of a 20%+ boost otherwise it's kind of a waste, even considering needing a new mobo + RAM. I think it's often forgotten how well the 5900x performs :p 

 

 

A complete overhaul would be the way to go, but the simplicity of just dropping in an upgrade with doing an entire build is also compelling. I just need that little boost to stay above 60 minimum.

 

A full upgrade would be nasty. Under some scenarios I could double my performance and that is extremely tempting. With a 240 Hz monitor I lower settings for any online multiplayer game, and getting better performance in emulation would be nice too.

Link to comment
Share on other sites

3 minutes ago, cusideabelincoln said:

 

A complete overhaul would be the way to go, but the simplicity of just dropping in an upgrade with doing an entire build is also compelling. I just need that little boost to stay above 60 minimum.

 

A full upgrade would be nasty. Under some scenarios I could double my performance and that is extremely tempting. With a 240 Hz monitor I lower settings for any online multiplayer game, and getting better performance in emulation would be nice too.

 

I dunno, you can get that 12900k newegg mobo/cpu/ram combo for $400, so waiting for a $200 5800x3D seems wrong to me :nottalking:

Link to comment
Share on other sites

10 minutes ago, cusideabelincoln said:

 

Dang that's cheap, and would also help heat my house through winter :thinking:

 

It goes in and out of stock multiple times a day: https://www.reddit.com/r/buildapcsales/comments/171mw9i/bundle_i912900k_asus_tuf_gaming_z690plus_wifi_arx/

Keep checking that reddit page to try and catch it. Otherwise, Microcenter has a similar bundle if you have one near you (they also have a 13700k + mobo + ram bundle for $500)

 

I would've grabbed one of those bundles awhile ago if it came with Corsair Vengeance RGB RAM, or if I could opt-out of the RAM it comes with and save $50 or something :p 

Link to comment
Share on other sites

12 minutes ago, Spork3245 said:

 

It goes in and out of stock multiple times a day: https://www.reddit.com/r/buildapcsales/comments/171mw9i/bundle_i912900k_asus_tuf_gaming_z690plus_wifi_arx/

Keep checking that reddit page to try and catch it. Otherwise, Microcenter has a similar bundle if you have one near you (they also have a 13700k + mobo + ram bundle for $500)

 

I would've grabbed one of those bundles awhile ago if it came with Corsair Vengeance RGB RAM, or if I could opt-out of the RAM it comes with and save $50 or something :p 

 

Yeah those are good deals, but I also have to tell myself that the next time I upgrade I want the latest PCIE, USB, and whatever specs that I can get. On my current build, I bought a cheap bundle of older parts right when PCIE 4.0 had come out, so now I'm stuck with 3.0 speeds that's costing me a little bit of performance as well.

Link to comment
Share on other sites

2 hours ago, cusideabelincoln said:

There seems to be more silicon lottery jackpot potential with Intel CPUs and fast RAM. Or you can just get a motherboard with just 2 RAM slots :p

 

I've got a 5900X and 3090. Game definitely needs some patches to smooth out performance and bugs, but right now I'm hovering around that 60 fps mark with any RT turned on with the benchmark telling me I'm only using 75% of my GPU. Turning down the render resolution barely made a difference, as there were still small stutters. With RT off performance was fine and I was getting higher GPU utilization.

 

Forza has serious serious issues right now on pc with cpu performance. I’d just wait a few weeks and the game will probably patch in that performance. I get 80fps on a 7950x3D set as a 7800x3D and my gpu usage is under utilized…but if I adjust my native resolution to something ridiculous like 540p then it jumps up. But adjusting dlss or render resolution doesn’t do that. It’s all bonked in a way that doesn’t make sense. I wouldn’t base any purchasing decisions on it. 

  • True 1
Link to comment
Share on other sites

20 minutes ago, stepee said:

 

Forza has serious serious issues right now on pc with cpu performance. I’d just wait a few weeks and the game will probably patch in that performance. I get 80fps on a 7950x3D set as a 7800x3D and my gpu usage is under utilized…but if I adjust my native resolution to something ridiculous like 540p then it jumps up. But adjusting dlss or render resolution doesn’t do that. It’s all bonked in a way that doesn’t make sense. I wouldn’t base any purchasing decisions on it. 

 

Oh I'm not buying anything too soon, and the game is obviously borked. It doesn't remember graphic settings when I change them, sometimes the DLSS option will disappear completely and be replaced by FSR2, and there are weird stutters.

Link to comment
Share on other sites

3 hours ago, Spork3245 said:

The 7800x3D has dropped all the way down to $350. After being on a 5900x I really wanted to go back to Intel for my next upgrade, but that's a ridiculous price. The 13700k really needs to drop to $300 or lower. If X670/B650E mobos could run 4 sticks of DDR5 without issues I'd probably have a new CPU/Mobo/RAM in the mail right now :p 


Is there even a reason to get a 13700k over a 7800x3D? Developers don’t seem to care at all for the extra cores and threads on 12th and 13th gen Intel CPUs. The 3D v-cache seems to get used much more often. 

Link to comment
Share on other sites

13 minutes ago, Spawn_of_Apathy said:


Is there even a reason to get a 13700k over a 7800x3D? Developers don’t seem to care at all for the extra cores and threads on 12th and 13th gen Intel CPUs. The 3D v-cache seems to get used much more often. 

 

Just the aesthetic appearance of the extra two sticks, you’d lose gaming performance.

Link to comment
Share on other sites

3 hours ago, Spawn_of_Apathy said:


Is there even a reason to get a 13700k over a 7800x3D? Developers don’t seem to care at all for the extra cores and threads on 12th and 13th gen Intel CPUs. The 3D v-cache seems to get used much more often. 

 

2 hours ago, stepee said:

 

Just the aesthetic appearance of the extra two sticks, you’d lose gaming performance.


There’s only a 4-5% performance difference between them at 1440p ( https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/20.html ), I’ll be able to use 4x RAM sticks, and then I don’t have to deal with AMD’s shit bioses anymore, plus Intel is typically better for emulation.

Link to comment
Share on other sites

2 minutes ago, Spork3245 said:

 


There’s only a 4-5% performance difference between them at 1440p ( https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/20.html ), I’ll be able to use 4x RAM sticks, and then I don’t have to deal with AMD’s shit bioses anymore, plus Intel is typically better for emulation.

 

That shows 11% on 1080p though (think dlss performance mode) and that kind of benchmark never shows the full truth when irs just a handful of games that are really issues with the cpu, so averages might not reflect your actual “can I max everything out at 120fps” experience. Also it looks like the 13700 is more like 6-8% vs a 13700k there at 1400p. That said the 13700 is indeed close enough and should be more than fine for anything not broken anyway.

Link to comment
Share on other sites

9 minutes ago, stepee said:

That shows 11% on 1080p though


I’m seeing 8%. 
 

11 minutes ago, stepee said:

Also it looks like the 13700 is more like 6-8% vs a 13700k there at 1400p. That said the 13700 is indeed close enough and should be more than fine for anything not broken anyway.


I don’t see a non-k on the benchmark? :confused:

I wouldn’t get a non-k incase I overclock anyway 

Link to comment
Share on other sites

21 minutes ago, Spork3245 said:


I’m seeing 8%. 
 


I don’t see a non-k on the benchmark? :confused:

I wouldn’t get a non-k incase I overclock anyway 

 

I meant 13700k, I was just being lazy lol

 

For 1080p average performance I see 13700k at 91.2% and 7800x3d at 100% or 102.8% - If you want to use the non pbo max I’d still round 91.2 down instead of up and put it at 9%. It’s starting to kinda matter at that point either way. 

 

It’s just depends how you look at it though, I always look at cpu scores at the lower resolutions because I want to know what’s going to perform the best once that load starts getting pushed harder. But you can also look at averages at what you actually play at and figure that it’s more than enough for now.

Link to comment
Share on other sites

26 minutes ago, stepee said:

For 1080p average performance I see 13700k at 91.2% and 7800x3d at 100% or 102.8%


Ah, yea, you need to go by the actual framerate on the next chart. The 7800x3d (1080p) avg is 261.1, the 13700k is 241.7, which is an 8% difference. I’m not using the max pbo thing as that seems bin dependent and not guaranteed.

 

26 minutes ago, stepee said:

I always look at cpu scores at the lower resolutions because I want to know what’s going to perform the best once that load starts getting pushed harder. But you can also look at averages at what you actually play at and figure that it’s more than enough for now.


That’s good to see overall raw power in gaming, I agree, however, if I’m unlikely to play at those resolutions then it’s basically a “look at my e-peen” score like synthetic SSD tests, IMO :p 

Link to comment
Share on other sites

1 minute ago, Spork3245 said:


Ah, yea, you need to go by the actual framerate on the next chart. The 7800x3d (1080p) avg is 261.1, the 13700k is 241.7, which is an 8% difference. I’m not using the max pbo thing as that seems bin dependent and not guaranteed.

 


That’s good to see overall raw power in gaming, I agree, however, if I’m unlikely to play at those resolutions then it’s basically a “look at my e-peen” score like synthetic SSD tests, IMO :p 

 

I wouldn’t at least consider whatever dlss internal resolution you are comfortable with these days though!

Link to comment
Share on other sites

1 minute ago, stepee said:

 

I wouldn’t at least consider whatever dlss internal resolution you are comfortable with these days though!


It’s not 1:1 with DLSS scaling (there’s a difference between DLSS internal rendering vs that same resolution at native), plus I’m unlikely to drop below balanced these days :p. That’s why I typically look at 1440p+

Link to comment
Share on other sites

1 minute ago, Spork3245 said:


It’s not 1:1 with DLSS scaling (there’s a difference between DLSS internal rendering vs that same resolution at native), plus I’m unlikely to drop below balanced these days :p 

 

This all gets confusing because also I usually raise the dlss quality level when I’m cpu limited!

 

Edit: And yeah, that’s true, I think mostly, I want benches to be redesigned for 2023

  • Sicko 1
Link to comment
Share on other sites

Just now, stepee said:

 

This all gets confusing because also I usually raise the dlss quality level when I’m cpu limited!


To my understanding (after hours on r/nvidia from boredom), 4k performance DLSS (1080p render) vs 1080p native doesn’t have 1:1 performance. There’s anywhere from a 5-20% difference pending the game and the DLSS one will have less load on the CPU allegedly in some (but not all) cases. 

Link to comment
Share on other sites

Just now, Spork3245 said:


To my understanding (after hours on r/nvidia from boredom), 4k performance DLSS (1080p render) vs 1080p native doesn’t have 1:1 performance. There’s anywhere from a 5-20% difference pending the game and the DLSS one will have less load on the CPU allegedly in some (but not all) cases. 

 

Oh no, I’m aware of that, I was just making a joke lol. There isn’t a good chart to refer to because the current methods don’t really reflect how gamers play right now. I look to 1080p as I find it more representative for a demanding game than say 4k. The problem is there are different ways to look at it because we don’t have good bench methodology.

Link to comment
Share on other sites

Just now, stepee said:

 

Oh no, I’m aware of that, I was just making a joke lol. There isn’t a good chart to refer to because the current methods don’t really reflect how gamers play right now. I look to 1080p as I find it more representative for a demanding game than say 4k. The problem is there are different ways to look at it because we don’t have good bench methodology.


It’d be nice to see more DLSS benchmarks included

  • Hype 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...