Jump to content

The official RTX 3000 series launch event thread & subsequent quest to strive in vain to find one!


Mr.Vic20

Recommended Posts

3 hours ago, cusideabelincoln said:

 

I think if you already have a 4 core i7 processor it's fine to hold onto it, but I don't think it's worth upgrading to at this point in time for an old platform.  Better off saving your money for a rebuild.  You can only get so far on a DDR3 platform; the memory speed and latency will also hold the system back.

 

The 3600x is a waste of money compared to the 3600, which comes and goes around $200 and worth it at that price.

 

I haven't looked into DDR5, but I don't see DDR4 going anywhere anytime soon.  The speeds of DDR4 have also drastically increased in the last year, for a good price.  It may become outdated when DDR5 releases, but it will still be fast and much faster than any DDR3 platform.  Not to mention DDR5 will be more expensive. And good chance the upcoming games will start to tax the 4c/8t processors even more, making the 6 core an even better investment.

 

Link to comment
Share on other sites

22 hours ago, cusideabelincoln said:

You can only get so far on a DDR3 platform; the memory speed and latency will also hold the system back.

 

In gaming, on an Intel platform? I'm not so sure of that.

 

Quote

 

The 3600x is a waste of money compared to the 3600, which comes and goes around $200 and worth it at that price.

 

I wasn't even aware of a difference. :p 

 

Quote

 

I haven't looked into DDR5, but I don't see DDR4 going anywhere anytime soon.  The speeds of DDR4 have also drastically increased in the last year, for a good price.  It may become outdated when DDR5 releases, but it will still be fast and much faster than any DDR3 platform.  Not to mention DDR5 will be more expensive. And good chance the upcoming games will start to tax the 4c/8t processors even more, making the 6 core an even better investment.

 

Of course DDR4 will still be around: DDR3 is still currently around too :p. The point is if you're a person that doesn't like to upgrade often, DDR5 will likely be the standard for 4-5 years, going with DDR4 now is potentially like buying a PS4 Pro to replace your PS4 this time last year. :) 

Link to comment
Share on other sites

Okay, I think I fixed the issue - the default fan-curve was keeping the fans at 0rpm until it reached 80C so it must've been throttling which also explains the mico-stutters/hitches :rolleyes:

I changed the fan-curve to be 25% below 40C and to gradually increase from there, and framerates and frametiming seem better. I was getting like 50-ish fps in Shadow of the Tomb Raider 4k DLSS max settings + RT, now...

Shadow-of-the-Tomb-Raider-Screenshot-202

Link to comment
Share on other sites

2 hours ago, BlueAngel said:

Do you have a switch on you card? I know my asus 2060 had a performance switch that would change the fan settings and voltage. Default was quiet and the other was performance, default would kill the fan until it got to like 60c and would barely go up from there.


No switch, just a very bad default fan curve. Sometimes these variants with giant-coolers that are made for OCing default to a non-existent fan-curve as they expect buyers to immediately start tweaking everything.

Link to comment
Share on other sites

8-F21355-C-6-FA4-4299-82-A6-940-E26070-D
 

(I haven’t had a chance to run/hide the third 8-pin connector yet) The RGB on this card is gorgeous. Four zones, all separate from each other: the large visible strip, underneath is a separate zone, the spot on top of the card to the right, and also the right side adjacent to my hdds/ssds (you can see that it’s glow is illuminating one of my hdds in teal/blue). I do hope EVGA updates their cards to work with iCue so it can sync with the rest of my lighting (ASUS recently made some of their cards iCue compatible).

Link to comment
Share on other sites

17 hours ago, Spork3245 said:

Okay, I think I fixed the issue - the default fan-curve was keeping the fans at 0rpm until it reached 80C so it must've been throttling which also explains the mico-stutters/hitches :rolleyes:

I changed the fan-curve to be 25% below 40C and to gradually increase from there, and framerates and frametiming seem better. I was getting like 50-ish fps in Shadow of the Tomb Raider 4k DLSS max settings + RT, now...

Shadow-of-the-Tomb-Raider-Screenshot-202

 

yah the fan curve out of the box was not great (im not sure it was no fans until 80 but definitely no fans until it was warmer) .  once i got precision x1 install and adjusted i still barely hear the fans and even under load in games like 2077 i havent seen it go above like 60-65

Link to comment
Share on other sites

58 minutes ago, cusideabelincoln said:

3 mechanical hard drives in your system?  That's a lot of porn.


3x 7200 rpm HDDs (750gb + 500gb are both legacy that I plan to replace and consolidate eventually - filled with documents and video editing stuff. 2tb 7200rpm for VR games and emu) 

1x 5400rpm 3tb for UHD + BD backups and MP3

3x sata SSD (500gb bx500, 1tb ADATA SU800, 2tb ADATA SU800 - all for gaming)

1x m.2 Samsung Evo 970 Pro 500gb for boot/OS

 :p 

 

8 minutes ago, Firewithin said:

 

yah the fan curve out of the box was not great (im not sure it was no fans until 80 but definitely no fans until it was warmer) .  once i got precision x1 install and adjusted i still barely hear the fans and even under load in games like 2077 i havent seen it go above like 60-65


I did some googling, and it looks like the default is no-fans until it goes above 60c... which is super dumb and adds a ton of heat inside the case for no reason. Was also causing tons of throttling. But, whatevs, it’s fixed now :p 

Link to comment
Share on other sites

Got my new rig set up now and really happy with the upgrade so far over the old system (kept the 3080, psu, ssds). The ddr3 ram and 4970k stuck at stock really bottlenecked the heck out of me.

 

Games seem to be about 20fps or so average better with the upgrade, some things like Watch Dogs Legion almost doubled with rtx ultra, that’s actually easily playable at 60fps now!

 

The big difference in playing I think is the minimum frame rate improvement though, it’s just so much more smooth playing, loving it.

 

Now to redownload Ms flight Simulator and see what’s up.

 

 

Link to comment
Share on other sites

2 hours ago, BlueAngel said:

Having a 4790k at stock is like buying a Ferrari to commute to work in it makes no sense.

 

yeah it’s just how the computer was when i got it remember, had probs.

 

Even with an oc tho it’s hard to imagine that a cpu still on the ddr3 platform is going to last too far into the gen, especially with ray tracing which seems to like fast ram.

 

Good news is though all the people who have been holding off have a great time to upgrade late this year/early next year when ddr5 hits.

Link to comment
Share on other sites

36 minutes ago, stepee said:

 

yeah it’s just how the computer was when i got it remember, had probs.

 

Even with an oc tho it’s hard to imagine that a cpu still on the ddr3 platform is going to last too far into the gen, especially with ray tracing which seems to like fast ram.

 

Good news is though all the people who have been holding off have a great time to upgrade late this year/early next year when ddr5 hits.


I’m not sure why you and others keep bringing up DDR3 vs DDR4 for gaming:

 

For gaming there’s almost no difference and the small amount present in one game is likely from the 5960x vs a 4770k.

 

Here we have Toms Hardware using a mobo that supports both DDR3 and DDR4:

USmNQdy7TC2ZRxa8haphB-1200-80.jpg
WWW.TOMSHARDWARE.COM

Does DDR4 really offer better program performance than DDR3? We compare ASRock's Fatal1ty Z170 Gaming K4/D3 to its previously-reviewed DDR4 sibling.


No real difference in gaming on, at least, Intel platforms.

 

Even DDR4 2400 vs 4000 doesn’t make a difference for gaming on Intel platforms. RAM speed only seems to matter for AMD. :p 

Link to comment
Share on other sites

1 minute ago, Spork3245 said:


I’m not sure why you and others keep bringing up DDR3 vs DDR4 for gaming:

 

For gaming there’s almost no difference and is from the 5960x vs a 4770k.

 

Here we have Toms Hardware using a mobo that supports both DDR3 and DDR4:

USmNQdy7TC2ZRxa8haphB-1200-80.jpg
WWW.TOMSHARDWARE.COM

Does DDR4 really offer better program performance than DDR3? We compare ASRock's Fatal1ty Z170 Gaming K4/D3 to its previously-reviewed DDR4 sibling.


No real difference in gaming on, at least, Intel platforms.

 

Even DDR4 2400 vs 4000 doesn’t make a difference on Intel platforms. :p 

 

I mean that was like a half decade ago and a lot of things have changed since then, and that’s certainly faster ddr3 ram than I had, so could depend on the type of ram your motherboard supports. 

 

But I’m not really talking overall right now anyway, just certain specific things I noticed in certain games where I think as ddr5 comes in ddr3 is going to start to not really work so well with games and all their ray tracing shinanigans. I think ddr4 platforms right now will be fine for awhile, just they will actually be used more.

Link to comment
Share on other sites

1 hour ago, stepee said:

I don’t think we are arguing anything here really, I’m not saying anyone should do anything specific right now, just thinking more about the future and bandwidth and stuff out loud.


By “arguing” I’m referencing the citing of an idea, not a debate between two parties. :p 

 

1 hour ago, stepee said:

 

I mean that was like a half decade ago and a lot of things have changed since then, and that’s certainly faster ddr3 ram than I had, so could depend on the type of ram your motherboard supports. 

 

But I’m not really talking overall right now anyway, just certain specific things I noticed in certain games where I think as ddr5 comes in ddr3 is going to start to not really work so well with games and all their ray tracing shinanigans. I think ddr4 platforms right now will be fine for awhile, just they will actually be used more.


System RAM doesn’t really affect gaming performance unless some of the CPU bandwidth is tied to it like with AMD’s Ryzen chips. That’s typically what VRAM is for, and that’s also in regards to the RT statement.

For non-gaming related things, yes, there’s a difference.

The Tom’s article used DDR3 1600 and DDR3 2400 vs DDR4. There’s also plenty of benches showing only a 1-2% difference between DDR4 2400 and DDR4 4000 on Intel platforms for gaming. If you want better framerates in games, faster ram is not the answer. :p 

Link to comment
Share on other sites

Faster CPUs require faster memory, just like how faster GPUs require faster VRAM bandwidth.  I don't think those 5 year old articles are representative of today's CPUs.  Even still, the Tom's article clearly shows how DDR3-1600 can hold back performance (using a 6700k and GTX 970!), and more than likely 1600 MHz is the speed most common people are using for these old systems.

 

I'd like to see how there's only a 1-2% difference between 2400 and 4000, with the latest games using the latest GPUs on the latest CPUs. The last two generations of Intel processors do actually see benefit from faster RAM, and there's certainly a huge difference between 2400 and 4000.  Sometimes even going from a single rank configuration to a dual rank configuration can improve performance.

 

CPU2-f.png
WWW.TECHSPOT.COM

In this article we'll be searching for Zen 3's memory sweet spot and looking at DDR4 memory performance with the new Ryzen 5000 CPU series, and a...

 

2019-08-09-image.png
WWW.TECHSPOT.COM

When we reviewed Ryzen's latest iteration we briefly checked out different DDR4 memory speeds but now that things have settled we were put on a mission to...

 

 

 

 

 

 

 

4000 MHz memory is not currently called for or price effective, but clearly 2666Mhz and slower is holding back performance for new CPUs.

 

The general rules as of today are:

At 4k, CPU/RAM still make little difference, although there might be a few rare exceptions to this rule.  GPUs aren't quite fast enough yet, but they are getting close.

At 1440p, CPU/RAM impact depends on the game and what you're going for.  

At 1080p, bottlenecks in most games.

 

In some extreme examples of slowness, a bad CPU/RAM combo can have the game dipping near or below 60 fps, which is my measurement for "time to upgrade".  Also depending on your monitor, there might actually be an ideal framerate you would want to achieve.  With some of these Gsync/Vsync monitors on the market, they actually have different levels of ghosting/blurring depending on the refresh rate or framerate your sending to the display.  Currently I have two 1440p/144+hz monitors.  If I'm playing a game at 60 fps on monitor 1, it looks like shit with tons of blurring and ghosting and, for some reason, stuttering.  When playing a game at 60 fps on monitor 2, it looks fine.

Link to comment
Share on other sites

1 hour ago, cusideabelincoln said:

Faster CPUs require faster memory, just like how faster GPUs require faster VRAM bandwidth.  I don't think those 5 year old articles are representative of today's CPUs.  Even still, the Tom's article clearly shows how DDR3-1600 can hold back performance (using a 6700k and GTX 970!), and more than likely 1600 MHz is the speed most common people are using for these old systems.

 

I'd like to see how there's only a 1-2% difference between 2400 and 4000, with the latest games using the latest GPUs on the latest CPUs. The last two generations of Intel processors do actually see benefit from faster RAM, and there's certainly a huge difference between 2400 and 4000.  Sometimes even going from a single rank configuration to a dual rank configuration can improve performance.

 

CPU2-f.png
WWW.TECHSPOT.COM

In this article we'll be searching for Zen 3's memory sweet spot and looking at DDR4 memory performance with the new Ryzen 5000...

 

2019-08-09-image.png
WWW.TECHSPOT.COM

When we reviewed Ryzen's latest iteration we briefly checked out different DDR4 memory speeds but now that things have settled we were...

 

 

 

 

 

 

 

4000 MHz memory is not currently called for or price effective, but clearly 2666Mhz and slower is holding back performance for new CPUs.

 

The general rules as of today are:

At 4k, CPU/RAM still make little difference, although there might be a few rare exceptions to this rule.  GPUs aren't quite fast enough yet, but they are getting close.

At 1440p, CPU/RAM impact depends on the game and what you're going for.  

At 1080p, bottlenecks in most games.

 

In some extreme examples of slowness, a bad CPU/RAM combo can have the game dipping near or below 60 fps, which is my measurement for "time to upgrade".  Also depending on your monitor, there might actually be an ideal framerate you would want to achieve.  With some of these Gsync/Vsync monitors on the market, they actually have different levels of ghosting/blurring depending on the refresh rate or framerate your sending to the display.  Currently I have two 1440p/144+hz monitors.  If I'm playing a game at 60 fps on monitor 1, it looks like shit with tons of blurring and ghosting and, for some reason, stuttering.  When playing a game at 60 fps on monitor 2, it looks fine.


Yes, I stated multiple times that for AMD Ryzen RAM speed is important due to its architecture. However, for Intel, no (unless the memory controller changed for LGA1200/the 10xxx series):

 

pCSJpVo7GN4xrtsbxdVTPh-1200-80.jpg
WWW.PCGAMER.COM

Revisiting the question that every gamer considers when building a new gaming system: does RAM speed matter?


61-F4-A610-0724-4241-AB37-81-A5-D21-A22- 
8-B145-D62-7-E00-4-A52-9-BA9-410504952-C
CAD3-F697-CF16-479-C-919-C-1-B86-D11-BD2

Outside of gaming, does it matter? Yes.

Link to comment
Share on other sites

32 minutes ago, cusideabelincoln said:

But I just posted a bunch of tests, with different games than PCGamer, that shows Intel scaling with memory speed.


You posted YouTube videos that I’m not dedicating an hour of my life to watch :p 

Can you provide non-video based benchmarks?

I also provided both anandtech and Tom’s Hardware showing the exact opposite.

Also:

1 hour ago, Spork3245 said:


Yes, I stated multiple times that for AMD Ryzen RAM speed is important due to its architecture. However, for Intel, no (unless the memory controller changed for LGA1200/the 10xxx series):

pCSJpVo7GN4xrtsbxdVTPh-1200-80.jpg
WWW.PCGAMER.COM

Revisiting the question that every gamer considers when building a new gaming system: does RAM speed matter?

 

Link to comment
Share on other sites

Just now, Spork3245 said:


You posted YouTube videos that I’m not dedicating an hour of my life to watch :p 

Can you provide non-video based benchmarks?

I also provided both anandtech and Tom’s Hardware showing the exact opposite.

Also:

 

 

 

Unfortunately all the good reviews are video-based now, and more often than not they don't post a corresponding article.  All they show are the exact same bar graphs you see in an article though so I usually just skip around.

 

I'm not sure how the memory controller changed on recent Intel processors, but the faster memory more likely matters because there are more cores and faster cores now than there were 5 years ago, and the DDR4 controller can hit higher speeds.

Link to comment
Share on other sites

2 minutes ago, cusideabelincoln said:

 

 

Unfortunately all the good reviews are video-based now, and more often than not they don't post a corresponding article.  All they show are the exact same bar graphs you see in an article though so I usually just skip around.

 

I'm not sure how the memory controller changed on recent Intel processors, but the faster memory more likely matters because there are more cores and faster cores now than there were 5 years ago, and the DDR4 controller can hit higher speeds.


The 9xxx Intel series from ~1 year ago shows very little difference between RAM speeds in gaming. We really can’t hold the 5-year thing up as a marker. With Ryzen, RAM speed was certainly always a factor, however: if you attempted to run 2400mhz DDR4 on a 2600x, you’d be killing your performance potential compared to if you ran 3600mhz RAM. I remember when Ryzen 2xxx series first launched and this surprising most people due to Intel being largely unaffected in gaming (for rendering and/or data-specific system tasks (ie: compression, decompression) there was a significant difference) via system RAM mhz.
Back in the day, CAS latency was more important than mhz, now latency is through the roof comparatively :daydream: 

Link to comment
Share on other sites

So, nVidia dropped Watch Dogs Legion as the 3xxx series pack-in game and instead I received the new CoD Cold War game and The Falconeer. Looking around, WDL promo codes for sale have seemingly dried up and the best deal I can find is $30 (meanwhile, CoD promo codes are going for like $15-20) - is it worth $30 or should I wait for the inevitable spring sales in 2-3 months?

Link to comment
Share on other sites

Just have to watch the videos man, there are Intel CPUs in them :p

 

But like I said, at

 

1080p: Lot of bottlenecks, showing what we could expect when more powerful GPUs come out in the future.  EA, Ubisoft and Square Enix games seem to be the most CPU/Memory intensive.  And obviously eSports titles at eSports settings show a benefit to faster memory.

 

1440p: A few bottlenecks, depending on the game and what settings you use.

 

4k: If a bottleneck shows up, it's very minimal.

 

I also wonder how developers are going to tackle games now that the next gen consoles don't have a laptop CPU in them.

Link to comment
Share on other sites

19 minutes ago, cusideabelincoln said:

Just have to watch the videos man, there are Intel CPUs in them :p


 

 

10xxx series though going by the thumbnails :p 

 

Quote

But like I said, at

 

1080p: Lot of bottlenecks, showing what we could expect when more powerful GPUs come out in the future.  EA, Ubisoft and Square Enix games seem to be the most CPU/Memory intensive.  And obviously eSports titles at eSports settings show a benefit to faster memory.

 

1440p: A few bottlenecks, depending on the game and what settings you use.

 

4k: If a bottleneck shows up, it's very minimal.


 

 

This goes back to our previous discussion regarding CPU bottlenecks in that it may exist at low resolutions but is largely unimportant and the GPU is still the main factor most of the time regardless of any potential gains at low res.

 

Quote

I also wonder how developers are going to tackle games now that the next gen consoles don't have a laptop CPU in them.

 

Considering Sony and MS are basically forcing most devs into cross-gen development for at least 1-2 years... we have a while before we’ll know :lol: 

Link to comment
Share on other sites

33 minutes ago, cusideabelincoln said:

CPUs do matter when you start needing to push a minimum of 100 fps.

 

 

46 minutes ago, Spork3245 said:

This goes back to our previous discussion regarding CPU bottlenecks in that it may exist at low resolutions but is largely unimportant and the GPU is still the main factor most of the time regardless of any potential gains at low res


 :p 

Link to comment
Share on other sites

On 1/13/2021 at 2:43 AM, Fizzzzle said:

That won't do, at least let me buy you something in return. But yes, I would take it.

 

Though, now that I know I could probably use a CPU upgrade, I"m already getting new RAM, and I could definitely use a SSD upgrade... it looks like we're getting into full rebuild territory. It's exciting. Though, thankfully, ass opposed to the last 2 PC upgrades I did, I can leave Windows on the tiny SSD and just add another for games. I think. Probably. I haven't looked at my motherboard in a long time.

 

I don't really need anything but just send me ur address.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...