Jump to content

Nvidia 4xxx Series - The Official Perennially Disappointing Hunt for More Power!


Mr.Vic20

Recommended Posts

In terms of 4K, its pretty wild! Even more wild is that this will scale nicely to 8K. The market is still not ready for 8K adoption, but its nice to know that a solution will be well matured intime to meet 8K eventual arrival. That said, I'm not betting on a good gamer 8K TV until 2024. 

  • True 1
Link to comment
Share on other sites

https://videocardz.com/newz/nvidia-confirms-5-games-will-support-dlss-3-0-within-a-week

 

Gentlemen, our mission has mutiple objectives now:

 

1.) Aquire card by whatever means!

2.) Agonize over prospective ship dates and check shipment status 30+ times a day as if waiting for a text from a person you know has definitely ghosted you...

3.) Install and play the initial meager offering of games that we would largely not play if they didn't support DLSS3*

 

 

and here are your immediate choices:

SUPER PEOPLE – now available

Justice ‘Fuyun Court’ – October 12

Loopmancer – October 12

Microsoft Flight Simulator – October 17

A Plague Tale: Requiem – October 18

F1 22 – update soon

 

Link to comment
Share on other sites

10 minutes ago, Mr.Vic20 said:

https://videocardz.com/newz/nvidia-confirms-5-games-will-support-dlss-3-0-within-a-week

 

Gentlemen, our mission has mutiple objectives now:

 

1.) Aquire card by whatever means!

2.) Angonize over prospective ship dates and check shipment status 30+ times a day as if waiting for a text from a person you know has definitely ghosted you...

3.) Install and play the initial meager offering of games that we would largely not play if they didn't support DLSS3*

 

 

and here are your immediate choices:

SUPER PEOPLE – now available

Justice ‘Fuyun Court’ – October 12

Loopmancer – October 12

Microsoft Flight Simulator – October 17

A Plague Tale: Requiem – October 18

F1 22 – update soon

 


Did Spider-Man and Cyberpunk get their DLSS 3 patches?

Link to comment
Share on other sites

Some DLSS 3 latency tests (and detailed power testing)

 

ISO-Front.jpg
WWW.IGORSLAB.DE

The time has come and after an almost unbearable period of waiting and speculation, I can finally present the GeForce RTX 4090 Founders Edition (FE) today inclu

 

 

TLDR:

Adds a little bit more latency compared to DLSS2, but improves latency over native resolution rendering.

Uses less power than a 3090 Ti, the power fear-mongering was not really warranted pre-launch.

Link to comment
Share on other sites

21 minutes ago, cusideabelincoln said:

Some DLSS 3 latency tests (and detailed power testing)

 

ISO-Front.jpg
WWW.IGORSLAB.DE

The time has come and after an almost unbearable period of waiting and speculation, I can finally present the GeForce RTX 4090 Founders Edition (FE) today inclu

 

 

TLDR:

Adds a little bit more latency compared to DLSS2, but improves latency over native resolution rendering.

Uses less power than a 3090 Ti, the power fear-mongering was not really warranted pre-launch.

 

Is it that it just adds more latency overall but the higher frames achievable with it ends up negating that to reduce latency overall?

 

Meaning, if a game is running at 60fps 4k native and then 4k 60fps dlss3, is there a latency improvement?

 

Or is it that with dlss3 it’s running at 90fps vs 60fps native, which provides a latency improvement, but not as good of an improvement as if it were running at 90fps native?

Link to comment
Share on other sites

Quote

During testing, we were also given a counterintuitive suggestion for Nvidia systems: reverse the steps normally used to get Nvidia's G-Sync technology working on compatible screens and monitors. G-Sync users are usually told to enable V-Sync on the Nvidia Control Panel level, but doing this increases DLSS 3 FG's latency an additional 60–100 ms. Nvidia says it is working on a streamlined solution to let G-Sync and DLSS 3 FG play nice together.

 

rtx4090-unboxing-listing-760x380.jpg
ARSTECHNICA.COM

Our review also dives deeply into DLSS 3—and how it may affect future Nvidia GPUs.

 

Link to comment
Share on other sites

28 minutes ago, stepee said:

 

Is it that it just adds more latency overall but the higher frames achievable with it ends up negating that to reduce latency overall?

 

Meaning, if a game is running at 60fps 4k native and then 4k 60fps dlss3, is there a latency improvement?

 

Or is it that with dlss3 it’s running at 90fps vs 60fps native, which provides a latency improvement, but not as good of an improvement as if it were running at 90fps native?

 

As of right now, it should be mostly impossible to have a game run at 60 fps 4k native and also then run at 60 fps 4k dlss3. That scenario an only happen if there is a very hard CPU/system bottleneck that's capping the framerate at 60. But, and I haven't seen this tested so I'm guessing, if this situation were to ever happen then DLSS3 would add latency.

 

But if you have a game that runs 60 fps at native 4k, it will run faster with DLSS enabled. For an arbitrary example, if you enable DLSS 2 then the game will run at 90 fps and you will see a reduction in latency just purely from the reduction in frame render time, which hypothetically will be going from 16 ms per frame to 11 ms per frame. If you were to then run that game with DLSS3, you could hypothetically see framerates up to 180 fps, but the latency will be similar to DLSS2's 90 fps performance.

 

So, yeah to the last question. 90 fps native should provide the same or better latency than 90 fps using either version of DLSS.

 

This was in Cyberpunk, so in a totally GPU-limited situation I would expect latency to behave similar to any game that is GPU bound. But if a game is CPU limited, the situation is going to change. And as Crispy linked to above, capping the framerate either in-engine, in-driver, or through V-sync can negatively hurt latency. Right now you can't use G-Sync and DLSS3. The driver doesn't allow it. I suspect it's because the driver isn't smart enough, yet, to determine which frame it should keep or throw away if you're producing way more frames than your display can handle.

 

The solution that Nvidia will probably come up with in time, if you want to use a framerate cap or some type of Vsync, is to have dynamic DLSS3. Basically, if the GPU is powerful enough to render the scene as fast or faster than your framerate cap, then you don't need to AI generate any frames at all. So like how dynamic resolution scaling works in a lot of games (console games mostly), the GPU will only AI generate a frame if the render time is longer than the framerate cap.

Link to comment
Share on other sites

23 minutes ago, cusideabelincoln said:

 

As of right now, it should be mostly impossible to have a game run at 60 fps 4k native and also then run at 60 fps 4k dlss3. That scenario an only happen if there is a very hard CPU/system bottleneck that's capping the framerate at 60. But, and I haven't seen this tested so I'm guessing, if this situation were to ever happen then DLSS3 would add latency.

 

But if you have a game that runs 60 fps at native 4k, it will run faster with DLSS enabled. For an arbitrary example, if you enable DLSS 2 then the game will run at 90 fps and you will see a reduction in latency just purely from the reduction in frame render time, which hypothetically will be going from 16 ms per frame to 11 ms per frame. If you were to then run that game with DLSS3, you could hypothetically see framerates up to 180 fps, but the latency will be similar to DLSS2's 90 fps performance.

 

So, yeah to the last question. 90 fps native should provide the same or better latency than 90 fps using either version of DLSS.

 

This was in Cyberpunk, so in a totally GPU-limited situation I would expect latency to behave similar to any game that is GPU bound. But if a game is CPU limited, the situation is going to change. And as Crispy linked to above, capping the framerate either in-engine, in-driver, or through V-sync can negatively hurt latency. Right now you can't use G-Sync and DLSS3. The driver doesn't allow it. I suspect it's because the driver isn't smart enough, yet, to determine which frame it should keep or throw away if you're producing way more frames than your display can handle.

 

The solution that Nvidia will probably come up with in time, if you want to use a framerate cap or some type of Vsync, is to have dynamic DLSS3. Basically, if the GPU is powerful enough to render the scene as fast or faster than your framerate cap, then you don't need to AI generate any frames at all. So like how dynamic resolution scaling works in a lot of games (console games mostly), the GPU will only AI generate a frame if the render time is longer than the framerate cap.

 

Ok thanks that’s what I thought it was, but just wanted to be clear what the latency comparisons are based on.

 

Only thing Im unclear on I guess is exactly what you are supposed to do with dlss3 and capping the frame rate. If it doesn’t support gsync, then you don’t want a variable frame rate, then you want to cap the frame rate, which adds latency so they say not to..so what exactly do you do? 

 

It sounds like the actual results at least with a 4090 works as well as you could hope for at launch and is in better shape than say dlss1.0 was, but I just don’t get how you use it then.

Link to comment
Share on other sites

25 minutes ago, stepee said:

 

Ok thanks that’s what I thought it was, but just wanted to be clear what the latency comparisons are based on.

 

Only thing Im unclear on I guess is exactly what you are supposed to do with dlss3 and capping the frame rate. If it doesn’t support gsync, then you don’t want a variable frame rate, then you want to cap the frame rate, which adds latency so they say not to..so what exactly do you do? 

 

It sounds like the actual results at least with a 4090 works as well as you could hope for at launch and is in better shape than say dlss1.0 was, but I just don’t get how you use it then.

 

Well all of this is going to be tricky and take some work. Latency is impacted by so many different variables that there is no simple answer when all of those variables can be used in different combinations. Vsync, double buffered, triple buffered, Gsync, in-engine framerate cap, driver framerate cap, in-engine render queue, driver render queue, refresh rate, power target, boost frequency behavior all affect input lag to various degrees. Even the resolution you run impacts latency; I've seen tests that capped everything at the same framerate yet the lower resolution will yield (slightly) better latencies compared to a higher resolution. And the same with DLSS and GSYNC. Input lag is slightly increased when using these things even if the framerate is the same with them off.

Link to comment
Share on other sites

Just now, cusideabelincoln said:

 

Well all of this is going to be tricky and take some work. Latency is impacted by so many different variables that there is no simple answer when all of those variables can be used in different combinations. Vsync, double buffered, triple buffered, Gsync, in-engine framerate cap, driver framerate cap, in-engine render queue, driver render queue, refresh rate, power target, boost frequency behavior all affect input lag to various degrees. Even the resolution you run impact latency, that is I've seen tests that capped everything at the same framerate yet the lower resolution will yield (slightly) better latencies compared to a higher resolution. And the same with DLSS and GSYNC. Input lag is slightly increased when using these things even if the framerate is the same with them off.

 

So basically just cap the frame rate and yolo.

Link to comment
Share on other sites

1 minute ago, stepee said:

 

So basically just cap the frame rate and yolo.

 

Oh how I wish it were that simple. But, and this is my experience in a single game, I was noticing something weird when trying to cap the framerate in Overwatch 1.

 

Whether capping framerate below my monitor's refresh rate in the game or using Nvidia's drivers, I noticed an ever so slightly choppiness. This happened with and without Gsync.

 

When I uncapped the framerate, the smoothness improved a little bit simply because of the massive increase in fps. Uncapping the framerate also gave me the lowest input lag. I really had to look for it, but it was ever so slightly noticeable.

 

But when I turned Gsync off and just used Vsync, with a framerate that never dipped below my refresh rate, the game definitely looked noticeably smoother. No tearing, no visual jitter. But the input lag was definitely noticeably worse compared to either Gsync or completely uncapped.

 

It's really difficult to get both a completely smooth visual experience and extremely low latencies.  The closest way to achieve this is to simply have way more computational power than you need and just brute force deliver a higher framerate.

Link to comment
Share on other sites

13 minutes ago, cusideabelincoln said:

 

Oh how I wish it were that simple. But, and this is my experience in a single game, I was noticing something weird when trying to cap the framerate in Overwatch 1.

 

Whether capping framerate below my monitor's refresh rate in the game or using Nvidia's drivers, I noticed an ever so slightly choppiness. This happened with and without Gsync.

 

When I uncapped the framerate, the smoothness improved a little bit simply because of the massive increase in fps. Uncapping the framerate also gave me the lowest input lag. I really had to look for it, but it was ever so slightly noticeable.

 

But when I turned Gsync off and just used Vsync, with a framerate that never dipped below my refresh rate, the game definitely looked noticeably smoother. No tearing, no visual jitter. But the input lag was definitely noticeably worse compared to either Gsync or completely uncapped.

 

It's really difficult to get both a completely smooth visual experience and extremely low latencies.  The closest way to achieve this is to simply have way more computational power than you need and just brute force deliver a higher framerate.

 

Well, I’m definitely going to be capping it at 60 or 120 still if i’m not using vrr, so for me its really just going to be a matter of finding which way to cap the frame rate introduces the least latency. I do play with a wired gamepad, and don’t use m/k ever, so I have that going for me at least.

Link to comment
Share on other sites

@stepee More latency tests in MS Flight Sim, F1 2022, and Cyberpunk, showing what I was trying to say.

 

 

In MS Flight Sim, there's a hard bottleneck so native and DLSS provide the same framerate, thus provide similar latency. But frame generation adds latency over both, albeit not that much. You won't notice a 6ms increase with controller.

 

In F1 and Cyberpunk, there's no CPU bottleneck at native and mostly no bottleneck with DLSS.  DLSS is thus able to increase the framerate and lower the latency. Frame generation adds latency over DLSS, but is still able to provide better latency than native. 

 

I suspect the reason why we don't see a doubling in performance in F1 and Cyberpunk like we do in Flight Sim is because of the GPU load.  There is not as much GPU headroom for the AI processing cores to do its thing when the rest of the GPU is already being fully utilized compared to a CPU bottlenecked like Flight Sim where the GPU is stuck at half utilization.  Now perhaps this can be improved with driver and programming updates, or perhaps there is just some bottleneck in the architecture of the chip, because most testing shows that DLSS + Frame generation actually reduces power consumption so that can't be the problem. 

Link to comment
Share on other sites

6 minutes ago, Mr.Vic20 said:

 

I don't see a top banner or German. You crazy.

  • Sicko 1
Link to comment
Share on other sites

2 minutes ago, Keyser_Soze said:

Don't you have to put the tetrahedron in the metronome today or something? :p

Actually dear boy, I have the week off! After having successfully coverd the Synchrotron ring by Oct 5th, it seemed like the only shot I would get at a vacation, so I took it!

 

Best Shocked Face GIFs | Gfycat

  • Shocked 1
  • Halal 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...