Jump to content

Nvidia - Geforce Beyond (EVGA) Event


Mr.Vic20

Recommended Posts

Just now, Mr.Vic20 said:

Hmmm, I've seen a variety of bench marks that differ from these, but I also don't own a 3080 so what do I know? 


At 1440p the difference is closer to 20% most of the time. The only benches I recall where it was around 20% at 4k was like Doom Eternal where it was 215fps vs 245fps… and unless you’re into screen tearing you’re capping that at 120-144fps :p 

  • Haha 1
Link to comment
Share on other sites

23 minutes ago, best3444 said:

 

I did a quick Google search and read this 😆 

 

"B&O's 8K OLED TV starts at $18,125 for 65 inches"...


Spend the bigger bucks and get the JVC NZ9 4K/8K Laser projector for a few thousand dollars more at $26k. Doesn’t do Dolby Vision but HDR10+, VRR and can do ssssooooooooo much better then a 65” tv. 

  • Shocked 1
Link to comment
Share on other sites

The good news is by the time 8k oled is something that is a sane purchase, there should be gpus that can actually run modern games at 8k/60 unlike when 4k first hit!

 

Also lol man resetera is having quite the meltdown still over the prices. I hope they are legit about skipping it and continue to shame people for buying one, might make it easier to grab it.

Link to comment
Share on other sites

5 minutes ago, stepee said:

The good news is by the time 8k oled is something that is a sane purchase, there should be gpus that can actually run modern games at 8k/60 unlike when 4k first hit!

 

Also lol man resetera is having quite the meltdown still over the prices. I hope they are legit about skipping it and continue to shame people for buying one, might make it easier to grab it.

 

I think some of em a trolling. Imagine dropping PC gaming over Nvidia pricing when you could just ya kno buy a cheap prebuild or gasp maybe buy a cheaper AMD card.

  • Haha 1
Link to comment
Share on other sites

15 minutes ago, silentbob said:


Spend the bigger bucks and get the JVC NZ9 4K/8K Laser projector for a few thousand dollars more at $26k. Doesn’t do Dolby Vision but HDR10+, VRR and can do ssssooooooooo much better then a 65” tv. 


HDR on projectors is kinda weak sauce compared to TVs, though, right? Or has that changed?

 

6 minutes ago, stepee said:

The good news is by the time 8k oled is something that is a sane purchase, there should be gpus that can actually run modern games at 8k/60 unlike when 4k first hit!

 

Also lol man resetera is having quite the meltdown still over the prices. I hope they are legit about skipping it and continue to shame people for buying one, might make it easier to grab it.

 

Mini LED 8k is kinda reasonably priced, like $5k for 85”. I haven’t kept up on how they compare to OLED, though :( 

Link to comment
Share on other sites

Just now, Zaku3 said:

 

I think some of em a trolling. Imagine dropping PC gaming over Nvidia pricing when you could just ya kno buy a cheap prebuild or gasp maybe buy a cheaper AMD card.

 

haha yeah they had a poll where like 40% of them were going to just straight quit pc gaming over it, the outrage is amazing 

Link to comment
Share on other sites

3 minutes ago, stepee said:

 

haha yeah they had a poll where like 40% of them were going to just straight quit pc gaming over it, the outrage is amazing 

 

The bad thing is they could be serious. I legit sit here after having terrible luck wit my old faithful of Gigabyte mobos. Had so many of em die on me after a while but I'm just gonna "gasp" buy another mobo for the 3d 7xxx. 

 

If I ever felt that AMD was ripping me off on a CPU/GPU. I'd either buy an Intel CPU, wait since I'm a gamer and any non-gaming boost is just a bonus me, or get a Nvidia card or wait for a price drop. 

 

Literally in a console echochamber. I need to break it to them that papa PC gaming is more popular then console gaming worldwide because you probably need a PC for school work or maybe your job/career and the cost of adding a GPU or just using the iGPU it uses is cheaper then buying a console. 

  • True 1
Link to comment
Share on other sites

1 hour ago, Spork3245 said:


HDR on projectors is kinda weak sauce compared to TVs, though, right? Or has that changed?


You wont get the brightness like a tv but the HDR I think looks good even on my current JVC NX7 4K projector even on low bulb setting. The newer one is Laser and has like +20k hours of life on high laser usage or +40k on low setting. This shot was taken with my iphone12 on my 106” screen from the Obi Wan series running at 4K/HDR. Would love a newer screen that would help take even better advantage of the HDR on my proj. with newer screen material.

 

spacer.png

 

I forgot I had Gears 5 pic as well. Took this pic almost an hour after I got my NX7. Got better with latest firmware updates and having the projector worked in a bit.

 

spacer.png

  • Halal 3
Link to comment
Share on other sites

Until light steering makes it into projectors, I'm not remotely considering one over an OLED for regular home use. Next year will mark the first commercial delivery when Cinionic/Barco puts their unit into a handful of cinemas. So we are probably 6-8 years away from light steering as a common feature of home projector tech.

Link to comment
Share on other sites

18 hours ago, Spork3245 said:


There’s only a handful of games where this is achieved at 4k+. It’s typically closer to around 10%.

 

4K_HZD-p.webp
4K_DS-p.webp

4K-p.webp
 


The 4080 16gb is being stated as the same “gain” as the 4090, but over the 3080 Ti rather than the 3090 Ti, which are barely apart performance-wise (it’s less than the 3080 vs 3090… IIRC, it’s around 7-8%).

 

Being stated by Nvidia :p  and some of their charts clearly showed, without DLSS 3, the 4080 12GB slower than a 3090 Ti, so putting it around 3080 Ti, with the 16GB being 15% faster. 

 

FdHHOKXWYAUmIH1?format=jpg&name=4096x409

 

Resident Evil, Assassin's Creed, and Division don't support DLSS 3, while the other games do support DLSS 3.0 and that's where we're seeing the 2+ times performance jumps.

 

DLSS 3.0 and Ray Tracing performance is where the real performance will come from.

 

DLSS-3-Performance-scaled.jpg

 

Link to comment
Share on other sites

I think i’m taking the wrong thing from the chart, I’m just focused that the worst case example for 4090 (AC of course!) is still 1.5x 3090ti and yeahhhhhhh

 

But yeah 4080 12gb being under the 3090ti, that branding there was a mistake, I think they will get more backlash than they will trick people with it.

Link to comment
Share on other sites

12 minutes ago, cusideabelincoln said:

 

Being stated by Nvidia :p  and some of their charts clearly showed, without DLSS 3, the 4080 12GB slower than a 3090 Ti, so putting it around 3080 Ti, with the 16GB being 15% faster. 

 

FdHHOKXWYAUmIH1?format=jpg&name=4096x409

 

Resident Evil, Assassin's Creed, and Division don't support DLSS 3, while the other games do support DLSS 3.0 and that's where we're seeing the 2+ times performance jumps.

 

DLSS 3.0 and Ray Tracing performance is where the real performance will come from.

 

DLSS-3-Performance-scaled.jpg

 


Saying 2-4x for any of the cards is ridiculous since the 4090 is nowhere near 2x without DLSS. The 4080 12gb is likely on par with a 3090 non-TI. Based on that chart, without DLSS, I would argue that none of these are an actual generational leap and closer to the normal leaps we’ve been seeing since the 1000 series, though, I think we’ll have to wait for actual benchmarks and not bull charts.

Link to comment
Share on other sites

20 minutes ago, stepee said:

I think i’m taking the wrong thing from the chart, I’m just focused that the worst case example for 4090 (AC of course!) is still 1.5x 3090ti and yeahhhhhhh

 

But yeah 4080 12gb being under the 3090ti, that branding there was a mistake, I think they will get more backlash than they will trick people with it.

 

That's what I'm trying to say, in a worst case scenario the 4090 is still the typical generational leap we've seen lately. AC is also a very CPU and memory intensive game, so the gains in that chart could be limited by that.

 

Nvidia charts don't technically lie; they've been pretty accurate. But there a million different ways to configure a test so how they get their numbers is as important as what the numbers actually are.

Link to comment
Share on other sites

Now that I’m seeing more about how their numbers are put together and have more details sorted, my initial excitement for this series being kinda incredible has definitely changed to only thinking that the 4090 is a great card. I was missing that huge difference between 4080 16gb and 4090 through their 2-4x marketing.

 

I hope AMD has something to make those 4080’s look extra bad because it’s some bs. The 4090 should be the 4080 at $1200 most likely. 

 

That 4090 still gets me so damn hard though so I’ma be bad.

Link to comment
Share on other sites

16 minutes ago, stepee said:

Now that I’m seeing more about how their numbers are put together and have more details sorted, my initial excitement for this series being kinda incredible has definitely changed to only thinking that the 4090 is a great card. I was missing that huge difference between 4080 16gb and 4090 through their 2-4x marketing.

 

I hope AMD has something to make those 4080’s look extra bad because it’s some bs. The 4090 should be the 4080 at $1200 most likely. 

 

That 4090 still gets me so damn hard though so I’ma be bad.

 

I think Nvidia is pivoting to data center customers. Someone posted a break down of their revenue I think and data centers exceded gaming. It was close but that "gaming" revenue was probably being boosted by mining. Without mining their growth segment seems to be data centers. 

 

It kinda sucks that PC gamings primary hardware companies don't seem to care too much about gaming. Nvidia cares more for AI and other tasks it's GPUs can do the heavy lifting processing powerwise. While AMD is more focused on servers, laptops, and desktops. 

Link to comment
Share on other sites

1 minute ago, Zaku3 said:

 

I think Nvidia is pivoting to data center customers. Someone posted a break down of their revenue I think and data centers exceded gaming. It was close but that "gaming" revenue was probably being boosted by mining. Without mining their growth segment seems to be data centers. 

 

It kinda sucks that PC gamings primary hardware companies don't seem to care too much about gaming. Nvidia cares more for AI and other tasks it's GPUs can do the heavy lifting processing powerwise. While AMD is more focused on servers, laptops, and desktops. 

 

Tbf, I feel like that stuff is the only reason we get what we do now! I think gaming tech at this point is just the residuals from R&D targeted at enterprise. I feel like gains would have really slowed down if they were developing with a gaming only focus.

 

 

Link to comment
Share on other sites

5 minutes ago, stepee said:

 

Tbf, I feel like that stuff is the only reason we get what we do now! I think gaming tech at this point is just the residuals from R&D targeted at enterprise. I feel like gains would have really slowed down if they were developing with a gaming only focus.

 

 

 

Ya man. That presentation was gaming for what 20min then off to some metaverse (i don't want to live on this planet anymore) stuff.

 

 

Link to comment
Share on other sites

44 minutes ago, stepee said:

 

Someone summarize this for me, does it run the video games well?

 

In Cyberpunk the 4090 with DLSS 3 frame generation is 2.5x faster than a 3099 Ti using DLSS 2 performance mode.

 

In Spiderman the 4090 provides twice the performance using DLSS 3 frame generation compared to native resolution.

 

In Portal the 4090 gives 5.5x performance using DLSS 3 frame generation, DLSS 2 is 3.33x faster than native rendering.

 

cremita-lub.gif

 

Latency using DLSS 3 frame gen is similar to using DLSS 2. Worst case is it's still better than latency without using DLSS.

 

So this does mean the higher frame rate using frame generation does not give you the latency benefit if that frame rate was rendered without frame gen.

 

Good news about frame generation is the image quality is top notch and will make the game look smoother.

Link to comment
Share on other sites

I was most interested to see the artifacting concerns with DLSS3.  DF seems a little confused to how to treat them, since when they show up, it's only for a single frame, pretty much imperceptible at high framerates.

 

I'd really like them to go in depth with lower FPS games though.  For example, a game that runs at 30-40fps with DLSS2, then being upscaled (and vsynced) to ~60fps with DLSS3.  How obvious are the artifacts then?  This has everything to do with the longevity of these cards to me.

 

Latency seems alright.  DLSS3 4k performance mode has better latency than native 4K.  The bigger question now is where the quality modes stand.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...