Jump to content

Nvidia - Geforce Beyond (EVGA) Event


Mr.Vic20

Recommended Posts

3 minutes ago, Spork3245 said:

 

If you only want 60fps, I get why you're so adamant about AMD

mickey mouse troll GIF

dave chappelle what GIF

 

I was thinking about that yesterday and came to the conclusion if I didn't switch due to thr 7970 being the top card for what until RDNA2 I don't think I'd ever switch.

 

High edit: Ya I remember being mad excited that RDNA1 was the crossfire performance of the 7970 at all times.

Link to comment
Share on other sites

And herew are our first 35 GAMES* that use DLSS 3:

 

  • A Plague Tale: Requiem
  • Atomic Heart
  • Black Myth: Wukong
  • Bright Memory: Infinite
  • Chernobylite
  • Conqueror's Blade
  • Cyberpunk 2077
  • Dakar Rally
  • Deliver Us Mars
  • Destroy All Humans! 2 - Reprobed
  • Dying Light 2 Stay Human
  • F1 22
  • F.I.S.T.: Forged In Shadow Torch
  • Frostbite Engine
  • HITMAN 3
  • Hogwarts Legacy
  • ICARUS
  • Jurassic World Evolution 2
  • Justice
  • Loopmancer
  • Marauders
  • Microsoft Flight Simulator
  • Midnight Ghost Hunt
  • Mount & Blade II: Bannerlord
  • Naraka: Bladepoint
  • NVIDIA Omniverse
  • NVIDIA Racer RTX
  • PERISH
  • Portal with RTX
  • Ripout
  • S.T.A.L.K.E.R. 2: Heart of Chornobyl
  • Scathe
  • Sword and Fairy 7
  • SYNCED
  • The Lord of the Rings: Gollum
  • The Witcher 3: Wild Hunt
  • THRONE AND LIBERTY
  • Tower of Fantasy
  • Unity
  • Unreal Engine 4 & 5
  • Warhammer 40,000: Darktide

 

 

NOTE THE WEIRD PADDING OUT OF THE LIST WITH ENTRIES LIKE "Nvidia Omniverse and of course that classic game "Unreal Engine 4&5"  :p

  • Haha 2
Link to comment
Share on other sites

39 minutes ago, Zaku3 said:

 

Does AMD invest in AI? I don't remember.

 

Lets put it this way, the M1/M2 Mac silicon has more support in the AI library community than AMD hardware does. I don't know anyone who uses AMD GPUs for AI. At most, they use AMD CPUs for light-weight inference.

Link to comment
Share on other sites

Yeah looking at the specs the 12gb model is really a 4080 Lite version, less cores and bandwidth and draws less power. The real deal then is a $500 price increase over the 3080. Damn. Or I guess you could be charitable and pretend the 16gb is the TI version but ehhh.

Link to comment
Share on other sites

Just now, Paperclyp said:

My computer is due for a refresh. I bought a pretty meh card in the heat of the card shortages because my other one (which was a bit better) died. I don't know when I'll be ready to actually pull the trigger on an expensive card again though.

 

I wonder at what point is it gonna be better to just buy a prebuilt PC. 

Link to comment
Share on other sites

Just now, stepee said:

Yeah looking at the specs the 12gb model is really a 4080 Lite version, less cores and bandwidth and draws less power. The real deal then is a $500 price increase over the 3080. Damn. Or I guess you could be charitable and pretend the 16gb is the TI version but ehhh.

Or we can just look in the mirror now and realize: Look man, we both knew we intended to get a 4090 all along, we just told friends we were considering a 4080 to seem reasonible. Problem solved! :sun:

 

YARN | Come on, buddy. | Archer (2009) - S01E10 Animation | Video gifs by  quotes | cea6d1fa | 紗

  • Haha 1
Link to comment
Share on other sites

1 hour ago, Mr.Vic20 said:

 

 

Absolutely stoked for this, hopefully the frame interpolation still yields good quality results closer to 60.  I don't care about VRS and won't need all that extra headroom.

Hopefully a 4070 strikes a good balance between this and power draw.  I could see myself building a new rig with this once prices cool.

Link to comment
Share on other sites

20 minutes ago, Mr.Vic20 said:

Hmm, the kids on Resetera are claiming that DLSS 3 will be exclusive to the 4xxx series. Not sure where it says that but it doesn't quite say either way. 

 

 

 

That is correct.  It's exclusive to 40xx series.  And just like that, I'm not interested in 30 series cards anymore.

  • True 1
  • Halal 1
Link to comment
Share on other sites

3 minutes ago, Mr.Vic20 said:

Ack, its all terribly true:

 

geforce-ampere-rtx-3080-og-image-1200x63
WWW.NVIDIA.COM

30 Series vs 20 Series? RTX 20 vs GTX 10 Series? 10 vs 900? All the features and more.

 

 

 

That cheeky bastard definitely avoided the word "exclusive" when telling us all about how awesome 3.0 is. I guess they still need to get rid of the 3000 inventory sitting in warehouses lol.

  • Haha 1
Link to comment
Share on other sites

11 minutes ago, Mr.Vic20 said:

Ack, its all terribly true:

 

geforce-ampere-rtx-3080-og-image-1200x63
WWW.NVIDIA.COM

30 Series vs 20 Series? RTX 20 vs GTX 10 Series? 10 vs 900? All the features and more.

 

 

 

Disappointing I suppose, but also I will father a child simply to sacrifice it in return if needed to get one of these asap, so Ima have me some dlss3.

  • Sicko 1
Link to comment
Share on other sites

The dirty truth to 2-4x performance is definitely in the DLSS technology, along with one other trick:

 

uj53dmc.png

 

This other dirty truth as to how they can claim the 4090 is 2-4x faster than a 3090 Ti is by limiting the 3090 Ti to 350 watts while their performance numbers have the 4090 at at least 450w.

 

Likewise the "imaginary numbers" they are using for the 4080 is probably when its running at 450+ watts, while they are definitely limiting their "3080 numbers" to 350w.

 

Not really a completely apples to apples comparison. While the 3080 and 3090 launched with a 350w TDP, many cards were able to push up to 450w. And the 3080, 3090, and Ti variants are all power constrained. They'll hit their power limits before their clock limits, and since they are all so close in shader count they all basically perform the same when limited to the same TDP.

  • Halal 1
Link to comment
Share on other sites

7 minutes ago, Spork3245 said:

 

EVGA FTW3 and FTW3 Ultra cards still under warranty seem to be selling for a premium at the moment.

 

5 minutes ago, Mr.Vic20 said:

I'm sure for a select few people those are now "collectors items" as its the last of the EVGA GPU line. 

 

This seems weird to me, like just get a box or something lol, but I’ll take a couple hundred extra!

  • Halal 1
Link to comment
Share on other sites

1 hour ago, stepee said:

Man though with the RT cost reduction, dlss 3.0, and just extra power combined, pc is sitting on a ridiculous leap over consoles again now.


FSR 2.0 was going to put in a lot of work with traditional rasterized performance at least.  But yeah, DLSS 3.0 just created a new paradigm shift.


Think about all those UE5 games that will be CPU limited.

 

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...