Jump to content

Nvidia - Geforce Beyond (EVGA) Event


Mr.Vic20

Recommended Posts

Seems like a lot of people are understandably upset at the prices, but they're about what I would expect. If the last couple years of shortages showed one thing, it's that gpu prices were less than the market would bear. Nvidia would be foolish to look at a market where their products are consistently being sold over MSRP and not try to capture some of that themselves.

Link to comment
Share on other sites

1 hour ago, TwinIon said:

Seems like a lot of people are understandably upset at the prices, but they're about what I would expect. If the last couple years of shortages showed one thing, it's that gpu prices were less than the market would bear. Nvidia would be foolish to look at a market where their products are consistently being sold over MSRP and not try to capture some of that themselves.


Crypto and the chip shortage was a large part of that.

 

I'd love to see these cards drop if demand cooled.  But even if they don't for a time, surely they will within 2-3 years.

Link to comment
Share on other sites

39 minutes ago, Spork3245 said:

 

Disclaimer: I haven't watched this whole video yet

I watched part of it. Basically while DLSS 3 may appear smoother on the screen, gamers may not feel the smoothness because input latency has not improved vs. reflex dlss 2.0 This may be a problem for certain types of competitive games, but probably not a problem for others.

Link to comment
Share on other sites

43 minutes ago, Massdriver said:

I watched part of it. Basically while DLSS 3 may appear smoother on the screen, gamers may not feel the smoothness because input latency has not improved vs. reflex dlss 2.0 This may be a problem for certain types of competitive games, but probably not a problem for others.

 

Sounds like the word “problem” is doing a lot of heavy lifting here :P

 

 

Link to comment
Share on other sites

29 minutes ago, stepee said:

 

Sounds like the word “problem” is doing a lot of heavy lifting here :P

 

 


My concern is the game looking smoother than the response feels. Maybe it won’t even really be noticeable, like when I sold my Sony GDM FW900 CRT and went to a 4k HDR LED, but, in my head I have it feeling “off”. This is mainly because it’s adding SO many frames, about doubling framerate over DLSS 2.0 in some cases.

Link to comment
Share on other sites

4 minutes ago, Spork3245 said:


My concern is the game looking smoother than the response feels. Maybe it won’t even really be noticeable, like when I sold my Sony GDM FW900 CRT and went to a 4k HDR LED, but, in my head I have it feeling “off”. This is mainly because it’s adding SO many frames, about doubling framerate over DLSS 2.0 in some cases.

 

So do you mean more like the latency might not be bad per se but since the game is running 120fps dlss3 and the latency is more like let’s just say, playing 60fps native, it still feels off?

 

I suppose I could see that being a bother, though it seems like something your brain might easily adjust to, sort of like adjusting to the extra latency with moonlight, but even easier. Only one way to find out! $$$$$$$$$

Link to comment
Share on other sites

9 minutes ago, Spork3245 said:


My concern is the game looking smoother than the response feels. Maybe it won’t even really be noticeable, like when I sold my Sony GDM FW900 CRT and went to a 4k HDR LED, but, in my head I have it feeling “off”. This is mainly because it’s adding SO many frames, about doubling framerate over DLSS 2.0 in some cases.

I doubt Nvidia would go all in with it if it felt off. My guess is it will only bother people that are pro gamers, but perhaps I'm too optimistic. 

Link to comment
Share on other sites

14 minutes ago, Massdriver said:

I doubt Nvidia would go all in with it if it felt off. My guess is it will only bother people that are pro gamers, but perhaps I'm too optimistic. 


They went all in on PhysX and even encouraged people to buy a second video card to use as a dedicated PhysX card. I’m just saying. :p 

18 minutes ago, stepee said:

 

So do you mean more like the latency might not be bad per se but since the game is running 120fps dlss3 and the latency is more like let’s just say, playing 60fps native, it still feels off?

 

I suppose I could see that being a bother, though it seems like something your brain might easily adjust to, sort of like adjusting to the extra latency with moonlight, but even easier. Only one way to find out! $$$$$$$$$

 

The concern would be rendering at, say, 40fps but displaying 90-120. A lot of the frames would be “guesses” that the AI is making and not actually happening in-game. Getting the actual latency of, say, 90+fps and displaying 160-200 wouldn’t be as noticeable as latency at 90fps is already good. It’s possible I’m greatly overthinking it all, but it makes me want to wait for solid reviews and not previews that are limited in what they can and cannot say. End of the day, you can just run 2.0 on these cards if you don’t like 3.0 and it should still be significantly faster than the 3000 series, though  :p 

  • Halal 1
Link to comment
Share on other sites

56 minutes ago, Spork3245 said:


They went all in on PhysX and even encouraged people to buy a second video card to use as a dedicated PhysX card. I’m just saying. :p 

 

The concern would be rendering at, say, 40fps but displaying 90-120. A lot of the frames would be “guesses” that the AI is making and not actually happening in-game. Getting the actual latency of, say, 90+fps and displaying 160-200 wouldn’t be as noticeable as latency at 90fps is already good. It’s possible I’m greatly overthinking it all, but it makes me want to wait for solid reviews and not previews that are limited in what they can and cannot say. End of the day, you can just run 2.0 on these cards if you don’t like 3.0 and it should still be significantly faster than the 3000 series, though  :p 

 

Yeah I see what you mean, we will have to see exactly how magic this magic is, but yeah the worst case scenario here is that it’s still a damn banger of a card :P

Link to comment
Share on other sites

 

1 hour ago, Spork3245 said:

The concern would be rendering at, say, 40fps but displaying 90-120. A lot of the frames would be “guesses” that the AI is making and not actually happening in-game. Getting the actual latency of, say, 90+fps and displaying 160-200 wouldn’t be as noticeable as latency at 90fps is already good. It’s possible I’m greatly overthinking it all, but it makes me want to wait for solid reviews and not previews that are limited in what they can and cannot say. End of the day, you can just run 2.0 on these cards if you don’t like 3.0 and it should still be significantly faster than the 3000 series, though  :p 


Don’t think we can use a native presentation’s input lag as a benchmark here.  There’s still the standard DLSS super resolution pass as part of the process.

 

Not that I’m expecting anything crazy.  Reflex has to compensate for a lot.

 

Link to comment
Share on other sites

30 minutes ago, CastlevaniaNut18 said:

All of this is tempting, but I think my 3080 still has a good bit of life in it. 
 

Do need to upgrade my CPU to one of the new Ryzens soon, though. 

I’m thinking of picking up a 5800x3d early next year for hopefully a discount and popping it in my motherboard. I would be upgrading from a 3800x.

Link to comment
Share on other sites

"NVLink is dead to GeForce

Nvidia’s SLI technology for multi-GPU setups has been dead for a while now, but the “NVLink” tech that supplanted it remained in last-generation’s RTX 3090 GPUs. No more. The spec pages for the GeForce RTX 4090 and 4080s explicitly say that NVLink is not supported this generation. Pour one out."

 

-PC World

 

"Good"

 

-Vic20

  • Halal 1
  • Hugs 1
Link to comment
Share on other sites

57 minutes ago, Dre801 said:

Why wouldn't the 12GB 4080, which clearly has cut down specs compared to the 16GB model just be the 4070? I understand an official 4070 is coming, but it seems they already have one.


nVidia wants to purposely make this confusing for some reason. My guess is that the actual 4070 will be 8-10gb.

 

 

Link to comment
Share on other sites

59 minutes ago, Dre801 said:

Why wouldn't the 12GB 4080, which clearly has cut down specs compared to the 16GB model just be the 4070? I understand an official 4070 is coming, but it seems they already have one.

 

Because people would be even more upset if they charged $900 for a 4070.

 

Honestly, to me it looks like the announced 4080 specs is what would historically be a xx70 class card, as they are so much worse than the 4090 than what the 3080 was to the 3090.

Link to comment
Share on other sites

Also, the more models you make, the more you can pump the public for cash. This is a console system trick coming to GPUs. Did you want the $899 card, designed to make this product look like its reasonably priced? (in a modern context ) Well sorry, we have non of those, but check the 1200 unit, we have some of those! You still want the latest hotness, right? 

Link to comment
Share on other sites

1 hour ago, cusideabelincoln said:

 

Because people would be even more upset if they charged $900 for a 4070.

 

Honestly, to me it looks like the announced 4080 specs is what would historically be a xx70 class card, as they are so much worse than the 4090 than what the 3080 was to the 3090.

 

That and this price is technically a lie. The only cards at MSRP are founder's edition cards and this card is AIB only. 

 

Between the large amount of stock, Nvidia looking to just slot in the 4xxx above the 3xxx, and leaving AIBs out to dry with MSRP I can see why EVGA bailed.

Link to comment
Share on other sites

1 hour ago, cusideabelincoln said:

Because people would be even more upset if they charged $900 for a 4070.

 

Honestly, to me it looks like the announced 4080 specs is what would historically be a xx70 class card, as they are so much worse than the 4090 than what the 3080 was to the 3090.

 

Hey, I'm not the only one who thought this. It really felt like they unveiled a 4090, 4070, and 4060, but gave the latter two 4080 names just to "justify" the pricetag.

Link to comment
Share on other sites

3 hours ago, Ghost_MH said:

 

Hey, I'm not the only one who thought this. It really felt like they unveiled a 4090, 4070, and 4060, but gave the latter two 4080 names just to "justify" the pricetag.

 

Someone did mention that. Isn't the card with the 192 bit bus and the difference in shaders normally like a xx60 Ti?

Link to comment
Share on other sites

10 minutes ago, Zaku3 said:

Someone did mention that. Isn't the card with the 192 bit bus and the difference in shaders normally like a xx60 Ti?

 

Yes, that's the matches the 3060, NOT the 3060 Ti. The 3060 Ti  is 256 bit. The 2060 was also 192 bit with the 2060 Super being 256.

 

This is what I'm saying. This is a 4090, 4070, and 4060 priced like a 4090, 4080 Ti, and 4080. Now I want to know what the real 4070 and 4060 going to look like then?

 

I currently have a 2080 Ti and was planning to skip the 30 series and pick up a 40 series card. Between everything else that's been going on aside from the prices here, I think Team Red might have won me over. This would literally be my first not-Nvidia card since my old Radeon 9700 Pro so many years ago.

 

Well, let's see if they have anything that'll interest me enough to move on from my 2080 Ti in November. Otherwise, I'll keep this thing running for another year.

Link to comment
Share on other sites

13 minutes ago, Ghost_MH said:

 

Yes, that's the matches the 3060, NOT the 3060 Ti. The 3060 Ti  is 256 bit. The 2060 was also 192 bit with the 2060 Super being 256.

 

This is what I'm saying. This is a 4090, 4070, and 4060 priced like a 4090, 4080 Ti, and 4080. Now I want to know what the real 4070 and 4060 going to look like then?

 

I currently have a 2080 Ti and was planning to skip the 30 series and pick up a 40 series card. Between everything else that's been going on aside from the prices here, I think Team Red might have won me over. This would literally be my first not-Nvidia card since my old Radeon 9700 Pro so many years ago.

 

Well, let's see if they have anything that'll interest me enough to move on from my 2080 Ti in November. Otherwise, I'll keep this thing running for another year.

 

The thing that's worth noting, however, is that the 16gb 4080 is being compared with the 3080 Ti and is advertised as 2-4x faster, the 4090 is being advertised as 2-4x faster than the 3090 Ti. So, regardless of paper, if those comparisons are indeed true, then the 16gb is indeed a 4080 as the 3080 Ti and 3090 Ti are barely different in terms of in-game performance (the average at 4k is around 5% difference IIRC).

  • True 1
Link to comment
Share on other sites

11 minutes ago, Ghost_MH said:

 

Yes, that's the matches the 3060, NOT the 3060 Ti. The 3060 Ti  is 256 bit. The 2060 was also 192 bit with the 2060 Super being 256.

 

This is what I'm saying. This is a 4090, 4070, and 4060 priced like a 4090, 4080 Ti, and 4080. Now I want to know what the real 4070 and 4060 going to look like then?

 

I currently have a 2080 Ti and was planning to skip the 30 series and pick up a 40 series card. Between everything else that's been going on aside from the prices here, I think Team Red might have won me over. This would literally be my first not-Nvidia card since my old Radeon 9700 Pro so many years ago.

 

Well, let's see if they have anything that'll interest me enough to move on from my 2080 Ti in November. Otherwise, I'll keep this thing running for another year.

 

The cards should be good the question is how much performance are we gonna look at. The non DLSS games show the usual generational leap. If RDNA3 is the same or more plus at lower wattage it be a W. Though I'd expect a 7800XT will be 799.99 or something cuz of Nvidia. 

Link to comment
Share on other sites

13 minutes ago, Spork3245 said:

 

The thing that's worth noting, however, is that the 16gb 4080 is being compared with the 3080 Ti and is advertised as 2-4x faster, the 4090 is being advertised as 2-4x faster than the 3090 Ti. So, regardless of paper, if those comparisons are indeed true, then the 16gb is indeed a 4080 as the 3080 Ti and 3090 Ti are barely different in terms of in-game performance.

 

This is where I’m at. Comparing previous spec branding aside, if the 16gb 4080 performs around 2x a 3080ti standard and up to 4x  with dlss/RT then I have very little problem considering it a proper successor to the 3080.

 

It sounds like people are expecting AMD to have a card more than twice as fast than a 3080 for $800. I’m not sure if that will happen. But without being able to compete with DLSS3 I still wouldn’t be able to spend $800 on that. 
 

What I’ve realized is while not every game has dlss, very often games that actually need DLSS do have it. The other games usually just run great anyway. And then RT is also what you normally need the most power for. Basically nvidia is best for the games that actually use the power. AMD might be a more cost efficient way of running games that don’t at super high frame rates though.

Link to comment
Share on other sites

One thing I’m noticing is that even the 4090 looks like it only has a single 8pin connector in pics from both ASUS and MSI… this can’t be correct, can it?


EDIT: wait, this may be a new/different power type? Looks like some supply a converter which takes four VGA 8 pins… (yea, it’s the new ATX 3.0 power type)

 

EDIT2: BTW, I really like the RGB on my EVGA FTW3, looks like currently Galax has the best design to replace that 

Link to comment
Share on other sites

1 hour ago, Spork3245 said:

One thing I’m noticing is that even the 4090 looks like it only has a single 8pin connector in pics from both ASUS and MSI… this can’t be correct, can it?


EDIT: wait, this may be a new/different power type? Looks like some supply a converter which takes four VGA 8 pins… (yea, it’s the new ATX 3.0 power type)

 

EDIT2: BTW, I really like the RGB on my EVGA FTW3, looks like currently Galax has the best design to replace that 

 

All 4090s will use this new 12pin PCIe 5.0 connector, so every card will come with an adapter. It's highly recommended to have a PSU well above what you need when using this adapter with a non-ATX 3.0. 1000w should be safe, and perhaps bare minimum you can safely use an 850w. A new ATX 3.0 power supply has much stricter requirements to meet so you don't have to go quite above and beyond on wattage.

Link to comment
Share on other sites

15 minutes ago, cusideabelincoln said:

 

All 4090s will use this new 12pin PCIe 5.0 connector, so every card will come with an adapter. It's highly recommended to have a PSU well above what you need when using this adapter with a non-ATX 3.0. 1000w should be safe, and perhaps bare minimum you can safely use an 850w. A new ATX 3.0 power supply has much stricter requirements to meet so you don't have to go quite above and beyond on wattage.

 

I have a 1300w EVGA G2. I should be fine. should be :isee:

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...