Jump to content

Should we actually be excited about the new Nvidia cards?


Recommended Posts

I don't know.  Should I be excited?  Ray tracing looks like the new hotness...  However, I'm still going to play a lot of "traditionally rendered" games.

 

GTX-1070 -- Launched for $379 in 2016 w/ 5.8 GFlops; RTX 2070 -- launched for $499 in 2018 w/ 6.5 GFlops

GTX-1080 -- Launched for $549 in 2016 w/ 8.2 GFlops; RTX 2080 -- launched for $699 in 2018 w/ 8.9 GFlops

FTX 1080TI -- Launched for $699 in 2017 w/ 10.6 GFlops; RTX 2080TI -- lauched for $999 in 2018 w/ 11.8 GFlops

 

It will be interesting to see the performance of these cards in games -- however, they cost a LOT more, and are coming out ~2 years later.  They are only marginally faster -- in fact a GTX-1080 might actually be faster than a RTX 2070.

 

Now for ray tracing, there are some big games that will support it (Shadow of the Tomb Raider, Battlefield V) -- but there doesn't appear to be ubiquitous support.  So, we could be a year away from it becoming something that is supported in most games.  So we may actually be buying these primarily for Xmas 2019 games.

 

Now, here is where I get into full speculation, not based on fact, but complete theory that may make sense only to me.  This Nvidia generation could be short cycled and only be about 1 year long.

1)  Last gen was extended as Nvidia (and AMD) supported the crypto currency boom

2)  Launch of manufacturing of GDDR6 may have come later than anticipated, delaying the launch

3)  The XX80Ti is launching at the same as the XX80 (it had been launching later and later in each gen for the last 3 gens, and was 10 months later for the 1080Ti)

4)  The 10XX generation lasted over 2 years (ilo the 1.5 year gens from before)

So could a 2180 be out by Xmas 2019?

 

I was uber excited about these cards earlier today, but after some sober-second thoughts.  I think I may be saving up and waiting for a 2180Ti.

 

(I could easily flip-flop after reading reviews though...)

Link to comment
Share on other sites

4 minutes ago, Keyser_Soze said:

Ray tracing is the new hotness but I feel like it's been around for a while. At least the term has.

 

It’s what they use in movies, even in OG Toy Story, so... yea, it’s “been around for awhile”.:p 

It’s generally very taxing for real-time rendering, that’s what’s exciting about the “RTX” tech, with it being more feasible on a greater scale than past implementations (some older games had single objects, like clouds, being ray-traced as a gimmick to say “hey, look!”). 

 

Anyway, yea, the prices on the new cards are insane and removed all excitement for me.

Link to comment
Share on other sites

9 hours ago, Keyser_Soze said:

Ray tracing is the new hotness but I feel like it's been around for a while. At least the term has.

Real-time ray tracing is the new hotness.  I remember people doing ray tracing on the Amiga back in the 80s -- but they were very simple images that took an hour to render.

Link to comment
Share on other sites

The tech is indeed exciting.  It's probably the most exciting tech in over a decade.  

 

But the cards are not exciting - for the simple fact we are old hags who have seen it all.  The first iteration/generation of any new tech has never delivered the promises in spades.  It takes two, three generations (or never) for the performance to catch up to the tech, such that when then performance is there THEN developers can actually widely implement it.  Just look at tessellation, PhysX, and etc.  The first games to use them did so in less than stellar fashion, and it took a few generations before the performance from cards to be good enough to properly use these features.

 

And in addition, for all of you running at 4k these cards aren't really going to move the needle much for your performance.  So I'm sure those of you hoping to get some headroom for future 4k titles are even more disappointed.  It seems like you also won't be running ray-tracing in 4k anyway.  Sacrifices, more sacrifices!

Link to comment
Share on other sites

Yes, absolutely. We've been on a run of diminishing returns for graphics for some time now. The only way out is to break free of old standards. Nvidia's complete embracing of a combination of raster + ray trace + deep learning is a very promising direction to take.

 

No, it won't be perfect out of the gate, but you're still getting top of the line cards and as a community we ought to be encouraging this direction for our hobby.

Link to comment
Share on other sites

Hands on: Nvidia GeForce RTX 2080 Ti review

"Early verdict

Whether you’re a PC gamer who was waiting in the wings for a more powerful graphics card or you truly believe in Nvidia’s vision of a ray traced future, the GeForce RTX 2080 Ti is already looking like the world’s most powerful graphics card in the world regardless.

That said, you shouldn’t automatically jump on the pre-order button.

We still have a plenty of unanswered questions about power consumption, when we’ll actually see the lower-priced versions and their shortcomings, new multi-card SLI compatibility and, of course, benchmarking them. Be sure to keep your eyes locked to TechRadar, as it won’t be long before we fully review this new heir to rule the kingdom of graphics cards.

Nvidia Turing could really shake things up in the world of the best graphics cards"

Link to comment
Share on other sites

NVIDIA RTX 2080 Ti & 2080 Benchmarks Coming Soon, Cards Already in The Hands of a Lucky Few

"Word has reached us that reviewers returning from Gamescom are going to be able to start testing the new toys straight away. NVIDIA’s reviewer’s briefing is taking place tomorrow where the company is expected to hand over a working driver to reviewers, so it’s likely they could begin running benchmarks no later than this weekend.

It’s still not clear when exactly the NDA will lift and reviews are allowed to go live, but with a working driver benchmarks are bound to start leaking online. Even though full reviews may be a week or two off, you can expect to get a very a good idea of how the RTX 2080 Ti and RTX 2080 perform much sooner. As always, stay tuned for more."

Link to comment
Share on other sites

Why You Shouldn’t Buy Nvidia’s RTX 20-Series Graphics Cards (Yet)

"If you already have a 10-series card that's adequate, it may make sense to skip this card generation entirely and see what kind of ray tracing and Tensor Core support is like come time for the 30-series line. By that time, it will also be clear what kind of a response AMD is going to muster to battle Nvidia's Turing juggernaut. Intel might have something serious to say about high-end gaming by that time as well. "

 

Link to comment
Share on other sites

18 hours ago, legend said:

Yes, absolutely. We've been on a run of diminishing returns for graphics for some time now. The only way out is to break free of old standards. Nvidia's complete embracing of a combination of raster + ray trace + deep learning is a very promising direction to take.

 

No, it won't be perfect out of the gate, but you're still getting top of the line cards and as a community we ought to be encouraging this direction for our hobby.

I mostly agree with this sentiment, but purely as a consumer and early adopter of tech, I'd caution against buying into the first generation, which this effectively is. I want to see it succeed, and I want to see ray tracing become the norm, but I think there is enough industry incentive to make that happen. It's a differentiating feature unlike anything we've re seen in graphics tech in recent memory, making it an obvious profit incentive for nvidia. It should simplify lighting models for game creators, giving them an unusual incentive for adoption (at least compared to other high end features), especially once incorporated into the standard game engines out there.

 

The only issue facing it's eventual adoption is how far behind AMD is. If this is something AMD can compete at in relatively short order (a couple years at most), I can't think of a reason it won't become standard.

Link to comment
Share on other sites

2 hours ago, TwinIon said:

I mostly agree with this sentiment, but purely as a consumer and early adopter of tech, I'd caution against buying into the first generation, which this effectively is. I want to see it succeed, and I want to see ray tracing become the norm, but I think there is enough industry incentive to make that happen. It's a differentiating feature unlike anything we've re seen in graphics tech in recent memory, making it an obvious profit incentive for nvidia. It should simplify lighting models for game creators, giving them an unusual incentive for adoption (at least compared to other high end features), especially once incorporated into the standard game engines out there.

 

The only issue facing it's eventual adoption is how far behind AMD is. If this is something AMD can compete at in relatively short order (a couple years at most), I can't think of a reason it won't become standard.

 

I'm not especially keen on hoping it progresses without consumer support, but It is possible developers and Nvidia will try and force it through regardless. I would certainly be more comfortable if that ends up the case!

 

I'm also still confused that more people are not excited about this, but were excited by the boring gains the 10 series brought.

 

 

Link to comment
Share on other sites

23 minutes ago, Dre801 said:

I max out at 1920 x 1200 screens, hell I'm looking at just getting a 1060 or 1070 for that.  Not really excited.

 

You won't be maxing it out if devs start embracing the hardware of the 20x line. A 'low' resolution like 1080P won't save you.

 

If devs don't then yeah, you'll be fine.

Link to comment
Share on other sites

5 hours ago, Reputator said:

 

LOL what developer would be that dumb?

 

? It's not a controversial statement that a developer's max settings would strive to take advantage of the latest cards, consequently resulting in the max settings only being really possible with the best cards. That's not dumb, that's exactly what we want devs to be doing. Making the max setting being ceilinged by older cards sucks. 

 

Having demanding max settings doesn't mean the game won't run on older systems on lower than max settings.

Link to comment
Share on other sites

On 8/23/2018 at 12:47 PM, legend said:

 

You won't be maxing it out if devs start embracing the hardware of the 20x line. A 'low' resolution like 1080P won't save you.

 

If devs don't then yeah, you'll be fine.

I don't think you need to worry about that.  Ray Tracing will be bolted on for a few years at least and unless we see PS5/Xbox 4 adoption, it will matter less and will continue to be an "nVidia Feature" until other people catch up.

 

Normal rasterization isn't going anywhere for the next 5 years at least IMO.  

 

 

Link to comment
Share on other sites

14 minutes ago, Mithan said:

I don't think you need to worry about that.  Ray Tracing will be bolted on for a few years at least and unless we see PS5/Xbox 4 adoption, it will matter less and will continue to be an "nVidia Feature" until other people catch up.

 

Normal rasterization isn't going anywhere for the next 5 years at least IMO.  

 

 

 

I don't think that's true unless you consider "bolted on" to be anything other than completely ray traced (which won't happen for many graphics generations).

 

The stuff out of the gate we're seeing with almost no time is already quite promising and it's clear there is a huge range of software improvement that can be done that developers are excited about it. It also has support from DX and Epic in UE4. 

 

The only reason I think we won't see a nice substantial growth in this tech is if the cards are a major commercial failure.

Link to comment
Share on other sites

On 8/31/2018 at 8:09 AM, legend said:

 

I don't think that's true unless you consider "bolted on" to be anything other than completely ray traced (which won't happen for many graphics generations).

 

The stuff out of the gate we're seeing with almost no time is already quite promising and it's clear there is a huge range of software improvement that can be done that developers are excited about it. It also has support from DX and Epic in UE4. 

 

The only reason I think we won't see a nice substantial growth in this tech is if the cards are a major commercial failure.

 

I think in time it will go that route, but I do not see this as being something people "must have" today or in the very near future (next 12 months).  I also think the hit to performance is more than the vast majority of people will tolerate.

 

You might disagree, but I am one of those people who loves graphic fidelity and will drop $1000 on a new video card, but I still want a decent frame rate.  I don't see a point in dropping $1600 Canadian on the 2080Ti version, when it will give me the great graphics but at about 30-40FPS, not after playing at 120+ fps in almost every game the last year, maxed out.  I just don't see the trade off worth it, so I will skip it until it IS worth it, which I suspect will be a generation, so late 2019 or 2020.  

 

I like the tech, it is the future but I think at this point, prices are extremely high.  

Link to comment
Share on other sites

1 hour ago, Mithan said:

 

I think in time it will go that route, but I do not see this as being something people "must have" today or in the very near future (next 12 months).  I also think the hit to performance is more than the vast majority of people will tolerate.

 

You might disagree, but I am one of those people who loves graphic fidelity and will drop $1000 on a new video card, but I still want a decent frame rate.  I don't see a point in dropping $1600 Canadian on the 2080Ti version, when it will give me the great graphics but at about 30-40FPS, not after playing at 120+ fps in almost every game the last year, maxed out.  I just don't see the trade off worth it, so I will skip it until it IS worth it, which I suspect will be a generation, so late 2019 or 2020.  

 

I like the tech, it is the future but I think at this point, prices are extremely high.  

 

I'm also a framerate whore about a steady 60 fps and loathe micro stutter and tearing, so I always optimize for that. But every indication shows that 1080P (which is the original context of my comment to Dre who said he wanted to stay at 1080P) with ray tracing features at 60FPS is absolutely going to be possible quite frequently, and it's only going to get more possible with time as developers start optimizing for the hardware. The DF Dice videos do a good job at making that clear.

 

Now if you're commenting out of the original context of the 1080P discussion,  want 4K at 60fps, and otherwise would rather have that than ray tracing, then this is *still* a great card to get because this is the first line of cards that seem to make conventional ultra raster games at 4K extremely reliable. And because the ray tracing resolution can be lower than the screen resolution, as time goes on you'd probably still be able to do 4K Ultra at 60 with some degree of ray tracing on as well.

 

Really, no matter your preferences on how much you care about each kind of graphical features, this is a winning card.

 

The prices are high, there is no doubt about that. But if you're the kind of person who is willing to spend 1K on gaming hardware, there hasn't been a card that gives a better reason to do that in a very long time.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...