Jump to content

Nvidia Gamescon Confrence Today - 2080 series to be announced


Mr.Vic20

Recommended Posts

1 hour ago, legend said:

 

Okay, lets review. I was legitimately trying to have a discussion with you. Then you accuse me of putting words in your mouth when I've put in an effort to carefully look at what you said and respond to it.  So I give you back a little bit of sass in kind and now you're upset with that? Really?

 

But yes, you are engaging in special pleading. You took a hardline stance that because a card won't always max out a game a 60fps that it couldn't be called a 4K/60 card. I showed how this led to the conclusion that basically no card is ever safe in that sense, and suddenly you're willing to grant exceptions. There's no more reason to disqualify ray tracing as a critical aspect of maxing out a game than there is literally any other graphical feature. In fact, because ray tracing can have impacts on how you play the game, it's arguably a more worthy feature to be concerned with than a higher res texture map, or foliage draw distance, or hairworks, etc. When you take a hardline position in opposition to someone's claim, and then adhoc add exceptions whenever it suits you, that's special pleading.

 

Of course I'm with you that disqualifying the 1080Ti as a 1080/60 card because it won't be able to do ray tracing--or any number of potentially really cool graphical features these new cards will afford--is silly.  But that's the point: an absolutist simple criteria for this term is not a useful criteria. Unless you want to also deny 1080Ti being a 1080/60 card, you're in no position to reject Crispy taking a slightly more nuanced definition.

 

That's all I'll further say on the matter.


Hit the nail on the head.

I'll reiterate one thing this doesn't touch on.  He said "if 4k 30fps is acceptable, might as well save money and get an Xbone X."  Technically, it's still hit or miss for reaching 4k on average (particularly in 3rd party titles).  No biggie in itself, and typically something I'd brush past by since its still ballpark to 4k most often.  But then he insisted that the RTX 2080ti isn't a 4k60 piece of kit unless it can do both 4k and 60fps consistently at PC's ultra settings.

 

That double standard is another big reason why his 'firm' stance makes little sense.  He hasn't applied it consistently.
Not for new rendering effects.  And not to last year's new console either.

Link to comment
Share on other sites

1 hour ago, legend said:

 

Okay, lets review. I was legitimately trying to have a discussion with you. Then you accuse me of putting words in your mouth when I've put in an effort to carefully look at what you said and respond to it.  So I give you back a little bit of sass in kind and now you're upset with that? Really?

 

...because you did put words in my mouth and insinuated something regarding RT effects on older gen cards that I never stated. I’m, uhh... sorry (?) you somehow found this to be “sassy” and as reasoning to begin insulting me...?

 

Quote

But yes, you are engaging in special pleading.

 

I’m not, though, my stance on these things has been consistent for years.

 

Quote

 

You took a hardline stance that because a card won't always max out a game a 60fps that it couldn't be called a 4K/60 card. I showed how this led to the conclusion that basically no card is ever safe in that sense, and suddenly you're willing to grant exceptions.

 

I always granted these exceptions and stated as such using AMD and hairworks as an example.

 

Quote

 

There's no more reason to disqualify ray tracing as a critical aspect of maxing out a game than there is literally any other graphical feature.

 

*On cards that support it

 

Quote

 

In fact, because ray tracing can have impacts on how you play the game, it's arguably a more worthy feature to be concerned with than a higher res texture map, or foliage draw distance, or hairworks, etc.

 

Which is why I wouldn’t recommend a 10xx series card to be purchased at this point in time. Just like I wouldn’t have recommended a DX7 GeForce 2 after the DX8 GeForce 3 released.

I do hold RT in its own performance spectrum, however, as I would likely state (again), once RT is available for use in games, that the 2080 Ti is practically/almost/close to a 4k/60 card in non-RT games and a 1080p/60 card for RT capable games. I really fail to see what’s outlandish or a stretch in that statement.

 

Quote

 

When you take a hardline position in opposition to someone's claim, and then adhoc add exceptions whenever it suits you, that's special pleading.

 

I’m not “adding exceptions”, a card cannot render hardware-based effects that it cannot render, thus how can I hold that against its capable resolution/fps? If the 1080 Ti were a new card that just released alongside the 20xx series and was from the same manufacturer (as it is) I would rip it apart as if it were the GeForce4 MX reborn, though.

 

Quote

Of course I'm with you that disqualifying the 1080Ti as a 1080/60 card because it won't be able to do ray tracing--or any number of potentially really cool graphical features these new cards will afford--is silly.  But that's the point: an absolutist simple criteria for this term is not a useful criteria. Unless you want to also deny 1080Ti being a 1080/60 card, you're in no position to reject Crispy taking a slightly more nuanced definition.

 

I never stated my opinion or stance on the matter was “absolutist” and did not have exceptions for certain effects. That’s something you assumed and then willfully refused my clarifications on with your insistence of your 1080 Ti example. Once I clarified you accuse me of “pleading” and now “adhoc add exceptions” based on your own assumptions of what you think my stance is.

Simply put, I do not hold the performance of non-available features against what a card is actually capable of, especially if it’s a card made prior to said effect existing, however, I would also be unlikely to recommend a card for purchase that is missing said features. If the 1080 Ti could do RT in RTX games then I would hold it against it, but, again, in regards to RT, I place it in its own category of performance simply due to its demanding nature and limited availability in games.

 

Quote

That's all I'll further say on the matter.

 

I kinda-sorta doubt this. :p 

 

Link to comment
Share on other sites

1 hour ago, crispy4000 said:


Hit the nail on the head.

I'll reiterate one thing this doesn't touch on.  He said "if 4k 30fps is acceptable, might as well save money and get an Xbone X."  Technically, it's still hit or miss for reaching 4k on average (particularly in 3rd party titles).  No biggie in itself, and typically something I'd brush past by since its still ballpark to 4k most often.  But then he insisted that the RTX 2080ti isn't a 4k60 piece of kit unless it can do both 4k and 60fps consistently at PC's max settings.

 

That double standard is anothe big reason why his 'firm' stance makes little sense.  He hasn't applied it consistently.
Not for new rendering effects.  And not to last year's new console either.

 

Shhhhh, adults are speaking.

Speaking of firm stances, you’re still okay with “sacrifices” for the 2080 Ti but disqualify the same sacrifices for the Xbone. Okay. I’m glad you believe in your own argument. :thumbup:

Link to comment
Share on other sites

2 hours ago, Brick said:

 

Yeah but if the 2080Ti only has HDMI ports on it, I'll need one of those converters, yes? It's only a 60Hz monitor, but it is 1440p. 

 

Wait, what? Why in the world would the 2080 Ti only be HDMI? Furthermore, a digital-to-digital port conversion is simple and just a pass through (ie: hdmi to display port, or hdmi to dvi-d, or any combo of the three should be just a simple connection adapter). Digital-to-analog requires an actual hardware/software based conversion of the signal which is where those expensive and lag-inducing converters come into play. I have a friggen CRT Sony GDM-FW900, bro. :p 

If you have a 16:9 monitor, it’s HIGHLY unlikely that you have an analog based LCD monitor.

Link to comment
Share on other sites

17 minutes ago, Spork3245 said:

 

Shhhhh, adults are speaking.

Speaking of firm stances, you’re still okay with “sacrifices” but disqualify the same sacrifices for the Xbone. Okay. :thumbup:


So you're resorting to ad homenims now.  Great.

And no, the sacrifices 2080ti owners may make to get consistent 40k60 in some titles will, typically, not be the same as those X1X owners often see to hit 30fps at 4k-ish.

Link to comment
Share on other sites

9 minutes ago, crispy4000 said:


So you're resorting to ad homenims now.  Great.

 

I mean, you’re resorting to piggybacking on an actual response, so why not?

 

Quote

And no, the sacrifices 2080TI owners make to get consistent 40k60 will, typically, not be the same as those X1X owners often see to hit 30fps at 4k-ish.

 

Lowering settings and using dynamic resolution?

 

BTW, please show me where I actually stated that the 2080 Ti was a “piece of kit” or even a bad card, as you so used quotations for? My posts have been in regards to my disappointment that it cannot do 4k/60 ultra in games that it is releasing alongside and how I believe a $1k card should be giving me all the eye candy in current games, not that the card is a pos.

Link to comment
Share on other sites

2 hours ago, Spork3245 said:

 

BTW, please show me where I actually stated that the 2080 Ti was a “piece of kit” or even a bad card, as you so used quotations for?

 

lol, the only person who put piece of kit in quotations is you, in this reply.  Do a page search.

I also never said you explicitly thought the 2080 Ti was a bad card.  Just that you've contradicted yourself.
 

2 hours ago, Spork3245 said:

Lowering settings and using dynamic resolution?

 

To X1X levels for intensive games, no.  Not likely to be the same sacrifices.
 

2 hours ago, Spork3245 said:

 

I mean, you’re resorting to piggybacking on an actual response, so why not?

 

I was mentioned in the response too, and wanted to clarify there was more to our disagreement than a semantics debate.

 

Besides, this is a message board.  People jump in and out of other's conversations and take sides all the time.

Link to comment
Share on other sites

4 hours ago, Spork3245 said:

 

Wait, what? Why in the world would the 2080 Ti only be HDMI? Furthermore, a digital-to-digital port conversion is simple and just a pass through (ie: hdmi to display port, or hdmi to dvi-d, or any combo of the three should be just a simple connection adapter). Digital-to-analog requires an actual hardware/software based conversion of the signal which is where those expensive and lag-inducing converters come into play. I have a friggen CRT Sony GDM-FW900, bro. :p 

If you have a 16:9 monitor, it’s HIGHLY unlikely that you have an analog based LCD monitor.

 

Didn't the 1080Ti not have DVI ports? Either way, yeah I was talking about an adapter to connect DVI to HDMI. I think we confused ourselves here.

Link to comment
Share on other sites

9 hours ago, crispy4000 said:

Just that you've contradicted yourself.

 

Except that I haven’t, my stance on this has been consistent since the GeForce 256 era. You just think I have because you disagree with my position and would rather argue. :peace:

If you think I’m contradicting myself because of the XboneX posts... jfc, the suggestion is and always has been sarcasm and is being used to show the folly of your diminished settings/resolution argument. I stated as such and said the argument for the XboneX being a 4k/30 machine sucks pages ago.

 

9 hours ago, crispy4000 said:

To X1X levels for intensive games, no.  Not likely to be the same sacrifices.

 

So, it’s not lowering settings and using dynamic resolution to achieve a stable framerate? What’s it doing that’s different than your suggestion of lowering settings and using dynamic resolution to get a stable 4k/60 on everything currently out with a 2080 Ti?

Gee, yet I’m the one contradicting myself. :thinking:

 

5 hours ago, Brick said:

 

Didn't the 1080Ti not have DVI ports? Either way, yeah I was talking about an adapter to connect DVI to HDMI. I think we confused ourselves here.

 

It had Display Port and HDMI, but comes with a DP-to-DVI adapter.

The adapter you were initially talking about, in regards to my own monitor and what we once spoke with Xbob about, is not a digital-to-digital adapter, it’s an actual converter for changing digital signals to analog.

Link to comment
Share on other sites

1 hour ago, Spork3245 said:

 

So, it’s not lowering settings and using dynamic resolution to achieve a stable framerate? 

 


Like I've already said, not to the same levels as the X1X resorts to.  Most of the games that don't average to 60fps at max settings on the 2080ti are only a few frames off the mark. (ex: MH World @ 59fps, Shadow of the Tomb Raider @ 56fps : source)  They'll still look much better than X1X quality if you have to drop a setting or two, marginally, to reach 4k60. 

The X1X just isn't running those same games at a native 4k on average (at 30fps), even with settings dialed back from PC ultra.  MHW, for example, sits at 1728p and reduces shadow detail, ambient occlusion quality, etc.  If you're going to call that a 4k30 machine, as you did, this damn well should be considered a 4k60 card.  All I'm asking is for you to be consistent.



Ghost Recon: Wildlands comes closest to being an exception, afaik.  At 4k ultra, it only manages 47fps on average on the 2080ti.  You'd have to dial back settings to very high, which reaches 65fps.  That's not insubstantial.  But neither is playing it at 1800p30 on the X1X with similar, if not more extensive cut backs.

It's probably just a case of Ubisoft making PC ultra settings more intensive than most.

Link to comment
Share on other sites

45 minutes ago, crispy4000 said:


Like I've already said, not to the same levels as the X1X resorts to.  Most of the games that don't average to 60fps at max settings on the 2080ti are only a few frames off the mark. (ex: MH World @ 59fps, Shadow of the Tomb Raider @ 56fps : source)  They'll still look much better than X1X quality if you have to drop a setting or two, marginally, to reach 4k60. 

The X1X just isn't running those same games at a native 4k on average (at 30fps), even with settings dialed back from PC ultra.  MHW, for example, sits at 1728p and reduces shadow detail, ambient occlusion quality, etc.  If you're going to call that a 4k30 machine, as you did, this damn well should be considered a 4k60 card.  All I'm asking is for you to be consistent.



Ghost Recon: Wildlands comes closest to being an exception, afaik.  At 4k ultra, it only manages 47fps on average on the 2080ti.  You'd have to dial back settings to very high, which reaches 65fps.  That's not insubstantial.  But neither is playing it at 1800p30 on the X1X with similar, if not more extensive cut backs.

It's probably just a case of Ubisoft making PC ultra settings more intensive than most.

 

You need higher than a 60fps average to maintain a solid 60fps. So, where’s this arbitrary line you’re drawing where the lowered settings/resolution no longer becomes acceptable on the X1X?

 

Also,

1 hour ago, Spork3245 said:

If you think I’m contradicting myself because of the XboneX posts... jfc, the suggestion is and always has been sarcasm and is being used to show the folly of your diminished settings/resolution argument. I stated as such and said the argument for the XboneX being a 4k/30 machine sucks pages ago.

*cough*

If you're going to call that a 4k60 gpu, as you did, this damn well should be considered a 4k30 console.  All I'm asking is for you to be consistent.

 

Link to comment
Share on other sites

38 minutes ago, Spork3245 said:

 

You need higher than a 60fps average to maintain a solid 60fps. So, where’s this arbitrary line you’re drawing where the lowered settings/resolution no longer becomes acceptable on the X1X?

 


The framerate you'll encounter most often in the game.  That isn't arbitrary.

And I never said anything about lowered settings/resolution being unacceptable on the X1X.  On the contrary, it's a good thing to make compromises where needed.
 

38 minutes ago, Spork3245 said:

 

Also,

 

*cough*

If you're going to call that a 4k60 gpu, as you did, this damn well should be considered a 4k30 console.  All I'm asking is for you to be consistent.

 

 

I've got no problem calling the X1X a 4k30 console, thank you.  Many games do hit that on average, even some AAA ones.  Several more come close enough to it to make it splitting hairs, or use reconstruction techniques like checkerboarding to reach it.  That's all good enough for me.

 

This card is better equipped to hit 4k60 in games today than the X1X to its target, but you still have a hang up for some reason.  I would have no beef with that if you could admit the X1X isn't 4k30 by the criteria you're choosing to evaluate the 2080ti.
 

Link to comment
Share on other sites

1 hour ago, Mr.Vic20 said:

 

 

Nice. So this is indeed suggesting that that the standard way DLSS is going to be used is rasterizing at something lower like 1440 and then using DLSS to superscale it to 4K (as opposed to rasterizing at 4K, superscaling to something higher, and then downsampling it for AA).

Link to comment
Share on other sites

Also, their analogy to the Geforce 3 is precisely the point I've been trying to make. People need to look beyond the immediate results on games as they are when it releases. It's fine to have some skepticism there and want to wait a bit longer to see how strong it really is, but you have to factor it in and thus far all signs are really positive.

Link to comment
Share on other sites

7 minutes ago, crispy4000 said:

I'm probably jumping the gun here, but since that video said that lower resolutions work too ... what about DLSS in a Switch successor?

I can dream.

A few Tensor cores is, in my mind, an inevitability in the Switch successor, provided Nintendo continues their relationship with Nvidia. It would be an excellent low watt, high IQ solution for portable tech of all kinds. 

Link to comment
Share on other sites

8 minutes ago, legend said:

Also, their analogy to the Geforce 3 is precisely the point I've been trying to make. People need to look beyond the immediate results on games as they are when it releases. It's fine to have some skepticism there and want to wait a bit longer to see how strong it really is, but you have to factor it in and thus far all signs are really positive.

I think there are a few camps on this subject, and I believe I get the view of each. I'm all about progress in tech, but I understand how the unwritten contract of "This" costs X dollars historically is causing most budget conscious gamers to say:

 

Image result for this deal is getting worse animated gif

 

However, I prefer to see it as:

 

Image result for my god it's full of stars animated gif

 

People disappointing with the 2080 might well become a fair bit more interested in the card once the DLSS support is there. Linus tech has made the point that this series seems rushed and at least from a software perspective, I can see that. But I can also look forward to Q1 of 2019 and think, these features will get some solid bus and pick up rapidly! 

Link to comment
Share on other sites

5 minutes ago, crispy4000 said:

I'm probably jumping the gun here, but since that video said that lower resolutions work too ... what about DLSS in a Switch successor?

I can dream.

 

Depends I suppose, on how fast Nvidia can miniaturize this architecture with low power demands, and how soon Nintendo might put out a successor.

 

There is hope that miniaturization won't be too far off on Nvidia's side; at least with respect to tensors cores taking a bigger slice, because Nvidia has lots of orthogonal incentives for that. For example, there is already the Jetson Xavier which is a small low-power SoC and boasts tensor core support.

 

That product's primary role, as you might imagine, is for those devious individuals trying to get deep learning power on robots and sensors :isee: But it also means it could be easy for them to provide something for portable devices like a new Switch successor.

Link to comment
Share on other sites

18 minutes ago, Mr.Vic20 said:

I think there are a few camps on this subject, and I believe I get the view of each. I'm all about progress in tech, but I understand how the unwritten contract of "This" costs X dollars historically is causing most budget conscious gamers to say:

 

Image result for this deal is getting worse animated gif

 

However, I prefer to see it as:

 

Image result for my god it's full of stars animated gif

 

People disappointing with the 2080 might well become a fair bit more interested in the card once the DLSS support is there. Linus tech has made the point that this series seems rushed and at least from a software perspective, I can see that. But I can also look forward to Q1 of 2019 and think, these features will get some solid bus and pick up rapidly! 

 

 

:lol: Yeah I totally understand people on a more sane budget for gaming balking at this price. But fuck it's exciting and those of us willing to spend that kind of cash on this kind of stuff should!

Link to comment
Share on other sites

To me, the 2080 and 2080Ti are the best cards you can buy. 

 

But, at their current prices, I can't justify the purchase with the games I expect to play in the near term.  I'm waiting for the 2180Ti.

 

If I had a 970, or lower, I probably would make a different decision.

 

On a side note, I don't know what a "4k30" console is -- it's an arbitrary adjective that is being made up in this thread.  Arguing over what a made-up term should mean will never go well.

Link to comment
Share on other sites

4 hours ago, crispy4000 said:


The framerate you'll encounter most often in the game.  That isn't arbitrary.

 

 

 

I'm talking about the amount the settings which can be lowered and the amount of dynamic resolution that can intercede. :silly: 

 

Quote


And I never said anything about lowered settings/resolution being unacceptable on the X1X.  On the contrary, it's a good thing to make compromises where needed.

 

 

 

But you did say that it's totally A-OK for the 2080 Ti to use lowered settings/resolution... which is my point that you're contradicting yourself regarding it not okay for the XboneX.

Hmmm, it's almost like you're just being obstinant and arguing for the sole sake of arguing. Nah, couldn't possibly be that!

 

Quote


 

 

I've got no problem calling the X1X a 4k30 console, thank you.  

 

 

Oh, okay, you only have a problem with my sarcastic comments when using your own logic to imply such a statement. Got it!

 

Quote

This card is better equipped to hit 4k60 in games today than the X1X to its target, but you still have a hang up for some reason.  I would have no beef with that if you could admit the X1X isn't 4k30 by the criteria you're choosing to evaluate the 2080ti.
 

 

 

6 hours ago, Spork3245 said:

If you think I’m contradicting myself because of the XboneX posts... jfc, the suggestion is and always has been sarcasm and is being used to show the folly of your diminished settings/resolution argument. I stated as such and said the argument for the XboneX being a 4k/30 machine sucks pages ago.

2

 

22 hours ago, Spork3245 said:

I'm using your own argument as to how the 2080 Ti is a 4k/60 card for the XboneX being a 4k/30 machine to show you how terrible your argument is. I'm glad we both agree that the argument is just awful.

1

 

:silly: Wow, it's almost like I never actually called it a 4k/30 machine and stated "If 4k 30fps is acceptable, might as well save money and get an Xbone X" in reply to your statement about lowering settings/resolution/framerate with the 2080 Ti :silly: 

Link to comment
Share on other sites

21 minutes ago, crispy4000 said:

 

 

:shrug:

 

 

Which was in response to:

 

On 9/19/2018 at 1:05 PM, crispy4000 said:

 

 

Still might be good for 4k 30fps though for future titles. 

 

 

That's hardly me calling it a 4k/30 console

 

Oh, and also this like 2 replies later where I offered the first clarification:

 

On 9/19/2018 at 1:49 PM, Spork3245 said:

 

Except for the XboneX I got for $450 will run games at 4k better than that low-end AMD RX card... So... I don't understand the argument you're trying to make here?

 

EDIT: My comment about the XboneX is if you're spending $1000+ on a 2080 Ti with the goal of 4k 30fps... like... why even bother with the Ti model at that point? The goal of spending the cash for the Ti should be for 4k 60fps.

1

 

So, yea, as I stated, you're being purposefully obstinant here if you actually think that was me calling it a 4k/30fps console and not making a sarcastic comment about spending $1k on a GPU and being content with reduced settings, resolution and framerate, as I stated a ridiculous amount of times now that the argument for it being a 4k/30 machine is the same as the 2080 ti being a 4k/60 card to me: nonsensical and ridiculous.

You're asking me to be "consistent" and I never wasn't. :| 

Link to comment
Share on other sites

14 minutes ago, Spork3245 said:

 

So, yea, like, as I stated, you're being purposefully obstinant here if you actually think that was me calling it a 4k/30fps console and not making a sarcastic comment about spending $1k on a GPU and being content with reduced settings, resolution and framerate, as I stated a ridiculous amount of times now that the argument for it being a 4k/30 machine is the same as the 2080 ti being a 4k/60 card to me (nonsensical and ridiculous).

 

For the last time, it's not the same degree of setting reductions or resolution dips.  With that card, you wouldn't be sacrificing as much to reach the target.
 

You refuse to weigh that, so whatever.  I'm done.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...