Jump to content

legend

Members
  • Posts

    29,565
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by legend

  1. 7 minutes ago, CastlevaniaNut18 said:

    I love PC gaming. I can afford it. 

     

    Doesn't mean I'm willing to throw away t hat much money, though.

     

    There's gradations to this. If you're already on something like a 1080Ti I understand not wanting to drop cash on it all over again. What I don't understand is a lack of excitement for these cards. What are people wanting Nvidia to do otherwise? Making big, but important, changes like this isn't going to be cheap but I can't think of anything better to encourage.

     

    A regular upgrade to a standard raster card at the same price range as the last set is not exciting and I'd be quite disappointed to continue to be stuck in that zone.

     

    But if we don't want to be stuck there, we have to put our money on the future we want to see.

  2. Yes, absolutely. We've been on a run of diminishing returns for graphics for some time now. The only way out is to break free of old standards. Nvidia's complete embracing of a combination of raster + ray trace + deep learning is a very promising direction to take.

     

    No, it won't be perfect out of the gate, but you're still getting top of the line cards and as a community we ought to be encouraging this direction for our hobby.

  3. 5 minutes ago, Spork3245 said:

     

    Gsync is technically superior to Freesync (no clue about 2.1’s VRR, but it’s likely the same case) as it’s a hardware solution vs a software solution, but it’s by an indecipherable amount that requires a 600fps camera to capture the difference - though that fraction of a fraction of a fraction of a millisecond will certainly be nVidia’s reasoning for keeping it around, at least on monitors. :p The issue is also that we don’t know if monitors will adopt 2.1 VRR as they are primarily display port and I’m unaware of any VRR standards upcoming to DP.

    To your other point, I would highly doubt that nVidia has included a 2.0b-2.1 hybrid HDMI out on the 20xx series like MS did on the XboneX and Samsung did on the Q9 without making a huge deal about it (both MS and Samsung made a bunch of noise about their inclusion... and this is nVidia we’re talking about! :p )

     

    I suppose they could maybe then make a case for the time being for sticking with Gsync for monitors since there is no standard, while supporting 2.1's VRR for TVs. Still would be ridiculous though because requiring such special hardware is a terrible plan and I really wish they would do away with it.

  4. 7 minutes ago, Spork3245 said:

     

     

    It’d be totally fine if they stuck with GSync for monitors and display port, especially since HDMI VRR =/= Freesync. I am highly concerned about HDMI 2.1 compliance on future cards, however, especially since these cards really should have been 2.1 compliant given that nVidia was the first to adopt 2.0b compliancy to give 4k 60hz HDR to PC.

     

    It's perfectly fine for Gsync to coexist with 2.1 VRR, but it seems completely pointless for it to do so if they have 2.1 VRR, so I'm more worried about them exclusively supporting Gsync for VRR.

     

    As we found in other discussion, you don't need to be fully 2.1 compliant to support 2.1 VRR, so lack of full 2.1 support is at least not a blocker for that. Although if you wanted other 2.1 features, then yeah, that might be an issue.

  5. Beyond the silliness of the company's hype machine, I think we should also praise the whole company. They ultimately are deciding to make this large business decision to push the market forward.

     

    It comes down to this: if you want the market to embrace this change, then we should be okay with paying the cost of getting us there.

     

    Here's a world that would suck IMO: the market doesn't embrace the new cards and so Nvidia goes back to only pushing on raster and it takes far longer to get to this next world of architecture both because Nvidia has stopped focusing on it and because developers have no audience to justify working on the software end.

     

    All things considered this is a commendable job to maintain top-end parity on their top end cards for raster, and introduce a whole lot more. I don't think we can really ask for better than that and so if we want to see more of this, we should reward the company for trying to do it.

     

     

    This is an occasion where Vic's compulsion to buy the newest greatest hardware really would be better if adopted by more people :p 

    • Thanks 1
  6. 20 minutes ago, Mr.Vic20 said:

    This is the double edged sword of Nvidia's approach. On the one hand they promising what no one else is, the so called "holy grail" of graphics, Ray tracing (though its not a 100% true solution, but rather a very convincing alternative solution) On the other hand, they have done what Nvidia does namely:

     

    1.) Waste 90 minutes of our lives explaining graphic technologies to people who already have decent understanding of said graphics tech.

    2.) Promising the moon!

    3.) Using charts that mean very little at best and are out right lies at worst!

    4.) And price gouging the consumer with the impunity of an arrogant Sony back in the day. 

     

    Now that the dust has settled on Nvidia's reveal, and the gaming "Press" that quite frankly were writing breathless pieces of hyperbole like a rapid fire machine gun over the last month, have only now been turning the corner on asking some practical questions. This is not thier fault specifically, as Nvidia is handing out only PR crap if anything. But still, the back swing on the initial excitement will be brutal in the coming weeks. With prices sky high and the consumer skeptical as all Hell, there aren't too many good narratives likely to emerge between now and mid September when these cards launch.

     

    My, unproven take, is that the 2080ti is likely to perform at or around the speed of the Titan V in non RTX games. In RTX games, I suspect, I'll have to down shift from 4K to 1440P to pick up the use of this new feature. So I guess it will be time to truly put to the test the concept of "Prettier pixels" vs resolution. 

     

     

    This is my take as well and frankly, I'm okay with it because where they have delivered they really have. They're doing an enormous shift in architecture and developers are just now starting to use it. It's not going to super high performance on the new stuff immediately. As long as it's not a losing proposition in standard pure raster methods (meaning slower than existing GTX cards), there's really not a trade off. Maybe if AMD was able to compete and was pushing out even better raster cards there'd be a trade off to consider. But there's not. It's still going to be the fastest raster card on the market and introduces a whole new world of rendering with wildly a different architecture.

     

    I think we *should* be rewarding Nvidia for this. Despite being king, they haven't rested on their laurels. They've put in the work to start the next generation of graphics and are pulling it off. People shrugging off what they've done is odd to me.

  7. 7 minutes ago, Bjomesphat said:

    Why are people still buying games on day one anyway? They're just going to get discounted in a few months anyway, sometimes by 50%.

     

    The only games I pre-order are Nintendo games that I need to play on day one. Because I know they're not going on sale for a long time.

     

    Because they want to play sooner, and are willing to spend the money to do so? :p

    • Thanks 2
  8. 2 hours ago, Kal-El814 said:

    As ever, it seems as though people discussing movies on the internet believe they live on the same figurative planet as the rest of the movie viewing public. They do not. Most people aren't sitting through 10 minutes of credit reel to see the extra scenes, fewer people than that know who the hell Captain Marvel is, fewer people than that know how many movies Chadwick's deal was for, etc.

     

    A buddy of mine watched Man of Steel for the first time the other day (pray for him) and despite being a dude on the internet who enjoys superhero movies... he didn't know it wasn't a sequel to Superman Returns. He's not an idiot. He just likes movies enough to watch them and go to the theater and aside from that spends exactly zero time or energy thinking about them. This is how most people watch movies.

     

    So to say...

     

     

    A decent number of motherfuckers who go to EVERY ONE of these movies couldn't tell you why Superman and Spider-Man won't cross over. Marvel Studios and WB are as about as familiar to them as Paramount and Universal... they're logos people see before movies that mean, essentially, nothing.

     

    So... yes. Impossible as it may seem here, some people finished Infinity War thinking that was the "last one" or at the very least that everyone that died is gone forever.

     

    1 hour ago, sblfilms said:

     

    More accuratrly, they left the movie not thinking anything about future movies because they never think about a next movie. I can’t quantify it, but it’s not at all an uncommon reaction when I would chat with customers leaving the movie that they were surprised to find out this was really just part 1 of a 2 part film. When the next films begin their advertising blitz, that is often the first time they think about it at all.

     

    1 hour ago, sblfilms said:

     

    Yes, there are people who leave the theater after movies like FOTR and have no conception that there is more to come and they don’t even think about it.

     

     

    Well in that case

     

    giphy.gif

     

     

    Who am I kidding? I haven't wanted to for a long time for much more significant reasons!

     

    Still...

    • Haha 1
  9. I absolutely loathe the idea that wages for some jobs are expected to come from tips.

     

    That said, only when it's that kind of job do I feel compelled to tip unless it's also change that I don't care to have in my pocket and I for some reason paid with cash.

  10. 4 minutes ago, mikechorney said:

    I believe that AMD has the consoles locked up for the foreseeable future -- so unless AMD also goes this direction, it will be hard for it to hit the mainstream.

     

    Will developers really incorporate Ray Tracing in their engines before a critical mass of people have these cards?

     

    1 minute ago, TwinIon said:

    They did mention that they worked with Epic to get it into UE4. If it's a feature that devs can more or less flip on for those special few (and use in all the promo footage), maybe it gets used. If it's much more difficult than that, I doubt we'll see it much for a while.

     

    That's kind of what I'm thinking. On the PC side today, there are any number of optional special effects in games, including Nvidia specific ones like HBAO+. I can easily see better GI options using some limited ray tracing being something similar that can be turned on.

  11. 1 hour ago, mikechorney said:

    I believe that Ray Tracing is the "future" of graphics -- but probably won't be mainstream for many years.

    Not full ray tracing, I agree. But I think in the next few years we will see limited ray tracing for better effects and possibly low-quality (compared to fully ray tracing) implementations for approximate GI. Better GI is IMO one of the best ways to improve the naturalness of games.

     

    Quote

    That said, I've read that GTX may stay around with the 2060 due to it's slower ray tracing performance...

     

    That would be interesting to have a split on the same line, but that makes sense too.

  12. 1 hour ago, Mr.Vic20 said:

    Yup, apparently so! Also, its RTX (ray-tracing) instead of GTX from now on.

     

    Yeah I think it's interesting to have that level of rebranding. It's been a long time with GTX and it's interesting that ray tracing is what is making them change it. If ray tracing really does start taking off though, I can see why. It will have pretty significant graphics ramifications.

×
×
  • Create New...