SimpleG Posted December 11, 2020 Share Posted December 11, 2020 Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted December 11, 2020 Share Posted December 11, 2020 26 minutes ago, NextGen said: Sounds a lot like Intel with their "benchmarks don't matter" when Ryzen caught up in gaming. Nvidia is now basically doing the same thing because Radeon has essentially closed the gap in rasterization performance isn't a huge shock, in fact, it's borderline expected. Want the focus on RT and DLSS so the 6xxx Radeons look like crap. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted December 12, 2020 Share Posted December 12, 2020 That sucks for them. They do really good reviews, and they also do about as many RT and DLSS tests as any other publication (because there's only ~5 worthwhile applications of this tech), so WTF is Nvidia on? Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted December 12, 2020 Share Posted December 12, 2020 Nvidia has great engineer’s, but truly awful management! Quote Link to comment Share on other sites More sharing options...
Reputator Posted December 12, 2020 Share Posted December 12, 2020 1 Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted December 12, 2020 Share Posted December 12, 2020 15 hours ago, Reputator said: Linus was ripping Nvidia last night. Even mentioned how the review for the 6900XT or was it their cyberpunk video had the 3xxx series using DLSS among the benchmarks. I don't understand what Nvidia wants. I guess they just want there videos to say Nvidia good go buy. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted December 13, 2020 Share Posted December 13, 2020 On 12/11/2020 at 9:41 PM, Mr.Vic20 said: Nvidia has great engineer’s, but truly awful management! Their tech has, mostly, worked great and better than the competition, save for, what, 2 or 3 stumbles? They epitomize the evil side of capitalism. Being better isn't good enough. They engage in anti-competitive practices and spread misinformation--a decade ago they got caught paying shills to promote their products on websites and forums. I really don't think they stopped that practice completely; they just got better at it. And then they take all of the advantages they get from doing the above mentioned, and use that to get the highest possible margins out of their products. Hence the price creep, especially of their halo products. Their stockholders come first; not their customers. And I fucking hate that with a passion. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted December 13, 2020 Share Posted December 13, 2020 This is so weird and unbelievably dumb on nVidia’s part. 1 hour ago, cusideabelincoln said: Their tech has, mostly, worked great and better than the competition, save for, what, 2 or 3 stumbles? The GeForce 5x00/FX series was an absolute joke and only supported FP-HDR (a major new graphical feature at the time) at either 16-bit or 32-bit: 16-bit had no major performance cost but looked awful, 32-bit looked great but had a major performance impact... meanwhile ATi supported 24-bit which had very little impact on performance (similar amount to 16-bit) and looked indiscernible from 32-bit. nVidia tried to “save face” with their refresh which wound up just using their drivers to force 16-bit FP-HDR in games that called for 24 or 32-bit, making the cards appear to run better in benchmarks (compared to the pre-refresh) but look awful compared to it running on an ATi card. Oh, and the first iteration of the FX line had the loudest fans imaginable, sounding like a vacuum cleaner (no hyperbole here) because their engineers thought “gamers liked the sound of their video cards, kinda like car-guys and their car’s exhaust”... no, really. They didn’t redeem themselves fully until the 8x00 series (it started with the 6x00 and 7x00, but all eyes stayed towards ATi as the leader until the 8x00 series) Then there was the GTX 4xx... jfc, iirc, those things ran so damn hot they could literally heat your home in the winter. There was also some type of memory bandwidth issue, I think? They fixed it with the GTX 5xx series (and the GTX 460 Ti, which ran on the 5xx series architecture (fermi?)). Their third misstep I suppose could be the GeForce 4 MX series, which was basically a GeForce 2 on steroids and lacked vertex/pixel shaders, the biggest DX8/8.1 graphical improvements. Buuuuut... they sure made a lot of money on uneducated consumers with those. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted December 13, 2020 Share Posted December 13, 2020 8 hours ago, Spork3245 said: This is so weird and unbelievably dumb on nVidia’s part. The GeForce 5x00/FX series was an absolute joke and only supported FP-HDR (a major new graphical feature at the time) at either 16-bit or 32-bit: 16-bit had no major performance cost but looked awful, 32-bit looked great but had a major performance impact... meanwhile ATi supported 24-bit which had very little impact on performance (similar amount to 16-bit) and looked indiscernible from 32-bit. nVidia tried to “save face” with their refresh which wound up just using their drivers to force 16-bit FP-HDR in games that called for 24 or 32-bit, making the cards appear to run better in benchmarks (compared to the pre-refresh) but look awful compared to it running on an ATi card. Oh, and the first iteration of the FX line had the loudest fans imaginable, sounding like a vacuum cleaner (no hyperbole here) because their engineers thought “gamers liked the sound of their video cards, kinda like car-guys and their car’s exhaust”... no, really. They didn’t redeem themselves fully until the 8x00 series (it started with the 6x00 and 7x00, but all eyes stayed towards ATi as the leader until the 8x00 series) Then there was the GTX 4xx... jfc, iirc, those things ran so damn hot they could literally heat your home in the winter. There was also some type of memory bandwidth issue, I think? They fixed it with the GTX 5xx series (and the GTX 460 Ti, which ran on the 5xx series architecture (fermi?)). Their third misstep I suppose could be the GeForce 4 MX series, which was basically a GeForce 2 on steroids and lacked vertex/pixel shaders, the biggest DX8/8.1 graphical improvements. Buuuuut... they sure made a lot of money on uneducated consumers with those. A minor hit they took was the GTX 970 3.5GB fiasco, and also the 600 and 700 series in general fell on its face over time as AMD's GCN architecture proved to perform (a lot) better in future games and driver updates. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted December 14, 2020 Share Posted December 14, 2020 Nvidia just has too much mindshare. I doubt AMD will ever break it. Though I do hope they get competitive again. I really dont want to wait what 3-4 years since Vega 64 to get another high end card available. Quote Link to comment Share on other sites More sharing options...
SimpleG Posted December 15, 2020 Author Share Posted December 15, 2020 Nvidia un surprisingly back peddled Jump to 21:49 to see the worst non apology they could make . Quote Link to comment Share on other sites More sharing options...
Dre801 Posted January 28, 2021 Share Posted January 28, 2021 Great products, total dick of a company. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 28, 2021 Share Posted January 28, 2021 36 minutes ago, Dre801 said: Great products, total dick of a company. It’s funny looking back at how great and pro-consumer nVidia used to be. We’d have to go back to basically the GeForce 3 series and prior, but they were awesome. Now? JFC. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.