cusideabelincoln

Member
  • Content count

    6,067
  • Joined

  • Last visited

Community Reputation

338

1 Follower

About cusideabelincoln

  • Rank

Profile Information

  • Gender Male
  • TwitchTV cusideabelincoln
  1. New Titan XP :)

    No I'm not sure. I didn't mean a facelift though. It'll be like tahiti to hawaii revision. 
  2. Justice League: official Comic-Con trailer

    It's bullshit we have never seen the perfect Batman in a live action movie.  One that is confident, calculating, threatening, but also cautious, methodical, and accepting of the fact his parents were murdered right in front of him.  Ya know, the Batman we all know and love in the animated universe.   The 90's Batmen?  Overly brooding. Bale's Batman? Not the World's Greatest Detective, and his Bruce Wayne was just a huge asshole Snyder's Batman?  MARTHA MARTHA MARTHA.
  3. Justice League: official Comic-Con trailer

    The Batman in this universe is a paranoid, gun-toting murderer.   So I guess in Snyder's head it makes perfect sense for Batman to be all gung-ho recruiting these heroes.  
  4. Case recommendation

    Side panel ventilation is a must if you're overclocking, IMO.   I just swapped out my case, and my max temps lowered by 5 degrees.
  5. New Titan XP :)

    Huh?   The 7970 released ahead of the GTX 680 and was as fast at the time.  Now it's faster because of DX12, better drivers from AMD, and Nvidia not optimizing the 680 anymore.   Then they released the 290X.  It was faster and cheaper than the 780 at the time.  Nvidia countered with the 780 Ti that, at the time, was marginally faster.  In today's games, the 290x is faster than the OG Titan and 780 Ti.   They only just fell behind the curve with the Fury X in comparison to the 980 Ti and Titan X, and they aren't even that far behind.  Vega has the potential to be quite awesome.  It's a new architecture; it won't even be the same architecture that the 480x is using.  The 480x is also bandwidth constrained.  Vega is going to use HBM, so bandwidth won't be an issue.  As long as they fix the other hardware bottlenecks that were holding back Fury X (Fury X has an insane number of compute cores) they will be quite competitive with Vega 10 and then Vega 11.
  6. Nvidia Control Panel Settings VS In-Game Settings / Also -DSR vs AA?

    You could drop MSAA down to 2x with DSR, or probably just turn it off altogether.  DSR will apply anti-aliasing to the entire screen, and since you're on a TV I assume you're sitting far enough away for the IQ differences to not be noticeable.   Best way is to definitely experiment.  But I will say MSAA x8 is usually a waste of performance to begin with for what you'll get out of it.
  7. New Titan XP :)

        Big chip, 512-bit memory bus (makes the PCB way more expensive), miners buying these cards.  The 7970 was also only $550, so at least AMD didn't increase the price.  With Fury X, they had to because of the early adoption of HBM.  Also Fury X had a water cooler.  But the 290x was a damn good value.  It was faster than the 780 at the time of launch, and today it's as fast/faster than a 780 Ti.  So AMD released the 290x, at $550, that was competing with the $1000 Titan.  And for DX12, it's like no contest.   The 290x's longevity is fucking astounding if you ask me.
  8. Wonder Woman: official Comic-Con trailer

    If only Batman Forever was better than it was.  I mean... you've got Tommy Lee fucking Jones.
  9. New Titan XP :)

    That's exactly what I'm talking about!   If we change our perspective to compare the 980 to the 1080, then yeah, that's a nice generational leap in performance.  Right on target of what we should get.  But... don't just look at performance.  What did Nvidia do besides increase performance?  Oh, yeah, the 980 retailed for $550 and the 1080 is $600-$700   But it gets better!  Let's go back from the 980 and look at the card that was -40% slower, the GTX 680.  Their first 28nm top of the line card.  It retailed for $500!   And the card before the 680, the superb GTX 580?  It was also $500!   And what did the 480 cost?  Oh yeah, $500!   What about the GTX 285?  Well, fuck, that was only $400!   Oh, look at what Nvidia tried to charge us for the GTX 280, when they thought AMD had no competition.  That's right, they tried to ask $650 for it.  A few weeks later the dropped the price down to the magical $500.       Do you guys see what's happening here?  Since the 28nm gen has begun, they are slowly upping the price of the xx80 cards, the reason being "because it's OMG faster than the previous Ti or Titan card!"  But then they release a new Ti or Titan card at an even higher price point.  Rinse and repeat.   Just consider the GTX 280 -> GTX 285.  That was a refresh.  They increased performance (20%), but charged us the same price.   Look at the 480 -> 485.  Again, increased performance but didn't increase the price.   But then look at the 680 -> 780.  Whoops, the 780 was $649.  They fucking increased the price!   Then they came out with the 780 Ti.  That was $700!  Another price increase for a ~20% performance boost.   Then boom, they back off a little bit by giving us the 980.  It wasn't a performance boost over the 780 Ti, but they lowered the price.  However, they did not lower the price back down to that $500 sweet spot.  No no no no.  They pulled a fast one and made it $550.  And then they began the cycle again, and here we are at the 1080 situation.
  10. New Titan XP :)

    Hey does anyone remember when Nvidia released the 8800 GT that was basically as fast as the 8800 GTX and only charged us $250?  That card was also LESS THAN HALF the price of the  8800 GTX.   These days, they release a GTX 1070 and charge us $400, telling us it's as fast as their previous high end cards and also cheaper! (the 8800 GT was going against their current high end card)  Well no fucking shit, that's what's supposed to happen on a new node.
  11. New Titan XP :)

    Mr. Vic said it best himself.  If he has to think twice about buying a Titan X, then you know Nvidia is flying way too close to the sun right now.   I hope this move bites them in the ass.  It should.  It really, really should.  People SHOULD NOT fall for this shit. Knock them back down to reality, please.  For the love of Ra!
  12. New Titan XP :)

      It happened with previous node jump...   http://www.techspot.com/article/928-five-generations-nvidia-geforce-graphics-compared/page3.html     Compare the GTX 580 to the GTX 680.  That's where the node jump happened.  On their first try with the 28nm process, they managed to make the GTX 680 >40% faster than the previous fastest card, the GTX 580.  And to put that in further perspective, they actually did that and LOWERED power consumption, drastically.  The TDP went way, way down with the GTX 680.  If they would have allocated the same TDP as the GTX 580 had to the GTX 680 on the 28nm process, then we're talking over a 60% performance gain.   And if you compare the GTX 680 to the GTX 980, on the same node with the same TDP, they gained >45% performance increase.   So you know what it says to me when Nvidia releases a top of the line card (1080), and it's only 25-30% faster than the previous gen on an OLD node?  It tells me they're milking the fuck out of your pockets.  They're taking advantage of the long droughts between new releases.  They're taking advantage of their 75% market share and their blind, loyal fans who would be a another FX 5800 Ultra if they released it.  And you can tell they are purposefully holding back because they are releasing the a fucking (cut down) Titan X card so soon after the 1080 has came out.   The upcoming Titan X card is what the 1080 SHOULD have been, if we're going by history of top of the line cards from Nvidia.  
  13. New Titan XP :)

    I don't consider 25% major.  That's the bare minimum gain that you can tell just by the eye test there is an improvement.  A major gain is at least 50% more performance.   I double checked some numbers, and you are right twice the improvement is not the norm.  It did happen.  The 9700 Pro doubled the previous top of the line card.  The 8800GTX doubled the previous top of the line card.   However when jumping to a new node, the first top of the line card for that new node was consistently 50% faster than the fastest card from the previous generation at launch, and that gap usually grew as more mature drivers came out and games started taking advantage of the new architecture.   And if we compare the mature card of one generation to the mature card of the previous generation, there is usually twice the performance.   For example, compare a GTX 580 (mature Fermi on 40 nm process ) to a GTX 285 (mature GT200 on 55 nm process)       The darkest time in graphics history?  I'm going to say the DX10 era.  After Nvidia killed it with the 8800, they rebadged the shit out of it for years and years because they had no competition.  First they gave us the 8800 GTX.  Then they gave us the 8800 Ultra. Then they shrunk that chip on a new node and called it 9800 GTX (only adjusting clockspeed, no architecture changes).  Then they did the same thing again and called it a 9800 GTX+.  Finally they straight rebadged the chip into the GTS 250.
  14. New Titan XP :)

    Nvidia is releasing as small performance boost as possible. Just enough for them to say, "Look we now have an even faster card!  So give us more money"   They have somehow tricked us into believing these 25% performance boosts for new cards is awesome. Back in the day we weren't impressed unless the new gen was twice as fast...
  15. Justice League: official Comic-Con trailer

    Snyder really does have a way with words.