Jump to content

Spork3245

Members
  • Posts

    38,025
  • Joined

  • Last visited

  • Days Won

    70

Everything posted by Spork3245

  1. There’s only a handful of games where this is achieved at 4k+. It’s typically closer to around 10%. The 4080 16gb is being stated as the same “gain” as the 4090, but over the 3080 Ti rather than the 3090 Ti, which are barely apart performance-wise (it’s less than the 3080 vs 3090… IIRC, it’s around 7-8%).
  2. There’s about a 10-15% difference in framerate, pending the game, between the 3080 vs 3090 Ti
  3. The connection itself seems poorly thought out. They should’ve made it a bit larger so it was more stable.
  4. @stepee @Mr.Vic20 Looks like these ATX 3.0 to 2.0 adapters are kinda junk. 2.5 hours at full draw was supposedly causing melting, the pins for the 3.0 adapter also seem to be super sensitive.
  5. I have a 1300w EVGA G2. I should be fine. should be
  6. One thing I’m noticing is that even the 4090 looks like it only has a single 8pin connector in pics from both ASUS and MSI… this can’t be correct, can it? EDIT: wait, this may be a new/different power type? Looks like some supply a converter which takes four VGA 8 pins… (yea, it’s the new ATX 3.0 power type) EDIT2: BTW, I really like the RGB on my EVGA FTW3, looks like currently Galax has the best design to replace that
  7. The thing that's worth noting, however, is that the 16gb 4080 is being compared with the 3080 Ti and is advertised as 2-4x faster, the 4090 is being advertised as 2-4x faster than the 3090 Ti. So, regardless of paper, if those comparisons are indeed true, then the 16gb is indeed a 4080 as the 3080 Ti and 3090 Ti are barely different in terms of in-game performance (the average at 4k is around 5% difference IIRC).
  8. nVidia wants to purposely make this confusing for some reason. My guess is that the actual 4070 will be 8-10gb.
  9. Whoa. Is that why his post was trolling Trump? Lots of liberals/celebs made Truth Social accounts, many to be able to watch Trump’s nonsense in real-time.
  10. They went all in on PhysX and even encouraged people to buy a second video card to use as a dedicated PhysX card. I’m just saying. The concern would be rendering at, say, 40fps but displaying 90-120. A lot of the frames would be “guesses” that the AI is making and not actually happening in-game. Getting the actual latency of, say, 90+fps and displaying 160-200 wouldn’t be as noticeable as latency at 90fps is already good. It’s possible I’m greatly overthinking it all, but it makes me want to wait for solid reviews and not previews that are limited in what they can and cannot say. End of the day, you can just run 2.0 on these cards if you don’t like 3.0 and it should still be significantly faster than the 3000 series, though
  11. My concern is the game looking smoother than the response feels. Maybe it won’t even really be noticeable, like when I sold my Sony GDM FW900 CRT and went to a 4k HDR LED, but, in my head I have it feeling “off”. This is mainly because it’s adding SO many frames, about doubling framerate over DLSS 2.0 in some cases.
  12. Hmmm, wait, maybe not. I'm getting conflicting info; this might just be for their mobile (cell phone) stuff?
  13. Comcast owns 50% and invested $1.7b in 2017, unless that has changed.
  14. Disclaimer: I haven't watched this whole video yet
  15. TBF, I'd only want it for emulating Gamecube and earlier consoles. Is emulation easy on the deck?
  16. Maybe this Odin WWW.AYNTEC.COM The most powerful Android and Windows gaming handhelds at the best prices. Shop our Odin and Loki handhelds today.
  17. It'll be interesting to see if latency or any odd effects are added vs DLSS 2.0. I'm sure it will be minimal, however.
  18. That's what I'm asking; is it essentially adding frames to get the higher framerate?
×
×
  • Create New...