Jump to content

cusideabelincoln

Members
  • Posts

    9,160
  • Joined

  • Last visited

Everything posted by cusideabelincoln

  1. Honestly I forgot how much of the "halo" price inflated during the 20 series as Nvidia launched the 20 series Titan at that ridiculous $3000, just so they can release the 2080 Ti and make it seem like a deal. So I consider the 3070 being $5-600 a return to normalcy as the previous x70 class cards were in the $400 market. In 3 generations Nvidia has inflated the price of a 70 class card by double. $400 1070 (Ti) -> $500 2070 (super) -> $500 3070 / $600 3070 Ti -> $800 4070 Ti
  2. Quest removed the Facebook requirement, and you can set up to be completely wireless to play PC games. It's a good cheap way to try out VR.
  3. You do have credit Ngreedia's marketing for releasing the 4070 Ti the same day as CES. They knew it would get slammed and hoped the news would bury some of the negativity.
  4. This is basically how I feel about both companies right now, and I'm not even in the market right now.
  5. 20 series was also price inflated but they did add RT and DLSS, two new features that are useful. 1070 > 980 ti 970 >= 780 ti 770 > 680 670 > 580 In some generations even the 60 class card matched a previous 80 class. 4070 Ti is now priced as a flagship used to be but delivers what used to be midrange performance, relatively speaking. Edit: The 4070 is half the specs and performance of a 4090. Historically speaking, the 60 class cards used to be half the 80 class cards and the 70 class sat in-between. This generation has by far the worst inflation of MSRP.
  6. I wouldn't call the 3070 an insane deal; it was the norm for a 70 or 60 class card to match the previous gen 80 class card. Now all new gen cards are overpriced from both AMD and Nvidia.
  7. It can be situational, and vary game to game, but a 7800x3D could potentially avoid the CCD-to-CCD latency penalty because it only has one CCD while the 79xx chips have two CCDs. So if thread is bouncing around the CPU cores, and happens to go from a core on one CCD to a core on another CCD, then that takes longer to do than if the thread bounced to a different core on the same CCD. This happens because the information for that thread has to travel over the infinity fabric of the CPU, which is how the chiplets communicate to each other. The Infinity fabric bandwidth is simply slower than the cache within a chiplet, and since gaming is one of the more bandwidth- and latency-intensive uses for the CPU there can be a performance drop. Good software programming, or BIOS updates, could help this issue though. And I haven't done much research into this since the launch of the 7000 series CPUs, however I noticed a handful of reviews showing the 7800X matching or slightly besting the 7900 in some, but not all, of their respective gaming tests. I also noticed this CCD->CCD issue with older chips in these same reviews, where a Ryzen 3700x was matching or slightly beating a 3950x even though there's a pretty big clockspeed difference and the 3950x should never lose to the 3700x, and strangely enough when these CPUs launched it never did. So possible changes over the last few years could be game engines, or Windows scheduling differences, or AMD microcode tweaks. Since it's very unlikely that games are going to be heavily using more than 8 cores for the foreseeable future, the extra cores from the 79xx chips don't add any benefit to make up the latency penalty. Now with all of that said, the differences I saw were about 5% or less, and these differences can also be determined by the quality of silicon you get or can be overcome by properly cooling and tweaking the 79xx CPUs. It's also possible AMD improves the Infinity fabric for the X3D chips, or it's just a non-issue because the inclusion of the large X3D cache means there's less data traveling over the Infinity Fabric to begin with. We just have to wait for reviews edit: You have Intel now, right? It would definitely be interesting to see how the Intel 13900k paired with ultra fast DDR5 memory stacks up to the X3D chips. AMD's memory wall is DDR5-6200, max overclock. Intel can go way beyond that.
  8. He's narrowed down the issue to some kind of flaw with the reference vapor chamber. More research is needed to differentiate between design flaw or manufacturing defect. Absolutely crazy that orientation matters.
  9. Use that week to get some before\after benchmarks, and if you want to post them here I'd be curious to see the difference. What games do you plan to play? You shouldn't have to do any software\driver changes when changing the CPU. I'd suggest over clocking out of the box. It should do 4.2-4.4 GHz without any fine tweaking or voltage adjustments. If you have good cooling and want to spend more time tweaking, you could get 4.7-5 GHz max.
  10. Take down the bosses in catacombs, caves, and evergaols. Anytime you get enough runes to level up, use them so you don't lose them. Usually save a few golden runes to use if I'm not quite close enough to leveling up. You should be having an easier time, but the enemies will get tougher. Make sure your weapon is leveled up, and put most points into health to keep you alive.
  11. It'll get you by at 4k, and hold its own really well if overclocked. A few games might be bottlenecked, but most won't. I could see pairing it up to a 4080, but the upcoming 4070 Ti seems more suited as the top end GPU to go with it. If you're running lower resolutions, then cheaper cards would be better suited. @Spork3245 had one recently, he can give you more insight on his gaming experience with it. After looking at some more information, I think it makes the most sense to grab a 4770k and overclock it the best you can. And your best GPU options would be a 6700 XT, 6750 XT, or 6800. The Nvidia equivalents would also be fine: 3060 Ti or 3070.
  12. Economical upgrade is a i7 4770k. Max upgrade is an i7 5775c, probably 10-15% faster than a 4770. RX 6800 (non XT) would be a good choice for 4k gaming before getting too CPU bottlenecked. Your computer would then have more GPU power than a PS5 but a bit less CPU performance.
  13. 4770k CPUs are like $40 on ebay so that's not much extra cost. Maybe if you need more RAM and a new power supply then a prebuilt looks good, but will still cost more. What's your motherboard model? There might be more CPU options.
  14. Yes, older i5 CPUs can hold performance back in newer games even at 4k. You can do a drop in upgrade to an i7 CPU to get you by for a couple more years, but you'll have to buy used. An i7 4770 + RX 6700\6800 would be a solid upgrade.
  15. That's what I was saying. No need to spend more than the P5 if you don't get a heatsink, because the faster drive will thermal throttle.
  16. I'd definitely recommend a slightly slower SSD + PS5 heatsink than get a faster SSD without one.
  17. Yeah he can see a good size boost with sequential read/writes (not much of a boost with random read/writes), so long as he has a good platform to put the SSDs in. He'll need plenty of PCIe lanes from the CPU, so he should be looking at the Pro-sumer chips from Intel and AMD and not the normal desktop/gamer ones. And he should definitely have a backup solution in place with the increased chance of data loss.
  18. No one should be supporting Nvidia's market practice with the 4080. 50% price increase from previous gen's comparative card, while manufacturing costs most likely decreased or at worst stayed the same. Full video.
  19. TechPowerUp WWW.TECHPOWERUP.COM Aftermarket cards + overclocking shows some decent gains at 12% faster than stock, which was the performance I was expecting normally. The reference card is power limited with only 2x8 pins. Looks like AMD will also have a room for driver optimizations, as several reviewers have shown some games are barely any faster than the RX 6950.
  20. Moonlight provided better latency than Steam Link, offers a higher bitrate, and doesn't require me to turn off the AI upscaling so N-greed-ia can go F themselves over this decision.
  21. I was expecting about 10% better raster performance than reviews are showing. RT seems to be inline with AMDs claims, and they are still a generation behind Nvidia. The worst thing are driver bugs. The card idles at 100+ watts. Unacceptable.
×
×
  • Create New...