Jump to content

cusideabelincoln

Members
  • Posts

    9,173
  • Joined

  • Last visited

Everything posted by cusideabelincoln

  1. Memory speds/overclocking will most likely matter more than CPU overclocking for gaming performance.
  2. AMD has definitely been downplaying the issue, and they aren't being pro-active about it. This should probably be a recall, and they should be contacting the people who might have gotten the bad batch of vapor chambers. @Zaku3 Might be a while before you can get a replacement. Haven't heard anything about these cards dying, so you'll just have to live with reduced performance. Could try to get a refund from AMD and buy an aftermarket card.
  3. 240 hz has spoiled me. I can't take sub 60 fps in any game so I'm glad there is a performance option. I'd rather look at a slightly less sharp image than a slideshow. After 5 minutes I think I like the movement and traversal the most. Hopefully that gets expanded as we unlock abilities, and that there will actually be interesting locations to explore.
  4. If you get the best PSU you can afford right now, I'm pretty confident it will last 10 more years. You won't be missing out on too many features because high end power supplies mostly meet the more stringent power delivery requirements that the ATX 3.0 spec requires. And manufacturers should should provide 16-pin cables for older power (modular) power supplies; at the very least I know Corsair has guaranteed to sell new cables for their old PSUs. Your other option wouldn't be terrible, though. You can get a decent $100 (USD) to fulfill your needs right now, and simply get the best one when you upgrade your system later. In this scenario, at least you'll have a spare lying around in case anything happens. You could even take up Vic's offer to get you by. I used a 550w power supply with a 2080 Ti for a few years and only ran into one problem: For some reason Elden Ring caused it to trip the Over-Current Protection and shut down the system when using it with an Intel 10600k. But I used it for years with an AMD 3700x and never had that happen. Your 9700k uses less power than both the 3700x and 10600k so a 600W should be fine.
  5. The problem with the adapters has been figured out, and the same problem can also occur with the new ATX 3.0 cable too. Users weren't fully inserting and latching pins into the video card, although part of the reason why is a poor design that requires a lot of force to do it. A fully inserted adapter or cable won't have issues, so if he doesn't want to wait too long then any current power supply will do fine. There might even be more of a delay as companies tweak the 12VHPWR cable to make it easier to insert and latch. 850W will be more than enough for stock use or minor tweaking. If you want to do major tweaking or overclocking of the CPU and GPU at the same time, 1000W would provide some safety margin. 1200W is not necessary for normal use.
  6. Consider the Alienware AW3423DWF, the Freesync alternative that improves on the Gsync one. The Gsync module doesn't really add any benefit to an OLED display because the pixels offer the same response time regardless of the refresh rate. A Gsync module's main purpose is to fine tune a traditional LCD's pixel response on a dynamic per refresh rate basis to offer the best motion clarity without introducing ghosting artifacts, and it needs to do this because how hard the display pushes the LCD at one specific refresh rate may not actually work very well at a different refresh rate and this can lead to an inconsistent experience when using variable refresh rates. Some cases: https://ca.pcpartpicker.com/product/CrzhP6/lian-li-lancool-205-mesh-c-atx-mid-tower-case-g99oe764cx00am https://ca.pcpartpicker.com/product/bmstt6/lian-li-lancool-iii-atx-mid-tower-case-lancool-3-w https://ca.pcpartpicker.com/product/4cPQzy/lian-li-o11-dynamic-evo-atx-mid-tower-case-pc-o11dew https://ca.pcpartpicker.com/product/XMhFf7/lian-li-lancool-ii-mesh-c-performance-atx-mid-tower-case-lancool-ii-mesh-c-performance-x https://ca.pcpartpicker.com/product/PWn8TW/corsair-5000d-airflow-atx-mid-tower-case-cc-9011210-ww https://ca.pcpartpicker.com/product/FdZ9TW/thermaltake-view-300-mx-atx-mid-tower-case-ca-1p6-00m1wn-00
  7. The 700 series aged like warm milk and wasn't fully dx12 so the 970 is just a better card than 780 ti now. 3070 was a 50% improvement and that is just standard. I'll just never consider it a great deal. It's a normal deal imo. And to answer a previous question I am salty at these corporations
  8. There should be a price creep, I just think it is out of control and the crypto inflation has given these companies the perfect cover to take advantage of us well into this gen. The 3070 wasn't a deal, at least not anymore than any other 70 class card like the 970/1070 before it. 30 series only looks like a good value proposition because the 20 series was a relatively small (raster) performance jump over the 10 series because they added Tensor and RT cores. The 20 series was at least understandable; they were using the die space to add features and die space costs money. When the 30 series comes along, they've don't need to proportion out extra die space for RT and Tensor cores; every part of the die can just be beefed up and thus they can make it look good relative to the 20 series with minimal extra cost per die area. It's a return to normalcy before the 20 series. Well, not a complete return to normalcy, because then Nvidia makes the 3070 Ti $100 more expensive whereas the previous generations' 70 refreshes (2070 super, 1070 Ti) came in at the same price. Now the 4070 Ti comes along and just makes an absurd price jump, and the only added benefit is Frame Generation, which doesn't require nearly as much extra die space as entire cores. Nvidia, and AMD, are upcharging based on existing performance, not based on whatever manufacturing costs were added.
  9. Honestly I forgot how much of the "halo" price inflated during the 20 series as Nvidia launched the 20 series Titan at that ridiculous $3000, just so they can release the 2080 Ti and make it seem like a deal. So I consider the 3070 being $5-600 a return to normalcy as the previous x70 class cards were in the $400 market. In 3 generations Nvidia has inflated the price of a 70 class card by double. $400 1070 (Ti) -> $500 2070 (super) -> $500 3070 / $600 3070 Ti -> $800 4070 Ti
  10. Quest removed the Facebook requirement, and you can set up to be completely wireless to play PC games. It's a good cheap way to try out VR.
  11. You do have credit Ngreedia's marketing for releasing the 4070 Ti the same day as CES. They knew it would get slammed and hoped the news would bury some of the negativity.
  12. This is basically how I feel about both companies right now, and I'm not even in the market right now.
  13. 20 series was also price inflated but they did add RT and DLSS, two new features that are useful. 1070 > 980 ti 970 >= 780 ti 770 > 680 670 > 580 In some generations even the 60 class card matched a previous 80 class. 4070 Ti is now priced as a flagship used to be but delivers what used to be midrange performance, relatively speaking. Edit: The 4070 is half the specs and performance of a 4090. Historically speaking, the 60 class cards used to be half the 80 class cards and the 70 class sat in-between. This generation has by far the worst inflation of MSRP.
  14. I wouldn't call the 3070 an insane deal; it was the norm for a 70 or 60 class card to match the previous gen 80 class card. Now all new gen cards are overpriced from both AMD and Nvidia.
  15. It can be situational, and vary game to game, but a 7800x3D could potentially avoid the CCD-to-CCD latency penalty because it only has one CCD while the 79xx chips have two CCDs. So if thread is bouncing around the CPU cores, and happens to go from a core on one CCD to a core on another CCD, then that takes longer to do than if the thread bounced to a different core on the same CCD. This happens because the information for that thread has to travel over the infinity fabric of the CPU, which is how the chiplets communicate to each other. The Infinity fabric bandwidth is simply slower than the cache within a chiplet, and since gaming is one of the more bandwidth- and latency-intensive uses for the CPU there can be a performance drop. Good software programming, or BIOS updates, could help this issue though. And I haven't done much research into this since the launch of the 7000 series CPUs, however I noticed a handful of reviews showing the 7800X matching or slightly besting the 7900 in some, but not all, of their respective gaming tests. I also noticed this CCD->CCD issue with older chips in these same reviews, where a Ryzen 3700x was matching or slightly beating a 3950x even though there's a pretty big clockspeed difference and the 3950x should never lose to the 3700x, and strangely enough when these CPUs launched it never did. So possible changes over the last few years could be game engines, or Windows scheduling differences, or AMD microcode tweaks. Since it's very unlikely that games are going to be heavily using more than 8 cores for the foreseeable future, the extra cores from the 79xx chips don't add any benefit to make up the latency penalty. Now with all of that said, the differences I saw were about 5% or less, and these differences can also be determined by the quality of silicon you get or can be overcome by properly cooling and tweaking the 79xx CPUs. It's also possible AMD improves the Infinity fabric for the X3D chips, or it's just a non-issue because the inclusion of the large X3D cache means there's less data traveling over the Infinity Fabric to begin with. We just have to wait for reviews edit: You have Intel now, right? It would definitely be interesting to see how the Intel 13900k paired with ultra fast DDR5 memory stacks up to the X3D chips. AMD's memory wall is DDR5-6200, max overclock. Intel can go way beyond that.
  16. He's narrowed down the issue to some kind of flaw with the reference vapor chamber. More research is needed to differentiate between design flaw or manufacturing defect. Absolutely crazy that orientation matters.
  17. Use that week to get some before\after benchmarks, and if you want to post them here I'd be curious to see the difference. What games do you plan to play? You shouldn't have to do any software\driver changes when changing the CPU. I'd suggest over clocking out of the box. It should do 4.2-4.4 GHz without any fine tweaking or voltage adjustments. If you have good cooling and want to spend more time tweaking, you could get 4.7-5 GHz max.
  18. Take down the bosses in catacombs, caves, and evergaols. Anytime you get enough runes to level up, use them so you don't lose them. Usually save a few golden runes to use if I'm not quite close enough to leveling up. You should be having an easier time, but the enemies will get tougher. Make sure your weapon is leveled up, and put most points into health to keep you alive.
  19. It'll get you by at 4k, and hold its own really well if overclocked. A few games might be bottlenecked, but most won't. I could see pairing it up to a 4080, but the upcoming 4070 Ti seems more suited as the top end GPU to go with it. If you're running lower resolutions, then cheaper cards would be better suited. @Spork3245 had one recently, he can give you more insight on his gaming experience with it. After looking at some more information, I think it makes the most sense to grab a 4770k and overclock it the best you can. And your best GPU options would be a 6700 XT, 6750 XT, or 6800. The Nvidia equivalents would also be fine: 3060 Ti or 3070.
  20. Economical upgrade is a i7 4770k. Max upgrade is an i7 5775c, probably 10-15% faster than a 4770. RX 6800 (non XT) would be a good choice for 4k gaming before getting too CPU bottlenecked. Your computer would then have more GPU power than a PS5 but a bit less CPU performance.
  21. 4770k CPUs are like $40 on ebay so that's not much extra cost. Maybe if you need more RAM and a new power supply then a prebuilt looks good, but will still cost more. What's your motherboard model? There might be more CPU options.
×
×
  • Create New...