Jump to content

cusideabelincoln

Members
  • Posts

    9,173
  • Joined

  • Last visited

Everything posted by cusideabelincoln

  1. Yep, the cable is just power and 12VHPWR is the official name if you were wanting to search for one to buy.
  2. That won't work with 4000 series. It's for 3000 series which use a special 12 pin connector Nvidia came up with. You want to get a 12VHPWR cable, which I should correct myself and actually has 16 pins, 12 power pins + 4 sensor pins. You might also see them called as PCIE 5.0 connectors.
  3. Hmm, I did have mine set to headphones before and after the update, and then I switched it to studio reference and still wasn't happy. I just now decided to test all of them, and Night Mode sounds the best and has the most punch. But I noticed some of the drastically changed sound effects when watching the pre-release trailers and streamers playing the beta. A lot of them kept saying how much better the sound effects were, but I really can't hear it even after playing the game for myself. The characters that sound like a significant "downgrade" are Roadhog and Pharah, followed by Soldier, Sombra, Ana, Ashe, Reaper, Torb's right click. Some of the other characters have slightly different sound effects that I wouldn't say sounds worse or better, just different, and the rest have little difference from the first game. But whatever they did to Pharah's rockets is just bad. Sounds like she's shooting air pellets, the explosion and propulsion have no weight which boggles my mind because Junkrat's explosions sound fine. Every map, no matter the location, also sounds like you're shooting in an empty rocky canyon. So much echo.
  4. Completely safe. The card will simply limit you to the default 450w power limit, whereas other cards with 4x8 will allow up to a 600W power limit. But the 600w limit is completely pointless, as the card doesn't overclock high enough to use that much. Most overclocks I've seen, even with the 600w limit, only get the card to draw a little over 500w, and the performance gains are on average 5%. At stock the 4090 uses less power than the 3090 Ti anyway. If you were to get a new adapter, I would suggest look into if your current PSU manufacturer is selling (or is going to to sell) a new 12pin cable for your specific unit. Corsair is going to sell a new modular cable that will be compatible with most of their recent power supplies.
  5. I see both 4080 cards having absolutely zero value to anybody outside of the new features of Frame Generation and AV1 encoding. For any game without DLSS3, they won't be a significant upgrade over a 3090... hell the 12GB looks like it's going to be slower in a lot of games. And they are incredibly overpriced compared to the sales we are seeing and will see on 3000 cards. The 16GB just doesn't make any sense at its price either, when a few bucks more gets you substantially better performance. Somehow Nvidia has managed to make the 4090 a decent value.
  6. If I were Microsoft my argument would be the very existence of Activision as an entity is anti-competitive to the market.
  7. @stepee More latency tests in MS Flight Sim, F1 2022, and Cyberpunk, showing what I was trying to say. In MS Flight Sim, there's a hard bottleneck so native and DLSS provide the same framerate, thus provide similar latency. But frame generation adds latency over both, albeit not that much. You won't notice a 6ms increase with controller. In F1 and Cyberpunk, there's no CPU bottleneck at native and mostly no bottleneck with DLSS. DLSS is thus able to increase the framerate and lower the latency. Frame generation adds latency over DLSS, but is still able to provide better latency than native. I suspect the reason why we don't see a doubling in performance in F1 and Cyberpunk like we do in Flight Sim is because of the GPU load. There is not as much GPU headroom for the AI processing cores to do its thing when the rest of the GPU is already being fully utilized compared to a CPU bottlenecked like Flight Sim where the GPU is stuck at half utilization. Now perhaps this can be improved with driver and programming updates, or perhaps there is just some bottleneck in the architecture of the chip, because most testing shows that DLSS + Frame generation actually reduces power consumption so that can't be the problem.
  8. Oh how I wish it were that simple. But, and this is my experience in a single game, I was noticing something weird when trying to cap the framerate in Overwatch 1. Whether capping framerate below my monitor's refresh rate in the game or using Nvidia's drivers, I noticed an ever so slightly choppiness. This happened with and without Gsync. When I uncapped the framerate, the smoothness improved a little bit simply because of the massive increase in fps. Uncapping the framerate also gave me the lowest input lag. I really had to look for it, but it was ever so slightly noticeable. But when I turned Gsync off and just used Vsync, with a framerate that never dipped below my refresh rate, the game definitely looked noticeably smoother. No tearing, no visual jitter. But the input lag was definitely noticeably worse compared to either Gsync or completely uncapped. It's really difficult to get both a completely smooth visual experience and extremely low latencies. The closest way to achieve this is to simply have way more computational power than you need and just brute force deliver a higher framerate.
  9. Well all of this is going to be tricky and take some work. Latency is impacted by so many different variables that there is no simple answer when all of those variables can be used in different combinations. Vsync, double buffered, triple buffered, Gsync, in-engine framerate cap, driver framerate cap, in-engine render queue, driver render queue, refresh rate, power target, boost frequency behavior all affect input lag to various degrees. Even the resolution you run impacts latency; I've seen tests that capped everything at the same framerate yet the lower resolution will yield (slightly) better latencies compared to a higher resolution. And the same with DLSS and GSYNC. Input lag is slightly increased when using these things even if the framerate is the same with them off.
  10. As of right now, it should be mostly impossible to have a game run at 60 fps 4k native and also then run at 60 fps 4k dlss3. That scenario an only happen if there is a very hard CPU/system bottleneck that's capping the framerate at 60. But, and I haven't seen this tested so I'm guessing, if this situation were to ever happen then DLSS3 would add latency. But if you have a game that runs 60 fps at native 4k, it will run faster with DLSS enabled. For an arbitrary example, if you enable DLSS 2 then the game will run at 90 fps and you will see a reduction in latency just purely from the reduction in frame render time, which hypothetically will be going from 16 ms per frame to 11 ms per frame. If you were to then run that game with DLSS3, you could hypothetically see framerates up to 180 fps, but the latency will be similar to DLSS2's 90 fps performance. So, yeah to the last question. 90 fps native should provide the same or better latency than 90 fps using either version of DLSS. This was in Cyberpunk, so in a totally GPU-limited situation I would expect latency to behave similar to any game that is GPU bound. But if a game is CPU limited, the situation is going to change. And as Crispy linked to above, capping the framerate either in-engine, in-driver, or through V-sync can negatively hurt latency. Right now you can't use G-Sync and DLSS3. The driver doesn't allow it. I suspect it's because the driver isn't smart enough, yet, to determine which frame it should keep or throw away if you're producing way more frames than your display can handle. The solution that Nvidia will probably come up with in time, if you want to use a framerate cap or some type of Vsync, is to have dynamic DLSS3. Basically, if the GPU is powerful enough to render the scene as fast or faster than your framerate cap, then you don't need to AI generate any frames at all. So like how dynamic resolution scaling works in a lot of games (console games mostly), the GPU will only AI generate a frame if the render time is longer than the framerate cap.
  11. Some DLSS 3 latency tests (and detailed power testing) NVIDIA GeForce RTX 4090 Founders Edition 24GB Review - Drink less, work more! WWW.IGORSLAB.DE The time has come and after an almost unbearable period of waiting and speculation, I can finally present the GeForce RTX 4090 Founders Edition (FE) today inclu TLDR: Adds a little bit more latency compared to DLSS2, but improves latency over native resolution rendering. Uses less power than a 3090 Ti, the power fear-mongering was not really warranted pre-launch.
  12. These results are definitely surprising, however the past generation was also basically overclocked out of the box. The 3090 Ti and basically all AMD cards have maxed out power limits. Nvidia probably saw the reception of the 3090 Ti and decided it's best to squeeze everything out of the chip. The 3090 launched with a 350-375w power limit and that's what most reviewers use when benchmarking, however the Ti defaulted to a 450w power limit and that's definitely one of the reasons the performance gap between the 3090 Ti and 3090 is wider than the gap between the 3080 Ti (also defaults to <400w) and 3090. I really have no doubt they did this to sell a more premium product and to try to justify a higher MSRP and higher profit margins. I am definitely interested to see how different boards perform. And when the curve optimizer is functional there could be a lot of room to tweak performance. I still haven't come across any review showing in depth overclocking and voltage tuning.
  13. Should be simple to transfer your key if you signed into Windows with a Microsoft account (outlook, hotmail, etc.) The Windows installer is also really bad about formatting partitions and drives, particularly if they have been used with another system before. The last few times I tried to install Windows 10/11 onto a mechanical hard drive, I had to format the drive in a separate computer before Windows would let me install to it. You may also want to try using different SATA cables with your drives, in case a bad cable is responsible for the corruption.
  14. I honestly think the audio rework and visual style is a downgrade from the first game. The audio sounds like shit to me. A few sound effects do seem better and have more punch, but the most other effects sound tinny. Some weapons sound like plastic nerf guns to me, and they added an overall reverb\echo that ruins some of the distinctness abilities had in the first game. Visually there are just some pointless changes like the time of day of some maps. They had a map called Night Market in OW1 but they changed it to daylight lighting and then had to change the name of the map. Like why even do that? And I'm not a fan of the added bloom from the new lighting system which washes out any individual style maps had from one another.
  15. I honestly don't think it's going to be a problem to get one of these things. If not at launch, a week later or something.
  16. If this will be strictly a gaming PC, OLED does seem like the way to go. They won't be good if you game in a brightly lit room, but the contrast ratio makes up for their lack of brightness in dim or dark rooms. List of OLED monitors on the market | OLED-Info WWW.OLED-INFO.COM
  17. Yesterday I kept trying to login and was greeted with 2000 people ahead of me in queue, and after 5 minutes it would just timeout and say error. But then I decided to change my region from America to Europe, and joined the game instantly. Funnily enough, when I played matches my pings were completely normal for America, when usually I'm supposed to get 100+ ping to Euro servers.
  18. Shhhh, don't give Nvidia more ideas on how to monopolize the market.
  19. I'm very much interested in seeing how driver updates improve performance over time. AMD was able to squeeze more performance out of GCN and RDNA after they first released. With the amount of transistors Intel has in these GPUs, there's potential for massive improvements.
  20. A lot of monitors will claim HDR support, but there's only a few where the HDR actually looks better and you'll be paying the price premium for it. Most monitors, practically every budget and mid range option, simply do not have the contrast and brightness capabilities to properly display HDR content.
  21. I highly recommend doing a clean install of Windows 11. I first did the upgrade from 10 to 11, keeping my files, and had weird issues that I couldn't fix. Most notably Windows Security just wouldn't open up, even though I think it was still functioning in the background. But after a clean install everything works fine, albeit compared to previous editions of Windows Microsoft has made a lot of things harder to get to.
  22. It could do that. Depends on how you're capping the frame rate. Afaik any modern CPU can push over 60 fps in all games besides sim type games so there's no need to reduce load. I wonder how DLSS would handle a frame rate cap, which frames would it decide to keep and which to their away.
  23. GPUs basically use this same algorithm. Transistors at lower temperatures need less power, so the algorithm will boost more because there's more power left before hitting the limit. So go full custom on both CPU and GPU I definitely like this algorithm over their first versions. I cannot get my 3700x to even meet the advertised frequency, let alone overclock last it. I've only seen 1 core even get close to max boost, most are 100 MHz lower. I got one really shit piece of silicon. Luckily my 5900x can boost 200 MHz over the advertised max setting so that improvement is appreciated. But now I'm happy they are guaranteeing the advertised clocks as the minimum achieved, instead of the maximum achievable in the 3000 days. We can thank Intel for this improvement. AMD could have sold these CPUs at 105w, sacrificed maybe 5% performance, but they would have looked much worse against Intel if they did.
  24. I would put it as DLSS 3 is bypassing system bottlenecks. It's requiring the same system resources as running the game at a lower resolution, but will send twice the frames to your monitor.
×
×
  • Create New...