Jump to content

cusideabelincoln

Members
  • Posts

    9,160
  • Joined

  • Last visited

Everything posted by cusideabelincoln

  1. I think like the tolerance for how much you can bend this connector is much tighter because it is so small that more people are running into issues doing the same things we have all been doing with the 8 pin connector for years. And personally, it's just a poorly thought out design. While the 8pin connector was extremely overbuilt, Nvidia overdid shrinking it and leaves little room to use it in real-world, un-ideal conditions. You just simply have been more careful and haven't bent the wires too much to have this issue happen. The wires connect to this and this slots over the pins on the card. There is a seam on the jacket, so when you bend the wires a certain way while the connector is on the pins then you can potentially spread it apart, and since this connector is smaller than the 8pin one it's much easier to do that.
  2. Oh dear, more people are reporting melted power connectors. If any of you guys have to unplug your connector, you should examine the adapter before inserting it into the card. One theory floating around is when the cable gets bent too much close to the connector it can cause the metal jackets in the connector to enlarge, thus they won't make good contact with the power pins on the card. This causes resistance to increase and the connector to get hot enough to melt plastic.
  3. My theories: 1. Early BIOS issues, might have to wait for an update. I'm assuming you reset BIOS/cleared CMOS. 2. CPU is not making good contact with the socket. The stock retention bracket is known to be inadequate for LGA1700 and puts uneven pressure on the CPU. Look up LGA1700 contact frames to get a replacement that may solve this issue, but even if this isn't the issue a contact frame will also lower your temps by ~5 degrees. 3. Defective CPU.
  4. From the limited analyses I've seen the biggest hurdle to improving the garbled image will revolve around HUD elements, especially transparent ones like waypoint markers. The blocky, imperfect AI-generated frame mostly helps improve the motion clarity because it exists between two good frames and 95% of the pixels of that frame are constantly changing. However, HUD elements are fairly static and typically reside in the same pixel location of a frame, so when the imperfect AI-generated frame doesn't get the HUD correct you'll notice flicker. I don't see an easy fix for any HUD element that is also transparent because of the way the frame generation works now - needing to have the entire frame rendered. They could tell the algorithm to basically "crop" the portions of the frame with HUD elements and keep those pixels unchanged between normal frames, but then there may be a noticeable jitteriness if those pixels were supposed to change due to the transparency. But this could be less noticeable than if the AI were to try and guess those pixels anyway. The next evolution of this tech would be to implement frame generation into the game engine, and then the developer can tell the algorithm what is and is not a HUD element. Otherwise, Nvidia will have to optimize in the algorithm on a per game basis what the HUD elements are. A fascinating comparison I would love to see is how Nvidia's AI stacks up to some HDTV's motion interpolation, strictly in terms of motion clarity. I have no idea how good smart TVs have gotten since my 4k TV does a shit job at motion interpolation, so this could be a good way to judge how much improvement Nvidia has left to go.
  5. "Consoles need this" is the first thought when they announced it, but we should also see how it functions and performs on lower tier GPUs before putting it into everything. If developers can't even hit 30 fps as a baseline for Frame Generation, it may look like ass when pumped up to 50-60 fps.
  6. I think the time jumps in the show are probably my main issue since we don't get enough character investment. Alicent's transition is basically skipped, left up to our imaginations. One episode we get a calm, calculated, and likeable young Alicent, and then we jump years ahead and get a high-strung adult Alicent. Overall Olivia Cooke's portrayal is also unsettling. Even in the calm and empathetic moments, she always looks highly anxious. The time jumps also made every episode feel formulaic until the King died: Someone is born (and we get to see it!), someone dies, people arguing over who should be heir to X location, and throw in the surprise gore scene. I'm glad we can actually move one and get a story now, and I also hope that means we get to spend more time in other locations besides the Red Keep.
  7. They should lean more into selling Alicent's manic side. The only scene we've gotten to show her as crazy is when she slashed Rhaenyra. Every other scene is empathetic, and this why I'm not buying into her dumb decision making.
  8. The show is willing to throw away consistent logic and motivations in favor of depicting a "bet you didn't see that coming" scene (usually involving gore) in just about every episode. Criston Cole should not be alive, let alone a member of the King's guard. Alicent pleaded desperately to make sure all members of her former husband's family can survive, yet she is pushing the one agenda that will guarantee a war. And she also tries to persuade the once-should-be Queen away from supporting her own niece who has been named heir and Queen.
  9. Well, I already see the effects of this in the latest ad. Holy shit, deals are actually twice as worse as a few weeks ago.
  10. That's hilarious, but also I find it so very comforting that we were able to bitch enough to actually get them to change it. I wonder how many cardboard boxes AIBs will have to throw away now. We also shouldn't have let Nvidia get away with this bullshit in the past. In recent history, I think they started pulling this malarky with the GTX 1060 6GB vs 3GB, which had different specs beyond RAM. There was a variant of the RTX 2060 that used a different chip, but at least they called it a KO, and then there a 12GB that had different specs. The 3080 12GB is closer in specs to a 3080 Ti than 3080 10GB. So fucking please, if the card has different specs call it something else Nvidia. There's more than 5 numbers you can use, and more than 2 letters!
  11. Heard there is an issue with accurate tracking updates due to severe time dilation caused by these massive cards.
  12. PCIE 5 should actually refer to the PSU's ability to output power. It's a more stringent requirement to meet for the new CPUs and GPUs that require higher power and transient loads. So that's why you see people conflating the names; basically just a shortcut in communication because 12vhpwr is kind of a shitty name to say lol, and any PSU that meets the PCIE 5.0 spec will have the new connector anyway.
  13. Yep, the cable is just power and 12VHPWR is the official name if you were wanting to search for one to buy.
  14. That won't work with 4000 series. It's for 3000 series which use a special 12 pin connector Nvidia came up with. You want to get a 12VHPWR cable, which I should correct myself and actually has 16 pins, 12 power pins + 4 sensor pins. You might also see them called as PCIE 5.0 connectors.
  15. Hmm, I did have mine set to headphones before and after the update, and then I switched it to studio reference and still wasn't happy. I just now decided to test all of them, and Night Mode sounds the best and has the most punch. But I noticed some of the drastically changed sound effects when watching the pre-release trailers and streamers playing the beta. A lot of them kept saying how much better the sound effects were, but I really can't hear it even after playing the game for myself. The characters that sound like a significant "downgrade" are Roadhog and Pharah, followed by Soldier, Sombra, Ana, Ashe, Reaper, Torb's right click. Some of the other characters have slightly different sound effects that I wouldn't say sounds worse or better, just different, and the rest have little difference from the first game. But whatever they did to Pharah's rockets is just bad. Sounds like she's shooting air pellets, the explosion and propulsion have no weight which boggles my mind because Junkrat's explosions sound fine. Every map, no matter the location, also sounds like you're shooting in an empty rocky canyon. So much echo.
  16. Completely safe. The card will simply limit you to the default 450w power limit, whereas other cards with 4x8 will allow up to a 600W power limit. But the 600w limit is completely pointless, as the card doesn't overclock high enough to use that much. Most overclocks I've seen, even with the 600w limit, only get the card to draw a little over 500w, and the performance gains are on average 5%. At stock the 4090 uses less power than the 3090 Ti anyway. If you were to get a new adapter, I would suggest look into if your current PSU manufacturer is selling (or is going to to sell) a new 12pin cable for your specific unit. Corsair is going to sell a new modular cable that will be compatible with most of their recent power supplies.
  17. I see both 4080 cards having absolutely zero value to anybody outside of the new features of Frame Generation and AV1 encoding. For any game without DLSS3, they won't be a significant upgrade over a 3090... hell the 12GB looks like it's going to be slower in a lot of games. And they are incredibly overpriced compared to the sales we are seeing and will see on 3000 cards. The 16GB just doesn't make any sense at its price either, when a few bucks more gets you substantially better performance. Somehow Nvidia has managed to make the 4090 a decent value.
  18. If I were Microsoft my argument would be the very existence of Activision as an entity is anti-competitive to the market.
  19. @stepee More latency tests in MS Flight Sim, F1 2022, and Cyberpunk, showing what I was trying to say. In MS Flight Sim, there's a hard bottleneck so native and DLSS provide the same framerate, thus provide similar latency. But frame generation adds latency over both, albeit not that much. You won't notice a 6ms increase with controller. In F1 and Cyberpunk, there's no CPU bottleneck at native and mostly no bottleneck with DLSS. DLSS is thus able to increase the framerate and lower the latency. Frame generation adds latency over DLSS, but is still able to provide better latency than native. I suspect the reason why we don't see a doubling in performance in F1 and Cyberpunk like we do in Flight Sim is because of the GPU load. There is not as much GPU headroom for the AI processing cores to do its thing when the rest of the GPU is already being fully utilized compared to a CPU bottlenecked like Flight Sim where the GPU is stuck at half utilization. Now perhaps this can be improved with driver and programming updates, or perhaps there is just some bottleneck in the architecture of the chip, because most testing shows that DLSS + Frame generation actually reduces power consumption so that can't be the problem.
  20. Oh how I wish it were that simple. But, and this is my experience in a single game, I was noticing something weird when trying to cap the framerate in Overwatch 1. Whether capping framerate below my monitor's refresh rate in the game or using Nvidia's drivers, I noticed an ever so slightly choppiness. This happened with and without Gsync. When I uncapped the framerate, the smoothness improved a little bit simply because of the massive increase in fps. Uncapping the framerate also gave me the lowest input lag. I really had to look for it, but it was ever so slightly noticeable. But when I turned Gsync off and just used Vsync, with a framerate that never dipped below my refresh rate, the game definitely looked noticeably smoother. No tearing, no visual jitter. But the input lag was definitely noticeably worse compared to either Gsync or completely uncapped. It's really difficult to get both a completely smooth visual experience and extremely low latencies. The closest way to achieve this is to simply have way more computational power than you need and just brute force deliver a higher framerate.
  21. Well all of this is going to be tricky and take some work. Latency is impacted by so many different variables that there is no simple answer when all of those variables can be used in different combinations. Vsync, double buffered, triple buffered, Gsync, in-engine framerate cap, driver framerate cap, in-engine render queue, driver render queue, refresh rate, power target, boost frequency behavior all affect input lag to various degrees. Even the resolution you run impacts latency; I've seen tests that capped everything at the same framerate yet the lower resolution will yield (slightly) better latencies compared to a higher resolution. And the same with DLSS and GSYNC. Input lag is slightly increased when using these things even if the framerate is the same with them off.
  22. As of right now, it should be mostly impossible to have a game run at 60 fps 4k native and also then run at 60 fps 4k dlss3. That scenario an only happen if there is a very hard CPU/system bottleneck that's capping the framerate at 60. But, and I haven't seen this tested so I'm guessing, if this situation were to ever happen then DLSS3 would add latency. But if you have a game that runs 60 fps at native 4k, it will run faster with DLSS enabled. For an arbitrary example, if you enable DLSS 2 then the game will run at 90 fps and you will see a reduction in latency just purely from the reduction in frame render time, which hypothetically will be going from 16 ms per frame to 11 ms per frame. If you were to then run that game with DLSS3, you could hypothetically see framerates up to 180 fps, but the latency will be similar to DLSS2's 90 fps performance. So, yeah to the last question. 90 fps native should provide the same or better latency than 90 fps using either version of DLSS. This was in Cyberpunk, so in a totally GPU-limited situation I would expect latency to behave similar to any game that is GPU bound. But if a game is CPU limited, the situation is going to change. And as Crispy linked to above, capping the framerate either in-engine, in-driver, or through V-sync can negatively hurt latency. Right now you can't use G-Sync and DLSS3. The driver doesn't allow it. I suspect it's because the driver isn't smart enough, yet, to determine which frame it should keep or throw away if you're producing way more frames than your display can handle. The solution that Nvidia will probably come up with in time, if you want to use a framerate cap or some type of Vsync, is to have dynamic DLSS3. Basically, if the GPU is powerful enough to render the scene as fast or faster than your framerate cap, then you don't need to AI generate any frames at all. So like how dynamic resolution scaling works in a lot of games (console games mostly), the GPU will only AI generate a frame if the render time is longer than the framerate cap.
  23. Some DLSS 3 latency tests (and detailed power testing) NVIDIA GeForce RTX 4090 Founders Edition 24GB Review - Drink less, work more! WWW.IGORSLAB.DE The time has come and after an almost unbearable period of waiting and speculation, I can finally present the GeForce RTX 4090 Founders Edition (FE) today inclu TLDR: Adds a little bit more latency compared to DLSS2, but improves latency over native resolution rendering. Uses less power than a 3090 Ti, the power fear-mongering was not really warranted pre-launch.
  24. These results are definitely surprising, however the past generation was also basically overclocked out of the box. The 3090 Ti and basically all AMD cards have maxed out power limits. Nvidia probably saw the reception of the 3090 Ti and decided it's best to squeeze everything out of the chip. The 3090 launched with a 350-375w power limit and that's what most reviewers use when benchmarking, however the Ti defaulted to a 450w power limit and that's definitely one of the reasons the performance gap between the 3090 Ti and 3090 is wider than the gap between the 3080 Ti (also defaults to <400w) and 3090. I really have no doubt they did this to sell a more premium product and to try to justify a higher MSRP and higher profit margins. I am definitely interested to see how different boards perform. And when the curve optimizer is functional there could be a lot of room to tweak performance. I still haven't come across any review showing in depth overclocking and voltage tuning.
×
×
  • Create New...