Jump to content

cusideabelincoln

Members
  • Posts

    9,173
  • Joined

  • Last visited

Everything posted by cusideabelincoln

  1. Gotham Knights should get better the more you play it. It has to build a shader cache the first time it renders anything and you can't brute force past it lol. But you should go into the Nvidia control panel and increase the shader cache size.
  2. Looks like a 3-way provided from Nvidia, perhaps the new 3-way adpaters use the same shitty manufacturing as the 4-way. There seems to be 2 big theories out there, perhaps working in tandem 1. Poor soldering where the wire attaches to the terminals. 2. Used cheaper terminal that has 2 splits in it where the spec should only be 1, this makes it easier to bend the terminal when inserting it on the pins and giving you a bad connection. Dang, if I were you guys I'd examine your adapters. And when you insert it, do it straight on. I definitely wouldn't put one side of the connector in and then line up the other side before finally pushing it on. 100% straight.
  3. All of these Free Speech chuds are championing Musk buying Twitter, and he wants to copy China
  4. EGS is the absolute worst launcher of them all. EA's may be buggy, but you can do a lot more with it.
  5. What I love is how I'm as as easily compelled by the Empire's side of the story (ISB) as I am the Rebellion's.
  6. I definitely wouldn't use one of those 4-way adapters after seeing it designed now. @stepee @Mr.Vic20 Throw them out. The 3-way is fine, or a native modular cable straight to the PSU will be fine.
  7. It was an act of honest necessity. Jensen ran out of gas for his oven and he needed another heat source!
  8. Finally, I've been wanting to see what the inside of these adapters look like. The presence of solder is definitely the reason why you shouldn't bend the adapters aggressively. A native 12VHPWR connector should be fine as the wires in it should be clamped down with force over the pins. And the reason we see melting mostly on the outer pins can certainly be explained by improper current balancing since it's possible for the entire current of one 8-pin cable to be forced through just a single pin on the 12VHPWR connector, which could then be further made worse if that pin gets damaged from bending or isn't properly seated. Now is this some supplier cheaping out, or did Nvidia know about and approve of this design?
  9. The only logical explanation is the ghost of Fermi was tired of being the joke of Geforce thermals.
  10. I think like the tolerance for how much you can bend this connector is much tighter because it is so small that more people are running into issues doing the same things we have all been doing with the 8 pin connector for years. And personally, it's just a poorly thought out design. While the 8pin connector was extremely overbuilt, Nvidia overdid shrinking it and leaves little room to use it in real-world, un-ideal conditions. You just simply have been more careful and haven't bent the wires too much to have this issue happen. The wires connect to this and this slots over the pins on the card. There is a seam on the jacket, so when you bend the wires a certain way while the connector is on the pins then you can potentially spread it apart, and since this connector is smaller than the 8pin one it's much easier to do that.
  11. Oh dear, more people are reporting melted power connectors. If any of you guys have to unplug your connector, you should examine the adapter before inserting it into the card. One theory floating around is when the cable gets bent too much close to the connector it can cause the metal jackets in the connector to enlarge, thus they won't make good contact with the power pins on the card. This causes resistance to increase and the connector to get hot enough to melt plastic.
  12. My theories: 1. Early BIOS issues, might have to wait for an update. I'm assuming you reset BIOS/cleared CMOS. 2. CPU is not making good contact with the socket. The stock retention bracket is known to be inadequate for LGA1700 and puts uneven pressure on the CPU. Look up LGA1700 contact frames to get a replacement that may solve this issue, but even if this isn't the issue a contact frame will also lower your temps by ~5 degrees. 3. Defective CPU.
  13. From the limited analyses I've seen the biggest hurdle to improving the garbled image will revolve around HUD elements, especially transparent ones like waypoint markers. The blocky, imperfect AI-generated frame mostly helps improve the motion clarity because it exists between two good frames and 95% of the pixels of that frame are constantly changing. However, HUD elements are fairly static and typically reside in the same pixel location of a frame, so when the imperfect AI-generated frame doesn't get the HUD correct you'll notice flicker. I don't see an easy fix for any HUD element that is also transparent because of the way the frame generation works now - needing to have the entire frame rendered. They could tell the algorithm to basically "crop" the portions of the frame with HUD elements and keep those pixels unchanged between normal frames, but then there may be a noticeable jitteriness if those pixels were supposed to change due to the transparency. But this could be less noticeable than if the AI were to try and guess those pixels anyway. The next evolution of this tech would be to implement frame generation into the game engine, and then the developer can tell the algorithm what is and is not a HUD element. Otherwise, Nvidia will have to optimize in the algorithm on a per game basis what the HUD elements are. A fascinating comparison I would love to see is how Nvidia's AI stacks up to some HDTV's motion interpolation, strictly in terms of motion clarity. I have no idea how good smart TVs have gotten since my 4k TV does a shit job at motion interpolation, so this could be a good way to judge how much improvement Nvidia has left to go.
  14. "Consoles need this" is the first thought when they announced it, but we should also see how it functions and performs on lower tier GPUs before putting it into everything. If developers can't even hit 30 fps as a baseline for Frame Generation, it may look like ass when pumped up to 50-60 fps.
  15. I think the time jumps in the show are probably my main issue since we don't get enough character investment. Alicent's transition is basically skipped, left up to our imaginations. One episode we get a calm, calculated, and likeable young Alicent, and then we jump years ahead and get a high-strung adult Alicent. Overall Olivia Cooke's portrayal is also unsettling. Even in the calm and empathetic moments, she always looks highly anxious. The time jumps also made every episode feel formulaic until the King died: Someone is born (and we get to see it!), someone dies, people arguing over who should be heir to X location, and throw in the surprise gore scene. I'm glad we can actually move one and get a story now, and I also hope that means we get to spend more time in other locations besides the Red Keep.
  16. They should lean more into selling Alicent's manic side. The only scene we've gotten to show her as crazy is when she slashed Rhaenyra. Every other scene is empathetic, and this why I'm not buying into her dumb decision making.
  17. The show is willing to throw away consistent logic and motivations in favor of depicting a "bet you didn't see that coming" scene (usually involving gore) in just about every episode. Criston Cole should not be alive, let alone a member of the King's guard. Alicent pleaded desperately to make sure all members of her former husband's family can survive, yet she is pushing the one agenda that will guarantee a war. And she also tries to persuade the once-should-be Queen away from supporting her own niece who has been named heir and Queen.
  18. Well, I already see the effects of this in the latest ad. Holy shit, deals are actually twice as worse as a few weeks ago.
  19. That's hilarious, but also I find it so very comforting that we were able to bitch enough to actually get them to change it. I wonder how many cardboard boxes AIBs will have to throw away now. We also shouldn't have let Nvidia get away with this bullshit in the past. In recent history, I think they started pulling this malarky with the GTX 1060 6GB vs 3GB, which had different specs beyond RAM. There was a variant of the RTX 2060 that used a different chip, but at least they called it a KO, and then there a 12GB that had different specs. The 3080 12GB is closer in specs to a 3080 Ti than 3080 10GB. So fucking please, if the card has different specs call it something else Nvidia. There's more than 5 numbers you can use, and more than 2 letters!
  20. Heard there is an issue with accurate tracking updates due to severe time dilation caused by these massive cards.
  21. PCIE 5 should actually refer to the PSU's ability to output power. It's a more stringent requirement to meet for the new CPUs and GPUs that require higher power and transient loads. So that's why you see people conflating the names; basically just a shortcut in communication because 12vhpwr is kind of a shitty name to say lol, and any PSU that meets the PCIE 5.0 spec will have the new connector anyway.
×
×
  • Create New...