Jump to content

imthesoldier

Members
  • Posts

    1,282
  • Joined

  • Last visited

Everything posted by imthesoldier

  1. You just activated so many frustrating memories. That one race where you race that big fucker who actually starts running before the lights go green always infuriated me as a kid. ...I think some redemption is in order. I must replay DKR! But seriously, this game looks fun, and will becurious as we get closer to launch. I realize I'm less inclined to play sim racers these days, and stick to arcade, so could use a good fix.
  2. I never finished ZombiU back in the day, and was always so concerned about dying, and then never finding my zombified version of myself. And then I remember there were the random players that became zombified, and you could kill them to loot their shit before they could. I wonder if that feature will still exist after March 27th, or if it's already been shut down. Hell, can I even play ZombiU on Wii U still? (I sure hope so.)
  3. Right? This is the first time I've seen that too, and holy fuck. I don't know how many fans of Color Splash (all five fans I'm sure) actually will defend this method of attacking, but it is seriously appalling. Wonder if modders improved the game by removing that via emulation or something.
  4. I mean, it's quite obvious that nVidia doesn't care as much as we think about Gaming GPUs. Their bread and butter is enterprise hardware, and automotive. Gaming has become a smaller slice of their profits from what I can gather.
  5. I’m still holding out hope we’ll get Switch conversions of WWHD, TPHD, and XCX, especially the latter because there are a lot of optimizations to be had including the object pop in, higher resolution, tweaks to the UI, so text is more legible on a small screen, higher quality sound, etc. The former two could use really just improved frame rates, and higher resolutions, and call it good.
  6. At this rate, the 4050 will be like 550 when that eventually comes out. Makes me feel better about paying 500 for my 3060 at least.
  7. I always appreciate his input since he is an actual developer, but quite transparent in his expectations. He does bring an excellent point that some of you already mentioned regarding the PS4, and how the Jag CPU was already quite slow at the time of release. The Pro helped to keep it more relevant, especially when the age of 4k TVs were becoming cheap, and available for most consumers, same with Xbox One X. it is a bit different now considering how balanced the hardware is relative to their respective components. One of the things I remember during the PS3 era was this push for much more powerful CPUs, and honestly pretty weak GPUs all things considered. 360 was definitely more balanced between its CPU and GPU, but also had a single pool of memory, whereas PS3 as we all know had split memory between CPU, and GPU, and much greater emphasis on using the Cell Processor. With the PS4, that dynamic appear to shift with greater emphasis on the GPU, and less so on the CPU. I recall this is where General Purpose computing was showing up more in GPUs, so the CPU seemed less of an important part of the whole package. I think nowadays, people such as ourselves, plus developers have a much better understanding of bottlenecks within the hardware.
  8. "What do you have in mind?" A great question. I already mentioned briefly about things like Sound in video games because of waves. But what also has waves? This ties more into physics, but a physical thing such as water could be calculated like it since it's both waves, and particles like any fluid is. But take a step further, and gravitational waves. So physics interactions that are dynamically tied to the environment you sit on. Another "particle" based one could be the use of Wind in a game, which we haven't really seen yet. Wind, Water, and Gravity all tie into Physics calculations, but in a more ray-traced way, they could all be completely dynamic in ways we haven't seen before. That said, fluid simulations themselves are nothing new in the realm of enterprises such as the engineering side, so how that truly differs than what I'm imagining for a ray-traced version (or maybe that's how fluid simulations work? I'm not sure. I don't do engineering design), I'm not sure. As far as your second part, I'm glad you brought up things such as gameplay elements tide to Stealth, and Horror because that was the first thing I thought Ray-Tracing could truly have a benefit in gameplay design. Someone else also mentioned above concerning how the use of SSDs in PS5, and XSeries would also have effects. Mark Cerny even detailed how such things could work with say no longer needing duplicate assets to save on file sizes, which is a good thing. Loading times could also be a eliminated practically entirely, and level design could be dynamically changed without having to account for "loading" between sections (i.e. Elevators, small corridors, etc). I'm sure there are other gameplay elements that could be "improved" wit ray-tracing, but stealth, and horror were the first ones I thought of. Going back to SSDs, it does go without saying that only the drives themselves are only one part of the picture. The issue then comes with decompressing all that information, which is where "Kraken" comes in for the PS5. Hardware-based decompression is the secret sauce for SSD tech, and how it'll be utilized in future titles that are built from scratch around it. I'm excited to see what developers have in mind with it, and same on XSeries. So factoring in Ray-tracing + the use of SSDs w/ HW Decompression will I think have a much bigger impact. Wonder if developers may have to almost relearn some aspects of game and level design because they will no longer be tied to, "Oh, I need this small hallway to hide in streaming assets," or "This elevator is crucial to load the next bit of data." How that'll also reflect (no pun intended) on the use of Ray-Tracing I'm excited to see. So yeah, I'm more concerned on how Ray-tracing will affect gameplay vs. simply how pretty will the game look. We're already approaching that plateau in photorealism where the differences are getting more and more minute every generation. PS2 era was a massive step up from PS1, and PS3 was a huge leap from PS2. But PS3 to PS4? The difference was less, and PS5, and PS4? Well, we're just starting to get out of cross-gen, so you can make your own interpretation.
  9. Given that practically anything that has waves, or particles can be simulated using ray-tracing techniques, I can't see why other things besides just graphics fidelity can't be calculated using the hardware. If Ray-Tracing is here to stay, then it has to a lot of ground to cover before it's the standard across the board. Graphics alone imo is not enough to make Ray-tracing relevant.
  10. I could be wrong, but considering RT uses the dedicated cores on the silicon for Ray-Tracing, I can't see any reason why they can't be used for things besides graphics. I mean, it's possible to use CPU resources for graphics, though it's typically not recommended. Same with the tensor cores for AI upscaling. Why couldn't they be used for other AI-based uses besides resolution scaling, and frame interpolation? It's all down to how you use the hardware from my understanding.
  11. Great points the both of you. I have played Portal RTX, and the difference is huge for sure, and admittedly looks absolutely gorgeous (when you're card can run it well), but we're also comparing a game that in vanilla form came out back in 2007, and we're slapping ray-tracing on it without improving any of the Geometry or textures, which can also have a profound effect. Not to mention Source traces its roots back to the early 2000s, so it's already using an outdated feature set. I'd like to see what a Source 2 version of Portal look like with, and without RT. And you both are also right that developers have had limited opportunities to really utilize this, even though the technology itself has been available for Animated studios in Hollywood for years now. There are also other use cases for Ray-Tracing, including but not limited to: Sound, AI, and Physics. I'd like to see how developers can utilize all those. I, for one, have been yearning for developers to improve AI, and Physics in particular for nearly two decades now, and for the most part, it's been relatively held back. The last game to truly wow me in terms of physics was Half-Life 2 back in 2003, and F.E.A.R. for Enemy A.I. Supposedly, the later Killzone titles such as 2, and 3 used a more sophisticated Enemy A.I, but I don't remember it being THAT good. And Killzone Shadow Fall was so forgettable, improved A.I. wouldn't have saved it anyway.
  12. Hot take of the day: RT isn't the next-gen leap in graphical fidelity we all think it is. When I see those comparison shots folks like Digital Foundry post, yeah there's a difference, but generational leaps and bounds difference? Fuck no. Part of the issue imo is graphical artists are so good these days in faking lighting, and shadows when needed that some of the differences don't stick out as a "OMG! t3h MEGATON!" type thing. I could even see artists (if they haven't already) use RT as a template on what a room, or piece of scenery can look like with baked in lighting in order to save on resources. The only real advantage I see over the long term for RT is it means less work for the artists during development. Like you said, diminishing returns, and we're already seeing it. I'd like to be proven wrong though. I'd like to think RT will become the new Shaders like how that was a big deal in the mid-2000s. I just have skepticism, and that we're overselling what RT can do, or what it ought to do. Now, going back to the actual topic, I'm all for a PS5 Pro, though I personally would be fine with a PS5 Slim. But if both Sony, and Microsoft feel they need these mid-gen refreshes in the same generational cycle, and gamers will buy them, then there's a market for it.
  13. RT is just this generations version of Blast Processing. But seriously, besides saying, "Better shadows, better lighting, better reflections!" what is the actual mainstream draw for Ray-Tracing? CG Animators know what it is as they've been using it for almost two decades now. We know what it is because we're enthusiasts, but the general populace? Like you said, how you convince the public that "Improved Ray-Tracing!" is a big selling point? (Spoiler: It's not) Now saying things such as "8K Ready!" or "120FPS, or 240FPS modes included!" Now those have more sway I think in terms of what the mainstream would look for. The age of prettier graphics just doesn't have the same ring to it like it did back in the 2000s, let alone in the 80s, and 90s when things like 8-Bit, 16-bit, 32-bit, 64-bit graphics was all the rage in advertisements.
  14. Not only that, but all things considered, CPUs haven't gotten THAT much faster since say 2012. Yes, with more cores and threads, the CPUs having gotten much faster, plus other things such as more Cache, but things such as IPC haven't really made such leaps and bounds improvements. You can argue that single core performance has gotten decently better, and again, more cores does allow for better multi-threaded tasks to be executed. But the fact that I can still use a CPU from over ten years ago, and could have a reasonable gaming experience on it, or have a reasonably new OS run on it is impressive enough. Those 3rd and 4th Gen Intel chips from a decade ago still can be useful, and say you decide to add more ram to your mobo, plus use an nvme drive via an M.2 AIC in one of your unused PCIe slots, you could be easily mistaken your CPU was much faster and newer. And if you decide to go full bore with an ultrawide 1440p monitor, or even a 4k, your GPU will be doing the hard carry anyway. That said, there are other reason to upgrade a mobo besides just the CPU. Things such as USB 3/4, Type C ports, Thunderbolt, 2.5Gig Lan or above, HDMI/Display Port built-in, plus newer gen PCIe lanes are also things to consider. Given I'm on PCIe Gen 4 right now, have three M.2 slots (one 4.0, 2 are 3.0), and I could easily get a ~40-50% boost in CPU performance by going from my current 11400F to a 11700K, I could stay with this mobo setup for a long time. My hope is 3D stacking will become more common, especially with AMD pushing this tech in their flagship CPUs, plus more chiplet designs to help to ink out more performance as we approach the limits of die shrinking.
  15. I think at this point, that would be the reason for me to upgrade sooner, especially if I get an Ultrawide Monitor. But I'd have to see how my 3060 would handle ultrawide gaming at all first.
  16. My last rig was 8 years old when I upgraded last year, but that was a more unusual situation given the pandemic and all. Given what I know now, I probably would've just stuck with my Steam Deck, and not upgraded until later. I convinced myself that I needed to though because I could not really video edit anymore. My previous editing program was Power Director, and I no longer had the product key to even use it, so ended up downloading the free version of DaVinci Resolve. Issue with that was my GTX 770 was terrible at running it, and could not edit well at all, plus it was out of date with not receiving any new driver updates. Serious Sam 4 was the game to make me go, "Alright, time for an upgrade." EDIT: I did manage to run the game at low settings at a nearly locked 30fps, so felt good about that at least.
  17. Personally, I think that rig you got should last you at least 3 years before you should consider upgrading, but I'm also about maximizing my dollar for what I have. My previous rig built in 2014 had an FX 8320 + a GTX 770, and took me 8 years before I did more or less a full upgrade (minus case, PSU, and storage). That said, handing that down to your son would be one hell of a rig for him. Make him earn it.
  18. Yeah. I want to say you only really need that kind of power if you're going for really high frame rates, like beyond 240hz, while still going at 1440p, and have a 4090 to match it for maximum fidelity. To each their own though. I realized I have a sort of calculus for determining how much to spend for a CPU and GPU. If say I spend 500 dollars for a GPU (which I did for my 3060 for better or worse), then I'll only spend a max of half that for a CPU. But I only spent 170 for my 11400F at the time, so got a pretty good deal I thought. Could've gone for a 11600K perhaps, but didn't feel a need, plus I liked it was only a 65W CPU, and I couldn't care less about overclocking. I'm on a dead-end system in I can only go up to 11th gen, but if say I find a 11700KF for cheap, I might upgrade from that down the road.
  19. I know that is Canadian pricing, but even at US prices of 700 dollars according to Newegg, that's a shit load of money for a consumer-based CPU. Yes, it's the king of Gaming CPUs, but still believe that's too much. Personally, the 7800X3D looks to be a better value based on early reviews, though I still think spending even 500 or so dollars on a CPU is expensive. I'm a cheap skate though. 250 is about my limit for what I'd pay for a CPU.
  20. What I find strange though is given this information, it would appear that vram bandwidth means jack shit. Despite those cards having only 8-10 gigs, they're still pretty damn fast, especially compared to my 3060, which I think is something like 360GB/sec bandwidth for its 12GB framebuffer. So the lack of physical ram should be offset with how fast the pool is.
×
×
  • Create New...