Jump to content

imthesoldier

Members
  • Posts

    1,282
  • Joined

  • Last visited

Posts posted by imthesoldier

  1. 3 minutes ago, crispy4000 said:

    I want a sequel to Diddy Kong Racing.  This might come close enough.

     

    You just activated so many frustrating memories. That one race where you race that big fucker who actually starts running before the lights go green always infuriated me as a kid. 

     

    ...I think some redemption is in order. I must replay DKR! 

     

     

    But seriously, this game looks fun, and will becurious as we get closer to launch. I realize I'm less inclined to play sim racers these days, and stick to arcade, so could use a good fix. 

    • Haha 1
  2. 13 hours ago, BloodyHell said:

    Color Splash is awful. I have it. Just fundamentally bad. I really disliked Origami King, but that’s a competent game next to Color Splash.

     

    The only WiiU game I sometimes think of going back to is ZombieU. I have no illusion that it’s a great game, but I definitely enjoyed it around launch.

     

    I never finished ZombiU back in the day, and was always so concerned about dying, and then never finding my zombified version of myself. And then I remember there were the random players that became zombified, and you could kill them to loot their shit before they could. I wonder if that feature will still exist after March 27th, or if it's already been shut down.

     

    Hell, can I even play ZombiU on Wii U still? (I sure hope so.)

  3. 2 hours ago, stepee said:

    They should just release them all for 1k and let people fight out the stock.

     

    I mean, it's quite obvious that nVidia doesn't care as much as we think about Gaming GPUs. Their bread and butter is enterprise hardware, and automotive. Gaming has become a smaller slice of their profits from what I can gather.

  4. I’m still holding out hope we’ll get Switch conversions of WWHD, TPHD, and XCX, especially the latter because there are a lot of optimizations to be had including the object pop in, higher resolution, tweaks to the UI, so text is more legible on a small screen, higher quality sound, etc. 

     

    The former two could use really just improved frame rates, and higher resolutions, and call it good. 


  5. I always appreciate his input since he is an actual developer, but quite transparent in his expectations. 
     

    He does bring an excellent point that some of you already mentioned regarding the PS4, and how the Jag CPU was already quite slow at the time of release. The Pro helped to keep it more relevant, especially when the age of 4k TVs were becoming cheap, and available for most consumers, same with Xbox One X. 
     

    it is a bit different now considering how balanced the hardware is relative to their respective components. One of the things I remember during the PS3 era was this push for much more powerful CPUs, and honestly pretty weak GPUs all things considered. 360 was definitely more balanced between its CPU and GPU, but also had a single pool of memory, whereas PS3 as we all know had split memory between CPU, and GPU, and much greater emphasis on using the Cell Processor. 
     

    With the PS4, that dynamic appear to shift with greater emphasis on the GPU, and less so on the CPU. I recall this is where General Purpose computing was showing up more in GPUs, so the CPU seemed less of an important part of the whole package. I think nowadays, people such as ourselves, plus developers have a much better understanding of bottlenecks within the hardware. 

  6. 8 hours ago, legend said:

     

    Plausibly, but I would still guess fewer than tensor cores. What do you have in mind?

     

     

    I think dividing games into "graphics" and "gameplay" and bemoaning that raytracing only impacts "graphics" is an overly simplistic way to critique games. Graphics have an enormous impact on the entire experience. Have you ever played a game with developer art? It doesn't hit the same way at all. Graphics are inherently entwined in games because they're the medium by which feedback, rewards, etc. are conveyed to players. And even if you want to talk about the artistic angle and how "low tech games" can have good art direction, new technology gives artists new tools to do things they couldn't' do before. Minimizing graphics technology as this less important auxiliary aspect would be like telling film directors the lighting doesn't matter. Of course it does. Again, there is a reason ray tracing is the standard in CGI movies. It's maximally enables artists to realize their vision.

     

    And even putting all that aside, you have more explicit impact of graphics on the "gameplay." It's easy to imagine how things like reflections affect how you play in an environment and how shadows impact stealth and horror games. 

     

    "What do you have in mind?" 

     

    A great question. I already mentioned briefly about things like Sound in video games because of waves. But what also has waves? This ties more into physics, but a physical thing such as water could be calculated like it since it's both waves, and particles like any fluid is. But take a step further, and gravitational waves. So physics interactions that are dynamically tied to the environment you sit on. 

     

    Another "particle" based one could be the use of Wind in a game, which we haven't really seen yet. Wind, Water, and Gravity all tie into Physics calculations, but in a more ray-traced way, they could all be completely dynamic in ways we haven't seen before. That said, fluid simulations themselves are nothing new in the realm of enterprises such as the engineering side, so how that truly differs than what I'm imagining for a ray-traced version (or maybe that's how fluid simulations work? I'm not sure. I don't do engineering design), I'm not sure. 

     

     

    As far as your second part, I'm glad you brought up things such as gameplay elements tide to Stealth, and Horror because that was the first thing I thought Ray-Tracing could truly have a benefit in gameplay design. Someone else also mentioned above concerning how the use of SSDs in PS5, and XSeries would also have effects. Mark Cerny even detailed how such things could work with say no longer needing duplicate assets to save on file sizes, which is a good thing. Loading times could also be a eliminated practically entirely, and level design could be dynamically changed without having to account for "loading" between sections (i.e. Elevators, small corridors, etc). I'm sure there are other gameplay elements that could be "improved" wit ray-tracing, but stealth, and horror were the first ones I thought of. 

     

    Going back to SSDs, it does go without saying that only the drives themselves are only one part of the picture. The issue then comes with decompressing all that information, which is where "Kraken" comes in for the PS5. Hardware-based decompression is the secret sauce for SSD tech, and how it'll be utilized in future titles that are built from scratch around it. I'm excited to see what developers have in mind with it, and same on XSeries. 

     

    So factoring in Ray-tracing + the use of SSDs w/ HW Decompression will I think have a much bigger impact. Wonder if developers may have to almost relearn some aspects of game and level design because they will no longer be tied to, "Oh, I need this small hallway to hide in streaming assets," or "This elevator is crucial to load the next bit of data." How that'll also reflect (no pun intended) on the use of Ray-Tracing I'm excited to see. 

     

    So yeah, I'm more concerned on how Ray-tracing will affect gameplay vs. simply how pretty will the game look. We're already approaching that plateau in photorealism where the differences are getting more and more minute every generation. PS2 era was a massive step up from PS1, and PS3 was a huge leap from PS2. But PS3 to PS4? The difference was less, and PS5, and PS4? Well, we're just starting to get out of cross-gen, so you can make your own interpretation. 

  7. 47 minutes ago, legend said:

     

    Blast processing is at best a hardware architecture (and in reality more a marketing term). Ray tracing is a categorically different rendering methodology that better simulates how light actually works. It is the standard for all pre-rendered movies for a reason. But it seems you know this, so I'm not sure why you would draw the comparison! :p 

     

     

     

    Tensor cores can, and are, used for more than than just image upscaling. For games, they've primarily been used for AI upscaling, but in research they're used to accelerate any AI models that use deep learning methods (which is almost all of modern AI), and even could be used for anything that involves very large matrix operations. As games integrate more AI technology, they'll absolutely benefit from tensor cores.

     

    RTX cores I'm less certain about it. If you can cast an operation as ray intersection, they're probably useful for it! But maybe less general than tensor cores.

     

    Given that practically anything that has waves, or particles can be simulated using ray-tracing techniques, I can't see why other things besides just graphics fidelity can't be calculated using the hardware. If Ray-Tracing is here to stay, then it has to a lot of ground to cover before it's the standard across the board. Graphics alone imo is not enough to make Ray-tracing relevant. 

  8. 18 minutes ago, stepee said:

    Nothing gameplay wise will be introduced via ray tracing, it’s for graphics not gameplay. There are things you could point to like oh you can use reflections to see such and such but I mean there are planar reflections and other things that can work well enough for gameplay.

     

    Any gameplay advancements you are looking more at cpu and bandwidth and memory. Physics could be something tackled on gpu end that would change gameplay, but not ray tracing. It’s just a better rendering technique. I like it a lot because it makes games look better and closer to cg movie quality but if you don’t care about graphic only advances and only gameplay then not caring about ray tracing makes sense. Not everything matters to everyone.

     

    Honestly if I was looking for new gameplay experiences right now I’d be focusing on vr the most.

     

    I could be wrong, but considering RT uses the dedicated cores on the silicon for Ray-Tracing, I can't see any reason why they can't be used for things besides graphics. I mean, it's possible to use CPU resources for graphics, though it's typically not recommended. Same with the tensor cores for AI upscaling. Why couldn't they be used for other AI-based uses besides resolution scaling, and frame interpolation? 

     

    It's all down to how you use the hardware from my understanding. 

  9. 59 minutes ago, Spork3245 said:


    Indeed. You’re looking at limited RT usage as if it’s the peak. PortalRTX and RacerRTX show what full path traced RT (zero games besides these use path traced RT and only two or three have full RT (meaning every light source, shadow, and reflection) but they aren’t using path tracing) can be, and it’s night and day.

     

    26 minutes ago, crispy4000 said:


    I don’t think RT gives diminishing returns the harder it’s pushed, at least not yet.  We’re not at that point.  If anything it’s the opposite, with developers having to be so careful with their cutbacks on consoles.  There’s only so much that can be done, and only a select few developers seemed equipped to make a strong case.  

     

    I look at Portal RTX and see a generational leap, absolutely.  From the little we’ve seen, I think Cyperpunk’s Overdrive RT will do so as well, especially in providing a direct comparison to a rasterized-only alternative.  Even the marbles at night tech demo makes the case.

     

     

     

    Great points the both of you. 

     

    I have played Portal RTX, and the difference is huge for sure, and admittedly looks absolutely gorgeous (when you're card can run it well), but we're also comparing a game that in vanilla form came out back in 2007, and we're slapping ray-tracing on it without improving any of the Geometry or textures, which can also have a profound effect. Not to mention Source traces its roots back to the early 2000s, so it's already using an outdated feature set. I'd like to see what a Source 2 version of Portal look like with, and without RT. 

     

    And you both are also right that developers have had limited opportunities to really utilize this, even though the technology itself has been available for Animated studios in Hollywood for years now. There are also other use cases for Ray-Tracing, including but not limited to: Sound, AI, and Physics. I'd like to see how developers can utilize all those. I, for one, have been yearning for developers to improve AI, and Physics in particular for nearly two decades now, and for the most part, it's been relatively held back. 

     

    The last game to truly wow me in terms of physics was Half-Life 2 back in 2003, and F.E.A.R. for Enemy A.I. Supposedly, the later Killzone titles such as 2, and 3 used a more sophisticated Enemy A.I, but I don't remember it being THAT good. And Killzone Shadow Fall was so forgettable, improved A.I. wouldn't have saved it anyway. 

     

     

  10. 1 minute ago, crispy4000 said:


    I’d venture to say that the mainstream doesn’t care about VRR displays and 8k, and won’t for some time.  It’s a case of diminishing returns.  Plus I think the industry will want to push for better quality visuals once 4K60 is truly baseline.  And beforhand too, as this gen continues.


    RT is more marketable, IMO.  For the simple reason that you can show whipepan comparisons on YouTube and see differences on pretty much any display.  Yes, it’s had some bumps in the beginning, but we’ll be seeing more games take fuller advantage of it over the coming years, and less “selective” in how they do.

     

    If it all RT needs a is marketing schtick that resonates, then its ‘fully path-traced. ‘ That statement could need some qualifications to some degree, similar to how 4K gets thrown around for games (and other media) that doesn’t teach it.  But we’ll start seeing more devs mention it soon on the PC side.

     

     

    Hot take of the day: RT isn't the next-gen leap in graphical fidelity we all think it is. When I see those comparison shots folks like Digital Foundry post, yeah there's a difference, but generational leaps and bounds difference? Fuck no. 

     

    Part of the issue imo is graphical artists are so good these days in faking lighting, and shadows when needed that some of the differences don't stick out as a "OMG! t3h MEGATON!" type thing. I could even see artists (if they haven't already) use RT as a template on what a room, or piece of scenery can look like with baked in lighting in order to save on resources.

     

    The only real advantage I see over the long term for RT is it means less work for the artists during development. Like you said, diminishing returns, and we're already seeing it. I'd like to be proven wrong though. I'd like to think RT will become the new Shaders like how that was a big deal in the mid-2000s. I just have skepticism, and that we're overselling what RT can do, or what it ought to do. 

     

     

    Now, going back to the actual topic, I'm all for a PS5 Pro, though I personally would be fine with a PS5 Slim. But if both Sony, and Microsoft feel they need these mid-gen refreshes in the same generational cycle, and gamers will buy them, then there's a market for it. 

    • Halal 1
  11. 22 hours ago, ShreddieMercury said:

    Ray-tracing is probably the least impressive graphical upgrade that I've seen in all my years of gaming.  Like lots of buzzy GFX terms through the ages, its impact on enhancing/innovating/improving gameplay is extremely minimal so far.  It can be impressive if somebody points it out and shows the difference between a RT and non-RT image, but once a game is in motion, I don't understand why we should give a shit.  In terms of actually affecting the image, to my eyes HDR makes more of a difference, but it also does not enhance the gameplay to any degree.  This gen is a bust.

     

    RT is just this generations version of Blast Processing. 

     

    :p

     

    But seriously, besides saying, "Better shadows, better lighting, better reflections!" what is the actual mainstream draw for Ray-Tracing? CG Animators know what it is as they've been using it for almost two decades now. We know what it is because we're enthusiasts, but the general populace? Like you said, how you convince the public that "Improved Ray-Tracing!" is a big selling point? (Spoiler: It's not) 

     

    Now saying things such as "8K Ready!" or "120FPS, or 240FPS modes included!" Now those have more sway I think in terms of what the mainstream would look for. The age of prettier graphics just doesn't have the same ring to it like it did back in the 2000s, let alone in the 80s, and 90s when things like 8-Bit, 16-bit, 32-bit, 64-bit graphics was all the rage in advertisements. 

    • Like 1
  12. 16 hours ago, Spork3245 said:

    FWIW, with CPUs, I need to see at least a 25-30% performance jump in gaming, and the jump needs to be below my refresh rate (since I use vsync :p ) before I’ll typically even consider it. Mostly because changing motherboards is such a pain.

     

    Not only that, but all things considered, CPUs haven't gotten THAT much faster since say 2012. Yes, with more cores and threads, the CPUs having gotten much faster, plus other things such as more Cache, but things such as IPC haven't really made such leaps and bounds improvements. You can argue that single core performance has gotten decently better, and again, more cores does allow for better multi-threaded tasks to be executed. 

     

    But the fact that I can still use a CPU from over ten years ago, and could have a reasonable gaming experience on it, or have a reasonably new OS run on it is impressive enough. Those 3rd and 4th Gen Intel chips from a decade ago still can be useful, and say you decide to add more ram to your mobo, plus use an nvme drive via an M.2 AIC in one of your unused PCIe slots, you could be easily mistaken your CPU was much faster and newer. And if you decide to go full bore with an ultrawide 1440p monitor, or even a 4k, your GPU will be doing the hard carry anyway. 

     

    That said, there are other reason to upgrade a mobo besides just the CPU. Things such as USB 3/4, Type C ports, Thunderbolt, 2.5Gig Lan or above, HDMI/Display Port built-in, plus newer gen PCIe lanes are also things to consider. 

     

    Given I'm on PCIe Gen 4 right now, have three M.2 slots (one 4.0, 2 are 3.0), and I could easily get a ~40-50% boost in CPU performance by going from my current 11400F to a 11700K, I could stay with this mobo setup for a long time. 

     

    My hope is 3D stacking will become more common, especially with AMD pushing this tech in their flagship CPUs, plus more chiplet designs to help to ink out more performance as we approach the limits of die shrinking. 

  13. 21 minutes ago, stepee said:

    I typically like 4 years for the cpu/mb/etc though I’m doing a 3 year because I moved from 60 to 120fps.

     

    I think at this point, that would be the reason for me to upgrade sooner, especially if I get an Ultrawide Monitor. But I'd have to see how my 3060 would handle ultrawide gaming at all first. 

     

    Feeling Old Season 8 GIF by The Simpsons

    • Like 1
  14. 4 hours ago, SuperSpreader said:

    How often do you guys upgrade your PCs? Not like more RAM but the MB/CPU?? 

     

    Cause I'm like on a 8 year cycle. I buy fairly high end and keep it for a long time.

     

    My last rig was 8 years old when I upgraded last year, but that was a more unusual situation given the pandemic and all. Given what I know now, I probably would've just stuck with my Steam Deck, and not upgraded until later. I convinced myself that I needed to though because I could not really video edit anymore. My previous editing program was Power Director, and I no longer had the product key to even use it, so ended up downloading the free version of DaVinci Resolve. Issue with that was my GTX 770 was terrible at running it, and could not edit well at all, plus it was out of date with not receiving any new driver updates. Serious Sam 4 was the game to make me go, "Alright, time for an upgrade." 

     

    EDIT: I did manage to run the game at low settings at a nearly locked 30fps, so felt good about that at least. 

  15. 31 minutes ago, AbsolutSurgen said:

    The issue is less with how long this rig will last, versus how long the other one will.  (I went longer than I normally do before building this one because of pandemic pricing.)  When you have 4 gamers in the house, you need multiple place for people to play.

     

    #Firstworldproblems

     

    :p

  16. 3 minutes ago, AbsolutSurgen said:

    I understand the value of high frame-rate gaming in certain situations -- I could see the difference in my lap-times in Forza when playing at 60 fps on console vs. 90-100 on the computer.  But, personally, for most games, I don't consciously perceive a huge difference above 60 fps.  So, I won't pay a huge premium for a super-fast CPU.

     

    I actually went more extreme than what you are suggesting, I spent about triple on my GPU (4080) vs. my CPU (i7-13700KF).

     

    While I would like to think I will upgrade my current rig, more realistically I will pass it down as the next "kid's computer" in my house in a couple of years and build something new.  (Not that it isn't used extensively by the kids, but my son now calls my old rig "his computer".)

     

    Personally, I think that rig you got should last you at least 3 years before you should consider upgrading, but I'm also about maximizing my dollar for what I have. My previous rig built in 2014 had an FX 8320 + a GTX 770, and took me 8 years before I did more or less a full upgrade (minus case, PSU, and storage). 

     

    That said, handing that down to your son would be one hell of a rig for him. Make him earn it. ;)

  17. 1 minute ago, AbsolutSurgen said:

    I didn't get one of those, I bought a i7-13700kf for C$520 or ~US$385.  I don't feel I need a faster/more powerful CPU than that, but understand why others might.

     

    Yeah. I want to say you only really need that kind of power if you're going for really high frame rates, like beyond 240hz, while still going at 1440p, and have a 4090 to match it for maximum fidelity. To each their own though. 

     

    I realized I have a sort of calculus for determining how much to spend for a CPU and GPU. If say I spend 500 dollars for a GPU (which I did for my 3060 for better or worse), then I'll only spend a max of half that for a CPU. But I only spent 170 for my 11400F at the time, so got a pretty good deal I thought. Could've gone for a 11600K perhaps, but didn't feel a need, plus I liked it was only a 65W CPU, and I couldn't care less about overclocking. 

     

    I'm on a dead-end system in I can only go up to 11th gen, but if say I find a 11700KF for cheap, I might upgrade from that down the road. 

  18. 15 hours ago, AbsolutSurgen said:

    gzgaqcy.png

     

    I know that is Canadian pricing, but even at US prices of 700 dollars according to Newegg, that's a shit load of money for a consumer-based CPU. Yes, it's the king of Gaming CPUs, but still believe that's too much. Personally, the 7800X3D looks to be a better value based on early reviews, though I still think spending even 500 or so dollars on a CPU is expensive. 

     

    I'm a cheap skate though. 250 is about my limit for what I'd pay for a CPU. 

    • Halal 1
  19. What I find strange though is given this information, it would appear that vram bandwidth means jack shit. Despite those cards having only 8-10 gigs, they're still pretty damn fast, especially compared to my 3060, which I think is something like 360GB/sec bandwidth for its 12GB framebuffer. So the lack of physical ram should be offset with how fast the pool is. 

×
×
  • Create New...