Jump to content

TomCat

Members
  • Posts

    13,994
  • Joined

  • Last visited

Posts posted by TomCat

  1. On 3/9/2021 at 2:03 PM, Fizzzzle said:

    It wouldn't surprise me if Microsoft tried to do some kind of Game Pass thing on the Switch. They're already releasing games on Steam, and Nintendo isn't necessarily a competitor for them the way Sony is.

     

    I do wonder if they'd release games on PS5, but charge full price for it. Like, "you could buy this on PS5 for $70, or you could just subscribe to Game Pass" or something. They win either way. Depending on what they put on Game Pass.

    MS has said over and over that Sony is not there competition 

  2. On 11/21/2020 at 3:48 PM, Duderino said:

     I think we could see that 15-20% advantage on paper manifest with games that have really excellent CU utilization and limited GPU idle time.  Further improvements Microsoft makes to the API will help, but I suspect developers will have an equal part role to play in getting there.

     

    Results that favor the PS5 right now could come down to the higher clocked CUs counteracting some bottlenecks and game inefficiencies.

     

    the reason why the ps5 is running neck and neck with the series X right now is do to clock speed and Sony's version of infinity cache.  The last gen games benifits  from the higher clocks of the gpu and the cache benefits from the increased speed also.   NONE of these launch games are using velocity Ar.  once they do then you will start seeing the diffrence.  Remember series X will have feature parity with the top of the line AMD. Card   WHICH JUST RELEASED.  So of course its going to take awhile for Dev. to properly utilize them

  3. On 11/17/2020 at 5:42 AM, JPDunks4 said:

    Dirt 5 dev said the same.  There simply aren't many 4k120fps HDR TVs out there, and the team definitely didn't have one at each person's home for testing.  He said there wasn't an easy way to test due to Covid, where as they'd typically have something at the studio for them to test on.

     

    He also said they simply have to learn the RDNA2 features as there hadn't been an RDNA2 GPU out yet.  Said the 120fps mode used only 5ms of 8ms on the Xbox, so plenty of overhead left on the CPU.  Now they just need to learn the new GPU features to get more out of the games visually.

     

     

     

    damn that sure sounds alot like what someone I know been saying. He also said they didnt have the SDk's for the series X  they had to develop the game on an Xbox 1 GDK. Something else ole TomCat said

  4. 1 minute ago, crispy4000 said:


    What other barometer do we have?

     

    A cross gen game should be able to push RT plenty.  It’s adding to what’s already optimized to run on weaker hardware.
     

    Even Control could be described as such.  Was it not a fair barometer for RTX cards?  ...Of course it is.

     

     

    control will be a good example because they are actually patching the game  so they should actually leverage the hardware and they DEV. is technically sound

  5. 1 minute ago, AbsolutSurgen said:

    What game do you want him to use?

    Not these lauch games that are still designed to work on last gen machines.  Look the main performance hit for RT is Frame Rate.  What does Velocity arch.  and all the RDNA features do like VRR, Mesh Shaders, they will increase Frame rates  or you can use those saved frame rates to implement Stronger Raytracing.  And one MS adds their DLSS solution with ML you will be blown away

  6. 9 minutes ago, crispy4000 said:


    Did the same.  He speaks about the SDK being less ready on Microsoft’s side.  But nothing as to RT in specific.  And certainly no claims that WD:L is being significantly hamstrung by it.
     

    We’ll see RT optimizations and improvements throughout the gen.  It can only get better down the road.

     

    But I don’t think we’ll be seeing 2080 performance out of these consoles with RT.  It’s not being realistic.  We’re far from it now.

    I dont understand why you using CROSS GEN games as a Barometer for Raytracing performance

  7. 15 minutes ago, BloodyHell said:

    To be fair, I believe the video and think you’re wrong. Both are upscaling. 

    Im not denying the fact the the X uses upscaling.    Crispy is bringing up a OLD post  where i repeated what Ubisoft themselves said about the game.  they said the series X would be native 4k  and the ps5 would use dynamic 4k

  8. 2 minutes ago, crispy4000 said:


    They said PS5 was 4K too at one point.  Your mistake was taking the PR as gospel.  There’s a long track record of 4K being fungible to Ubisoft (and most AAA publishers), especially with dynamic drops,  sub-4K averages and reconstruction.  
     

    So yeah.  Don’t be so trusting, and you won’t be disappointed.

    ole TomCat is just fine   Not my fault Ubisoft dont know what their own teams are doing. 

  9. 39 minutes ago, Xbob42 said:

    Ubisoft having to specifically design several settings lower than PC's lowest is... a little worrying.

     

    On the other hand, Legion runs like shit no matter your machine. You can only get it running "acceptably," even with RT off completely. It's not a good gauge of anything.

    exactly but crispy is on this mission to prove me wrong and is failing badly

    • Haha 2
  10. Just now, crispy4000 said:


    If the config file is to blame, how come it dips down to 1440p?

     

    Keep those blinders on.

    Because its been reported by developers that amd's Raytracing is not performing correctly and needs a Driver update.  Remember how bad Nvidias Raytracing performed when first released. It Got alot better after updates.  I dont have on Blinders I use my knowledge of this stuff 

  11. 13 minutes ago, crispy4000 said:

     

    The denial here pretty much speaks for itself.
     

    Your 2080 comment stands as false.  The Series X is performing at 2060 S level (sans DLSS) with RT.

     

    It’s even more absurd to think the PS5 is what’s holding it back when the PC exists as it does.

    No its the config file thats holding it back. but keep trying.  The ps5 is the 2060 equal

    • Confused 1
  12. 8 hours ago, crispy4000 said:

    Yep. Xbox Series X runs raytracing in this game like a 2060 Super without DLSS.  Maybe just a tad worse in fidelity, but with dynamic resolution to help framerates along if things dip.  (which isn't possible in the PC build)

    Console settings on Series X (and PS5 too) are below medium RT on PC, the lowest that can be selected in game.


    But a 2060 Super can also run RT on Ultra settings at similar framerates, using DLSS 2.0 for reconstructed 4k.


    Obligatory @TomCat quote:

     

    I've already said that raytracing was last second add on the series x because ms was late with the api. I STILL STAND BY MY 2080 COMMENT.  Even though he didnt have a ps5 to do an actual test he did have the configuration files. They showed the exact same settings for the ps5 and the series X.  We know that the series x has the hardware advantage over ps5 due to core count. So its the ps5 that performs at a 2060ti  level not the series X.  because we know the config file was set to perform on  the least powerful system.  They didnt leverage the xtra series X power.

     

    Dont talk to me about Cross Gen games  Holler at me when the ACTUAL next gen games release that leverages the systems power.  

  13. 10 hours ago, crispy4000 said:


    Second source now saying that both consoles use image reconstruction.  They found lower bounds and resolution averages than NX gamer, closer in line with the PS4 Pro's typical resolution counts:
     



    @TomCat counted his eggs before they hatched.

     


     

    Claims of "4K native" for AAA games should always be taken with a heaping spoonful of salt, until there's evidence of typical resolution / no reconstruction techniques.  Its like a developer calling their game 60fps when it averages 50fps or lower.

    There was reason to be skeptical here anyways, with Ubisoft being wishy-washy on what framerate the game would run at initially.  I'm glad they made the development choices they did to get it to 60fps, even if the press release promises were sketch.

    lol cause i used the WORDS FROM UBISOFT themselves  I'm wrong  Man you on a life mission to prove ole Tomcat Wrong Keep trying. Ubisoft themselves are the one that said series x would be native 4k and ps5 would be dynamic

  14. 14 minutes ago, JPDunks4 said:

     

     

    All I saw on the PS5 thread was about how huge and ugly it is? :p

    Well they cant talk about features because all they have is the Xtra fast SSD.  Did that help them get native 4k60 in Valhalla  NOPE  that xtra fast ssd only leads to a couple seconds faster load times THATS IT

    • Guillotine 1
  15. 12 minutes ago, Mercury33 said:

    Soooo the PS5 thread is filled with “Cant wait to play xxxx” “Cant wait to try out the controller” “Here’s a look at the UI” “here’s a look at xxxx game”

     

    and this thread is just filled with theoretical tech dick measuring. Is this system gonna have/play games, innovate in any way? Or is this just basically X1X 2? Nothing new or interesting, just some cool sick new loading times for your 12 year old games. 

     

    *yawn

    i'm Glad a DLC has you so happy in the games department.  Keep on believing them sony lies  and their smoke and mirror routine  Mr. I dont understand what they talking about so I'll insult the whole thread Typical SDF response

    • Guillotine 2
  16. 1 hour ago, crispy4000 said:

     

    I expect the PS5 to be weaker with RT, with scaled back settings.  But larger picture, both consoles will show us early on why a mid-cycle upgrade will be needed.  We’re already seeing it with a cross-gen Watch Dogs Legion dipping to ~1440p30 on Series X at nighttime, when it focuses exclusively on RT reflections (rather than Shadows and GI). It wouldn’t be surprising if Ubisoft was using their same upscaling solution as on the One X to get there.

     

    It’s exciting to think everything could get much better with next-gen upscaling.  But it remains to be seen what the compromises and performance gains will be on AMD’s architecture.  Or even when the hell its coming.  Will it have to be trained per game, like DLSS 1.0?  That might be cheaper in performance impact.

     

    Whatever it is, it’s not ready.  We’ll be seeing old upscaling methods used in the meanwhile.

     

    This is why I'm excited.

     

    Radeon-RX-6900-XT_Angle_1-Custom.png
    WCCFTECH.COM

    The first DXR raytracing performance benchmark of the AMD Radeon RX 6800 "Big Navi" RDNA 2 GPU based graphics card have leaked out.

     

     

    According to that the 6800 which has 60 cu's performs better then a 3070 in Raytracing without Dlss active. Thats big if true.

     

    Xbox-Series-X-1-640x361.jpg
    WWW.EXTREMETECH.COM

    Hot Chips 32, the (virtual) conference that delves into silicon designs from across the injury, is in full swing this week and AMD is one of the companies teeing up to talk about its hardware.

     

     

    Now we know that the series X has 52cu's  and MS added the hardware accelerators for raytracing to the die. Here is an excerpt

     

    The remarks on “Ray-box” and “ray-tri” are likely references to ray-box intersection tests and ray-triangle intersection tests. Both of these are crucial steps in ray tracing, though simply knowing these two numbers doesn’t tell us anything about what the solution will be capable of or how it compares with Nvidia hardware. The talk of a minor area cost for a 3x – 10x acceleration is interesting. One thing we don’t know yet about RDNA2 is whether or not it has any specialized hardware blocks for these types of calculations. Spending more die size to improve performance is not a bad tradeoff, though obviously it’s more impressive when the speedup is towards the 10x end of the range

     

     

    But we now know how Amds  raytracing compares to nvidia  its actually better.   So add the better performance with the Hardware Accelerators and the Series X should perform excitedly well.   I doubt that cross gen games will take advantage of them like watch dogs because they didnt know the hardware was there.  But it will be exciting to see how control performs when they release the patch. And also future games that are designed from the ground up with all this tech in mind

     

     

    Oh My just released sounds alot like what Ole TomCat was saying

     

     

    • Guillotine 1
  17. On 10/31/2020 at 1:23 PM, Massdriver said:
    Radeon-RX-6900-XT_Angle_1-Custom.png
    WCCFTECH.COM

    The first DXR raytracing performance benchmark of the AMD Radeon RX 6800 "Big Navi" RDNA 2 GPU based graphics card have leaked out.


    I’m very skeptical of this because if it was this good, then I would have thought AMD would have shown it off by now. It’s not confirmed.

    I told yall it was going to be this good but I'm a Quack according to some of you  Hell its better then i actually thought

×
×
  • Create New...