Jump to content

David Cage talks next-gen console benchmarks


crispy4000

Recommended Posts

No discussion of framerate or upscaling techniques unfortunately.

 

Quote

Cage admitted that new technology does have a lot of strengths such as “a significant improvement in CPU power, which will mean significant improvements in physics and AI. GPU improvement should be enough to get Ray Tracing in Full HD [1080p] or Full 4K resolution (without Ray Tracing).” Despite how great these features sound, David Cage was sure to state that “All in all, we believe that there will be serious improvements in next-gen games, but maybe not the ones that are currently promoted the most.”

 

Two of the biggest things that David Cage says fans currently seem to be unaware of is how “all parameters are linked” for the aforementioned improvements and how “improving one part of the hardware only makes sense if all is proportionally accelerated to avoid bottlenecks.” As an example, Cage discussed the 8K support that is being teased by companies like Microsoft. “You can have 8K content only if you have an 8K screen. You can do 8K, but probably not 8K AND Ray Tracing. If you have 8K content, the volume of your assets will grow very significantly, their size in memory and on your hard drive too. You will also need to load them very quickly from your storage device to the memory, so this pipeline will also need to be proportionally faster.”

 

While 8K sounds great on paper, Cage demonstrated to DualShockers how these seemingly simple improvements can snowball into a whole suite of new problems that need solutions. Quantic Dream has even already analyzed how the PS5 and Project Scarlett will likely be utilized by developers. “Our current analysis is that few studios will go for 8K because it will necessitate too many compromises on the overall quality of the game. Ray Tracing is going to be so costly that we will probably only see Full HD [1080p] titles using it (there is a direct connection between resolution, Ray Tracing and performances), at least in the first generation of titles,” he said.
 


He goes on a bit further about next-gen budgets being impacted:
https://www.dualshockers.com/ps5-xbox-scarlett-problems-david-cage-quantic-dream/


Sounds like a 1440p+ Raytracing standard might take another mid-cycle refresh.  Or an Nvidia card. :p

Link to comment
Share on other sites

Can't wait for him to squander all the potential on some hackneyed pile of shit with a basic and obvious morality tale so thinly veiled in its allegories that only David Cage himself could consider it anything but a literal explanation of what you're supposed to take away from the story!


Top it all off with some awful gameplay and you've made a Cage Pie!

Link to comment
Share on other sites

3 minutes ago, SFLUFAN said:

I see nothing to disagree with in his analysis that lighting will be more important than resolution for the best visual fidelity in the next generation.

 

I'd agree with that up to a point.  But I wouldn't agree that 1080p is that threshold.

Link to comment
Share on other sites

2 minutes ago, crispy4000 said:

Wow, not the reaction I was expecting.  You guys really think he doesn't know his tech?

Oooooooohhhh he probably does. I just can't take the guy whose pitch for his recent game was "what if the robots in blade runner were actually the sympathetic characters" seriously on anything. 

Link to comment
Share on other sites

The Thing about "Ray tracing" is that technically you can have VERY little of it happening in your engine and it can still, technically, be said that the game supports Ray tracing. You can guarantee that we will be fighting about the level and quality of RT happening on consoles and PCs for years to come! :p 

Link to comment
Share on other sites

27 minutes ago, crispy4000 said:

Wow, not the reaction I was expecting.  You guys honestly think he doesn't know his tech?

I'm sure he knows it a lot better than I do, I just don't really care what he has to say about it! Also he hasn't appeared to come to any conclusions we didn't independently come to while shooting the shit here; power is far less powerful with bottlenecks, even a single one (that's why it's called a bottleneck, after all), developers prioritize the flashiest shit they can, so they're likely to come up with some mix of graphical settings that makes for impressive screenshots or trailers, while running at a sub-optimal resolution and absolutely fucking definitely a sub-optimal framerate.

Link to comment
Share on other sites

5 minutes ago, AbsolutSurgen said:

If a 2080Ti struggles to go above 1080p with ray tracing, and can’t run modern games at 8k, then why would anyone think an AMD tech console would do better?

 

I thought his comments equated to “captain obvious” talks about next gen limitations. 

Can't wait to hear DF salivate over some blurry ass garbage and how it looks "almost as good as real 8k despite only being 1925 x 1083"

Link to comment
Share on other sites

53 minutes ago, AbsolutSurgen said:

If a 2080Ti struggles to go above 1080p with ray tracing, and can’t run modern games at 8k, then why would anyone think an AMD tech console would do better?

 

I thought his comments equated to “captain obvious” talks about next gen limitations. 

 

It’s not entirely accurate to say Nvidia’s upper end cards struggle at 1080p.  If you kick up the rays to high or ultra, perhaps, depending on the utilization.  But if you stick to medium rays (as consoles will inevitably resort to) there’s a lot more wiggle room and potential to hit high resolutions and 60fps.  

 

1440p, dynamic res, temporal upscaling, and cutting back to 30fps are all options on the table too.  Those with weaker RTX cards have been able to resort to some of that.  Though it could change with next gen game demands.  (One of the reasons I’m not buying these cards yet)

 

My takeaway here still is that AMD hasn’t found a way to effectively compete with Nvidia’s raytracing acceleration.  If 1080p with RT is all they can muster at console AAA standards (30 FPS, etc), I’m waiting for a hardware revision.

 

I’m not going to be happy unless 1440p30 is doable with RT.  There’s never been a console generation before where resolution standards went backwards (or even static) for the sake of fidelity.  Maybe it’s the revisions to blame.

Link to comment
Share on other sites

2 hours ago, Xbob42 said:

I'm sure he knows it a lot better than I do, I just don't really care what he has to say about it! Also he hasn't appeared to come to any conclusions we didn't independently come to while shooting the shit here; power is far less powerful with bottlenecks, even a single one (that's why it's called a bottleneck, after all), developers prioritize the flashiest shit they can, so they're likely to come up with some mix of graphical settings that makes for impressive screenshots or trailers, while running at a sub-optimal resolution and absolutely fucking definitely a sub-optimal framerate.

 

Its never been said yet that consoles would be limited to 1080p with RT.  Maybe wasn’t likely, but It was never probable enough to be a foregone conclusion IMO.  Especially with consoles generally being able to punch above their weight due to optimizations.

 

This is also first time it’s been alluded to by a developer.  And the first time I’ve seen any dev call out the 8k thing as mostly unrealistic for these machines.

Link to comment
Share on other sites

6 minutes ago, crispy4000 said:

 

Its never been said yet that consoles would be limited to 1080p with RT.  Maybe wasn’t likely, but It was never probable enough to be a foregone conclusion IMO.  Especially with consoles generally being able to punch above their weight due to optimizations.

 

The is also first time it’s been alluded to by a developer.  And the first time I’ve seen any dev call out the 8k thing as mostly unrealistic for these machines.

I'm like 90% sure we actually discussed the assumption that next gen consoles would require a low resolution to use any real amount of ray tracing, not as devs and not specifically citing 1080, but definitely close enough based on the real world performance of current cards and AMD being weaker at RT on top of limited budgets and time constraints -- the timing just seemed really bad to get anything resembling mature tech for this. Hence why we're not exactly surprised. It's interesting to hear it directly from a developer, of course, but I'd be more interested in whether or not they're gonna try faking it somehow, like with checkerboarding. Get some checkerboarded ass ray tracing, that'd be funny.

Link to comment
Share on other sites

1 hour ago, Keyser_Soze said:

Benchmarks don't matter much when you make glorified point and click games.

 

To me that sounds like part of the problem.  Simple game, still won’t run higher than 1080p with Raytracing.

 

Makes me wonder if even that is feasible for open world stuff (early gen).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...