Jump to content

Rumour: PlayStation 5 Is More Powerful Than Xbox Scarlett, According To Devs


AbsolutSurgen

Recommended Posts

4 hours ago, TomCat said:

phil has said no midgen refresh this system will be strong enough that it wont need one

 

3 hours ago, TomCat said:

he did it wasnt the giant bomb interview 

 

I’m still getting through their E3 stuff.  But even if you’re right that he actually said it (which I would find hard to believe), taking that seriously is a joke.

 

Of course a mid gen refresh could, and likely would, happen again.  How long has it been since the X, the “most powerful console ever”? And how soon are we getting a new console now?

Link to comment
Share on other sites

3 hours ago, JPDunks4 said:

People will still claim superiority of course, but the difference of 1080p vs 720p are a lot more apparent and obvious than a 4k vs 4k checkeroarded or whatever the differences might be going forward in future hardware.

 

The higher and higher resolutions we get, you keep getting diminishing returns.  With all the different rendering techniques they use too, its difficult to figure out exactly what is what anyway.  

 

360 vs PS3 was mostly an architecture disparity.  Cell sucked to work with, 360 was easy for developrs.   Sony learned their lesson.

 

XB1 vs PS4 was a staight up power issue along with architecture.  PS4 was a dream to work with, XB1X was an underpowered convuluted multimedia device.  Microsoft learned their lesson.

 

Now unless one of them are dumb enough to repeat old mistakes, I expect both to have a very developer friendly box, both 100% focused on game performance.  I think one of the main differentiators maybe PSVR, and what MS is goign to do about VR.

 

I’d be with you if raytracing wasn’t a thing.  It’s as simple as that, really.  Performance and resolution will remain an issue because of it.  Even more so if there’s nothing specific in AMD’s chip design (ie: tensor cores equivalent) to facilitate it.  

 

So no, these boxes won’t be 100% performance focused.  New rendering techniques, which they’re already both bragging about, will eat away at that priority significantly.

 

Some devs will do a great job balancing RT with performance.  Others will have shit optimization and still stick with it.  And with a mid-gen refresh, those problems could compound themselves over time for these start-of-gen machines.

Link to comment
Share on other sites

1 hour ago, crispy4000 said:

 

I’d be with you if raytracing wasn’t a thing.  It’s as simple as that, really.  Performance and resolution will remain an issue because of it.  Even more so if there’s nothing specific in AMD’s chip design (ie: tensor cores equivalent) to facilitate it.  

 

So no, these boxes won’t be 100% performance focused.  New rendering techniques, which they’re already both bragging about, will eat away at that priority significantly.

 

Some devs will do a great job balancing RT with performance.  Others will have shit optimization and still stick with it.  And with a mid-gen refresh, those problems could compound themselves over time for these start-of-gen machines.

 

I honestly don't know a ton about Ray Tracing so it maybe why I'm just looking at it wrong.  I just think both will be aiming at roughly the same benchmarks and performance and price range, and both are probably working pretty closely with developers too on deciding on what specs to aim for.  If Microsoft goes to a dev and says, is this suitable for what you need, and they are like, wellll its good but something else might be better.  I think MS would re-evaluate what they are doing, and vice versa.  

 

Either way, we won't know shit til at least next E3 probably.  

Link to comment
Share on other sites

14 hours ago, TomCat said:

phil has said no midgen refresh this system will be strong enough that it wont need one

 

Lol

Except when people are actually trying to render games in 8k in 5ish years and they do a refresh to make that plausible, just like with this gen and 4k. :p 

Link to comment
Share on other sites

17 hours ago, JPDunks4 said:

 

I honestly don't know a ton about Ray Tracing so it maybe why I'm just looking at it wrong.  I just think both will be aiming at roughly the same benchmarks and performance and price range, and both are probably working pretty closely with developers too on deciding on what specs to aim for.  If Microsoft goes to a dev and says, is this suitable for what you need, and they are like, wellll its good but something else might be better.  I think MS would re-evaluate what they are doing, and vice versa.  

 

Either way, we won't know shit til at least next E3 probably.  

 

It’s not a matter of Microsoft budging if developers want more.  It’s a matter of the tech not being affordable, or frankly, just not there yet.  To give developers everything they want RT-wise (shadows + lighting + reflections) isn’t possible today, even in current gen targeted games.  Nvidia is attempting to give them some of what they want and are designing their chips with that focus.  AMD, from what we know, is behind even that curve.

 

To aim for the same benchmarks as the Pro aims for today (4K checkerboarded 30fps) at medium raytracing settings would take a $500 Nvidia GPU.  That’s also with current gen targets, not games that push other aspects of the GPU further.  4k60 checkerboarded?  Bump that up to a $800 GPU.  4K60 Native? +$1000.

 

This is why performance will continue to be a problem.  Some developers will opt out of raytracing for the sake of framerate and resolution.  It’ll be nice for that option to exist.

 

But for games that want to be bleeding edge, performance will remain an issue.  And unless AMD cooks up some secret sauce themselves, the metrics can't be expected to be good.

Link to comment
Share on other sites

On 7/1/2019 at 6:31 PM, AbsolutSurgen said:

PlayStation 5 APU To Offer Better Performance Than NVIDIA’s GTX 1080 – Rumor

 

A rumour going the other way, which is comparing the PS5 to a 1080 -- which only has ~8 GFLOP of power.  I had expected something closer to the 12-14 range.

Current gen at release in 2013 was ~ equivalent to the previous gen nvidia x80 GPU. So this seems about right. Even though it's a little dated it's still a lot of power to give devs as an across the board standard to optimize to.

Link to comment
Share on other sites

1 hour ago, Dexterryu said:

Current gen at release in 2013 was ~ equivalent to the previous gen nvidia x80 GPU. So this seems about right. Even though it's a little dated it's still a lot of power to give devs as an across the board standard to optimize to.

That would be a 2080 Super w/ tensor cores?

Link to comment
Share on other sites

2 hours ago, Dexterryu said:

Current gen at release in 2013 was ~ equivalent to the previous gen nvidia x80 GPU. So this seems about right. Even though it's a little dated it's still a lot of power to give devs as an across the board standard to optimize to.

 

The launch Xbone’s gpu is equivalent to roughly a GTX 750 or a HD 7790, the launch PS4’s is roughly equivalent to a GTX 760 or an HD 7850. That would be a long ways off from a 1080 equivalent next fall.

Link to comment
Share on other sites

4 hours ago, jaethos said:

The 1080 is going to be 4 years old by the time these come out. If they can't match that then it's really more of a testament to how terribly GPUs have advanced over that time period.

 

Honestly, I'd hope they'd both be at least as powerful as a 1080 Ti or 2080 considering they're hoping for native 4k at 30-60fps with ray-tracing.

Link to comment
Share on other sites

On 7/1/2019 at 6:31 PM, AbsolutSurgen said:

PlayStation 5 APU To Offer Better Performance Than NVIDIA’s GTX 1080 – Rumor

 

A rumour going the other way, which is comparing the PS5 to a 1080 -- which only has ~8 GFLOP of power.  I had expected something closer to the 12-14 range.

 

I hope that rumor isn't true. That would be a massive disappointment, for me. 

Link to comment
Share on other sites

➤ PC Games Hardware: Does the already mentioned statement 4x relate to the performance of the complete console?

Phil Spencer: No, that's a pure CPU statement. It would also be a little too simplistic to refer to the whole system, as much as I would like to, because so many components flow into it. Take the Xbox One X: In its development, the memory bandwidth was the bottleneck. It had to be big enough to provide content to the GPU without idle time. We could have brought the console to market a year earlier, but we waited another year to get all 6 of the GPU's TFLOPS up and running.

Our primary goal with Scarlett was to improve the graphics capabilities and GPU of the console. Primarily because another goal was to integrate a CPU into the system that can compete with the GPU. Unlike PCs, consoles have historically been "arm-presses" with a strong arm - the GPU - and a weak arm - the CPU, which does nothing other than change frames calculated by the GPU as fast as possible, often with only a maximum of 30 fps. 
Our primary goal with Scarlett was to improve the graphics capabilities and GPU of the console.
Now we are talking about 120 Hertz or variable refresh rates. Because if the timing of the game loop - the core routines of a game - corresponds to the refresh rate, this reduces the input latency and thus ensures a smooth gaming experience. And that depends largely on the CPU and memory bandwidth. That's why you have to see a statement like "Scarlett is x times faster than Xbox One X" a bit more differentiated.

 

Once again I was right   4x the power is only refering to the cpu

 

he also confirmed that the cpu in the 1x was the main reason they coulnt do 60fps like i said

 

https://www.pcgameshardware.de/Xbox-Next-Scarlett-Konsolen-268616/Specials/Phil-Spencer-Das-Mega-Interview-1293543/amp/

Link to comment
Share on other sites

On 7/3/2019 at 1:17 PM, Spork3245 said:

 

The launch Xbone’s gpu is equivalent to roughly a GTX 750 or a HD 7790, the launch PS4’s is roughly equivalent to a GTX 760 or an HD 7850. That would be a long ways off from a 1080 equivalent next fall.

 

Yep... and I hope it's more along the lines of a 2080 personally. The point I'm trying to make is that both 1080 and 2080 can both produce some damn good graphics if set as the absolute minimum baseline. As in, the games we're seeing now are mostly developed to achieve 1080p / 30fps on a GTX750/760 since the base Xbox One and PS4 are the base specs all developers have to optimize for. So even if it only ends up being ~= to a 1080 that's still a huge jump.

Link to comment
Share on other sites

48 minutes ago, Dexterryu said:

 

Yep... and I hope it's more along the lines of a 2080 personally. The point I'm trying to make is that both 1080 and 2080 can both produce some damn good graphics if set as the absolute minimum baseline. As in, the games we're seeing now are mostly developed to achieve 1080p / 30fps on a GTX750/760 since the base Xbox One and PS4 are the base specs all developers have to optimize for. So even if it only ends up being ~= to a 1080 that's still a huge jump.

 

Considering the X is using roughly the equivalent of a 580/1060, which isn’t that far off from a 1080 in terms of a generational improvement, and struggles with rendering 4k 30fps, I’d say that’s still a not-so-great metric. A 1080 Ti equivalent is the bare minimum one should be hoping for.

Link to comment
Share on other sites

50 minutes ago, Spork3245 said:

 

Considering the X is using roughly the equivalent of a 580/1060, which isn’t that far off from a 1080 in terms of a generational improvement, and struggles with rendering 4k 30fps, I’d say that’s still a not-so-great metric. A 1080 Ti equivalent is the bare minimum one should be hoping for.

GPU was hampered by an underperforming cpu. that wont be the case with the new systems. i expect the gpu in scarlett to have twice the performance of the 1xgpu putting it in the 12-14 terraflop range

Link to comment
Share on other sites

21 minutes ago, TomCat said:

GPU was hampered by an underperforming cpu. that wont be the case with the new systems. i expect the gpu in scarlett to have twice the performance of the 1xgpu putting it in the 12-14 terraflop range

 

CPUs are not the bottleneck at 4k.

Link to comment
Share on other sites

38 minutes ago, Spork3245 said:

 

Considering the X is using roughly the equivalent of a 580/1060, which isn’t that far off from a 1080 in terms of a generational improvement, and struggles with rendering 4k 30fps, I’d say that’s still a not-so-great metric. A 1080 Ti equivalent is the bare minimum one should be hoping for.

The X is roughly equivalent to a 1070 at ~6TFlop.  It certainly struggles at 4k60 -- I thought it handled 4k30 in most cases?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...