Jump to content

AbsolutSurgen

Members
  • Posts

    14,953
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by AbsolutSurgen

  1. IMHO, the biggest upgrade for next gen consoles will be replacing the mobile CPUs with proper CPUs. A 6x GPU increase sounds reasonable given the current progress on desktop computer GPUs -- which would only put them at about 2x the processing power of a XboxX. The biggest challenge they will face is memory, which is ridiculously expensive, and hard drives. The HDs are really holding them back today, but SSD prices still haven't come down enough to make them affordable for a console.
  2. Why next-gen consoles shouldn't focus on 'true 4K' "In the wake of E3, Sony sent over high quality 4K versions of their trailers, allowing us to get a closer, more granular look at how PlayStation 4 Pro is set deliver the next wave of first-party exclusives for owners of ultra HD displays. Technologies like checkerboarding and temporal injection persist and in all cases, the results are impressive. And that's a good thing, as these techniques - or evolved versions of them - are likely to be a component of games designed for the next generation of consoles. By extension, what that also means is that marketing a new PlayStation or Xbox as a 'true 4K' console, or pushing developers to maximise pixel-counts first and foremost, may not be the best idea. So, why should this be the case? After all, 1080p resolution effectively became the standard for the PlayStation 4 generation, a 1:1 pixel match for the majority of the displays the consoles were connected to - and by the time PS5 arrives, 4K will be the new standard. Indeed, in terms of what TVs are available on store shelves, it already is. But this introduces the uncomfortable reality that we'll be looking at the biggest gen-on-gen increase in pixel-count since the transition from PS1 to PS2. Gigantic generational leaps in graphical power were commonplace in the early years of the 21st century, but these days, the gains are more slender. And that's a problem bearing in mind how vast a leap 4K actually is - a jump entirely at odds with increases in resolution seen in almost every prior console generation. Indeed, boosts to pixel count have actually been reducing gen-on-gen as a general trend for over a decade now. PS2 to PS3 saw the jump to high definition, but this still represented a circa 3x boost to the amount of pixels the GPU needed to drive the display. And moving forward to the present day, PS4's 1080p standard represented a 2.25x increase over the PS3's 720p. Were the same increase applied in the next transition, we'd be targeting a 2880x1620 resolution - a mere 56 per cent of the area demanded by our 4K flat panels. Based on the power of the GPUs AMD has delivered and what its roadmaps for future products hint at, a 6x increase in graphics power over PlayStation 4 is conceivable for a next-gen console - 8x at a real stretch, and this ballpark increase in processing power is the general threshold that typically defines a gen-on-gen leap in console performance. However, when looking at prior transitions, the danger in prioritising 'true 4K' across the board is that too much of those extra GPU resources will be spent painting pixels, with not enough power dedicated to providing an actual leap in graphical fidelity - the stuff that actually matters in defining new experiences associated with a new wave of console hardware. In the meantime, let's also put the GPU into context with what we should expect from the rest of the system. We can reasonably assume that the new wave of consoles will feature far more capable CPUs than the current machines - indeed, Xbox chief Phil Spencer has already talked about next-gen as a rebalancing between CPU and GPU power, opening the door to 60fps and perhaps support for 120Hz displays. If we are to assume that today's 30fps PS4 Pro experiences are tomorrow's PS5 60fps titles, doubling frame-rate ensure that a good chunk of the extra GPU power is already spoken for, even before we've looked at boosting resolution or introducing additional features that genuinely provide a generational leap in visual quality or features. Ryzen could also be deployed on simulating far more realistic, more immersive worlds at 30fps - but even then, it'll still need GPU resources to render them, power that won't be available if too much of the graphics core is put to work on servicing the 8.3m pixels on a 4K screen. With all of this in mind, looking at Sony's latest 'smart upscaling' techniques gives us a valuable insight into the ways that developers of next-gen games can still deliver a proper generational leap without expending too much in the way of GPU power on pixel-pushing. In taking a look at the 4K outputs offered by the likes of Death Stranding, Spider-Man, Ghost of Tsushima and The Last of Us Part 2, it's clear that the foundations are there to ensure that next-gen can actually mean next-gen in terms of that boost to visual quality we really want to see. Insomniac's temporal injection technique does a phenomenal job of transforming native 1440p rendering into a presentation worthy of a 4K screen (Ubisoft's For Honor and Bluepoint's Shadow of the Colossus are similarly impressive) while the checkerboarding techniques of Guerrilla Games are second to none. Meanwhile, The Last of Us Part 2 reminds us that temporal stability and an almost complete removal of aliasing go a long way in delivering excellent image quality and that core resolution is only one component in what makes for a great-looking game. TLOU2 is a 1440p game and doubtless it would look cleaner running at a higher resolution, but there's no denying that you can see where the GPU power is going and that the trade is obviously worthwhile. It's also worth taking a look at Guerrilla Games and Kojima Productions' presentation on the Decima Engine, which outlines how 2160p output is achieved, along with quality examples showing the base console's 1080p image, a prospective Pro 1512p output (the highest the team could push native resolution) along with the checkerboard result. Stacked up against an actual 2160p render with 16x super-sampling (!), Guerrilla's engineers get very close in terms of replicating detail, only compromising on sharpness - a topic the team discussed with Digital Foundry previously. If Guerrilla can produce this kind of image quality and 'bang for the buck' on PS4 Pro, just think about the enhanced techniques it could deliver with more powerful hardware. And let's remember that these results are achieved with relatively slight hardware resources. To scale effectively from 1080p rendering to PS4 Pro's 4K output with only a 2.3x boost to compute, a 24 per cent increase in bandwidth and just an extra 512MB of RAM is a remarkable achievement. It's also worth remembering that what we're seeing today in terms of smart upscaling solutions may only be an intermediate step in the journey, with improved techniques to come. After all, PS4 Pro's checkerboarding is based in part on reprojection techniques first seen in the multiplayer segment of Killzone Shadow Fall, which extrapolated a 1920x1080 output from a native framebuffer based on half that pixel count. With the experience gained from Pro development combined with more lavish GPU resources, more bandwidth and more RAM, we should expect even more in the generation to come. Of course, the main challenge facing the platform holders here is ensuring that the kind of results seen from Sony first-parties is accessible across the board to all developers. Pro was initially designed to offer 4K display support to game-makers without the requirement for a lot of development time. However, great checkerboarding solutions from the likes of Guerrilla Games and Housemarque (seriously, check out Resogun on PS4 Pro) took months to properly implement. Meanwhile, too many Pro titles have simply stuck to pushing native resolution to 1440p or thereabouts, while Xbox One X has delivered the power to push more games more closely to native 4K. It turned out that pure horsepower was easier to work with than the implementation of these smart upscaling techniques and that's something that needs to be addressed. As Mark Cerny and his Sony team continue to architect PlayStation 5, that's an outstanding issue that we hope to see addressed. As a proof of concept, the engineering team can look back on PS4 Pro as a genuine success in terms of disrupting the relationship between output resolution and the GPU power required to get the job done. In fact, many of the techniques pioneered on for PlayStation 4 Pro ended up being deployed on Xbox One X in a range of titles, demonstrating that even when you do have a significant power advantage, technologies like checkerboarding and temporal super-sampling do have a role to play, in addition to more established techniques like dynamic resolution scaling. The question is whether the platform holders should invest more silicon in bespoke hardware - like the Pro's id buffer - or to continue to leave the work to developers. 4K displays will be the standard within the lifespan of PlayStation 5, so as difficult as it may be in terms of engine integration, further hardware support at the silicon level could pay dividends. So, does this mean that we're advocating that no next-gen games should run at 4K? Obviously not. Take the Forza Motorsport franchise, for example. Turn 10 prides itself on running at native resolution. It believes that precision, clean lines and a pristine presentation are a crucial part of the series' DNA - it's key to the developer's vision for the game. But similar to 60fps rendering in the current generation, it's a design goal built into the technology at a very early stage of development and compromises are made. But equally, this generation has also seen many studios move to a more photo-realistic, filmic aesthetic and as Spider-Man and The Last of Us Part 2 demonstrate in particular, the importance of native rendering resolution isn't so pronounced - and this kind of aesthetic can integrate beautifully with smart upscaling techniques. The big takeaway here from my point of view is clear - next-gen hardware design and marketing shouldn't really be defined by native resolution. It was a key point of differentiation for Microsoft with Xbox One X, for a product very much aimed at a hardcore niche looking to get the best out of their expensive new TV purchases - but the new wave of machines will need the mainstream appeal that propelled PlayStation 4 to over 80 million sales. The display upgrade in itself is no longer the focus of the experience and 4K screens can be addressed more efficiently without the need to focus on native resolution rendering. And that in turn opens the door to a more profound question: just what is next-gen? What are the new ideas that'll shift new hardware? It'll be fascinating to see what Sony and Microsoft come up with, but it'll almost certainly be the case that to achieve these goals, technologies like checkerboarding, temporal super-sampling and dynamic resolution will have a big part to play."
  3. Everything I have played so far as been pretty generic. (Not sure it has much in common with other Ubisoft games.) I'm glad they got rid of the fake edginess from the first game (the random swearing and weird sex is gone). I'm having fun -- but so far, don't understand why some people claimed it was a big improvement over the first game.
  4. Blackmagic Design Announces Blackmagic eGPU for MacBook Pro $699 $699 for a 580!
  5. Just googled it. I must be old. We always called them Dungeon Masters.
  6. This may be a stupid question. What does "The GM Screen" mean? (GM means General Motors to me.)
  7. An Apple car will probably have more processing power than a Mac. (For the record, an autonomous vehicle will probably have more processing power than a high end gaming PC too.)
  8. For the record, you were posting about iPhone operating systems -- not Macs.
  9. Because it has the same picture as our supreme leaders avatar...
  10. It's your decision. But including announcements of hipster Mac laptops in a forum that primarily talks about building gaming PCs, video cards, CPU benchmarking and monitors makes little sense to me...
  11. Mac's aren't for gaming.... I would suggest the consumer technology forum...
  12. Why do we now have a Mac included with the PC gaming board? +1 for the old names
  13. Watch Dogs 2 Continuing to play a little God of War (after the sun goes down -- my family room is torn apart with no window coverings and I'm getting too much glare on my screen)
  14. Yes. that is IPS -- but it is still a significantly cheaper panel than in the $1,800 G-Sync panel. That monitor has a very low light outpout (350 CD/m2 ) and only has sRGB colors-(which makes it debatable about whether it should even be called an HDR monitor).
  15. The $1,800 monitor (at Microcenter) is a premium 144Hz Quandum-Dot IPS panel with a 384 zone FALD backlight (don't you love the marketing terms?) and the $400 one is a budget 60Hz TN panel. I wouldn't want to pay $1,800 for a monitor.... But I still want one of these...
  16. That number was based on pure speculation by one review site. It is not a credible number. Original quote that is getting re-reported was "I wouldn't be surprised to see that this FPGA alone makes up $500 of the final price point of these new displays, let alone the costly DDR4 memory."
  17. Topic thread should be -- Unreal Asset Marketplace is getting it's ass kicked by Unity. Unreal lowers their cut in hope of creators bothering to upload their assets for Unreal...
×
×
  • Create New...