Jump to content

Microsoft announces "DirectX XII Ultimate" to unify PC and Xbox Series X APIs


Commissar SFLUFAN

Recommended Posts

DirectX 12 Ultimate brings Xbox Series X features to PC gaming

Quote

Today, Microsoft is announcing a new version of its gaming and multimedia API platform, DirectX. The new version, DirectX 12 Ultimate, largely unifies Windows PCs with the upcoming Xbox Series X platform, offering the platform's new precision rendering features to Windows gamers with supporting video cards.

 

Link to comment
Share on other sites

Variable rate shading and sampler feedback look like they could do wonders for performance if devs make good use of them. The sampler feedback in particular is supposed to allow the Series X to behave as though it has more than 16GB of RAM, according to MS.

Since EVGA replaced my 1080ti with an RTX 2080 I'm ready to go for these things on my PC.

Link to comment
Share on other sites

19 hours ago, jaethos said:

Variable rate shading and sampler feedback look like they could do wonders for performance if devs make good use of them. The sampler feedback in particular is supposed to allow the Series X to behave as though it has more than 16GB of RAM, according to MS.

Since EVGA replaced my 1080ti with an RTX 2080 I'm ready to go for these things on my PC.

I've read that VRS is improving fps by up to 50%

 

  • Guillotine 1
Link to comment
Share on other sites

3 minutes ago, crispy4000 said:

 

“Variable rate shading adds to the performance tally but with RT active, the boost only comes in at around four per cent across the length of the game's internal benchmark.”

 

Digital Foundry on Wolfenstein Youngblood

and you posted that to say what? that is about RT cores and how Nvidia does it.   MS already said that their DX cores run in parallel with the shaders and doesnt effect their performance

 

Link to comment
Share on other sites

36 minutes ago, TomCat said:

and you posted that to say what? that is about RT cores and how Nvidia does it.   MS already said that their DX cores run in parallel with the shaders and doesnt effect their performance

 


It’s the 50% performance gains I took issue with. As that article elaborates, next gen we should be expecting those types of performance gains from upscaling techniques, not VRS.  
 

It’s a welcome addition, but it’s not a magic bullet.  Fortunately, devs can use both.

Link to comment
Share on other sites

56 minutes ago, crispy4000 said:


It’s the 50% performance gains I took issue with. As that article elaborates, next gen we should be expecting those types of performance gains from upscaling techniques, not VRS.  
 

It’s a welcome addition, but it’s not a magic bullet.  Fortunately, devs can use both.

it's okay you have a problem with everything i read or see.  forgot where I seen it but it was a dev. talking about the progress of Ms patented version not the version Nvidia is using

 

Link to comment
Share on other sites

3 hours ago, TomCat said:

it's okay you have a problem with everything i read or see.  forgot where I seen it but it was a dev. talking about the progress of Ms patented version not the version Nvidia is using


Most of what you post tends to be hyperbolic statements.  You assume they are reality without evidence or good reason.  It’s not just me calling you on it.  I’d defer to DF here on how much VRS matters, not expect a tenfold plus improvement on it because Microsoft.

  • Thanks 1
Link to comment
Share on other sites

13 hours ago, crispy4000 said:


Most of what you post tends to be hyperbolic statements.  You assume they are reality without evidence or good reason.  It’s not just me calling you on it.  I’d defer to DF here on how much VRS matters, not expect a tenfold plus improvement on it because Microsoft.

evidenced or good reason.  LOL   I'm using my own reasoning based on what I've read.  When I read things i dont be like ooooooh let me save this link for guys on d1p.  Also just because I read something doesnt mean its true  just passing on the info I read.   Now my opinion of VRS is that it will improve framerates enough to where they can run Raytracing full boar and it wont effect framerates.    Rumour   is that forza 8  will be a showcase for raytracing   running at 120fps and also utilizing the full force of raytracing

Link to comment
Share on other sites

1 hour ago, TomCat said:

Also just because I read something doesnt mean its true  just passing on the info I read. 

 


Yeah, we definitely need less of this as a society right now.

 

1 hour ago, TomCat said:

Now my opinion of VRS is that it will improve framerates enough to where they can run Raytracing full boar and it wont effect framerates.


We read it here folks: VRS will make framerate hits from path traced raytracing non-existent! 

 

Let’s all go leak the news on 4chan.

Link to comment
Share on other sites

The video in the OP says that VRS can improve performance "up to 15%" in Wolfenstein. 15% is a huge number. It's roughly the difference between a 2080 Super and a 2080 Ti. That DF analysis showed 7% with RT off and 4% with RT on. I would suspect that, given the nature of the tech, some games will benefit from it far more than others. A ~5% increase in performance is nothing to sneeze at, and if it can partially or entirely offset the additional load required by RT, that's a pretty tangible benefit.

Link to comment
Share on other sites

30 minutes ago, TwinIon said:

The video in the OP says that VRS can improve performance "up to 15%" in Wolfenstein. 15% is a huge number. It's roughly the difference between a 2080 Super and a 2080 Ti. That DF analysis showed 7% with RT off and 4% with RT on. I would suspect that, given the nature of the tech, some games will benefit from it far more than others. A ~5% increase in performance is nothing to sneeze at, and if it can partially or entirely offset the additional load required by RT, that's a pretty tangible benefit.


We already know RT is much more than a ~5% hit, even at lower settings and less comprehensive uses.


Every little bit matters, yes, especially with RT.  But conservative applications, resolution drops, upscaling techniques, even 30fps caps in some cases, are going to be the key to offset the performance hit at large. 

Link to comment
Share on other sites

Everything I have read suggests that VRS degrades the IQ, where the user "is paying less attention".  I assume that the amount of degradation, and the area of the screen impacted are selectable by the dev.  So the % improvement should change by game.  If you're getting a 50% improvement, there has to be a significant hit to IQ on a significant part of the screen.  Or am I missing something.

 

On 3/21/2020 at 11:55 AM, TomCat said:

evidenced or good reason.  LOL   I'm using my own reasoning based on what I've read.  When I read things i dont be like ooooooh let me save this link for guys on d1p.  Also just because I read something doesnt mean its true  just passing on the info I read.   Now my opinion of VRS is that it will improve framerates enough to where they can run Raytracing full boar and it wont effect framerates.    Rumour   is that forza 8  will be a showcase for raytracing   running at 120fps and also utilizing the full force of raytracing

3tmetz.jpg

  • Haha 2
Link to comment
Share on other sites

yes but it was boring as fuck.  xbox performed better at 4k then the 2080ti  at 1440p  when culling images.  I have an extensive background in electronics  I trouble shooted radars down to the transistor level in the navy.  but yall can keep thinking I'm dumb as fuck and dont know what i'm talking about but I dont care about internet characters opinions of me

Link to comment
Share on other sites

11 hours ago, TomCat said:

yes but it was boring as fuck.  xbox performed better at 4k then the 2080ti  at 1440p  when culling images.  I have an extensive background in electronics  I trouble shooted radars down to the transistor level in the navy.  but yall can keep thinking I'm dumb as fuck and dont know what i'm talking about but I dont care about internet characters opinions of me

I'm not ashamed to admit I understood about 10% of the video -- its target audience was obviously programmers of game engines.  Not understanding very technical details of something you haven't been trained for does not make you stupid.

I really would like to understand more of it -- can you help me:

1) What is an Input Assembler?  How does it work?

2) What is a Primitive Assembler?  How does it work?

3) What is SV_Cullprimitive?

4) What is attribute shading?

5) What do you think is the best culling method?

 

11 hours ago, TomCat said:

 xbox performed better at 4k then the 2080ti  at 1440p  when culling images. 

How did you get that conclusion?  I didn't see a comparison where the 2080Ti was run through a mesh shader when rendering the dragon (I only saw a number with the vertex shader).

 

I don't really think the presenter ever intended to do a comparison between 2080Ti vs. SeX.

Link to comment
Share on other sites

43 minutes ago, AbsolutSurgen said:

I really would like to understand more of it -- can you help me:

1) What is an Input Assembler?  How does it work? Magic

2) What is a Primitive Assembler?  How does it work? Magic

3) What is SV_Cullprimitive? Magic

4) What is attribute shading? None of your god damned business

5) What do you think is the best culling method? Whatever MS is using in the next Xbox, obviously 

 

  • Shocked 1
  • Haha 4
Link to comment
Share on other sites

https://www.youtube.com/watch?v=CFXKTXtil34

 

That's a pretty interesting video discussing how they've cleaned up the pipeline. It's quite nice. 

 

On 3/23/2020 at 10:58 PM, TomCat said:

but yall can keep thinking I'm dumb as fuck and dont know what i'm talking about but I dont care about internet characters opinions of me

It's not about if you're smart or dumb. You keep talking about things and making extrapolations about things you've shown you don't have an understanding of for simply fanboyish reasons. I'm telling you this as both a former game developer and graphics pipeline programmer. Just cut it out and you'll get called out less.

 

On 3/24/2020 at 10:33 AM, AbsolutSurgen said:

5) What do you think is the best culling method?

It's not really an either/or scenario. It's about reducing the workload being sent through the GPU on things you don't see. The standard culling method mention in the video is something you still would do to save on stuff being sent off to the GPU for transformation and processing. This video is on processes you can do while in the geometry stage to cull further before it's sent off further down the pipeline to later stages (tessellation/fragment/rop).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...