Jump to content

Xbox Series X | S OT - Power Your Dreams, update: FTC case unredacted documents leaked, including XSX mid-generation refresh, new gyro/haptic-enabled controller, and next-generation plans for 2028


Pikachu

Recommended Posts

I had only watched the Digital Foundry Videos. Now that I've read the article I see that there are some quotes and explanations on the decisions made, as well as a little more in depth info on the specs.

2 hours ago, cusideabelincoln said:

Audio and VR:

Clear advantage to Sony, as the only mention of audio from Microsoft was that it would be running on the generic CPU cores while Sony has a dedicated chip for processing MOAR sounds than you've ever heard in your life.

Actually in the DF video they state that it has dedicated 3D audio hardware for their project acoustics tech. There is a video talking about the software version of it at the link below.

https://www.microsoft.com/en-us/research/video/project-triton-making-waves-with-acoustics/

Link to comment
Share on other sites

On 3/19/2020 at 9:57 PM, cusideabelincoln said:

I thought the DF article was about as detailed on the XSX as the presentation Sony gave on the PS5.  Sony just added a lot of history and context for people to follow along, particularly with the audio, but MS gave us pretty detailed specs and (a short version as to) why they did it that way.

 

My take-away differences:

 

CPU/GPU performance:

-Microsoft targets consistent clockspeeds and built the console and heatsink to handle the worst-case scenario as the power requirements will fluctuate.

-Sony targets a consistent power delivery by offering adjustable clockspeeds.  So the system (or devs, presumably) can allocate more clockspeed/power to the GPU if the CPU doesn't need it, and vice versa.  They should be able to build a smaller form factor since they know what the worst-case power scenario will be, and the PS5 SoC should use less power anyway allowing them to use a smaller heatsink/fan.

 

GPU features:

-Both seem to use a form of mesh shading

-Both seem to have the same Ray Tracing acceleration built into the shaders of the GPU.

-Microsoft mentioned Variable Rate Shading while Sony didn't, but I don't see why the PS5 won't be able to do it.  I think this is just an assumed feature much like 120 Hz and Variable Refresh Rate support, neither of which Cerny talked about.

 

Overall the Xbox will be slightly more powerful than the PS5 at its max potential, but this is really not going to be a difference you'll notice.  It's a smaller difference than the XboneS and PS4 slim as well as the difference between the PS4 Pro and X1X.  My guess the biggest noticeable difference that could emerge will be in the Xbox's ability to dedicate more hardware horsepower for Ray-Tracing effects.

 

Storage:

PS5's SSD capabilities will be up to twice as fast the XSX.  While going about redesigning storage access methods in slightly different ways, both companies are effectively doing the same thing in having hardware-accelerated decompression to keep the CPU from being overwhelmed.  In Sony's description they talked more about the custom I/O hardware they created in order to achieve this while in Microsoft's description they talked more about the custom I/O software they needed to create, but in either case you need both hardware and software to work together so the end result is the same goal:  Both companies want the SSD to act as extended memory in order to free up actual system memory for data that is actively being used, as in current gen games most of the data sitting in memory isn't actually needed; the data is just there as a buffer to keep the frametime from spiking in case it needs to fetch the data from the currently slow as shit drives.

 

Memory:

The differences are basically a wash.  Microsoft does have overall faster memory, but it also needs the faster memory in order to keep its faster CPU and GPU fed properly.  The amount is the same for both consoles, but again each company is trying to free up memory by utilizing the SSDs.  Sony's SSD is faster so it can brute force this method better.  But Microsoft did mention one technique they are going to use to free up even more memory that Sony didn't, and that's Sampler Feedback Streaming.  This method will work concurrently with the SSD by having a less-memory-intensive, lower quality texture loaded into memory and utilized in a final rendered scene while it waits for the SSD to load in the higher quality version.  Maybe Sony doesn't need this method since its SSD is fast enough to... not need it? The end result is there will be a bigger generational leap in effective RAM games can utilize than what the paper specs show us - the same deal with how the performance difference will definitely be more than what TFLOP increase shows us because 1 RDNA2 Compute Unit is more efficient than a GCN Compute Unit.

 

Audio and VR:

Clear advantage to Sony, as the only mention of audio from Microsoft was that it would be running on the generic CPU cores while Sony has a dedicated chip for processing MOAR sounds than you've ever heard in your life.

 

Not mentioned in the DF article I linked, but I stand corrected in that both consoles will have dedicated audio processors.  Microsoft didn't provide any details beyond that, so who knows which one will be better.

 

Backwards Compat:

-Microsoft wins here by guaranteeing multiple generations back, but despite some of the meme's I'm seeing Sony will at least support PS4 compatibility.  However they will have to verify games, much like how the Pro couldn't run every game in boost mode when it launched.

very nice assessment  How ever i have to dissagree with you on the heatsink for the ps5.  although their die is smaller  you have to remember that they are running at higher freq.  which will produce more heat. I've read that the cooling solution for the ps5 is pricey and its the most Sony has ever spent on cooling.  Sony doenst have a good history of handling heat so I dont expect it to be as small as you think.

 

I also believe that the power delta is is bigger then the one for ps4pro to x1x.  over 100gb of bandwith difference.  36cu's ver 52   Was gonna say something about 3daudio and raytraced audio but seen you x'ed it out.  ps also doesnt support Dolby audio  perhaps they feel that their solution is better.  only advantage ps5 has is it will load games in 1or2 sec.  instead of 5-6 of the x

Link to comment
Share on other sites

You could be right about the cooling.  I should amend my statement by saying the size differences would be small regardless.   As both consoles will be close to same performance they should also be close in power and heat output, but right now we don't know how this RDNA2 architecture built on the 7nm+ architecture scales in power, frequency, voltage, and transistor count yet.  Right now the only metric we really know is performance, but these other factors will also impact how big the heatsink is.  So there's really no way to speculate size.

  • Like 1
Link to comment
Share on other sites

  • 3 weeks later...

Minecraft RTX is launching on PC in a few days, and Nvidia is talking frame rate numbers/specs:


1080p across cards:

RGNVqTq.png


2080TI in specific:

YxjUTgq.png


 

Without knowing if the Series X demo was running at max settings/draw distance, it's impossible to say how they truly compare yet.  But taken at face value, there's a real possibility for the Series X to match a RT 2060, or even a Super.  DF did say that Series X had a 30fps low point, eyeballing it.  But overall, good news.

Glass half empty view would be that the RTX 2060 (S) isn't much of a powerhouse to begin with for RT.  It's still the entry level card.  So while it'd be great if the consoles can match that, there would still be major trade-offs with RT enabled.  I don't expect we'll see many games using it upscaled to 4k-ish, or running at 60fps.

As for the 2080TI?  Probably closer to what the mid-cycle console refreshes would target.  And even that is far, far away from the native 4k60fps fully path-traced RT dream.  At least upscaling should get it to a rock solid 30fps... in Minecraft.


 

On 2/25/2020 at 8:21 PM, TomCat said:

Dont know why yall so full of doubt.  AMD has said that thier RDNA2  GPU is going to be DISRUPTIVE to the 4k market.  the gpu in the X is based on this tech. MS helped AMD dev their tech. It will perform better then raytracing on the 2080.  People didnt believe AMD would do what they are doing to intel at the current moment. They are about to do the same thing to nvidia. If they can get their drivers in order they will be golden

 

 

Saved for posterity.

Link to comment
Share on other sites

7 hours ago, crispy4000 said:

Minecraft RTX is launching on PC in a few days, and Nvidia is talking frame rate numbers/specs:


1080p across cards:

RGNVqTq.png


2080TI in specific:

YxjUTgq.png


 

Without knowing if the Series X demo was running at max settings/draw distance, it's impossible to say how they truly compare yet.  But taken at face value, there's a real possibility for the Series X to match a RT 2060, or even a Super.  DF did say that Series X had a 30fps low point, eyeballing it.  But overall, good news.

Glass half empty view would be that the RTX 2060 (S) isn't much of a powerhouse to begin with for RT.  It's still the entry level card.  So while it'd be great if the consoles can match that, there would still be major trade-offs with RT enabled.  I don't expect we'll see many games using it upscaled to 4k-ish, or running at 60fps.

As for the 2080TI?  Probably closer to what the mid-cycle console refreshes would target.  And even that is far, far away from the native 4k60fps fully path-traced RT dream.  At least upscaling should get it to a rock solid 30fps... in Minecraft.

 

I think this raises a number of questions:

1)  With DLSS 2.0 being so effective (according to DF, they thought it was largely indistinguishable from native) -- is the proper benchmark for comparison with RTX DLSS on or off?

2) We're comparing SeX with a $300 video card -- that is potentially months away from being replaced with Ampere.  Are these results a good thing/or a bad thing vs. what we've seen SeX running? (30 fps on SeX vs. a RTX-2060 at ~55 fps [30 fps with DLSS disabled])

 

Looking forward to DF comparing the RTX beta with the SeX video they've already captured.

Link to comment
Share on other sites

1 hour ago, AbsolutSurgen said:

I think this raises a number of questions:

1)  With DLSS 2.0 being so effective (according to DF, they thought it was largely indistinguishable from native) -- is the proper benchmark for comparison with RTX DLSS on or off?

2) We're comparing SeX with a $300 video card -- that is potentially months away from being replaced with Ampere.  Are these results a good thing/or a bad thing vs. what we've seen SeX running? (30 fps on SeX vs. a RTX-2060 at ~55 fps [30 fps with DLSS disabled])

 

Looking forward to DF comparing the RTX beta with the SeX video they've already captured.

 

1) I'd still consider it being proper with DLSS off.  Because traditional upscaling techniques (like checkerboarding) will still be used on Series X as well.  They're still just as effective at bringing frame rates down, even if the Faux-K isn't nearly as convincing.

 

2) I'd be surprised if the 2060 line doesn't see some substantial price drops this year.  The consoles will still likely be a stronger value proposition by several metrics.  The question I have is how quickly raytracing ambitions will outpace the hardware.

 

I think we've already seen that happen.  Control, a current gen game with next-gen lighting, can bring a 2080TI to its knees.  Even with DLSS upscaled 4K and medium RT settings it doesn't net a consistent 60fps.  How much further could a truly next-gen game push the issue?  How many devs will be able to well-optimize for it?

 

That's the kind of thing that makes me hesitant to accept weak RT capabilities, even in a console.  Either Control is a Crysis 1 degree of future looking, or some of the lower end hardware options will seem utterly outdated real quick.

Link to comment
Share on other sites

1 hour ago, crispy4000 said:

 

1) I'd still consider it being proper with DLSS off.  Because traditional upscaling techniques (like checkerboarding) will still be used on Series X as well.  They're still just as effective at bringing frame rates down, even if the Faux-K isn't nearly as convincing.

 

2) I'd be surprised if the 2060 line doesn't see some substantial price drops this year.  The consoles will still likely be a stronger value proposition by several metrics.  The question I have is how quickly raytracing ambitions will outpace the hardware.

 

I think we've already seen that happen.  Control, a current gen game with next-gen lighting, can bring a 2080TI to its knees.  Even with DLSS upscaled 4K and medium RT settings it doesn't net a consistent 60fps.  How much further could a truly next-gen game push the issue?  How many devs will be able to well-optimize for it?

 

That's the kind of thing that makes me hesitant to accept weak RT capabilities, even in a console.  Either Control is a Crysis 1 degree of future looking, or some of the lower end hardware options will seem utterly outdated real quick.

1)  I think it is unfair to compare it to traditional upscaling techniques.  DF said "The quality of the image reconstruction is so good, you'll have a hard time noticing whether it is on or off - and aspects of the image can look even better than native rendering, simply because DLSS doesn't feature some of the deficiencies found in some temporal anti-aliasing solutions."  The key is how effective it appears to be at replicating an image rendered at a higher res -- we'll have to see how effective it is on a broader gamut of games -- but it may be the killer app of next gen.

 

2)  I was trying to suggest that there will be a $300 video card available with an Nvidia chip that will be substantially more powerful than the 2060 by the time SeX launches.

 

I still believe that the next gen consoles will really struggle when it comes to ray tracing.  It will be the next few gens of PC graphics cards that will really shine.

Link to comment
Share on other sites

Definitely agree that DLSS 2.0 is next level upscaling tech.  My point was that older techniques still give very efficient performance yields without too much compromise.

 

If the consoles can only do checkerboarding (and the like), I think I'd still be plenty happy.  I take more issue with the consistency.  There's a lot of games that really should attempt this sort of temporal reconstruction that don't.

Link to comment
Share on other sites

But how effective is DLSS vs what consoles already use (checkerboarding)? Quality wise, DLSS seems better, to clarify, I’m speaking in terms of performance gains vs native 4k. 
ie: it’s possible the DLSS framerates may be realistic to go by for consoles.

Link to comment
Share on other sites

1 hour ago, Spork3245 said:

But how effective is DLSS vs what consoles already use (checkerboarding)? Quality wise, DLSS seems better, to clarify, I’m speaking in terms of performance gains vs native 4k. 
ie: it’s possible the DLSS framerates may be realistic to go by for consoles.


Performance gains should be similar, given that both can reconstruct from resolutions with 1/4 the pixels as native.  AMD is apparently developing their own AI upscale algorithms too.

 

DLSS 2.0 seemingly does very well quality-wise at lower resolutions.  So games with DLSS that can target a reconstructed 1440p/1080p might realistically be where there could be a big advantage.  After all, you wouldn’t want to see a game checkerboarded up to 1080p.  That could extend the relevance of NVIDIA’s older RTX cards a bit.


I seem to remember some base hardware current-gen games opting for lower-res checker-boarding in any case (Battlefront 2?).

Link to comment
Share on other sites

2 hours ago, crispy4000 said:


Performance gains should be similar, given that both can reconstruct from resolutions with 1/4 the pixels as native.  AMD is apparently developing their own AI upscale algorithms too.

 

DLSS 2.0 seemingly does very well quality-wise at lower resolutions.  So games with DLSS that can target a reconstructed 1440p/1080p might realistically be where there could be a big advantage.  After all, you wouldn’t want to see a game checkerboarded up to 1080p.  That could extend the relevance of NVIDIA’s older RTX cards a bit.


I seem to remember some base hardware current-gen games opting for lower-res checker-boarding in any case (Battlefront 2?).


Did you mean up “from”? The goal with these techniques is to make 4k+ (or ridiculously wide resolutions) with high framerates a reality without needing ridiculous horsepower for it, correct? I haven’t heard of anyone using it for standard HD resolutions (most games won’t even let you enable it unless you’re running at 1440p or higher).

Link to comment
Share on other sites

20 minutes ago, Spork3245 said:


Did you mean up “from”? The goal with these techniques is to make 4k+ (or ridiculously wide resolutions) with high framerates a reality without needing ridiculous horsepower for it, correct? I haven’t heard of anyone using it for standard HD resolutions (most games won’t even let you enable it unless you’re running at 1440p or higher).

 

Nope, I meant "to."  The goal was, and still, is to make 4k more feasible with modern bells and whistles.  But DLSS 2.0 has also become a game changer for lower-res gaming in that regard.  Watch this!
 

 

If you're on PC playing on a 1080p or 1440p monitor, or aren't as picky about needing everything to be 4k(-ish), this is a huge boon for the longevity of the initial RTX line.  Same goes for a future Switch revision potentially.

Link to comment
Share on other sites

6 minutes ago, AbsolutSurgen said:

I had thought that DLSS relied pretty heavily on Nvidia's tensor cores.  Why do we think DLSS on a AMD GPU without them would produce similar results?

 

DLSS won't be running on an AMD GPU afaik.  AMD is R&D'ing their own AI upscaling method.  But for now, checkerboarding (and the like) is the closest option that produces similar results, performance-wise.

Link to comment
Share on other sites

25 minutes ago, AbsolutSurgen said:

I had thought that DLSS relied pretty heavily on Nvidia's tensor cores.  Why do we think DLSS on a AMD GPU without them would produce similar results?

DLSS is a nVidia proprietary technique that does indeed rely on tensor cores, and as far as I know the Series X doesn't have any equivalent hardware to offload any AMD equivalent too.

 

The other thing that comes to my mind is that nVidia spent who knows how long on DLSS 1.0, which wasn't very good at all, and then spent another 18 months to get to the worthwhile 2.0 update, that is still only available on a small number of games. For all their recent strides in hardware, AMD doesn't have nVidia's a considerable expertise in AI. AMD has been working on similar efforts, but I agree with you that there's no reason to expect AMD to be able to replicate DLSS 2.0 like results anytime soon.

Link to comment
Share on other sites

26 minutes ago, crispy4000 said:

 

DLSS won't be running on an AMD GPU afaik.  AMD is R&D'ing their own AI upscaling method.  But for now, checkerboarding (and the like) is the closest option that produces similar results, performance-wise.

Checkerboarding doesn't produce a similar result .  That's the point.  What I am finding exciting about DLSS 2.0 is how the output appears to be [almost] as good as rendering at native resolution, or potentially better in some (due to how it works).  If it proves to be as effective across the board, DLSS 2.0 is a potential game changer.

What I want to hear is how dependent it is on the tensor cores, and whether something would be even feasible on RDNA2 cores.

Link to comment
Share on other sites

1 hour ago, AbsolutSurgen said:

Checkerboarding doesn't produce a similar result .  That's the point.  What I am finding exciting about DLSS 2.0 is how the output appears to be [almost] as good as rendering at native resolution, or potentially better in some (due to how it works).  If it proves to be as effective across the board, DLSS 2.0 is a potential game changer.

 

In terms of performance uptick, yes it does.  They're two temporal reconstruction techniques attempting to solve the same problem, and yield similar framerate gains.

 

But like you say, DLSS 2.0 is more of a next-gen leap in the fidelity of it.  I'm not denying that.  I'm only saying that checkerboarding can always be resorted to on consoles.  Particularly for upscaling to 4k, where it's still pretty convincing if done well.  (ie: first party Sony games like Horizon, God of War, Days Gone, etc)

 

So for 4k, I'd say DLSS 2.0 is more of a strong step forward than a game changer.  Supersampling applications might also be neat with future hardware.  But for games targeting 1080p/1440p... it makes upscaling to those resolutions tenable now.  That's the wildest application, IMO.  

 

With DLSS 2.0, games with heavy RT effects can choose to render at 360/PS3-like resolutions and scale up to what the Pro generally produces.  It's bonkers.  I'd feel much more comfortable buying a console if AMD can come up with an equivalent.

Link to comment
Share on other sites

3 hours ago, crispy4000 said:

 

In terms of performance uptick, yes it does.  

Based on the reporting DLSS can run at much lower rendering resolutions than checkerboarding can.

3 hours ago, crispy4000 said:

They're two temporal reconstruction techniques attempting to solve the same problem, and yield similar framerate gains.

I think that comment is a very reductive way of looking at what DLSS 2.0 seems to be. 

3 hours ago, crispy4000 said:

But like you say, DLSS 2.0 is more of a next-gen leap in the fidelity of it.  I'm not denying that.  I'm only saying that checkerboarding can always be resorted to on consoles.  Particularly for upscaling to 4k, where it's still pretty convincing if done well.  (ie: first party Sony games like Horizon, God of War, Days Gone, etc)

 

So for 4k, I'd say DLSS 2.0 is more of a strong step forward than a game changer.  Supersampling applications might also be neat with future hardware.  But for games targeting 1080p/1440p... it makes upscaling to those resolutions tenable now.  That's the wildest application, IMO.  

Checkerboarding does not produce anything like the image quality reported from using DLSS 2.0, nor the frame rate improvements.

Everything I've read suggests that a 4k image can be created from a 1080p image, that is for all intensive purposes just as good as if it had be rendered at 4k.  That feels like "magic" -- look at the 4k frame rate improvement that you posted above for the 2080Ti -- going from 16 fps to 49 fps.  That's bonkers -- going from completely unplayable to pretty fricking good -- with reportedly NO REDUCTION IN IMAGE QUALITY.

Checkerboarding doesn't get that kind of improvement, and it is a NOTICEABLE DOWNGRADE in image quality.

 

TBH, I really want to see more benchmarks across a broader sample of games, because DLSS 2.0 looks too good to be true.

Link to comment
Share on other sites

DLSS 2.0 is certainly a major step forward, but its existence does not make checkerboarding techniques any less valuable to console developers in the here and now.  A good checkerboard implementation can still be quite convincing and serve its performance purpose.

IMO the bigger quality difference between the two has more to do with DLSS's utility as an anti-aliasing alternative/enhancement.  At today's resolutions, temporal AA artifacts are arguably a bigger detractor to the overall image quality than reconstruction artifacts.

Link to comment
Share on other sites

15 hours ago, AbsolutSurgen said:

Based on the reporting DLSS can run at much lower rendering resolutions than checkerboarding can.

I think that comment is a very reductive way of looking at what DLSS 2.0 seems to be. 

Checkerboarding does not produce anything like the image quality reported from using DLSS 2.0, nor the frame rate improvements.

Everything I've read suggests that a 4k image can be created from a 1080p image, that is for all intensive purposes just as good as if it had be rendered at 4k.  That feels like "magic" -- look at the 4k frame rate improvement that you posted above for the 2080Ti -- going from 16 fps to 49 fps.  That's bonkers -- going from completely unplayable to pretty fricking good -- with reportedly NO REDUCTION IN IMAGE QUALITY.

Checkerboarding doesn't get that kind of improvement, and it is a NOTICEABLE DOWNGRADE in image quality.

 

TBH, I really want to see more benchmarks across a broader sample of games, because DLSS 2.0 looks too good to be true.


Everything I’ve read states that the 4x image scaling done in DLSS’s 2.0 “Performance” mode (ie: 1080p to 4K) won’t quite match the quality of native 4K, or whatever resolution is being targeted.  No, that’s more what the “Balanced” mode aims to achieve, while “Quality” mode (2x scaling) attempts to go even further.  I think you got in your head somewhere that their 4x scaling was so magical there would be no visual compromise.  That simply isn’t the case.  I mean, Nvidia itself is tacitly admitting that by offering the other two options.

 

Performance mode gains in fidelity and frame rate may very well be better than checkerboarding at 2x.  But it doesn’t negate the fact that the latter is still quite effective and efficient on AMD hardware and older machines too, and still does a pretty good job at faking a large resolution bump.  Checkerboarding was already the paradigm shift, IMO, and I see DLSS as more of the next big quality and performance pass on the concept.  And hopefully the thing that helps popularize upscaling among devs.

Link to comment
Share on other sites

8 hours ago, crispy4000 said:


Everything I’ve read states that the 4x image scaling done in DLSS’s 2.0 “Performance” mode (ie: 1080p to 4K) won’t quite match the quality of native 4K, or whatever resolution is being targeted.  No, that’s more what the “Balanced” mode aims to achieve, while “Quality” mode (2x scaling) attempts to go even further.  I think you got in your head somewhere that their 4x scaling was so magical there would be no visual compromise.  That simply isn’t the case.  I mean, Nvidia itself is tacitly admitting that by offering the other two options.

 

Performance mode gains in fidelity and frame rate may very well be better than checkerboarding at 2x.  But it doesn’t negate the fact that the latter is still quite effective and efficient on AMD hardware and older machines too, and still does a pretty good job at faking a large resolution bump.  Checkerboarding was already the paradigm shift, IMO, and I see DLSS as more of the next big quality and performance pass on the concept.  And hopefully the thing that helps popularize upscaling among devs.

You're right, I'm playing a little fast and loose with the different modes.

We're also unlikely to see very many "real" games that are 100% ray traced, so its probably all academic anyways.

 

I think we'll have to agree to differ on how good checkerboarding actually is, relative to rendering in native resolution.

Link to comment
Share on other sites

31 minutes ago, AbsolutSurgen said:

I think we'll have to agree to differ on how good checkerboarding actually is, relative to rendering in native resolution.


Maybe.  Checkerboarding isn’t perfect - it’s a little soft and blurry compared to native.  I seldom notice the artifacts in good implementations, but they are there.  But given the choice between a 1440p native and well implemented 4K checkerboarding, I’d take the latter every time.

 

It sort of kills me that my aging 1060 could get so much more mileage if PC devs were actually on board with the tech a few years back.  It’s even more spotty with support than HDR on PC.

 

Link to comment
Share on other sites

Sony's approach has been kinda annoying.  It wouldn't be if MS hadn't been so open and transparent.  We saw the thing months ago, we've seen the internals, and are about to see games.  Sony has shown us a logo and the controller.  I hope they make up for it with some big ass event like an all PS5 State of Play or something.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...