Jump to content

The Last of Us: Part 1 - update: PC Update 1.1 released, officially Steam Deck verified


Recommended Posts

5 hours ago, Zaku3 said:

 

Not having a 4090 and 7900XTXs. SMH. I'm viewing the 4090 as the better value in this comparison which says alot about the GPU market.

 

The 4090 was only a ~6% (price) increase over the 3090’s launch price but offered an insane performance gap compared to last gen. Despite it's stupid price, it's somehow the best value if you're comparing it vs last gen cards. Price per frame, though, it's the "worst" value of the 4000 series cards.

Link to comment
Share on other sites

5 minutes ago, cusideabelincoln said:

 

 

Don't buy an 8GB card in 2023.  Probably won't be long before the 4070s 12gb starts having issues.

 

 

 

If you play Triple A games and it's a current gen only game without direct storage. This is ultimately a console has direct storage and a PC doesn't unless the game supports Directstorage and the PC has a Nvme drive.

 

Don't have proof of this per say but high RAM, CPU, and large VRAm utilization implies alot of data flowing around and an APU wouldn't have that since the GPU/CPU are on the same die sharing the main system memory. So the game addresses that discrpency by using RAM, cores, and VRAM to keep all the needed data for a level for example in memory.

Link to comment
Share on other sites

  • Commissar SFLUFAN changed the title to The Last of Us: Part 1 - update: PC version hilarity ensues

Alex Battaglia from DF:

DBEF4336-872-C-4751-9937-4-AAF66-B4437-D
 

I’m not defending nVidia’s penny pinching on VRAM, but Hardware Unboxed has been on kind of a one sided mission lately to try and get ‘dem views. The only decent YouTuber for PC Hardware is GamersNexus IMO, and Linus when he’s doing “entertainment” videos vs attempting serious reviews. For graphics and performance, it’s DF or nothing for me.

Link to comment
Share on other sites

I think it’s kinda like both sides of this argument are often wrong. The side that argued that 10gb will have no problems heavily especially when the 3080 released was wrong because you really need to have the foresight with these things to account for bad or just not well optimized ports. Just because it shouldn’t be a problem doesn’t mean it isn’t.

 

The side that says 10gb is just useless now in 2023 is weighing too heavily on some new AAA demanding ports with a refusal to tone down settings and adjust things accordingly. People using native res in an example of running out of vram in a game with dlss support and things like that. And then forgetting about the 95%+ of all games that have no issue.

 

Basically when you’re buying a product and looking at the specs just always keep future un-optimized ports in mind and determine for yourself if that’s worth paying more to overcompensate for those or decide then that you will just adjust when those come out if you don’t want to skip them.

 

As far as who to blame, just blame everyone! Nvidia shouldn’t cheap out so much on ram especially with the price points they have now (I think the 3080 could have been the 1080 of this gen for longevity if it had been 16gb..now I feel like we are still waiting for that with the 4000series prices), Naughty Dog should have optimized this better, Sony shouldn’t have forced this out the door, Amd should check their partners on their vram usage because that hurts their own mid range gaming cards, and I shouldn’t support games released in this state!

 

Looking forward to playing this after RE4 though since none of this affects me rn :D

  • Like 1
Link to comment
Share on other sites

14 hours ago, Spork3245 said:

Alex Battaglia from DF:

DBEF4336-872-C-4751-9937-4-AAF66-B4437-D
 

I’m not defending nVidia’s penny pinching on VRAM, but Hardware Unboxed has been on kind of a one sided mission lately to try and get ‘dem views. The only decent YouTuber for PC Hardware is GamersNexus IMO, and Linus when he’s doing “entertainment” videos vs attempting serious reviews. For graphics and performance, it’s DF or nothing for me.

 

There are more new games to pushing VRAM than TLOU.

 

Hardware Unboxed provides the most extensive and updated performance numbers, regardless of what their actual opinions are. I like GamersNexus' transparency, but their benchmarks are pretty outdated. Their choice of games to test was outdated even when they last updated it a few years ago. And NGL, every benchmarker needs to retire Shadow of the Tomb Raider in favor of newer games; we have enough data points on this one game that literally everyone uses.

Link to comment
Share on other sites

1 hour ago, cusideabelincoln said:

There are more new games to pushing VRAM than TLOU.


I didn’t say otherwise, however, this game is a ridiculous example to use, especially for clickbait. Those other games tend to become an issue at 4k btw (the cards with sub-10gb tend to be targeted for 1440p and lower), this game wants 10gb of VRAM at 1080p making holding it up as a shining example even more poorly thought out as there’s something incredibly wrong with the port.

 

 

1 hour ago, cusideabelincoln said:

Hardware Unboxed provides the most extensive and updated performance numbers, regardless of what their actual opinions are. I like GamersNexus' transparency, but their benchmarks are pretty outdated. Their choice of games to test was outdated even when they last updated it a few years ago. And NGL, every benchmarker needs to retire Shadow of the Tomb Raider in favor of newer games; we have enough data points on this one game that literally everyone uses.


Lol, then skip to the numbers and ignore their clickbait videos like that TLOU:P1 video :p 

Link to comment
Share on other sites

15 hours ago, stepee said:

 

I think it’s kinda like both sides of this argument are often wrong. The side that argued that 10gb will have no problems heavily especially when the 3080 released was wrong because you really need to have the foresight with these things to account for bad or just not well optimized ports. Just because it shouldn’t be a problem doesn’t mean it isn’t.

 


What games are really pushing over 10gb unless you’re intentionally trying? At 4k native, sure, however 3080 always needed DLSS for 4k which puts VRAM closer to 1440p levels of VRAM usage. NVidia had always been weird with VRAM for some reason: even look back multiple generations with their 768mb, 1.5gb, 3gb, and 6gb weirdness. Then instead of going from 8 to 12, they first did 11gb then dropped to 10gb before finally putting out a 12gb consumer card (3080 Ti, then the 12gb 3080 variant).
It’s likely them looking to push people to upgrade sooner instead of holding onto cards forever, which is scummy of nVidia and anti-consumer. Look at the 980 Ti: if you’re at a mix of high and max settings (sans RT, obviously :p ), the card still pulls 60fps at 1440p in most modern titles… but, now you’re fighting 6gb of VRAM. AMD has done this before back with the (ATi) 5870 cards, they released that 512mb version initially before releasing the real 1gb cards months later - even at release the 512mb was barely enough and hitting VRAM limits at 1080p in new titles that released the same year as the card. Thankfully they at least learned from the backlash and started putting high amounts of VRAM from then on.

Link to comment
Share on other sites

42 minutes ago, Spork3245 said:


I didn’t say otherwise, however, this game is a ridiculous example to use, especially for clickbait. Those other games tend to become an issue at 4k btw (the cards with sub-10gb tend to be targeted for 1440p and lower), this game wants 10gb of VRAM at 1080p making holding it up as a shining example even more poorly thought out as there’s something incredibly wrong with the port.

 

 


Lol, then skip to the numbers and ignore their clickbait videos like that TLOU:P1 video :p 

 

I believe I came across a claim the game was loading in assets it didn't need to, but I think a  patched just dropped yesterday claiming to improve memory performance so I'm sure we'll soon see just how well they do.

 

 

 

And don't take away my guilty pleasure of watching clickbait videos, knowing full-well the community is going to have a melt-down and engulfing myself into the chaos of maulding. Sometimes I just need a little bit more than my daily LTT laugh video.  Although on a basis of principle, I agree with the claim HUB is making about Nvidia shortchanging gamers on VRAM. I still find it so silly the 3080 has less than the 1080 and 2080 Ti.

Link to comment
Share on other sites

16 minutes ago, cusideabelincoln said:

 

I believe I came across a claim the game was loading in assets it didn't need to, but I think a  patched just dropped yesterday claiming to improve memory performance so I'm sure we'll soon see just how well they do.

 

 

 

And don't take away my guilty pleasure of watching clickbait videos, knowing full-well the community is going to have a melt-down and engulfing myself into the chaos of maulding. Sometimes I just need a little bit more than my daily LTT laugh video.  Although on a basis of principle, I agree with the claim HUB is making about Nvidia shortchanging gamers on VRAM. I still find it so silly the 3080 has less than the 1080 and 2080 Ti.


As I mentioned in my post to Stepee, they’re absolutely being weird with and short changing VRAM on cards. However, sub-12gb isn’t the end of the world unless you’re trying for native 4k, but you lack breathing room which indeed sucks. A chunk (not even close to all) of the problem in this game is that it’s reserving over 4gb of VRAM for Windows, when in other games that’s typically a few hundred megabytes afaik: it’s memory management is just broken to the point of hilarity. I’m assuming what @Zaku3mentioned has something to do with it in that this PS5 game was built around an m.2 streaming to the consoles ram: the PC port lacking direct storage and Iron Galaxy being far from the best coders alongside ND not being PC devs led to them having a hard time removing the m.2 streaming aspects and thus it has stupidly high VRAM usage instead (of course, I’m just spitballing here :p ).

Link to comment
Share on other sites

The consoles by virtues of APUs punch above their weight. Sharing overall system memory and it being faster GDDR6 would improve GPU/CPU communication. I wonder what are the limitations. I'd drop up to 2k for like it needs either an AIO or water cooling system monster like Zen 8/RDNA qhatever APU. Pay for a high end mobo for OCing and would buy GDDR whatever if it was like flat out faster then our current seperate gpu/cpu set up. 

 

I could be even persuaded to do 3k+ for such a desktop/laptop APU. Has to be faster then the current set up and directstorage may render the difference meaningless due to PCIe whatever speeds but we probably are gonna hit a wall with silicon might as well YOLO while we wait for something better.

Link to comment
Share on other sites

Besides all the tech and configuration talk going on in here, are you guys actually playing this absolute masterpiece?

 

I just beat it in a day and a half on PS5 and it's in my top 5 games ever made.

 

Play this in 4K 120fps and stop taking about how to properly play this thing. 

 

Better yet, play it on the PS5 like it's meant to be played on with a slight resolution deduction that won't even be noticeable on your guys $5K rigs.

Link to comment
Share on other sites

I mean it’s all subjective right. But it’s just running at 1440p/60 there isn’t anything special going on beyond that, it’s just regular flat 1440p. 4k looks way sharper than 1440p to me on a 65” tv at normal viewing distance. Not to mention higher settings on everything else and double the frame rate, but even just doubling the resolution, it’s not like the human eye stops noticing beyond 1440p all of the sudden just cuz you think a game looks nice lol

  • Hugs 1
Link to comment
Share on other sites

5 minutes ago, stepee said:

I mean it’s all subjective right. But it’s just running at 1440p/60 there isn’t anything special going on beyond that, it’s just regular flat 1440p. 4k looks way sharper than 1440p to me on a 65” tv at normal viewing distance. Not to mention higher settings on everything else and double the frame rate, but even just doubling the resolution, it’s not like the human eye stops noticing beyond 1440p all of the sudden just cuz you think a game looks nice lol

 

Lol, I'm just trying to rile you up my man. I know there's a difference but I just felt like messin with you! 

 

But, have you played any of this for a significant amount of time or are you testing it out and putting your focus on RE4?

Link to comment
Share on other sites

Just now, best3444 said:

 

Lol, I'm just trying to rile you up my man. I know there's a difference but I just felt like messin with you! 

 

But, have you played any of this for a significant amount of time or are you testing it out and putting your focus on RE4?

 

Ya, I know you are but still gotta correct the record! Na I’m getting super pumped to start though, I’m going to start later this afternoon as I’m on the last chapter or RE4 now!

Link to comment
Share on other sites

5 minutes ago, stepee said:

 

Ya, I know you are but still gotta correct the record! Na I’m getting super pumped to start though, I’m going to start later this afternoon as I’m on the last chapter or RE4 now!

 

Perfect. That opening in TLOU still tears me up every fuckin time. Plus, the overall game is a big step above anything on the market. 

 

Part 2 surpringly looks basically just as good minus maybe a little resolution decrease but it's BARELY noticeable. Part 2, for a PS4 game, still looks absolutely amazing and 60fps on PS5 is the cherry on top. These games are in my top 3 of all-time. 

Link to comment
Share on other sites

12 minutes ago, stepee said:

Holy fuck this looks amazing though. Like seriously hot damn. One thing I didn’t think of is yeah the amazing animation at 120fps looks extra crazy 

I wonder if many of the people with issues are pushing the VRAM about what the game says they should? I wish I had more time to play, but kids and shit. 

Link to comment
Share on other sites

13 minutes ago, Ominous said:

I wonder if many of the people with issues are pushing the VRAM about what the game says they should? I wish I had more time to play, but kids and shit. 

 

Sounds like the issue is high textures are weirdly demanding and medium textures are a way bigger step down than they should be and it has a huge cpu hit in a way that scaling down settings doesn’t help to mitigate. 

 

Basically perfectly designed to run just fine and look amazing on the absolute top of the line systems and run/look like trash on everything else.

Link to comment
Share on other sites

47 minutes ago, Ominous said:

I wonder if many of the people with issues are pushing the VRAM about what the game says they should? I wish I had more time to play, but kids and shit. 


Basically, the game was designed around using direct storage with hardware decompression and they didn’t bother attempting to change that (or couldn’t) when porting it over. Not having that available on PC is what’s causing unusually high CPU usage and awful VRAM management. If you have a ton of VRAM and an upper end CPU then you might be mostly fine.

  • True 1
Link to comment
Share on other sites

3 hours ago, Spork3245 said:


Basically, the game was designed around using direct storage with hardware decompression and they didn’t bother attempting to change that (or couldn’t) when porting it over. Not having that available on PC is what’s causing unusually high CPU usage and awful VRAM management. If you have a ton of VRAM and an upper end CPU then you might be mostly fine.

I wish DS would get support on the PC. Such a shame. I guess 24 GB of vram and a 7950x is picking up the slack.

Link to comment
Share on other sites

5 hours ago, Ominous said:

I wish DS would get support on the PC. Such a shame. I guess 24 GB of vram and a 7950x is picking up the slack.


It’s a shame only a single game offers direct storage so far, but, what really needs to be adopted is RTX IO. NVidia actually made that as an “open platform” that doesn’t need their hardware to run.

  • True 2
Link to comment
Share on other sites

Nope, husband tried it tonight and it was poop. Said it was like a streaming video trying to load way too frequently and too much stuttering. Said he got about the same performance at 1080p as he did at 4k. Same for going ultra to low settings. He said it’s really pretty at ultra and 4k but wildly inconsistent. Maybe a new CPU will help, but I dunno. Either way we’re upgrading when the new AMD drops, but I’m gonna wait on playing this until it can run effectively. 

Link to comment
Share on other sites

TLOU-SITE_1hT2leW.jpg
WWW.EUROGAMER.NET

It's impossible to review an obviously unfinished game, but here are Digital Foundry's thoughts right now.

 

For people who find it easier to read than watch videos:

 

Quote

The Last of Us Part 1 on PC puts us in a somewhat difficult situation at Digital Foundry. How can you produce a technical review for a game that is essentially unfinished and - at the time of writing - has already received two hotfixes? We're told that a further patch is coming next week that should have more significant improvements, so the best we can provide is a snapshot of how the game looks right now, based on testing conducted last Friday. We'll return to the game if there's a noticeable improvement, but in the short term, we've got to move on to other projects.

....

What's clear is that there is a long road ahead because there's the sense right now that the PC version is effectively a beta - that is, feature complete but in need of a profound level of optimisation and bug-fixing. The key problem here is that while the game basically works, the hardware requirement is unexpectedly large - something that is explainable (though perhaps not justified) in some areas, while a complete mystery in others.

 

We'll start with memory management, where The Last of Us Part 1 drastically falls short. This is a game rich in high quality artwork but how those assets are streamed from system to video memory is clearly sub-optimal. The big takeaway from our simultaneous playthrough is the extent to which owners of 8GB graphics cards are disadvantaged by this conversion. The high quality texture option is broadly equivalent to the PS5 version and requires around 10-11GB of VRAM depending on the surrounding settings. However, medium is drastically cut down in environment artwork quality to the point where the game almost looks as if it's been vandalised, while the low setting offers only a broad approximation of the art.

....

Yes, we expect VRAM requirements to rise as we shift away from cross-gen development (TLOU Part 1 is a PS5 exclusive, built around a console with around 12.5GB of memory available to developers) but the bottom line is that 8GB is the sweet spot for the GPU market right now and the results for so many users will disappoint. Ultra textures require even more VRAM,but the end result is barely any different to the high setting, which in turn is very close indeed to the PS5 experience. Looking to use high textures on an 8GB card? You can do it, but expect a highly compromised experience plagued with stutter as assets streaming in and out of VRAM into system memory.

 

The most obviously noticeable issue with the port can be overcome simply by possessing by a GPU that has 10-11GB of memory. The RTX 3070 8GB and RTX 2080 Ti 11GB are broadly comparable in performance, but the 2080 Ti produces a far superior experience simply because the art in the game is displayed the way it was intended - the high setting works without issue. However, there's every reason you may still experience poor results, because the port's demands on the CPU are frankly astonishing.

....

Ultimately, this PC port is a profound disappointment and it's hard to believe that the developers and publishers were unaware that there were big issues before putting this out for sale. Hotfixes have already come out addressing some issues (though we've yet to see any dramatic changes, bar a modest reduction in shader compilation times and loading times) but we'd venture to suggest that there's foundational, systemic problems that need looking at - and that's going to require a lot of time. Firstly, the texture management issue is of urgent concern because right now, only a minority of PC users will actually get to see this gorgeous game in the way it was intended to be seen. It can be done - the Marvel's Spider-Man ports prove it.

 

Secondly, addressing the exceptionally high CPU requirement is also a priority. It seems likely that the hardware decompression of assets used on PS5 is replaced with an entirely CPU-based system on the PC port. This would explain the remarkably long loading times and a massively prolonged shader compilation step on first boot, not to mention the depressed performance in-game when new areas stream in as you play. Can PS5-like loading be achieved on PC without wiping out CPU performance? Again, Marvel's Spider-Man suggests it can be done.

 

The bottom line is that the requirement for top-tier CPU simply to achieve a consistent 60 frames per second puts even a console-like experience out of reach for the majority of PC users. Finally, GPU utilisation is also very, very high to a degree we find hard to explain and we shouldn't need to rely on upscaling to achieve 60fps on graphics hardware like an RTX 2070 Super.

 

There's a lot of work to be done then - and it's hard to imagine that there are any easy fixes that can be delivered in a short amount of time, but let's see how the first big patch looks. Right now at least, this port is hard to recommend. Hard, but not impossible, because it is possible to get a perfectly good experience - it's just that the multiple requirements to do so limits smooth, attractive gameplay to users with a high-end CPU and a performant GPU with a lot of VRAM. If your kit fits the bill, you're good to go unless you encounter problematic bugs. For everyone else though, it would be wise to pass on this release for now.

 

It's a bad port, and doesn't even seem to use the tools that are around to minimize VRAM usage (DLSS and DiretStorage).  Even if it did, it seems to have some other fundamental issues.

=====================

On a side note, running native 4k is hard.  It takes a buttload of resources.  VRAM, GPU power, etc.

 

If you want to run 4K at high frame rates -- you need to use technologies that enable that. DLSS and DirectStorage are two of those.  Yes, that does mean if you want a console experience on a PC Game, you may need a PCIe 4.0 SSD and an appropriately powerful GPU going forward.

 

The fact that this game was released in this state, with SO many issues is inexcusable (beyond just the VRAM issue).

 

That said, if you bought a 3080/3070 and thought you would be able to run 4k/Ultra for four years (at the same time as a new console generation was launching), that's on you.

 

Link to comment
Share on other sites

Played about 7 hours yesterday and it’s so damn good, the gameplay tweaks and graphic boost really put it up there with Part 2 now, though a bit less open. It’ll still look good for a long time once they figure out the performance issues and it’s going to be something that should be an evergreen for them so I’m sure they’ll fix it for everyone else soon enough. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...