• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part I PC Features Trailer- Ultra-Wide Support, Left Behind, and More

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yeah, I do a few other bits and pieces where I’ve seen usage go up and I typically have other apps open in the background when playing games. I can imagine that more games are going to start asking for 32GB RAM in the not too distant future too. No harm in upgrading while it’s cheapish!

Also, we’ve not played the game yet, that requirement could actually be accurate. We’ll need to wait and see.
Ahh.
If you are multi tasking while gaming then yeah 48GB for 180 dollars makes sense.
But realistically games on the high end use about 12GB, I dont see this game suddenly being the game to break the 20GB threshold that would require a 32GB to run.
Hell id be shocked if this game even uses 10GB of RAM.
 
Last edited:

Hoddi

Member
It's way more than just softer image, look at the hair in various places. There's way worse examples still.
The whole concept of games running at X resolution kinda doesn't exist anymore. Modern games have countless different buffers and render targets running at resolutions that are different from the output resolution. Depth-of-field effects often run at quarter resolution, for example, so they'll only run at 1080p when your output is 4k. There's no real way to quantify something like that as running at X resolution.

It's the same reason that 1080p games often look blurry as hell nowadays. Older TAA games like Doom 2016 always looked perfectly fine at 1080p while newer games often look godawful at the same resolution.
 

BigBeauford

Member
I wonder what the general consensus is among PS5 players.

Get Last of Us or Returnal?
Honestly Returnal is fucking amazing as a roguelike junkie. If the TLOU package had the multiplayer factions, TLOU is the no-brainer pick as factions is one of the most underrated multiplayer experiences in existence. Without factions, I prefer Returnal.
 
I got a 4080 but no 4K monitor so I should be good for 1440p ~100fps hopefully.

does it have RTX/DLSS3?

I love that 32GB is becoming more common on PC now. I'm going to be upgrading to 64GB RAM soon. It's getting too close for comfort seeing games run at 20-25GB lol.
 
Last edited:

Senua

Member
I wonder what the general consensus is among PS5 players.

Get Last of Us or Returnal?
Kenan Thompson Snl GIF by Saturday Night Live
 

hinch7

Member
its not, it often run into extreme performance bottleneckes at 1440p and above

literally on it box says 1080p. it is a gpu targeted for 1080p. it can match ps5 like for like at 1080p, but once you go beyond there, it will start having massive performance slowdowns.

here it is like this:


at 4k, 5700xt is %12 faster
at 1440p, they're matched
at 1080p, 6600xt is %8 faster

that's a whopping %20 performance drop going from 1080p to 4K.

cyberpunk is even more brutal


at 4k, 5700xt is %23 faster
at 1440p, they're matched
at 1080p, 6600xt is %10 faster

a whopping massive %23 performance drop going from 1440p to 4K and another %10 from 1080p to 1440p

6600xt is a gimped product that only works to its potential at 1080p.

problems are even more pronounced with ray tracing;



at native 1080p with rt set to high (ps5 ray tracing settings), it gets around 50-70 FPS.

ps5 gets these framerates native 4k with its fidelity mode;



as I said, 6600xt has pathetic bandwidth at 256 GB/s. and infinity cache only works good enough at lower resolutions where cache hits are repetetive. at 4k/1440p/ray tracing situations, it falls quite a bit below PS5.

there really exists no GPU that is a proper match for PS5. 6700xt overshoots, 6600xt is a situational gimped card. best way is to compare 6700xt to PS5 and see how much of a lead it has. otherwise comparisons will be moot.

as I said, just because 6600xt matches a PS5 in terms of TFLOPS does not mean it will match it. there are other factors. bandwidth is the most crucial one.

The PS5s GPU is more akin to the 6700 (not XT). Same shader count, CU's (36), ROPs and TMU. Even similar clock speeds. With a lesser 192-bit bus and bandwith, 10GB VRAM but with 80MB L3 cache (infinity cache).

Though you can't really compare 1:1 even with the same GPU layout. Each platform will use different API's and perform differently. And consoles being a closed system, is likely to be more optimised and perform better like for like. Especially if you put them side by side. Say the Windows PC running DX will have a lot more overhead with different configerations and drivers. Plus other features like cache scrubbers plus native decompression hardware on console, vs higher cache on RDNA 2 DGPU's.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I got a 4080 but no 4K monitor so I should be good for 1440p ~100fps hopefully.

does it have RTX/DLSS3?

I love that 32GB is becoming more common on PC now. I'm going to be upgrading to 64GB RAM soon. It's getting too close for comfort seeing games run at 20-25GB lol.
What game runs at 20 - 25GB?



Please dont say endgame Anno or custom Skylines.
 

Kataploom

Member
The PS5s GPU is more akin to the 6700 (not XT). Same shader count, CU's (36), ROPs and TMU. Even similar clock speeds. With a lesser 192-bit bus and bandwith, 10GB VRAM but with 80MB L3 cache (infinity cache).

Though you can't really compare 1:1 even with the same GPU layout. Each platform will use different API's and perform differently. And consoles being a closed system, is likely to be more optimised and perform better like for like. Especially if you put them side by side. Say the Windows PC running DX will have a lot more overhead with different configerations and drivers. Plus other features like cache scrubbers plus native decompression hardware on console, vs higher cache on RDNA 2 DGPU's.
Yet it seems to perform around 6600 to 6600XT (best case) in most cases... I think it's still a pretty good performance, I just wouldn't use it for 4K or RT... BTW I mostly talk about performance mode, since that's the one I'm interested in.

OT: This game is the trigger for me to get 32 GB of RAM, I know it won't necessarily use it but I'm on 16 GB DDR4 2400 MHz, never had a single problem since I'm almost always GPU bounded but I wanted to upgrade to 3200 MHz just to not leaving performance on the table and this seems like the perfect excuse. I hold a little more since DDR5 is already out but it's to expensive to migrate to a DDR5 if I'm getting almost same performance on GPU bound scenarios anyway
 

Alex11

Member
I think it's time to get a PS5 and be done with all these different requirements for every game that releases, It's becoming a chore.
It is a very pretty game no doubt, but a 4080 and 32 GB RAM seems a bit much for a game that is linear, no time of day and has pre-calculated lighting, and yeah, I know that you can't compare 1:1 the specs for a PS5 to a PC, but still.
 

Kataploom

Member
These two games are some of the ones I've tested myself on my 6700 XT just this past month:

A Plague Tale: Requiem: Around 50 to 65 at native 1440p... On PS5 it doesn't even renders at native 1440p for the 30-40 fps modes.

Forspoken: Around 60 to 80 fps on native non-dynamic 1440p... On PS5 it needs to use FSR to reach dynamic 1440p and yet framerate falls around 45 fps quite frequently.

Also DF put Returnal on PS5 against a 2060 Ti and it performs similarly even at 1440p, which in PS5 it requires CB and TAA from 1080p. The 2060 Ti was using DLSS Quality which is still higher than 1080p internally.

Even if you account for CPU bottlenecks, PC doesn't have it so it can show the full power of the GPU in scenarios PS5 can't.

I'll be playing TLOU Part I a little after it releases (gotta wait for performance reviews/patches first and get the 32 GB of RAM) so I'll be comparing.
 

Mr Moose

Member
These two games are some of the ones I've tested myself on my 6700 XT just this past month:

A Plague Tale: Requiem: Around 50 to 65 at native 1440p... On PS5 it doesn't even renders at native 1440p for the 30-40 fps modes.

Forspoken: Around 60 to 80 fps on native non-dynamic 1440p... On PS5 it needs to use FSR to reach dynamic 1440p and yet framerate falls around 45 fps quite frequently.

Also DF put Returnal on PS5 against a 2060 Ti and it performs similarly even at 1440p, which in PS5 it requires CB and TAA from 1080p. The 2060 Ti was using DLSS Quality which is still higher than 1080p internally.

Even if you account for CPU bottlenecks, PC doesn't have it so it can show the full power of the GPU in scenarios PS5 can't.

I'll be playing TLOU Part I a little after it releases (gotta wait for performance reviews/patches first and get the 32 GB of RAM) so I'll be comparing.
Image quality stats are a close match to the first game. Both PS5 and Series X each stay in place at a native 1440p resolution, reconstructing up to 4K using a temporal solution.
 

hinch7

Member
Yet it seems to perform around 6600 to 6600XT (best case) in most cases... I think it's still a pretty good performance, I just wouldn't use it for 4K or RT... BTW I mostly talk about performance mode, since that's the one I'm interested in.

OT: This game is the trigger for me to get 32 GB of RAM, I know it won't necessarily use it but I'm on 16 GB DDR4 2400 MHz, never had a single problem since I'm almost always GPU bounded but I wanted to upgrade to 3200 MHz just to not leaving performance on the table and this seems like the perfect excuse. I hold a little more since DDR5 is already out but it's to expensive to migrate to a DDR5 if I'm getting almost same performance on GPU bound scenarios anyway
CPU bottleneck. The Zen 2 CPU in consoles are fairly slow compared to current CPU's in terms of IPC. And those are clocked at 3.5Ghz with a small cache. Hence why when pushed in higher framerates they dip a lot when compared to more performant hardware. And fair better at 30fps. As i said you can't compare 1:1.
 
Last edited:

Gaiff

Member
I think it's time to get a PS5 and be done with all these different requirements for every game that releases, It's becoming a chore.
It is a very pretty game no doubt, but a 4080 and 32 GB RAM seems a bit much for a game that is linear, no time of day and has pre-calculated lighting, and yeah, I know that you can't compare 1:1 the specs for a PS5 to a PC, but still.
Where did you see that a 4080 is required to play this game? You can play it just fine with a GPU 1/10th of the price.
 

Alex11

Member
Where did you see that a 4080 is required to play this game? You can play it just fine with a GPU 1/10th of the price.
I didn't write required, of course I was referring to 4k60 max settings. I have a 1050 Ti now, but low 720p, with all due respect, yeah thanks but no thanks. And where I am, a GPU for 1080p High settings, it's in the same price range as a PS5, which is running at higher settings and resolution.

Don't take it the wrong way, I'm not looking to start something, I ain't a fan of any console or anything, it is just a conclusion that I've reached if I want to keep gaming, and it's just for my specific situation.
 

Kataploom

Member
CPU bottleneck. The Zen 2 CPU in consoles are fairly slow compared to current CPU's in terms of IPC. And those are clocked at 3.5Ghz with a small cache. Hence why when pushed in higher framerates they dip a lot when compared to more performant hardware. And fair better at 30fps. As i said you can't compare 1:1.
Not everything is CPU bottleneck on those consoles, GPU can punch above their weights at 4K (30 fps) against one with similar specs on PC due to bandwidth but at 1440p they're not even 6700 (non XT) level, I don't care though, 1440p is good enough to me but just to clarify... I got hyped by my new 6700 XT and have been doing tests comparing to DF and other sources just for the lols and it's like 20% most cases above consoles.
 
The whole concept of games running at X resolution kinda doesn't exist anymore. Modern games have countless different buffers and render targets running at resolutions that are different from the output resolution. Depth-of-field effects often run at quarter resolution, for example, so they'll only run at 1080p when your output is 4k. There's no real way to quantify something like that as running at X resolution.
That's the problem. It's like with CRT which is still perfect for modern games, but cuz no one decided to make it better when it comes to screen tech and at the same time significantly reduce the size of the tubes and monitors etc., we have to deal with bs tech like IPS, VA etc. which have a ton of problems everyone looking solutions for and devs had to implement motion blur (which a lot of people hate), where as CRT don't even need it. Same goes for image reconstruction tech, sole purpose of which is to help with performance, but at a cost of ruining the image which in my eyes not a worthy trade off. All of the above is solutions in search of more problems someone have to deal with at some point to make their own problems for others to fix and so on, the cycle never ends. The only tech which is (arguably) better than native res is DLSS, but it also has problems even with the latest version, BUT consoles can't use it.

I mean, okay, not a lot of people actually using photo modes in games to make screens, but for those of us using them - any kind of image reconstruction tech ruins the whole purpose of photo modes and making them pointless. Insomniac the only devs which actually allowing their games to run at much higher res so image reconstruction (if it's active in the game in some form) is not a problem. Native res is always better than any image reconstruction tech, no matter if it's being used only for certain visual aspects of the game like hair rendering, fog, lighting, shadows, foliage etc.

The whole problem of the gaming industry in general when it comes to tech and hardware, is that it's running ahead of the train without actually having hardware nessesary to do it (PC hardware included), so it's cheating and making an impression of the opposite. No one is using the full potential of UE3 or UE4 (which are not perfect, but Arkham Knight is sole example of what you can do with UE3 and kill any other game on this engine visually and tech wise) or any other game engine cuz fuck it, let's use UE5 and think of some bs tech to make games run on underpowered hardware and to make overpriced af PC hardware catch up for years to run games at native res.
It's the same reason that 1080p games often look blurry as hell nowadays. Older TAA games like Doom 2016 always looked perfectly fine at 1080p while newer games often look godawful at the same resolution.
It depends on the amount, complexity and quality of assets on screen and overall artistic vision. Take latest RE4 demo for example, which will look blury af in 1080p with TAA, same goes for any other extremely complex game visually. Also, why the fuck no one is using SSAA in modern games? It looks amazing in Metro: Last Light in 1080p while downsampling from 4K without TAA or other modern and BS AA solution. Modern high end GPUs have zero problems running SSAA.
 
Last edited:
Top Bottom