• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Graphical Fidelity I Expect This Gen

CGNoire

Member
But isn't it all own opinion. No game that has come out impressed me like a next gen title should do? Just take the title of this thread for example, it's called "I expect" thats far from objective.
I think his point is that with it being top down with no fourth wall or geometry behind the camera means the technicality of it isnt impressive enough to warrant such praise as "nextgen" looking.
 

CGNoire

Member
What i've meant is that the PS4 can do almost all graphical features that the PS5 can do in GoW:R, i've never said PS4 can do anything, ok? I was referring to GoW.
But like i said there aren't many really next gen titles yet.
But ok, lets say Diablo 4 doesn't qualify for a next gen title it's nonetheless a pretty looking game.
Its really pretty and art style for once is good.
 

Schmendrick

Member
Are you also saying Unreal Engine 5 is not a next gen engine because it can scale down to mobile devices?
Comparing the biggest commercial multiplatform engine with unprecedented features like Nanite to a no name in house engine originally targeting exactly one platform.
Tiny difference.....
 
Last edited:

sendit

Member
Comparing the biggest commercial multiplatform engine with unprecedented features like Nanite to a no name in house engine originally targeting exactly one platform.
Tiny difference.....
Right. One of largest development studios in the world is incapable of making an engine that scales.
 
Last edited:

Lysandros

Member
Yeah, i fucking hate having a dirty iq, and the order was the dirtier game i played recently, it's not just a matter of low res (and the game is not 1080p btw more like 800p if i remember correctly), they use pretty much all the tricks in the book to ruin iq:

Heavy blur
Low res soft image
Black bars
Film grain
Heavy postprocessing


Thank god for the absence of chromatic aberration (i think)




The game could look MUCH better.

I'm waiting for the remaster before re-playing bloodborne, at least that one has still a chance to get one, and i could not care less about driving games:lollipop_grinning_sweat:
The game runs at 1920x800 which is ~10% more pixels than Ryse's 900P (1600x900) without any upscaling on a 1080P screen.

If i remember correctly it also had a slight high quality chromatic aberration implementation.
 

GymWolf

Member
The game runs at 1920x800 which is ~10% more pixels than Ryse's 900P (1600x900) without any upscaling on a 1080P screen.

If i remember correctly it also had a slight high quality chromatic aberration implementation.
There is no such a thing like high quality chromatic aberration, it always look like shit in videogames.

The game was so full of visual noise shit that i didn't even noticed if CA was in or not.
 
Last edited:

winjer

Member
The game runs at 1920x800 which is ~10% more pixels than Ryse's 900P (1600x900) without any upscaling on a 1080P screen.

If i remember correctly it also had a slight high quality chromatic aberration implementation.

The Order 1886 has terrible image quality, due to the use of film grain, chromatic aberration and motion blur.
The game might have some nice texture and modeling work, but with all the crap they smear on screen, the game ends up looking quite bad.
 

GymWolf

Member
The Order 1886 has terrible image quality, due to the use of film grain, chromatic aberration and motion blur.
The game might have some nice texture and modeling work, but with all the crap they smear on screen, the game ends up looking quite bad.
I thought i was going crazy...

It seems like not many people care about clean iq.
 

GymWolf

Member
Games on console rarely have options to disable crap like chromatic aberration, motion blur, depth of field and film grain.
So most gamers on console don't even know what a clean image looks like. They have always lived in the muck and are used to these effects.
Some games let you disable film grain and other stuff and i play all the sony exclusives on console, no game come close to the visual noise of the order, not sure about third parties since i play all of them on pc.

Sony games have usually great iq.
 

Blendernaut

Neo Member
Comparing the biggest commercial multiplatform engine with unprecedented features like Nanite to a no name in house engine originally targeting exactly one platform.
Tiny difference.....
I must say that you are right but he is right too. Im an Unreal Engine user. I work with it every day. Its my job.
And Unreal Engine 5 is an overpowered version of Unreal Engine 4. 99% of the engine is still exactly the same with a polished and improved UI. And yes: its new features, spetially Lumen and Nanite, are incredibly awesome, and what you can do is just amazing. Even unbelievable. It's a next gen engine for sure. People don't trully know how powerful and incredible this engine is.
But that means that in-house engines can also be improved with new features that makes them next-gen too. Even if their base code keeps the same.
So both of you could be correct somehow.
 
Last edited:

Blendernaut

Neo Member
Games on console rarely have options to disable crap like chromatic aberration, motion blur, depth of field and film grain.
So most gamers on console don't even know what a clean image looks like. They have always lived in the muck and are used to these effects.
Real cameras in real world record video with, at least, a bit of motion blur. Its just how physics, light and cameras work. A real video with 0 motion blur would look very unnatural. Like, a weird stuttering efect. Not fluid at all.
So motion blur is NEEDED for a natural look. At least a little bit.
 

winjer

Member
Real cameras in real world record video with, at least, a bit of motion blur. Its just how physics, light and cameras work. A real video with 0 motion blur would look very unnatural. Like, a weird stuttering efect. Not fluid at all.
So motion blur is NEEDED for a natural look. At least a little bit.

The only reason movies need some motion blur is because they still use very low frame rates.
If movies were filmed at 120 fps, like we can play games, then motion blur would be unnecessary.
 

Schmendrick

Member
I must say that you are right but he is right too. Im an Unreal Engine user. I work with it every day. Its my job.
And Unreal Engine 5 is an overpowered version of Unreal Engine 4. 99% of the engine is still exactly the same with a polished and improved UI. And yes: its new features, spetially Lumen and Nanite, are incredibly awesome, and what you can do is just amazing. Even unbelievable. It's a next gen engine for sure. People don't trully know how powerful and incredible this engine is.
But that means that in-house engines can also be improved with new features that makes them next-gen too. Even if their base code keeps the same.
So both of you could be correct somehow.
Software is just software, ofc it can be improved/built upon.
But you`re not doing major changes on the base tech while 99% of your workforce are amidst game production with said tech, which was the situation with GoW:R.
And studios like SSM simply don`t have the workforce to keep up with monsters like UE on the side while actually developing games.
 
Last edited:

Blendernaut

Neo Member
The only reason movies need some motion blur is because they still use very low frame rates.
If movies were filmed at 120 fps, like we can play games, then motion blur would be unnecessary.
Man. If you know ANYTHING about how cameras work, you will know that its absolutelly IMPOSSIBLE to get no motion blur at all if there are moving objects or the camera is moving.

Its just physics. Its not about beeing necessary or not.

And just for you to know, when we render CGI videos from Unreal or from any renderer, we always, ALWAYS! use motion blur not because we like the "effect", but because not having motion blur makes the video look totally unnatural. Why is that? because of how cameras in real world work.

PERIOD.
 
Last edited:

winjer

Member
Man. If you know ANYTHING about how cameras work, you will know that its absolutelly IMPOSSIBLE to get no motion blur at all if there are moving objects or the camera is moving.

Its just physics. Its not about beeing necessary or not.

Depending on the camera settings, motion blur can be greatly reduced, to the point of being almost unnoticeable.

And just for you to know, when we render CGI videos from Unreal or from any renderer, we always, ALWAYS! use motion blur not because we like the "effect", but because not having motion blur makes the video look totally unnatural. Why is that? because of how cameras in real world work.

PERIOD.

That's because we have spent over a century with cinema using very low frame rates with motion blur. So our brains have a certain expectation of what is being showed.

And games are a different medium from cinema. We don't need to have the same issues as cinema.
We don't need to have 24 fps, we don't need motion blur, we don't need chromatic aberration, nor film grain and other useless crap.
 
Last edited:

Blendernaut

Neo Member
Even videos recorded/rendered at 120 have motion blur. Of course its a lot less intense, again, because of how cameras work and how much light the sensor captures every frame. But for eliminating 100% of the motion blur you would need to have a infinitely high shutter speed, which is phisically impossible, but even if it was possible, that would mean that the result would be a black image.

Motion blur is needed for natural looks. And yes, for higher frame rates, motion blur is and should be less intense both in real world, CGI or videogames. Of course, you can turn it off in videogames if it annoys you. Its not the real world, so you can do that.. But do not underestimate its value for visual fidelity or the reasons why it could be important to achieve a certain look of a product/videogame.
 

01011001

Member
Even videos recorded/rendered at 120 have motion blur. Of course its a lot less intense, again, because of how cameras work and how much light the sensor captures every frame. But for eliminating 100% of the motion blur you would need to have a infinitely high shutter speed, which is phisically impossible, but even if it was possible, that would mean that the result would be a black image.

Motion blur is needed for natural looks. And yes, for higher frame rates, motion blur is and should be less intense both in real world, CGI or videogames. Of course, you can turn it off in videogames if it annoys you. Its not the real world, so you can do that.. But do not underestimate its value for visual fidelity or the reasons why it could be important to achieve a certain look of a product/videogame.

you're aware that a camera isn't a natural look right? your eyes do not have a shutter, your eyes proccess light the moment it comes in at a nearly infinite and dynamic framerate so to speak.

motion blur isn't natural, motion blur is an artifact of how cameras work. videogames should not emulate movies/cameras, movies aren't interactive, movies do not have fast camera pans during which you have to parse the world around you for enemies, items or the path forward.

motion blur isn't something that should be used in games by default and should always be an optional checkbox in the menus.
 
Last edited:

SF Kosmo

The Trigglypuff
motion blur isn't natural
Neither is flashing increments 30 to 60 times to simulate motion. Perfectly sharp images flashed in sequence do not look natural to the eye.

Motion blur needs to be calibrated to the frame-rate, and sometimes is not. A game running at 60fps should have half the blur of a game at 30 fps, and a game at 120fps should have half the blur of that, because it should be simulating a shorter "exposure" time. When done right, motion blur looks very natural and more like how our eye sees, but usually we see it in 30fps games that are trying to look like movies instead.
 
Last edited:

RaduN

Member
Human eye sees much much worse than some here might think.
To simplify, what we see, is a combination of what information the eyes send to the brain (which is pretty crap) and what we expect to see based on prior experience.
And yes, we also perceive things with some kind of motion blur and depth of field, and yes these effects should remain in some form or other in all visual media, because they are completely natural.

In film, framerates have nothing to do with motion blur, it's the shutter speed that's responsible for that.
 

SF Kosmo

The Trigglypuff
In film, framerates have nothing to do with motion blur, it's the shutter speed that's responsible for that.
These aren't unrelated though. High framerate cameras have to have a much shorter exposure. I get that they're not totally 1:1, but it's misleading to say they have nothing to do with each other.
 

01011001

Member
Wave your hands in front of you and say that again.

that's once again a false equivalence.

in a game you usually follow an object with your eyes. this can be an enemy, an object or the spot you want to move towards to.

if you do this IRL you will not see any blur, in a game you will if there is motion blur added (and even if there isn't due to the still imperfect display tech we are stuck with)

meanwhile if something moves fast across the screen and you're not focusing it with your eyes, you will get the blurriness naturally that you get IRL when something fast that you're not focusing moves past you.

additional motion blur makes games look unnatural, that's simply a fact and is why VR headsets are actively fighting against motion blur and usually have super low persistence displays with higher response times than typical TV or Monitor displays
 
Last edited:

01011001

Member
Neither is flashing increments 30 to 60 times to simulate motion. Perfectly sharp images flashed in sequence do not look natural to the eye.

Motion blur needs to be calibrated to the frame-rate, and sometimes is not. A game running at 60fps should have half the blur of a game at 30 fps, and a game at 120fps should have half the blur of that, because it should be simulating a shorter "exposure" time. When done right, motion blur looks very natural and more like how our eye sees, but usually we see it in 30fps games that are trying to look like movies instead.

that's simply not true.
as soon as you follow an object, or text, or anything with your eyes and you can't read the text because it's blurred due to motion blur, it's instantly unnatural and beyond that also detrimental to gameplay.

and playability should always come first. more blur = worse playability due to worse motion clarity which leads to a lessened ability to parse your surroundings
 
Last edited:

GymWolf

Member
Man. If you know ANYTHING about how cameras work, you will know that its absolutelly IMPOSSIBLE to get no motion blur at all if there are moving objects or the camera is moving.
Its just physics. Its not about beeing necessary or not.
And just for you to know, when we render CGI videos from Unreal or from any renderer, we always, ALWAYS! use motion blur not because we like the "effect", but because not having motion blur makes the video look totally unnatural. Why is that? because of how cameras in real world work.

PERIOD.
Well it looks like ass in videogames, i don't play in 4k just to have everything turning into a blurry mess the momemt i move the camera.


MENSTRUAL CRAMPS.
 
Last edited:

rofif

Member
Well it looks like ass in videogames, i don't play in 4k just to have everything turning into a blurry mess the momemt i move the camera.


MENSTRUAL CRAMPS.
You are mixing blurry mess and natural motion blur.
In order to be able to disable in-game motion blur and achieve natural motion blur, You need like 300fps. For me, the 240hz result was VERY similar to 60hz with good motion blur in doom 2016 at the very least.

Essentially, a good motion blur should "blur" the frame to show it's movement in time of it's existance. So if you play at 60fps, each frame should be showing/containing 16ms of movement. NOT A STILL frozen frame.
The colelction of 16ms movements, create a realistic movement. Creation of 60 stills is not the same.

Look at photography. This is what shutter speed is. If you film waterall or rotating objects, you would need to set 1/2000 shutter speed to avoid motion blur and it looks vey jerky and unnatural.
So You set shutter speed to double of your framerate. 1/60 for 30fps filming is the usual.

Great comparison here. Notice how unnatural fast shutter speed looks. If You look at these objects in reality, you don't see them as sharp and jerky.


wave a hand in front of your face or the famous pencil bending illusion. It is the "natural motion blur" effect. Eyes don't use frames. The photons are continuous. But there is some delay in processing by your brain. That's why it happens.
And no. Monitor will not create motion blur for you. You are looking at a static object (monitor) and it is not movie. Only contents it's displaying are changing. So if You animate something at 60fps on that monitor without motion blur, there is not enough frame data for your brain to make it look as it should. Take 300hz? yep. thats good.

What I agree with you is terrible in-game motion blur implementations but recently most stuff is good and we are long past RE4 or gta3 terrible blur.
Stuff like Uncharted 4 per object motion blur is fantastic. And believe it or not - the camera motion blur GREATLY helps on oled at 30/40 fps. Oleds are too fast and 30fps really looks like ass without good motion blur. Even 60fps looks kinda dry
 
Last edited:

GymWolf

Member
You are mixing blurry mess and natural motion blur.
In order to be able to disable in-game motion blur and achieve natural motion blur, You need like 300fps. For me, the 240hz result was VERY similar to 60hz with good motion blur in doom 2016 at the very least.

Essentially, a good motion blur should "blur" the frame to show it's movement in time of it's existance. So if you play at 60fps, each frame should be showing/containing 16ms of movement. NOT A STILL frozen frame.
The colelction of 16ms movements, create a realistic movement. Creation of 60 stills is not the same.

Look at photography. This is what shutter speed is. If you film waterall or rotating objects, you would need to set 1/2000 shutter speed to avoid motion blur and it looks vey jerky and unnatural.
So You set shutter speed to double of your framerate. 1/60 for 30fps filming is the usual.

Great comparison here. Notice how unnatural fast shutter speed looks. If You look at these objects in reality, you don't see them as sharp and jerky.


wave a hand in front of your face or the famous pencil bending illusion. It is the "natural motion blur" effect. Eyes don't use frames. The photons are continuous. But there is some delay in processing by your brain. That's why it happens.
And no. Monitor will not create motion blur for you. You are looking at a static object (monitor) and it is not movie. Only contents it's displaying are changing. So if You animate something at 60fps on that monitor without motion blur, there is not enough frame data for your brain to make it look as it should. Take 300hz? yep. thats good.

What I agree with you is terrible in-game motion blur implementations but recently most stuff is good and we are long past RE4 or gta3 terrible blur.
Stuff like Uncharted 4 per object motion blur is fantastic. And believe it or not - the camera motion blur GREATLY helps on oled at 30/40 fps. Oleds are too fast and 30fps really looks like ass without good motion blur. Even 60fps looks kinda dry

I just turn that shit down because i hate blurry stuff on my 4k tv, don't care how natural it looks.
 
Last edited:

rofif

Member
I just turn that shit down because i hate blurry stuff on my 4k tv, don't care how natural it looks.
It's not blurry. The motion is jittery otherwise often. Unless you play at high framerate (not 60).
Maybe you should try it. Why would you be ridding yourself of smooth animations ?
 

SlimySnake

The Contrarian
Horizon was the first time I disabled motion blur. I honestly dont notice it. People like Alex see issues with native 4k images and im like wtf, I cant even see shimerring or jaggies in 1440p, let alone 4k. I still have no idea what TAA does to a native 4k image. It looks amazing to me in every single game.

HFW had that insane brightness flickering in its native 4k 30 fps mode that just gave me literal headaches. But for whatever reason, turning down camera acceleration and turning off motion blur fixed it, and I never turned it back on. Fucking John from digital foundry told GG that it was a sharpness issue so GG turned off the sharpness completely ruining the pristine IQ of the native 4k 30 fps version. God I hate DF sometimes and how devs dont think twice before taking their advice.
 
Last edited:

GymWolf

Member
Horizon was the first time I disabled motion blur. I honestly dont notice it. People like Alex see issues with native 4k images and im like wtf, I cant even see shimerring or jaggies in 1440p, let alone 4k. I still have no idea what TAA does to a native 4k image. It looks amazing to me in every single game.

HFW had that insane brightness flickering in its native 4k 30 fps mode that just gave me literal headaches. But for whatever reason, turning down camera acceleration and turning off motion blur fixed it, and I never turned it back on. Fucking John from digital foundry told GG that it was a sharpness issue so GG turned off the sharpness completely ruining the pristine IQ of the native 4k 30 fps version. God I hate DF sometimes and how devs dont think twice before taking their advice.
Taa on pc usually blur the shit out of everything, this is why people use alternative stuff from the nvcp.

I straight up don't use AA when i play at 4k in 90% of cases.
 
Horizon was the first time I disabled motion blur. I honestly dont notice it. People like Alex see issues with native 4k images and im like wtf, I cant even see shimerring or jaggies in 1440p, let alone 4k. I still have no idea what TAA does to a native 4k image. It looks amazing to me in every single game.

HFW had that insane brightness flickering in its native 4k 30 fps mode that just gave me literal headaches. But for whatever reason, turning down camera acceleration and turning off motion blur fixed it, and I never turned it back on. Fucking John from digital foundry told GG that it was a sharpness issue so GG turned off the sharpness completely ruining the pristine IQ of the native 4k 30 fps version. God I hate DF sometimes and how devs dont think twice before taking their advice.

This!!! The game does not appear higher than 1440p in resolution mode ...when the game came out you could see the insanely detailed texture work easily! Now you can't. I can't stand DF anymore either and I used to be a big fan. What you said about devs making a change and then forgetting about it is spot on.
 
No seriously GAF.

WHERE ARE THE NEXT GEN GAMES ?!!!!!

How long are we now into this gen and still there's not a game in sight that we can for sure say is really NEXT GEN. Where's the leap in lighting and physics or other graphics effects that won't be at all possible on PS4?

I want to see a game and go 'Fuck, PS4 and Xbox One will die spontaneously combust trying to run this'. Hasn't happened yet.
 
Last edited:

amigastar

Member
No seriously GAF.

WHERE ARE THE NEXT GEN GAMES ?!!!!!

How long are we now into this gen and still there's not a game in sight that we can for sure say is really NEXT GEN. Where's the leap in lighting and physics or other graphics effects that won't be at all possible on PS4?

I want to see a game and go 'Fuck, PS4 and Xbox One will die spontaneously combust trying to run this'. Hasn't happened yet.
Yep, there aren't really next gen looking games right now. I think GTA 6 will be one of the first real next gen games.
 
Just played the demo on ps5 in performance mode ...the image quality is terrible! This has me bummed because I want to play this at 60 fps. Fuck! Why, Capcom? Re8 has excellent iq and appears to be a much higher resolution! Same with RE2/3! Wtf happened here?

PS- if not for the low res, disappointing IQ re4 would be impressive for a cross gen game. Also, I highly recommend turning off chromatic aberration in the settings if you're on ps5. It helps a little but not enough. The resolution is not good on ps5!
 
Last edited:
No seriously GAF.

WHERE ARE THE NEXT GEN GAMES ?!!!!!

How long are we now into this gen and still there's not a game in sight that we can for sure say is really NEXT GEN. Where's the leap in lighting and physics or other graphics effects that won't be at all possible on PS4?

I want to see a game and go 'Fuck, PS4 and Xbox One will die spontaneously combust trying to run this'. Hasn't happened yet.

I'm convinced it's never happening at this point. I've played everything worth playing this gen. I know what the limitations of these systems are by now. The best we can hope for is something like Ratchet, demon's Souls or Forbidden West ir Ragnarok. Only first party Sony games can look great at both high resolutions and high framerates- and even these are compromised in SOME way such as lacking ray tracing or fidelity (like ragnarok which looks good but not pushing graphics).

Everything that's received a ps5 upgrade from last gen is only able to get around 1440p/60 with a mix of max/high settings ...and those are last gen games. Every cross gen, new game has similar limitations in terms of resolution at 60 fps- elden ring, dying light 2, Cyberpunk etc etc

Whenever there's a game that really pushes graphics, there are drawbacks in terms of resolution- like this re4 demo which would look stunning if not for the low resolution and poor image quality. Dead Space Remake and Callisto Protocol only being able to do 1440p/60 without RT (and not even "next gen" visuals).

Ever since the start of this gen up until the present, there have been obvious limitations on everything that'd come out for these consoles.
 

Minsc

Member
I'm convinced it's never happening at this point. I've played everything worth playing this gen. I know what the limitations of these systems are by now. The best we can hope for is something like Ratchet, demon's Souls or Forbidden West ir Ragnarok. Only first party Sony games can look great at both high resolutions and high framerates- and even these are compromised in SOME way such as lacking ray tracing or fidelity (like ragnarok which looks good but not pushing graphics).

Everything that's received a ps5 upgrade from last gen is only able to get around 1440p/60 with a mix of max/high settings ...and those are last gen games. Every cross gen, new game has similar limitations in terms of resolution at 60 fps- elden ring, dying light 2, Cyberpunk etc etc

Whenever there's a game that really pushes graphics, there are drawbacks in terms of resolution- like this re4 demo which would look stunning if not for the low resolution and poor image quality. Dead Space Remake and Callisto Protocol only being able to do 1440p/60 without RT (and not even "next gen" visuals).

Ever since the start of this gen up until the present, there have been obvious limitations on everything that'd come out for these consoles.
Wasn't one of the biggest points realized though? The blazing-super-duper fast SSD? We are back at 1-2 second loading times instead of 50 second load screens, so there's the real next-gen benefit.

I agree with you it's very clear this generation of consoles isn't able to run 4k, high framerate and ray tracing, which I think upsets people. The minute you start trying to focus on framerate you need to sacrifice resolution or something else.
 

CGNoire

Member
You are mixing blurry mess and natural motion blur.
In order to be able to disable in-game motion blur and achieve natural motion blur, You need like 300fps. For me, the 240hz result was VERY similar to 60hz with good motion blur in doom 2016 at the very least.

Essentially, a good motion blur should "blur" the frame to show it's movement in time of it's existance. So if you play at 60fps, each frame should be showing/containing 16ms of movement. NOT A STILL frozen frame.
The colelction of 16ms movements, create a realistic movement. Creation of 60 stills is not the same.

Look at photography. This is what shutter speed is. If you film waterall or rotating objects, you would need to set 1/2000 shutter speed to avoid motion blur and it looks vey jerky and unnatural.
So You set shutter speed to double of your framerate. 1/60 for 30fps filming is the usual.

Great comparison here. Notice how unnatural fast shutter speed looks. If You look at these objects in reality, you don't see them as sharp and jerky.


wave a hand in front of your face or the famous pencil bending illusion. It is the "natural motion blur" effect. Eyes don't use frames. The photons are continuous. But there is some delay in processing by your brain. That's why it happens.
And no. Monitor will not create motion blur for you. You are looking at a static object (monitor) and it is not movie. Only contents it's displaying are changing. So if You animate something at 60fps on that monitor without motion blur, there is not enough frame data for your brain to make it look as it should. Take 300hz? yep. thats good.

What I agree with you is terrible in-game motion blur implementations but recently most stuff is good and we are long past RE4 or gta3 terrible blur.
Stuff like Uncharted 4 per object motion blur is fantastic. And believe it or not - the camera motion blur GREATLY helps on oled at 30/40 fps. Oleds are too fast and 30fps really looks like ass without good motion blur. Even 60fps looks kinda dry

Zero motion blur at 60fps+ looks correct on a Plasma display. When I track objects with my eyes they become clear everytime on it. Same with CRT.
Not a fan of anything more than subtle object motion blur when 60fps+.
 

CGNoire

Member
Taa on pc usually blur the shit out of everything, this is why people use alternative stuff from the nvcp.

I straight up don't use AA when i play at 4k in 90% of cases.
For me the usual targef is 4k Native with just FXAA. At 4k the FXAA does good enough and the blur at the resolution when downsampled isnt noticable.
 

rofif

Member
Zero motion blur at 60fps+ looks correct on a Plasma display. When I track objects with my eyes they become clear everytime on it. Same with CRT.
Not a fan of anything more than subtle object motion blur when 60fps+.
Proper motion blur is less and less, the more fps you got
 

Aaron Olive

Member
PS4.5 is already too generous imho. The lightning is atrocious.
AO seems nonexistent in many cases, shadow rendering distance is laughable, the bright areas are basically just bloom at 500%, indirect lightning is just permaglow-everything.....
We've had lots of ps4 games doing this much better. Side by side with something like RDR2 this game looks older, not newer....
Shhhhh! don’t Let rofif rofif know you said this. Forspoken is his sweet baby Jesus.
 
Last edited:

OCASM

Member
If you track your hand it wont blur.
I get what he is saying. Hes saying we should rely on our on vision for the blur and not manually insert it.
Except our vision can't add motion blur to a series of static images. Motion blur should be a standard rendering feature but so should be the option to disable it for those who dislike it. A more advanced implementation should be possible in VR now that some headsets have eye tracking.
 

CGNoire

Member
Except our vision can't add motion blur to a series of static images. Motion blur should be a standard rendering feature but so should be the option to disable it for those who dislike it. A more advanced implementation should be possible in VR now that some headsets have eye tracking.
Plasma isnt a series of static images it uses a 600hz "field" to reproduce images.
 

OCASM

Member
Plasma isnt a series of static images it uses a 600hz "field" to reproduce images.
Well, that's not really the common case. If we had a super high-framerate sample-and-hold display then sure, motion blur will occurr naturally. Though in that case motion blur haters would be screwed because then they wouldn't be able to turn it off lol.
 
Last edited:
Top Bottom