Jump to content

Digital Foundry: Why the return of 30fps console games is inevitable


Slava

Recommended Posts

5 minutes ago, Akrioz said:

This technolohy is only good for video, because it adds a lot of input lag. I guess some people just don't care about it or don't notice. But if you really want the smooth and responsive gameplay that 40+ FPS provides, then it just defeats the purpose.

 

Yeah, true. I forgot about that. 

Link to comment
Share on other sites

21 hours ago, DrBloodmoney said:

...but then some smaller indie games, I do sometimes find myself wondering "what's the point in asking me whether I want Performance or Graphics - it's immaterial" 1f602.png

Peppa Pig's performance mode (solid 60 fps!) on PS5 changed completely that game experience for me

20 hours ago, KingGuy420 said:

I've never understood the "we need high framerates" crowd. High framerate doesn't automatically make a game good. Who gives a shit. Most of the widely considered "best games ever" didn't run at 60fps.

 

The obsession over it, when it's been roundly proven to not matter at all, is just mind-boggling.

Games are about interaction in the end. Great fps with perfect frame pacing allows for incredible responsiveness. Games feel a lot better to play like that. Recently I did Dirt 4 and Rally on all 3 platforms (PS4, XONE and PC). The PS4 version on PS5 was by far the worst, blurred 1080p graphics running at a poorly paced 60 fps due to the crappy PS4 emulation on PS5. The XONE version on a Series X felt much better to play, due to better upscaling from 1080p making things less blurry and the fps boost to 60 was okayish (still not perfect). Comparing that to my PC running the game on native 4k at 144 fps is ridiculous, the game felt much more pleasant to be played, I had a lot of more control over the car and was able to smash my times from PS4 and XONE every time.

  • Like 1
Link to comment
Share on other sites

If the game is well optimised for 30fps then I don't see the problem, I actually prefer 30fps on story driven cinematic games as it appears more filmic.

 

I do generally play adventure games rather than twitch shooters or multilayer games though so I do get why people prefer 60fps on those. 

 

A lower fps is not a reason avoid a great game.

Link to comment
Share on other sites

3 minutes ago, savageamusement said:

I actually prefer 30fps on story driven cinematic games as it appears more filmic.

 

That's a good point actually. As much as I love 60fps and will always choose it when given the option, even in cinematic games, 60fps does give off this somewhat weird(?) look in certain situations, cutscenes especially. It isn't always that noticable, but sometimes it is. 30fps is definitely more filmic in some situations. I read somewhere a while back that some developers purposely use 30fps for cutscenes as a stylistic choice. I could have sworn that Rift Apart did and I think Ghost of Tsushima Director's Cut on PS5 did as well. There's probably more examples.

 

Personally I don't mind cutscenes running at 30fps, because they do tend to look better, and also because you're not going to notice it anyway due to the way the camera works, with it being fixed or suddenly shifting perspective. And also you're not interacting with the game during cutscenes. Movies run at 24fps but nobody notices it for the same reasons. 

Link to comment
Share on other sites

38 minutes ago, The Alchemist said:

 

That's a good point actually. As much as I love 60fps and will always choose it when given the option, even in cinematic games, 60fps does give off this somewhat weird(?) look in certain situations, cutscenes especially. It isn't always that noticable, but sometimes it is. 30fps is definitely more filmic in some situations. I read somewhere a while back that some developers purposely use 30fps for cutscenes as a stylistic choice. I could have sworn that Rift Apart did and I think Ghost of Tsushima Director's Cut on PS5 did as well. There's probably more examples.

 

Personally I don't mind cutscenes running at 30fps, because they do tend to look better, and also because you're not going to notice it anyway due to the way the camera works, with it being fixed or suddenly shifting perspective. And also you're not interacting with the game during cutscenes. Movies run at 24fps but nobody notices it for the same reasons. 

 

There were the Hobbit movies which had 48fps and many people started getting nauseous and dizzy when watching it. Since then, movies have basically reverted back to the 24fps standard. Apparently the new Avatar movie will be at 48fps too so we'll see if people still have the same issues. I would imagine so. 

 

I think there is something to the idea that people have to get used to visually processing the higher framerates just from a physiological standpoint and not just out of preference and which is better. Even for myself, cutscenes in games that run at higher framerates than 30fps starts giving me motion sickness right away. But when playing the game itself at 60fps, I'm usually fine but sometimes I still do get nauseous. My new TV has helped with its advanced motion blur and smoothness settings but I'm still not 100% accustomed to 60fps let alone higher framerates than that. 

 

So in a way, 30fps may actually be the preferred option for more cinematic games that are more akin to movies than they are games just for that reason. Also like you said, it just looks weird for cutscenes to run at higher frames, people aren't used to it. Just like the Hobbit movies looked so weird to me all those years ago. I agree with the film critics, 24fps is definitely the best framerate for cinema and therefore also games that are more cinematic oriented. 

Link to comment
Share on other sites

As long as it doesn't fall under that I don't really care, choppy games is a nightmare.

21 hours ago, Deadly_Ha_Ha said:

The first time I played Kingdom Hearts in 60fps instead of 30 I could not BELIEVE what I was witnessing. It's cringe reading people in this thread man. 

Absolutely, I lost count how many times I played KH2 on PS2 and when I first played on PS4 it took me a while to get the guard timing right lol

  • Like 2
Link to comment
Share on other sites

If you can't tell the difference between 30 and 60 FPS, you are legally blind and you should make your next Eye Doctor's appointment or you have a shitty monitor.

60fps is a great way to experience some games. It turned me around on playing some PS4 titles like Horizon Zero Dawn and Days Gone. Seriously, I challenge anyone to play the ladder and tell me that the experience is not any better than playing on older console that can barely make 30fps. I've played Control on the same performance, and it was one of the best action game I've experienced in a long time on that performance. 

We are in the age of advance technology where there is really no excuse not to include it, pretty graphics be dammed. You can still offer two mode option for those who want it, but you can't tell me that 60fps doesn't matter when it makes so much difference. 

 

  • Like 2
Link to comment
Share on other sites

Ofcourse its inevitable. Sony increasing the value of the hardware financially doesn't change the fact that the hardware is constantly ageing and losing value technologically.

 

Running ps4 games, remasters and remakes of linear games at 60fps was never really impressive enough to make me expect it for the entire generation.

  • Like 1
Link to comment
Share on other sites

On 27/10/2022 at 3:40 PM, DrZero_1983 said:

Me still trying to understand the difference between 30fps and 60fps:

 

24vl8i2ksvv31.jpg

If you play Uncharted 4 in single player and then switch to Uncharted 4 multiplayer you'll understand the difference. You feel it but don't see it. It's like a night and day difference when going for headshots

  • Like 2
Link to comment
Share on other sites

On 10/27/2022 at 1:36 PM, farradono said:

It’s a pity, because we struck a good balance lately, with games asking  you if you prefer a performance mode, which usually nets you 60 fps, or a graphics mode, for those interested in seeing grass or water graphics, this should be the standard.

 

shouldn't have to choose in 2022 tbh

 

60fps/4k with ray tracing should be the standard 

  • Like 1
Link to comment
Share on other sites

11 hours ago, MonaSaxPayne said:

60fps/4k with ray tracing should be the standard 

4k what? 1080p upscaled to 4k? lmao, because 4k native/60fps locked/ray tracing/max settings is not the standard on the highest end PCs

 

in reality, so many PS5 games having close to native 4k resolutions is actually impressive even at 30fps, even on PC most people don't play at native 4k

 

people REALLY need to stop calling anything 4K, 4K is 3840x2160, over 8 million native pixels people, sorry, Returnal 1080p upscaled to 4K on PS5 is not "4K", Horizon 2 1800p checkerboard performance mode which is equal to a little over 1080p in terms of native pixels is not "4k", just because a game outputs a 4k frame, it doesn't mean anything in the game axtually runs at 4k

  • Like 3
Link to comment
Share on other sites

I'm expecting 30fps games towards the end of the PS5's generation to become a norm. With Gotham Knights , this game doesn't 'earn' its excuse to be 30Hz, being a former cross-gen game and not demonstrating optimisation mastery in tapping out PS5's true power. I read on Twitter some Rocksteady developer was blaming Series S for this limitation in stretching production values thin for better optimisation.

 

Mark Cerny made the PS5 in such a way that many of the traditional bottlenecks in a traditional PC have been removed, between throughput computation, memory streaming and CPU. This console is even easier than PS4 to develop for. 60fps should absolutely be the standard from developers today in the near coming years.

Edited by Eraezr
Link to comment
Share on other sites

5 minutes ago, Eraezr said:

 I read on Twitter some Rocksteady developer was blaming Series S for this limitation in stretching production values thin for better optimisation

The Series S has been getting a lot of flack lately over claims of it holding back the generation but looking at the specs, I can't see why it couldn't run the game at 60fps but I'm not a big tech guy

Link to comment
Share on other sites

So, a few takeaways after reading this thread in its entirety:

 

Firstly, that ridiculous lie from that Rocksteady developer about the Series S, “holding games back”...guys, think about it: games are mostly still cross gen at the moment so if Series S is “holding games back then” what about last gen consoles???  Come on.

 

Now, regarding the bulk of the matter, I disagree with DF's assessment that Plague Tale Requiem is more reflective of what we can expect going forward because, like Gotham Knights, PTR apparently runs badly on high end PCs and consoles as well. I don't think games that have clearly been rushed out poorly optimised should be used as an example, and I am almost certain that a 60fps patch will be coming for one or both of those games down the line. That being said… God of War Ragnarok was mentioned earlier as the gold standard and, while it’s awesome that one of the graphic modes will be targeting 120fps, we have to remember that Ragnarok is a PS4 game at heart and was made with that last gen console in mind. Therefore, it won’t be as technically demanding of the PS5’s hardware as future Sony games (which are rumoured to be current gen only). It's a case of, "wait and see" but I really hope that there's a way to push gaming forward while maintaining higher frame rates, rather than it being a, "one or the other" type of thing.

 

I also think folks sometimes say “I can't tell the difference” when what they really mean is, “I don't care about the difference” which is not the same thing.  Of course there's a difference; it's immediately noticeable when you switch from one to another.  Whether or not one cares, though, is subjective. I do care and I always choose higher frame rates when given the option. The difference is, frankly, staggering.  Going directly from the PS5 version of Far Cry 6 and Scarlet Nexus to the PS4 versions was rough. They were so blurry and uncomfortable to play; it felt like my eyes were going.  As an experiment, I challenge those who claim to not see the difference to play Ratchet and Clank: Rift Apart at 60fps for 3 hours and then switch to 30fps and let’s see how long you last before you switch it back lol

 

The only game where I could stomach the switch from high frame rates to 30 fps was Control and that's because 30fps mode suited the motion blur and film grain aesthetic which, coupled with the design of The Oldest House and the general muted colour tone made the entire game look very cinematic. But this is an exception. Most games that have both modes look infinitely better at 60fps no matter what the genre is because 30fps is a compromise, rather than a design choice. That’s why games like Assassin’s Creed Odyssey unlock the frame rate on current gen consoles with no way back unless you play them on PS4. Developers would ideally have you play their games at higher frame rates.

 

Lastly, I reject this tiresome notion that "graphics don’t matter" and that the preoccupation with them is a young person thing. I’ve been gaming for over 30 years and I think it’s reasonable to expect games to improve on what came before in all aspects, not just what each individual deems as important. Our bar for games has changed so, yes, it makes sense that we demand more when it comes to graphics as well. The way a game looks ties into the game’s overall presentation and our first impressions of it. Do great graphics necessarily guarantee a great game? No, but it makes for a more pleasant experience than if not.  Seemingly insignificant details like the aforementioned “blades of grass” add up when these details are peppered throughout a game, making it more immersive.  One of the first things I noticed about The Witcher 3, for instance, that made it truly stand out as next gen (at the time) was the way Geralt’s ponytail moved when he ran, or the way the foliage reacted against the wind. That may seem minor but stuff like that matters over time in time sink games, and I can’t wait to revisit that world of The Witcher 3 again when the remaster is released…in 60 frames per second. ?

Edited by Vault-TecPhantom
  • Like 2
Link to comment
Share on other sites

3 hours ago, Vault-TecPhantom said:

Firstly, that ridiculous lie from that Rocksteady developer about the Series S, “holding games back”...guys, think about it: games are mostly still cross gen at the moment so if Series S is “holding games back then” what about last gen consoles???  Come on

This is a poor argument, especially with current gen games like GK. Where do you get this idea? It makes complete sense that a 4tf console can't compete with 10-13tf machines and it holds current gen games back. 

Your other points are ?. 

Edited by Markemmanuel
Link to comment
Share on other sites

2 hours ago, Markemmanuel said:

This is a poor argument, especially with current gen games like GK. Where do you get this idea? It makes complete sense that a 4tf console can't compete with 10-13tf machines and it holds current gen games back. 

Your other points are 1f4af.png

 

What "current gen games"? Most games being released are cross gen, i.e. last gen games with current gen enhancements. In fact, somebody made a list of current gen exclusives the other day and the list was pitiful.  My point is, no one can say whether or not Series S is holding anything back because games are still being made for last gen consoles. Therefore, if developers have no problem releasing games on old consoles then they certainly should have no problem releasing it on the Series S.

 

And we've established that Gotham Knights is not a good example to use since that game has issues in general. To be frank, it just seems like the developers on that game were incompetent. There is absolutely nothing that I have seen from Gotham Knights that demands it to be a current gen exclusive only, certainly not its graphics which look worse than last gen games that have come before it. If Avengers can run on last gen consoles so can Gotham Knights. Also, Deathloop is a current gen exclusive as well and that runs fine on the Series S at 60fps. 

 

It's worth noting that the developer who made these comments later apologised for scapegoating the Series S for all his problems, and that Rocksteady recently lost its founders while the Suicide Squad game is still in development so it looks like there are issues in that company in general.

Link to comment
Share on other sites

2 hours ago, Markemmanuel said:

This is a poor argument, especially with current gen games like GK. Where do you get this idea? It makes complete sense that a 4tf console can't compete with 10-13tf machines and it holds current gen games back. 

Your other points are ?. 

 

Teraflops is a terrible measurement for overall performance, (probably a terrible measurement in general, to be fair) because it doesn't tell you the whole story, certainly not anything important. It's kinda like saying your car has 500BHP. Cool... but what does that tell you about its acceleration, or its handling, or its drag coefficient, or its fuel efficiency etc.? It doesn't really mean anything without context. It's just a number that Microsoft themselves latched onto as a marketing gimmick to make it seem more important than it actually is, and something that console warrior fanboys hang onto to sling shit at each other on Twitter. It should be completely ignored.

 

That said, the Series S does have some issues that will only become more apparent as the generation progresses. The main issue is memory, and this is something that's been brought up a good number of times by multiple developers and other people in the industry. As it stands, Microsoft has a mandate that games have to be developed for both the S and X, but I can see them dropping this eventually once third-party devs start kicking up a fuss about it. They might try to continue the mandate internally for first-party studios, given how popular the S has been for them sales wise, but that will inevitably result in gimped versions of games. Thankfully, the PS5 has no such worry because it's a single SKU. Microsoft really should've just went with an all-digital Series X instead for $100 less, like Sony did with PS5. Now they're caught in this weird situation that will cause a lot of headaches for developers in the next couple of years once the cross-gen period ends and we start to get pure 'next-gen' titles that push these machines to the limit. 

 

I say scrap the Series S and focus purely on the true 'next-gen' machines. We could do with one less resource hog if we're to stand any chance of 60fps performance modes sticking around going forward. 

  • Like 1
Link to comment
Share on other sites

On 10/29/2022 at 0:41 PM, The Investigator said:

people REALLY need to stop calling anything 4K, 4K is 3840x2160, over 8 million native pixels people, sorry, Returnal 1080p upscaled to 4K on PS5 is not "4K", Horizon 2 1800p checkerboard performance mode which is equal to a little over 1080p in terms of native pixels is not "4k", just because a game outputs a 4k frame, it doesn't mean anything in the game axtually runs at 4k

 

doesn't matter if it is sampled 4K or whatever. If it looks better than 1080p or 1440p, then people are happy.

DLSS on PC is fantastic for example, even if it is not native 4K. But if makes it so that Raytracing is suddenly a viable option in 4K on weaker hardware.
Also, the differences between DLSS and native 4K are much less noticeable, if at all, than differences in performance.
 

On 10/29/2022 at 0:37 AM, MonaSaxPayne said:

 

shouldn't have to choose in 2022 tbh

 

60fps/4k with ray tracing should be the standard 

 

you cannot expect this and, at the same time, expect consoles to cost 500 bucks and games only about 60-70 bucks.

The hardware necessary to get 60fs with RT in 4K (natively) on PCs right now is super expensive.

4K RT only got "viable" with the recent release of the 4090. That GPU is priced at an MSRP of $1599.

Edited by Sicho
  • Like 1
Link to comment
Share on other sites

30 FPS is still what I've played most of my life (GameCube to the present). I've never had a Gaming PC, so 60 FPS and up as a standard is not my experience. 

 

However, I would take 1440p60 FPS as the bare minimum for virtually every game (even more calm games) if it meant sacrificing shaders, draw distance, lighting, and some other graphical effects. Heck, I'd take a lot of those compromises and more for a 4K60FPS minimum. 

 

1080p60FPS on a PS5 or Xbox Series X is quite laughable in most cases. They should be able to pull off more than that. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...