this post was submitted on 23 Nov 2024
121 points (89.0% liked)

PC Gaming

11610 readers
567 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
all 18 comments
sorted by: hot top controversial new old
[–] [email protected] 37 points 7 months ago* (last edited 7 months ago) (2 children)

I'm going to sound a little pissy here but I think most of what's happening is that console hardware was so limited for such a long time that PC gamers got used to being able to max out their settings and still get 300 FPS.

Now that consoles have caught up and cranking the settings actually lowers your FPS like it used to people are shitting themselves.

If you don't believe me then look at these benchmarks from 2013:

https://pcper.com/2013/02/nvidia-geforce-gtx-titan-performance-review-and-frame-rating-update/3/

https://www.pugetsystems.com/labs/articles/review-nvidia-geforce-gtx-titan-6gb-185/

Look at how spikey the frame time graph was for Battlefield 3. Look at how, even with triple SLI Titans, you couldn't hit a consistent 60 FPS in maxed Hitman Absolution.

And yeah, I know high end graphics cards are even more expensive now than the Titan was in 2013 (due to the ongoing parade of BS that's been keeping GPU prices high), but the systems in those reviews are close to the highest end hardware you could get back then. Even if you were a billionaire you weren't going to be running Hitman much faster (you could put one more Titan in SLI, which had massively diminishing returns, and you could overclock everything maybe).

If you want to prioritize high and consistent framerate over visual fidelity / the latest rendering tech / giant map sizes then that's fine, but don't act like everything was great until a bunch of idiots got together and built UE5.

EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn't limited to UE5.

[–] [email protected] 12 points 7 months ago* (last edited 7 months ago)

The issue is not that the games performance requirements at reasonable graphics settings is absolutely destroying modern HW. The issue is that once you set the game to low settings it still performs like shit while looking worse than a 10y old games

[–] [email protected] 10 points 7 months ago (1 children)

EDIT: the shader compilation stuff is an exception. Games should not be compiling shaders during gameplay. But that problem isn’t limited to UE5.

You can preload them if you want but that leads to loadscreens. It’s a developer issue not an Unreal one

[–] [email protected] 5 points 7 months ago (1 children)

No matter what you've got to compile the shaders, either on launch or when needed. The game should be caching the results of that step though, so the next time it's needed it can be skipped entirely.

[–] [email protected] 2 points 7 months ago

Gpus do cache them

That’s why on launch/loading screens work

[–] [email protected] 33 points 7 months ago (1 children)

I don't agree with this at all. I'm sure there are projects where it wasn't a great choice, but I've had no consistent problems with UE5 games, and in several cases the games look and feel better after switching -- Satisfactory is a great example.

[–] [email protected] 12 points 7 months ago* (last edited 7 months ago) (2 children)

Dead by Daylight switched to UE5 and immediately had noticably bad performance.

Silent Hill 2 Remake is made in UE5 and also has bad performance stuttering. Though Bloober is famously bad at optimization so its possible it might be just Bloober being Bloober.

STALKER 2 is showing some questionable performance issues for even high end PCs, and that is also made in UE5.

Now, just because the common denominator for all these examples is UE5 doesn't mean that UE5 is the cause, but it is certainly quite the coincidence that the common denominator is the same in all these examples.

[–] [email protected] 8 points 7 months ago

It's the responsibility of the game developer to ensure their game performs well, regardless of engine choice. If they release a UE5 game that suffers from poor performance, that just means they needed to spend more time profiling and optimising their game. UE5 provides a mountain of tooling for this, and developers are free to make engine-side changes as it's all open source.

Of course Epic should be doing what they can to ensure their engine is performant out of the box, but they also need to keep pushing technology forward, which means things may run slower on older hardware. They don't define a game's minspec hardware, the developer does.

[–] [email protected] 3 points 7 months ago

Subnautica 2 is going to be UE5 also, I'm already worried about it.

[–] [email protected] 27 points 7 months ago (2 children)

I've seen a lot of talented devs explain that UE5 does give devs the tools to pre-cache shaders but since AAA studios rush everything, it ends up being low priority compared to maximizing the graphics. It's not hard to believe considering games are pushed out the door with game-breaking bugs nowadays.

But it does beg the question of why the engine doesn't do that itself. UE4 games ran like a dream, but this generation has felt like nothing but stuttering and 20 minutes of compiling shaders every time you open a game for the first time...

[–] [email protected] 6 points 7 months ago

20 minutes of compiling shaders every time you open a game for the first time...

Shiiit, Stalker 2 be compiling shaders every time I launch it!

[–] [email protected] 1 points 7 months ago

A lot of UE4 games had big issues with shader compilation stutter. This is nothing new.

[–] S_H_K 25 points 7 months ago

I think the main problem is how the industry became a crunching machine. Unreal had been sold as on size fits all solution whereas there is things it does good and others it doesn't obviously.

[–] [email protected] 15 points 7 months ago* (last edited 7 months ago)

I rarely have a good time with UE4/UE5 games, performance is often rough and while on a technical level the graphics are 'better', I often don't think they look as pleasant or feel as immersive as older games.

[–] [email protected] 11 points 7 months ago

Most games made in UE are AAA games, where every A stands for more scam, jankyness and less value overall. Very rushed, no love, made to barely work on "my machine" (4090). Many Unity games are smaller cash grabs.

The most devs that fulfill at least one criterium well (eg. Gameplay, Performance, Stability) are either small studios with their own engine (4AGames, Croteam, Minecraft), or publishers with one banger per 5 years or so: Valve (lost it with CS2 tho), Rockstar. Because those devs either put love, time or both into the games.

[–] [email protected] 8 points 7 months ago

means you won't do great, as shaders are being fully reloaded all over again, and all that heavy lifting is being done on the go.

Lukily, using DXVK, Vulkan caches them beforehand.

[–] [email protected] 3 points 7 months ago

People are finally catching on.