Why are computer games so huge these days?

8 GB - average Debian install on a server

20 GB - my Arch install

30 GB - a backup of a postgres database for three web services I admin, which have been running for almost a decade

60 GB - Divinity: Original Sin 2

@wolf480pl that's not fair, you didn't account the data of Arch's upgrade. ~500mb per week.

@username welk ok, I didn't count /var, but still, if you download 500MB per upgrade and keep last 3 upgrades in pacman cache, it's like 1.5GB more

@wolf480pl Have you ever heard of the new Call of Duty: MW? 250GB of game... It's ludicrous.

@captainepoch @wolf480pl And every update is easily 30GB too .. And they update often.. Disgusting..

@wolf480pl Static libraries and high resolution game data?

I am annoyed that tensorflow + cuda + friends take up around 10 to 20 GB. Add pacman cache of these packages and my root partition is nearly full.

@ashwinvis have you considered using a separate partition for /var?

Yes I have, but it needs to be mounted all the time right? Esp.for running flatpak apps. Maybe a network files system for /var might help

Perhaps in the long run. For now I remove these specific packages using paccache --remove --keep 0 ...

@wolf480pl Never mind computer games. Why is Debian so large?? 8 gigs? I remember when 2 was about average.

@JordiGH @wolf480pl debian isn't that large, I think that must be debian + all the additional services that are hosted there.

@kline @JordiGH
correction, / is 5.3GB not counting /home.

The main services (their data and python venvs) are on a separate partition.

Other than that, there's nginx, ssh, ntp, rabbitmq, memcached, postgresql client...

@wolf480pl @JordiGH so subtracting other services, dependencies, and data, we're probably talking 4GB.

they've forgotten how to compress their games.

Less compressed or not compressed textures take less compute power to decompress

@r000t @wolf480pl compute power should be damn near as free as storage space these days, though.

Now do it thousands of times per second as shit is copied in and out of vram

@r000t @wolf480pl old consoles had no problem with this, why should modern computers?
@r000t @tn5421 @wolf480pl wait till games become a thing on Electron. That will be the ultimate bullshit.
@GNUxeava @r000t @wolf480pl

i can see it now and i want to die

@tn5421 @GNUxeava @r000t @wolf480pl

This is legitimately something I considered for my little wasm game... It's hard to want your game to be universally accessible.

How do you get your windows friends to even try your game?

I decided against it after a long second and half of thought though, but others may not have my willpower

All the launchers are either Electron or Chromium Embedded Framework so we're already halfway there!
@tn5421 @wolf480pl

@wolf480pl but muh 16K resolution!

In all seriousness, though, I would suspect that a decent part of the games' size goes into resources like images, music, cutscenes and other sound effects.

@wolf480pl huge resolution textures and 4k ingame cinematics.

And maybe uncompressed sfx and music in some cases.

Assuming you want an actual answer and not just memes - the reason this is the case with larger AAA titles is because they like to pack absurdly massively oversized textures into their games, and sometimes even the same textures repeated at a lower resolution.

Couple this with all the intricate sound systems in games like Wwise and FMOD that can cause games to have many *uncompressed* audio files.

There's also general developer laziness, packing console assets into a lazily done PC port of a game for example.

Last but definitely not least - stuff like Denuvo and VMProtect can bloat the bytecode of a game's codebase.

I definitely don't agree with some of the practices AAA studios do in this regard - Rainbow 6 Siege is like 100 GB+ if you install the "high resolution" Why? What's the point? >_>
@wolf480pl one reason I heard it that a lot of resources are duplicates to avoid seek time. Basically, if the same piece of texture is used in two different game levels, it will be included as two separate copies.
@tn5421 @wolf480pl that shouldn't be an issue in the first place but it is. I heard this about relatively recent games released within the last 5 years.

@wolf480pl some games are breaching the 100GB boundary.

It's dumb.

Especially considering there are contemporaries who are able to provide a quality experience and staying under the 20GB limit.

@wolf480pl All the art assets are really large, and that's going to be the bulk of it. The engine itself is probably smaller than any of these.

That being said, I remember Doom 2016 having, like, 20GB patches in the form of engine updates, so I might be totally wrong there 😅

Sign in to participate in the conversation

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!