How much memory does the average game use today?

Started by
9 comments, last by Yann L 15 years, 2 months ago
So far my game uses 60 megs of ram. I estimate it will probably be a little over 100 megs by the time my game is finished. Is this a lot? The game isn't terribly complicated graphically, its 2d and probably somewhere on the level of a game from the late 90's.
Advertisement
It isn't a lot. A standard computer today will have gigabytes of RAM.

A more important issue is download size. If you want people to download your game, then keeping the download size small is a good idea.
Right now the download is around 5.5 megabytes.

Probably will be double that when finished. Not too bad I guess.
Quote:Original post by Chris_6713
Right now the download is around 5.5 megabytes.

Probably will be double that when finished. Not too bad I guess.

Considering that GTA IV was a 17 GB download over Steam, I guess 11 MB isn't too bad :)

Seriously though, RAM usage isn't much of a concern anymore. Usually people will have more physical RAM than the virtual address space an application can address under a 32bit OS in standard configuration - and that's around 2GB.
Define "average".
If you mean Crysis, Assassin's Creed, or GTA IV, then less than 2 GB is fine [lol]

60 MB if RAM is fine.
You should look at the Valve's hardware survey, define which is your market target; then your primary OS.

How much RAM the user needs depends on how much it is available.
Windows XP takes the whole 128MB for it's own. With some background programs running, it can take around 200MB
If your target are users with at least 512MB RAM, then you have around 312MB to use.
If the target is 256MB RAM, you will have 56MB free. Surely the OS will move the background programs to the page file so you can be able to use more than 56MB of physical RAM.

If your main target is Windows Vista, it takes 512MB for it's own.
However Vista users with only 512MB already suffer slow response everywhere.

60MB is more than fine. (unless you're making a standard tetris clone)
What kind of game you're making needs also some correlation.

Worry about memory leaks, rather memory usage.
And like said, the download size or everything else that matters depending on your situation.

Everything is relative

"640kb of memory ought to be enough for anybody" -Bill Gates
Edit: Argh, a quick google for that B. Gates quote turned out that it was a scam. I've been fooled for so loong..
Anyway, 60mb is fine ;)
If you're developing for PC then I wouldn't worry too much, but for consoles, there is no paging or virtual memory. If you run out of physical space you're toast.

With that said, even the Wii has more memory than 60MB (not too much more though). Handheld will obviously have less.
Quote:Original post by Yann L
Seriously though, RAM usage isn't much of a concern anymore. Usually people will have more physical RAM than the virtual address space an application can address under a 32bit OS in standard configuration - and that's around 2GB.
(emphasis added)

For the exact reason that you've given, that's not necessarily true, sadly. In particular, if you have many DLLs and need to either allocate larger blocks of memory or map larger files, you can get a nasty surprise. Random image base generation (which is advertised as a good thing, since it saves relocations) produces yet much worse results.
Also, if you VirtualAlloc in blocks of 4k, believing that this is nice and cool and all is good because it's the page size, you may eventually discover that your 4k pages are placed at 64k boundaries. Now if you've done a few of these, that's the moment you start cursing.

Until not long ago, I've memory mapped a 600 MB data file, because of the obvious advantages. Until... one day it failed. It turned out that the address space was so terribly clustered between a couple of smaller mappings, half a dozen heaps, thread stacks, and DLLs, so that there were no contiguous 600 MB of address space left (the machine had around 2.5 GB of unused physical RAM at this time, but to no avail).
I'm usually more concerned with correct and smart RAM usage, rather than the exact number. This means making sure memory is properly freed so that its returned to the system (ie - it doesn't leak) that resources aren't needlessly duplicated, that resources are only in memory when they're being used or pre-cached for quick access, that large buffers and/or standard containers are the right ones for the job and are correctly sized.

Once all that is satisfied, if you still feel a need to reduce RAM usage, then you have to start looking at more efficient data structures or pre-computing less where you can get away with it from a performance standpoint.

throw table_exception("(? ???)? ? ???");

Valve's survey is great for the gamer segment of the market, but you might also like to check out the Unity Web Player stats for something that might be closer to the indie market.
Quote:Original post by samoth
For the exact reason that you've given, that's not necessarily true, sadly. In particular, if you have many DLLs and need to either allocate larger blocks of memory or map larger files, you can get a nasty surprise.

Memory fragmentation has nothing to do with total memory usage. Fragmentation is not dependent on how much memory you allocate, but on how you do it. You can make the entire (32bit) virtual address space unusable by only allocating very small amounts of memory, if you get the allocation patterns wrong.

That's why every serious game or application should always use a custom memory allocator which reduces (or avoids) fragmentation. But this is an entirely different question than the one the OP asked.

And about random DLL bases, well you shouldn't use random ones. You should select a distinct area where you'd like to put all your DLLs into, and then try to organize them as sequentially as possible. If you get it slightly wrong, it doesn't matter. If you overestimate, you get a little bit of fragmentation, but nothing too serious. If you underestimate, the OS will rebase.

This topic is closed to new replies.

Advertisement