What is the modern go-to server solution for a client-server multiplayer mode?

Started by
10 comments, last by DinS 2 years, 8 months ago

Are there specialized services (including paid ones) or frameworks or do people usually build custom solutions from scratch? If custom then is a setup in the cloud like AWS's EC2 or DigitalOcean's VPS or Firebase + some server framework common among game developers? If there are no particularly popular choices, what do people usually do / what would you do?

I am asking this both from the perspective of a person who wants to try to build a multiplayer game and from the perspective of a person who wants to potentially help the community should it turn out that multiplayer is a big pain point for game developers.

None

Advertisement

Are there any? Yes, plenty.

One extremely common one right now is Valve's Steamworks. Amazon's Lumberyard is another, and can be cheaper/easier than building directly on their raw EC2 boxes. Microsoft, Sony, and Nintendo have their own libraries. Unreal and Unity have their own networking solutions which integrate with the consoles. There are many libraries like ENet, RakNet, and more. Major companies often develop their own suites of systems for internal use.

For simple games it is still possible to build your own, but if you want to start including all the major features (modern encrypted communications, voice chat, automatic NAT punchthrough and UPNP autoconfiguration, advanced bitpacking and compression techniques, etc) you're going to be better off using an existing system. It is certainly possible to develop all those systems on your own, but then you'll be spending several years building a communications library rather than building a game.

What the server looks like depends a lot on what the game looks like.

An asynchronous multiplayer game like Clash of Clans or Backyard Monsters will have fundamentally different architecture than an intense twitch-sensitive game like Battlefield. And a shooter like Overwatch will have a different architecture than a click-based MMORPG like World of Warcraft. And an MMORPG will have a different architecture than a turn-based game like Magic Arena.

Then the question is whether you're “big single world” or “many small levels," and whether you're “player hosted” or “central servers only.”

Multiplayer is actually, 100%, a big pain point for developers. It requires the entire game architecture and implementation to be instrumented for networking, from the ground up, to mask lag, to avoid client cheating, and to enforce efficient network and server usage. The way to solve this, is to build networking into the game engine. Unreal Engine provides good support for this. Unity, not as much, although they're slowly improving. In other engines, you'll have to lay more groundwork, specific to how your game thinks about entities, state, and simulation.

Now, can you be “a big help” when you've never done it before? I hate to say it, but, probably not. Each genre has its own approach, and each engine (or custom game codebase) has its own special niche where it fits best, versus where it goes against what you need to do. Once you have seen the insides of more than one game/engine using more than one networking approach, you might start being able to give general advice, though, so keep researching! And, most importantly, trying real code, for real applications! It will probably not work all that great in the beginning, but that is what experience is all about.

enum Bool { True, False, FileNotFound };

Big-world multiplayer is still not something that Just Works. Although UE5 is trying to get there. UE5 has some kind of content streaming system for big worlds. Anyone tried that yet?

My own interest is in big virtual worlds that allow building. Nothing in that space is really good yet. Only the voxel worlds (Minecraft, Roblox, Dual Universe) seem to scale well, and they look blocky. Improbable's Spatial OS looked promising, but turned out to be really expensive to operate. You have to host on Google, for one thing. The first four Spatial OS games, including Worlds Adrift and Nostos, all shut down due to high operating costs. The indy free to play, monetize later does not work well with high server costs.

Improbable, lacking paying customers but having accumulated way too much venture capital, started their own in-house game studio. The first result is Scavengers. This is yet another post-apocalyptic first person shooter. Someone who likes FPS, please go play it and report back. Supposedly they got 9000 users in the same shard in testing. Unclear if you can travel long distances through a big world. Gameplay is infantry combat, so movement is slow.

UE5 has some kind of content streaming system for big worlds. Anyone tried that yet?

As far as I understand it, it's an enhancement on top of World Composition, where you don't need to manually slice things into areas yourself. World Composition, in turn, was an enhancement on top of Level Streaming, helping make and manage multiple sub-levels for your world.

More importantly, this is for content-streaming-from-disk, not content-streaming-from-network. Thus, it may make it easier and more practical to build a large world that loads/unloads bits as you move around the world, but that's only really helpful on the client. On the server, you either need to do a seamless-handoff implementation between zone servers, or fit all the important (e g, collision, and physics) geometry in RAM and hope not too many players join in a single instance.

Perhaps surprisingly to an outsider, Roblox isn't particularly voxelized in its physics. The terrain system, and terrain-based lighting, are voxel based, but the physics is a custom rigid body simulation engine. A game that runs a tornado through a village of bricks, will have 40,000 simulated pieces, and they will replicate over the network, and the only reason that works, is that they've spent many man-years on profiling and optimizing how that works. Still, this is all “vertical” scalability, not “horizontal” scalability – when you hit the limit, split into multiple levels that players travel between, you can't “just add servers” to simulate more stuff in the same place.

There was a lot of hoopla in the early 2000s around seamless-zoning games, all the way from Asheron's Call to There.com. All those games implemented variations on a theme of “each area server has proxies for objects that are nearby on the other servers,” and some variation of forwarding messaging between proxies and authoritative instances. (I think Spatial does the same thing, btw.) And all of them have the same two problems:

  1. The proxies end up needing to be very large if you want to support long-distance interactions, like sniper rifles, or just regular rifles with realistic ballistics. This leads to a massive NxM explosion in proxies, which in turn loads up the CPU on all zone servers that are even remotely close to the interacted-with entities. And, for the person shooting the sniper rifle, every entity is potentially an interacted entity.
  2. There is a round-trip lag for the proxies. E g, on step X, a proxy is interacted with, and sends a message to its authority. This message can't arrive and be processed until step X+1. With server tick rates of 30 Hz, this very quickly became noticeable. A server tick rate of 100 Hz might make that better, but then you add an additional constant factor on top of the NxM proxy problem.

The “networking” bit of “distributing large built structures” is, by comparison, more straightforward to solve. (But see above about Roblox – this doesn't come for free!)

There's a reason approximately all successful MMOs end up with “instanced dungeons” and “zone travel," even though they sometimes hide it really well through game design. (Flying over oceans between islands, or whatever.) Not only is the technology more expensive to operate, but the game design is actually harder. When player are positively in one particular zone, the game design can take that into account, and naturally keep various game elements contained. When the world is fully open, players would expect gameplay to spread out over the entire world – as a kitschy example: kiting a dragon, and running across the entire map, should presumably pull the dragon with them, which would then leave the dragon's hoard un-guarded. (And all the counter-designs for that end up with other trade-offs.)

It's games. They're supposed to be fun. Technology isn't fun on its own, and, perhaps more controversially, FREEDOM TO ROAM in a multi-player world isn't fun on its own. Play a single-player game if you want that.

enum Bool { True, False, FileNotFound };

UE5 has some kind of content streaming system for big worlds. Anyone tried that yet?

As far as I understand it, it's an enhancement on top of World Composition, where you don't need to manually slice things into areas yourself. World Composition, in turn, was an enhancement on top of Level Streaming, helping make and manage multiple sub-levels for your world.

More importantly, this is for content-streaming-from-disk, not content-streaming-from-network. Thus, it may make it easier and more practical to build a large world that loads/unloads bits as you move around the world, but that's only really helpful on the client. On the server, you either need to do a seamless-handoff implementation between zone servers, or fit all the important (e g, collision, and physics) geometry in RAM and hope not too many players join in a single instance.

Perhaps surprisingly to an outsider, Roblox isn't particularly voxelized in its physics. The terrain system, and terrain-based lighting, are voxel based, but the physics is a custom rigid body simulation engine. A game that runs a tornado through a village of bricks, will have 40,000 simulated pieces, and they will replicate over the network, and the only reason that works, is that they've spent many man-years on profiling and optimizing how that works. Still, this is all “vertical” scalability, not “horizontal” scalability – when you hit the limit, split into multiple levels that players travel between, you can't “just add servers” to simulate more stuff in the same place.

There was a lot of hoopla in the early 2000s around seamless-zoning games, all the way from Asheron's Call to There.com. All those games implemented variations on a theme of “each area server has proxies for objects that are nearby on the other servers,” and some variation of forwarding messaging between proxies and authoritative instances. (I think Spatial does the same thing, btw.) And all of them have the same two problems:

  1. The proxies end up needing to be very large if you want to support long-distance interactions, like sniper rifles, or just regular rifles with realistic ballistics. This leads to a massive NxM explosion in proxies, which in turn loads up the CPU on all zone servers that are even remotely close to the interacted-with entities. And, for the person shooting the sniper rifle, every entity is potentially an interacted entity.
  2. There is a round-trip lag for the proxies. E g, on step X, a proxy is interacted with, and sends a message to its authority. This message can't arrive and be processed until step X+1. With server tick rates of 30 Hz, this very quickly became noticeable. A server tick rate of 100 Hz might make that better, but then you add an additional constant factor on top of the NxM proxy problem.

The “networking” bit of “distributing large built structures” is, by comparison, more straightforward to solve. (But see above about Roblox – this doesn't come for free!)

There's a reason approximately all successful MMOs end up with “instanced dungeons” and “zone travel," even though they sometimes hide it really well through game design. (Flying over oceans between islands, or whatever.) Not only is the technology more expensive to operate, but the game design is actually harder. When player are positively in one particular zone, the game design can take that into account, and naturally keep various game elements contained. When the world is fully open, players would expect gameplay to spread out over the entire world – as a kitschy example: kiting a dragon, and running across the entire map, should presumably pull the dragon with them, which would then leave the dragon's hoard un-guarded. (And all the counter-designs for that end up with other trade-offs.)

It's games. They're supposed to be fun. Technology isn't fun on its own, and, perhaps more controversially, FREEDOM TO ROAM in a multi-player world isn't fun on its own. Play a single-player game if you want that.

enum Bool { True, False, FileNotFound };

There was a lot of hoopla in the early 2000s around seamless-zoning games, all the way from Asheron's Call to There.com. All those games implemented variations on a theme of “each area server has proxies for objects that are nearby on the other servers,” and some variation of forwarding messaging between proxies and authoritative instances. And all of them have the same two problems:

  1. The proxies end up needing to be very large…
  2. There is a round-trip lag…

Interestingly, Second Life does not do it that way. There are no proxies. If one region needs to find out something about the next one, the regions have to talk.

Cross-region target practice

Shooting the bow causes a temporary object to be created, with all the features of any other object. It's given a velocity, and the the physics engine takes over. The arrow reaches the region boundary. The simulator running the region it is leaving packages up the object, including the byte code of any scripts it may contain, and ships it over the network to the gaining region. The gaining region re-creates the object, gives it the position and velocity it had in the previous region, and turns it loose. The arrow hits the target, and there's a region-local collision, to which arrow and target react.

This works, but there's about 300ms of extra delay between releasing the bow and getting a reaction from the target when the arrow has to cross a region boundary. Second Life is not a good platform for the pew-pew crowd. Bullets are not a special case.

The original design, for which there is an old patent, did call for proxies. But they were never implemented. This leads to annoying artifacts at region boundaries. If the target was within 1m of the region edge, the arrow would go right through it, because objects have to get 1m into the gaining sim before the region crossing triggers. This is to prevent rapid toggling between regions. Walls at the edge of a region don't stop movement into the region. That kind of thing.

Combining this approach with just a few meters of proxying at region boundaries would get rid of the artifacts, but not the delays.

This works, but there's about 300ms of extra delay between releasing the bow and getting a reaction


“works” :-)

Speaking of alternative technologies: RPG text MUDs work just fine, too, and are almost infinitely scalable! If I yell “a huge nuclear explosion just went off,” everyone sees a huge nuclear explosion in their minds. Magic!

enum Bool { True, False, FileNotFound };

Maybe you can look at World_Gate.

World_Gate is a platform where you can build, run and share a multi-agent strategic environment, i.e., a world. Ideal tool for online board games. But if you are talking about FPS games you may want a better solution.

None

DinS said:

World_Gate.

Looks like that's for turn-based RPG.

This is all about time scale. If you are willing to accept 1 second of lag, it's easy. If you will only tolerate 8ms of lag (1 frame at 120FPS) it's impossible over a long-haul network. Second Life technology can get you down to 200ms or so. Below that, it gets hard. You have to start predicting and correcting and assuming.

This topic is closed to new replies.

Advertisement