Hi all, I've been developing a Multiplayer solution for a game engine for almost 2 years now in my spare time, it is a Server Authoritative model were client interpolate server snapshots and all that stuff. I've been reading some posts here lately and also some articles online and I've read about some model/architecture where the server and the client send packets at different speeds, but I cant wrap my head around it.
In my current model, the server and the client both run at a “network fixed rate” (independent from framerate), lets say 60hz, so when the client produces an input, it says “this input is four your tick 88 server”, and when it reaches the server it is queued for a (hopefully) very small time and then processed. It works great!
But I've seen some articles online about the client sending inputs every frame (every render frame, not every fixed tick) so if the player has 120fps it will send 120 pps, or if it has 40fps it will send 40pps… Which actually makes sense to me, because if you are running the game at 144fps why should I limit you to 60pps?
If I were to implement this right now, the server queue will be flooded because its receiving packets at a much faster rate than it is processing them and that would result in a never ending queue of pending inputs, so I would appreciate if someone can help me understand this better, maybe I've been thinking about my approach for so long that I cant think outside the box.
Now that I'm thinking… maybe if, even tho the client is sending an arbitrary amount of inputs per second, I still mark them “for fixed tick X”, and then process them together on the server? So like if the client produces 20 inputs at fixed tick 44, then all those inputs are marked as “for tick 44” and processed together on the server…. that actually might work, I still would like some second opinions. Thanks!