How to innovate the First person shooter (FPS) genre?

Started by
76 comments, last by TeamToBeDetermined 3 months ago

JoeJ said:
This is not like bitcoin, NFTs, or metaverse. This is no made up crap from desperate tech CEOs. This is real and serious, and it will change the world, maybe much more than the internet did.

You're speaking to a senior blockchain developer :D but I agree that most of it as a big meme. And yes, it will change the world, but I simply opt out of it. I am already being left behind by not buying into “gamer hardware”. The only real usecase for blockchains is the coordination of public services without a central owner. Which basically nobody has ever used it for.

JoeJ said:
We won't have a job, we won't own much, so a little box with compute power exceeding phones is all we'll can hope for. That's the ideal case. The worst case is no more boxes for the masses at all - just plug your phone to the cloud to enjoy the same one and only server side path traced MMO everybody else is playing too.

My humour exactly. Sounds like those MMO animes just more dystopian.

JoeJ said:
I believe the future is APUs, dGPUs are EOL to consumers.

I'm working on my own ISA as main target for my compiler, which I hope to migrate from emulation to FPGA one day. Basically, instead of CPU+GPU on one chip, my ISA is for having a hybrid computation model that reflects both modes of computation in a single processing unit, along with some other changes that make it more future-proof than the current way of adding new opcode extensions to add bigger SIMD registers. And in that ISA, you don't need drivers for peripherals, much like the x86 BIOS era.

JoeJ said:
To get it done at all, we have to sacrifice some performance for progress and maintainability. Do you code in Assembly or C++? It's C++?

I program in my own language, which sadly currently transpiles to C++ because the self-hosted compiler isn't fully featured yet. But yeah, C++ makes it really easy to write terrible code, performance-wise. I hope to change that with my language. Been at it on-and-off for 8 years now, although most of the early years were wasted effort because I didn't really have enough insight into computing to make anything other than a reskin of C++ at that time.

JoeJ said:
If there are no kids in my class, i can not distribute X cookies to them.

Yes. And that's why no distribution occurs. A literal no-op. And that is why everything remains as-is, with full remainder. 😉

Think of dividing a flock of sheep fairly among your descendants. Oh, you're an incel? You do nothing and everything remains yours.

Anyway, to sum it up, since I plan on using an FPGA-based CPU for all my hobby computing in the long-term future, I won't be expecting speeds of modern hardware anyway, so I will naturally be forced to be as efficient and frugal as possible with my goals for the software I plan to write. The language must not cost performance, the OS+driver mess must not cost performance, the CPU must allow me to squeeze everything the FPGA can offer out of it. And then the code must be well-written.

JoeJ said:
Have not played it, but i doubt they mess things up. Their gfx lead dev is one of the best across the whole industry, it seems. … you have to upgrade … :D

I just booted up TQ, all graphics maxed but disabled HBAO+, ran smoothly on my intel integrated graphics at 1080p. The game is very similar to PoE in every aspect, yet it runs way better at almost-max graphics than PoE does at absolute minimum. So their lead graphics dev should learn from the TQ lead graphics dev or something. There is no excuse for their poor performance at lowest settings. It's not me who needs to upgrade, it's the entire contemporary software industry who needs to git gud, because we had been there already. The new generation just didn't try hard enough.

Walk with God.

Advertisement

RmbRT said:
Which basically nobody has ever used it for.

Kinda sorry to hear that.

RmbRT said:
I'm working on my own ISA as main target for my compiler

Haha, now that i call a hobby.

Well if your div by 0 gives 0, i'm fine with that. I guess it would save me than work than it costs.

But i need those little GPU cores for parallel processing. Don't forget about that. : )

RmbRT said:
just booted up TQ, all graphics maxed but disabled HBAO+, ran smoothly on my intel integrated graphics at 1080p. The game is very similar to PoE in every aspect, yet it runs way better at almost-max graphics than PoE does at absolute minimum.

Watching some TQ video that's easy to explain. TQ uses very simple directional lighting with a single source, PoE uses complex lighting - all those little particle effects are light sources, casting soft shadows in screenspace. TQ is low poly in comparison. Physics geometry is also a coarse approximation, i saw a shot flying thru a big rock. Likely there is a lot more. And i assume it's not worth to support very old HW for them. Few sales, but a lot of extra work which is no investment into the future. Don't blame the devs for optimizing their business too.

RmbRT said:
The new generation just didn't try hard enough.

There will be never a better rock band than Led Zeppelin, never a better FPS than Doom.
It's hard for later generations to impress after that. It was easy only for the first generation, when everything was new and unexplored.

There will be never a better rock band than Led Zeppelin, never a better FPS than Doom.
It's hard for later generations to impress after that. It was easy only for the first generation, when everything was new and unexplored.

There will never be a better Doom than Doom.

Doom is an FPS. FPS is not Doom. It doesn't make sense to compare apples to oranges by comparing Doom against other FPSes that are trying to do different things.

krumpet291 said:

There will be never a better rock band than Led Zeppelin, never a better FPS than Doom.
It's hard for later generations to impress after that. It was easy only for the first generation, when everything was new and unexplored.

There will never be a better Doom than Doom.

Doom is an FPS. FPS is not Doom. It doesn't make sense to compare apples to oranges by comparing Doom against other FPSes that are trying to do different things.

I think the problem is that early games have a much higher sentimental value compared to today's games. Even if I did not experience that era for myself, I can tell that those games have a very strong personality. Modern games do not offer that feeling. I think it is the clumsiness of the controls, the constrained graphics & complexity, and the purity of game design that make those games very relatable. Modern games are perhaps too polished, they do not offend or inconvenience the player, and due to their appearance of perfection, it is not apparent to the player that this is someone's hard work of art. It's like how well-adjusted, “perfect” people do not properly register as a person to whoever they are interacting with.

JoeJ said:
But i need those little GPU cores for parallel processing. Don't forget about that. : )

I actually do have GPU-kernel-like processing that is superior to x86-style SIMD because it does not use fixed vector dimensions. The hardware gets an arbitrary-sized workload and it can then perfectly schedule that into the available ALU count, register size, etc. And the semantics of that SIMD/GPU-kernel mode are defined so that it's not vector instructions in a linear program, but an entire sequence of a vector program. So there is no actual loop explicitly written in the assembly code that would have to be executed linearly, but a single 2-D workload (instruction sequence × workload dimension) the CPU can choose how to schedule. This allows the hardware to be potentially much more efficient because iterations have 0 causal dependency (except at barrier/gather instructions, such as sums), and at each actual instruction in the program, the CPU has to perform arbitrary amounts of work (based on workload dimension), giving the CPU the opportunity to perform more sophisticated tweaking/scheduling and pipelining because any increased throughput gained by that will easily amortise the increased latency of configuring an ideal pipeline just in time. You don't have those timing economics in regular SIMD code. Additionally, when instructions have huge dimensions compared to register count, it becomes economical to create a pipeline of a short sequence of instructions that get executed per input, amortising multiple instructions into a single cycle, and potentially feeding multiple sets of data into that pipeline per cycle (depending on actual hardware implementation).

So my ISA does not limit the CPU in any way regarding the amount of optimisations it can perform, and the actual code does not need to change to fully make use of a more powerful CPU.

So I do expect to get some serious performance out of it eventhough it will run at slower clock rates due to being FPGA-emulated.

Walk with God.

krumpet291 said:
Doom is an FPS. FPS is not Doom. It doesn't make sense to compare apples to oranges by comparing Doom against other FPSes that are trying to do different things.

It was not meant literally. When i got my hands at Doom, Quake was already out. So to me personally Doom was neither awesome nor influential.

My Doom was Blake Stone, some Doom clone my girlfriend had shown me on her school laptop.
And it blew me away, and it brought me back to playing games and programming, which at that time i had already left behind.

But i do not really see any FPS which tried to do different things than Doom. The concept was extended, but kept the same in its essentials. Adding some story, or weapon reload, a jetpack, cover mechanics… It's still just looking and moving around, aiming at moving things and shooting them, while dodging their shots. That's still the core gameplay loop, and the same moment to moment gameplay. So it does make sense to compare or even reduce any FPS to Doom, imo. The reduction is useful to identify what really matters the most.

On the other hand, it's maybe annoying to glorify Doom. To me it is, but i still use the game in discussion to make a point.
To remove the glorification, i like to say ‘Doom is just a twin stick shooter but with first person camera.’
Which means: The only ‘brilliant innovation’ here was to develop a 3D engine. It's surely not a game changer in terms of game design, and i've played 3D games on C64 long before.
What made the FPS genre really great after that was the addition of the mouse look. Iirc, Doom did not had it yet, and Quake added it as an optional, experimental feature. You had to constantly keep a button pressed to use mouse look all the time. I've use the right mouse button.
And only after it has shown that players learned and used mouse look, the FPS genre essentials were finally defined.

Id software were not the only guys innovating here. Other devs did similar things, players contributed as well, so it was a larger scaled movement, not just the single game we now use as a symbol to express all this.

RmbRT said:
I think the problem is that early games have a much higher sentimental value compared to today's games. Even if I did not experience that era for myself, I can tell that those games have a very strong personality. Modern games do not offer that feeling. I think it is the clumsiness of the controls, the constrained graphics & complexity, and the purity of game design that make those games very relatable. Modern games are perhaps too polished, they do not offend or inconvenience the player, and due to their appearance of perfection, it is not apparent to the player that this is someone's hard work of art. It's like how well-adjusted, “perfect” people do not properly register as a person to whoever they are interacting with.

Yeah, the problem with nostalgia.
But i think that's a big warning. If nostalgia becomes big, and it currently does (Remasters, retro-games, even retro HW), that's simply an indicator we are beyond some peak, and the arrow goes down.

It means we need to come up with something truly new, i'm afraid.
Analyzing old masterpiece games likely wont help us here. Personally i do it only to understand our history.
Identifying things we have lost over time, mostly because they did not work anymore with the new and awesome stuff, helps me to identify some design goals. To list up my findings:

Shitty Atari 2600 games were great at spurring imagination of the player. You came up with your own story while looking at abstract boxes moving on screen. Sadly we have lost this while improving graphics and overall realism. We show it all, so there is no more point to imagine while playing. That's a huge loss.
Scorn is maybe not the perfect game, but it manages to bring imagination back. You do not understand the world you see, so your imagination works hard to make sense out of it. That's awesome.

2D games had perfect oversight. Moment to moment gameplay was more tactical even in the simplest action games, as mentioned earlier. We've lost this completely with 3D games, especially in first person. But i see ways to bring it back without any sacrifice.

That's all, but just my list. I guess everybody has it's own and related ideas.
So why is one AAA game like the other? Why do they even make only ‘franchises’ this day? Why are they afraid to innovate?

The reasons are obviously purely economical. Risk aversion due to production costs.

It can be fixed, but the players need to help out too: They must reduce their expectations (potentially infalted by our bad marketing).
If players fail on this, they are just as guilty on the downfall of gaming as the devs they criticize.
But i'm optimistic. Personally, as a player i'm totally fine with smaller games made by about 10 people or less. It's even the major kind of games i play at all.

RmbRT said:
So there is no actual loop explicitly written in the assembly code that would have to be executed linearly, but a single 2-D workload (instruction sequence × workload dimension) the CPU can choose how to schedule.

I can not imagine what it means to build your own CPU / GPU. But if FPGA makes this possible to a single guy, that's pretty mindblowing.
I think flexibility is much more important than raw performance. But that's no ideology, it's because i want performance.
And GPU vendors, or maybe just NV, fail to understand this.
RTX for example is useless to me, because BVH data structure is blackboxed. So i can't do any traceable, fine grained LOD. It's impossible. For the same reason Epic can not make their Nanite stuff traceable.
And that's not even a hardware limitation - it's actually an API problem. Likely the API designers overlooked that dynamic geometry is important, becasue with proper LOD, any geometry becomes dynamic.
Getting this right from the start would have been difficult, but much easier than trying to fix multiple broken APIs now after multiple GPU vendors added their own custom BVH data formats to the equation.

So, as a result, i'm actually afraid whenever i hear about new awesome fixed function features they add to their HW.
It's useless crap. But if everybody has it in his box, we are basically forced to use the crap, the get the most out of the HW.
Initially, HW acceleration made things easier. But nowadays, it only makes it harder.
We are past that fixed function era. We need flexibility first, so we can make better software, instead relying on better HW.

In that sense you're on the right track, and i'm envious.
Maybe, it's even the future. Idk, but maybe AMDs new ‘AI CPU’ features are FPGA based? It's Xilinx stuff, afaik.
So maybe, we could reprogram it, to turn AI crap into something actually useful?

Guess not, but that's certainly a future i would like to see happen.

JoeJ said:
Yeah, the problem with nostalgia. But i think that's a big warning. If nostalgia becomes big, and it currently does (Remasters, retro-games, even retro HW), that's simply an indicator we are beyond some peak, and the arrow goes down. It means we need to come up with something truly new, i'm afraid. Analyzing old masterpiece games likely wont help us here.

Game dev has become too much inundated with commercial interests and software development has become too hard. Back in the 80s, it was normal to ship an in-house OS + game on a diskette. Everyone could write an OS and a game. But now, you are forced to depend on pre-existing OS APIs that are derived from mainframes and you have no idea what hardware your game will run on (availability of certain SIMD extensions, GPU vendor & model, etc.). If you want to play it safe and target old OpenGL versions, you're wasting your CPU time. You have this immense complexity in the increasingly brittle software stack that didn't exist in the 80s and early 90s.

You have to put a lot of effort in to even get some basic prototype going. Software studios need more and more specialised experts for anything (think of the whole docker in a VM debacle, just to be able to even compile something) and the efficiency of programmers is going down as a result. Software is becoming extremely expensive to develop, especially if it has to perform well. Which means almost nobody has the means to just try stuff out. And that's why everyone who has the means is risk-averse. Any software that either communicates with hardware or with other software is basically impossible to maintain. And as long as it stays that way, we will not see a resurgence in software overall.

Once we make development simple again, people will be able to try out new things with much less effort, less of a learning barrier, and in turn, less financial risk.

JoeJ said:
Shitty Atari 2600 games were great at spurring imagination of the player. You came up with your own story while looking at abstract boxes moving on screen. Sadly we have lost this while improving graphics and overall realism. We show it all, so there is no more point to imagine while playing. That's a huge loss. Scorn is maybe not the perfect game, but it manages to bring imagination back. You do not understand the world you see, so your imagination works hard to make sense out of it. That's awesome.

Old games had really strong atmosphere, but crappy mechanics, especially the early 3D games. Maybe those games were built the other way around, centered around the story or world, and modern games are almost exclusively centered around their mechanics, I think. Old games like Gothic had an intent and entered a dialogue with the player (what you called spurring the imagination), while modern games seem to just drop the player into a world, but all he really experiences is the mechanics. Maybe the mechanics have far eclipsed the rest of modern games, and the balance is messed up. Eventhough Gothic had a shitty camera & combat system, and the enemies weren't even that great, and the economy was unbalanced, it still is a great world that feels alive somehow. Meanwhile, other games treat the game world and characters as mere background, and put moment-to-moment action in the foreground. But moment-to-moment action is not memorable and it is not emotionally relatable. It's like old games gave you a world or story to experience, but modern games give you a core gameplay loop and then only have a world as a vehicle for that loop. And then there's the weird AAA games that are basically interactive movies and player action is a mere filler between cutscenes.

For example RPGs are basically all derived from DnD, which had always been about making a fantasy world accessible for adventuring. But modern RPGs and especially MMORPGs are now some weird kind of action / battle / powerup simulator, which happens to usually play in a fantasy world. And they try to make that world appealing, but they cannot hide the fact that they are no longer about that world and its adventure, they are about their core gameplay loop, which is slaughter, loot, powerup, repeat. Virtually all quests and storylines are just a facade to gloss over the uninspired gameplay loop.

JoeJ said:
I can not imagine what it means to build your own CPU / GPU. But if FPGA makes this possible to a single guy, that's pretty mindblowing.

Well, it's complicated and mostly like writing a hand-optimised assembly program. You have very limited amounts of logic you can put on an FPGA in comparison to regular hardwired chips, but recent FPGA lines have increased the available logic capacity by an order of magnitude. I'm mostly waiting for them to become a bit cheaper, and then I'll have to write a program that can output tweakable versions of the overall same circuit, so that it can be used on different FPGAs with different logic capacities. I often think about circuits when I'm in bed, so I think I'm familiar enough with it to produce something acceptable. You can easily get some very primitive CPU going without too much effort, but if you want pipelining, moderately complex instruction sets, etc., you'll have to really put your nose to the grindstone.

JoeJ said:
I think flexibility is much more important than raw performance. But that's no ideology, it's because i want performance.

Well, it's a more or less known fact that current CPUs and current programming languages don't fit properly. CPUs want to handle large batches of tightly packed sequential data in branchless loops. Current programming languages want indirection, subroutines, objects. CPUs want structs of arrays, not arrays of structs. Etc.

The discovery phase of what computing is and how CPUs work is over. We have pipelining, caches, HW prefetching, branch prediction, out of order execution, all that stuff. And we have written a lot of software, too. We know all the major features that we need to express computation. Now we should just scratch all the junk and start over with a holistic design that incorporates all the lessons we have learned, without all the mistakes that propagate themselves throughout languages and ISAs. It's astonishing how much sophistication modern CPUs have in their JIT translation to microcode (which is the actual commands they execute, the ISA is actually just a programming language / API at this point), trying to parse semantic patterns out of x86 ASM to activate some hardware acceleration for a sequence of instructions, etc. We could relieve the CPU of that burden and directly tell it the semantic pattern of what we want to achieve, and then all that circuitry can be spent on more ALUs or something, which can improve the throughput of the dynamic GPU-like SIMD mode. And if all the logic circuitry and cache space of on-chip GPUs was directly available to the regular CPU, too, you could boost general computation performance by a lot, too, and then do graphics in software, using the GPU-like SIMD mode instead of a dedicated separate part of the chip that is dead weight when we're not trying to do graphics. The achievable performance should be roughly the same for graphics, but massively increased for general computation. And that is before even talking about actual multi core.

JoeJ said:
Initially, HW acceleration made things easier. But nowadays, it only makes it harder. We are past that fixed function era. We need flexibility first, so we can make better software, instead relying on better HW.

Definitely. Graphics is also just a form of massively parallel general compute. All we want to do is draw pixels on the screen. GPUs should be obsolete if CPUs stopped being linear. CPUs would be cheaper and more efficient if they didn't require complex circuitry just to interface with an ISA that does not actually reflect the hardware anymore (for many years now, and the mismatch just keeps growing).

JoeJ said:
Maybe, it's even the future. Idk, but maybe AMDs new ‘AI CPU’ features are FPGA based? It's Xilinx stuff, afaik. So maybe, we could reprogram it, to turn AI crap into something actually useful?

I doubt it, because FPGA circuitry is inefficient. I would expect them to hardwire certain network topologies for ANNs with a hardwired activation function, they could fit probably at least 10 times as much circuitry in there (or even way more, probably) if they went for hardwiring instead of FPGA. And it could operate at much higher frequencies.

Walk with God.

When I play a game, I don't think to myself “wow, I want to kill enemies, find loot, and upgrade my character”. I want to have fun, have an adventure in another world. Success at a game shouldn't be measured in levels and powerups or something like that, but in immersion in its world and story. The fun should not arise out of numbers going up or completely dominating the battlefield. FPSs are probably about engaging in thrilling combat or other kinds of thrills arising from first person interaction with a world. After all, that's why the first person perspective is a key feature. Amnesia had a great way of utilising the first person perspective to make for a really thrilling and immersive game, eventhough I hate horror.

Walk with God.

RmbRT said:
When I play a game, I don't think to myself “wow, I want to kill enemies, find loot, and upgrade my character”. I want to have fun, have an adventure in another world.

I'm the same. Going farther, i do not even care about ‘playing a game’ that much. All i want is the other world.
To give it some sense and a purpose, let's make a game out of it. But that's the secondary objective, not the first.

I wonder if this is a kind of typical programmer mentality, and game designers might actually hate this way of thinking. : )

edit. nothing futher to add to discussion.

This topic is closed to new replies.

Advertisement