When did game developers start using 3d libraries to develop 2d games?

Started by
37 comments, last by JoeJ 2 months, 3 weeks ago

I'm not part of the trend toward retro. The super bowl for example had a “Tron” theme and roller skates. It's like an appreciation for the proverbial “good taste” in those old things.

But in my case I lived in a time when Shinobi by Sega was around when I was 9. I had no idea what an arcade was. Seemed like I lived under a rock. I played pac man at the convenience store back then. Sometimes pole position in a store where the clerk later gets shot. When I ate out with family at ‘the loop’ I occasionally played the legend of kage.

I always was enamored by the pallette and the painted quality of games like Shinobi and the Legend of Kage.

So these days one could be fascinated by the pallette and painted qualities of those old games and then render them with modern conveniences much more easily - or so you say and I agree I've often known those images are easily made with a paint program like krita.

But where we get to the hardware yes I am interested in the old ASICs, VLSI, and the idea of hardware acceleration - those were the hallmarks of my time.

Now we still have ASICs really good ones and I am wondering how the old ones worked. You get a white paper on some 1990s ASIC and also look through books concerning hardware from that time.

Hardware accelerators exist and with GPUs that have special multi core architecture I hear CUDA is like assembly code -which I like and would prefer over using Direct X. But versus an engine I do like the idea of direct x programming. I like how direct x is just rendering and not all the stuff in the game engine.

My first computer was an 80s machine, the c64 but the 16 bit CPUs like the 68000 in the Amiga and the Intel 80286 in the PC were just going into obsolescence as the 80386 was the bargain machine as it was more than 5 years old. My second computer was an 80386-SX 16 MHz running dos 5.0. I played commander keen and Hugo's house of horrors. I was always wondering how the games were made but didn't get an idea until about 8 years later in my 20's.

But a lot of stuff happened from 20 to 40 and I didn't decide to try again until recently. I did go to school around age 30 for electrical construction. Some of the classes I took were in advanced math, programming, and electronics.

Advertisement

rei4 said:
So these days one could be fascinated by the pallette and painted qualities of those old games and then render them with modern conveniences much more easily - or so you say and I agree I've often known those images are easily made with a paint program like krita.

One point that you seem to not understand is that it's not mainly about the graphics itself. It's the convenience of modern computers, coding-languages etc… If I understood you correctly, you are working on a PC with Windows 3.11? A 20 year old OS is not going to allow you to work as productive as modern systems.

As for coding, even standards from a few years ago are considered obsolete. Coding in C++ pre-11 is a big waste of time. Even in modern languages like C#, when you use some of the older standards, you'll constantly be wasting time by not having features available that came in later standards.

It seems you have your mind set, and that is fine. But you can't pretend like working with 16 bit CPUS and antiquated OS will give you the same level of productivity. It does not. It will force you to work around issues that don't exist anymore. You will be forced to solve problems that are already solved on modern hardware. You will face limitations that don't apply to modern systems.
For example, my own 2d game in SNES-style currently uses about 200MB of RAM. And the editor, around 1GB or mroe. Why? Because, while I care to optimize for performance, there is no point in try to make the game fit in any less RAM. Any modern PC has multiple GBs of RAM, so trying to bring that down would be a huge waste. Some of the budget is also spent for engine-features, that allow me to create game-content faster. But if I were to make that game for SNES-era hardware, I would have to sink so much time into trying to fit that older memory budget. I wouldn't be able to do certain convenience-features, because there is no memory for it.

Again, it's fine if you actually do want to work for and on that older hardware, but it seems to me you are convinced that there is no difference in production-quality/speed than using a modern system, which is just blatanly wrong. You might not notice the difference if all you make are small games, but if you try to make a 2D game with any amount of content, like a rpg, you will be wasting a lot of time. And you won't even notice the difference unless you've spend enough time in the modern eco-system to understand the difference.

Ancient 2D only accelerated apis, such as GDI's blitblt and such, don't exist any more as native interfaces (they are wrapped and emulated to modern protocols internally). So the industry switched to 3D apis to have the maximal available performance and functionality.

rei4 said:
Hardware accelerators exist and with GPUs that have special multi core architecture I hear CUDA is like assembly code

That's not true. NVidia does not publicate ISA afaik, and from other vendors we know ISA can change a lot across chip generations, and there are even bugs which compilers have to work around. So coding assembly for GPUs is practically not possible. Instead we provide some specified byte code to the vendor GPU driver which compiles the program for the GPU, so there is no precompiled binary we could ship like we do with CPU code.

You could write the byte code manually, which then would be a bit like assembly programming. But i've never heard about anybody doing this, maybe because the specified language is meant to be generated and processed only from the computer, not a human programmer.

What we do instead is using C alike languages, which are then transpiled to byte code automatically.
So in practice, coding for GPU is like C, not like assembly.
But it is the lowest level thing people do these days, so maybe you like it.
Personally i do. The fact that many threads execute the same program but on unique data changes everything, and new ways open up for even the simplest problems.
I've went from Basic on C64 up to C++, crossing a lot of other languages where needed, some assembly as well, but more or less this is all the same.
Contrary, parallel programming on GPU was something really new to me. Very interesting and exciting.

Sadly, the low level code, the vendor dependency, related proprietary standards, and simply the missing of a robust cross platform standard, are too much of a hurdle to use GPUs for general programming. Even though i have good experience with GPGPU, i use it only if the CPU is really much too slow for a task. Also because the C++ code for CPU is maintainable, but GPU code might need to be rewritten just because a certain API dies out, for example.

That's really the problem with low level things. The HW changes too quickly, so the work you do becomes temporary. It works on some HW, but few years later everything is gone.
Personally i do not love any HW at all, but if i would, i'd still dismiss paying such a high price.

Regarding your interest in ‘hardware acceleration’, that's also something standing on wonky legs, imo.
What we really want is the flexibility to do anything. But HW accel. rules this out. It can't be flexible, because in case it becomes flexible with time, it is no longer ‘hardware accelerated’ but actually just general purpose programming. This is what happened with CUDA, compute shaders, etc. People use it to implement their own software rasterizer in compute, because the actual hardware accelerated fixed function blocks are too slow. Current games using UE5 use a lot of software rendering like back then in the 90s.
Programmer beats HW engineer.

Isn't that much more awesome and interesting than restricted and boring hardware acceleration? What do you think?

Ok. This was my last attempt to warn you about doing what you want to do. Just had to say it.
Now go and have fun. : )

Yes. It seems like some important details aren't being translated - considering the last couple of persuasive messages by Julian and JoeJ. But it isn't really a big deal. I plan to use modern methods at one point but right now I'm studying the organization of now obsolete hardware. I just have always wondered how it worked. But the trick is to start with one particular area because the books about programming from back then covered many areas. We could have had a similar conversation about operating systems or local networks.

rei4 said:
I plan to use modern methods at one point but right now I'm studying the organization of now obsolete hardware.

Just remember you also need to get the software of the same era. They are tightly coupled.

Most of it is on floppies and few are still for sale. But because of emulators it is still possible to find compilers, games, paint programs, and networking software. The trick is you can't use an image to write to a new floppy disk as it won't preserve certain electronic characteristics of old MFM based disks from the 1980s nor the encoding that came later with 1.2 MB 5 ¼ disks or 1.44 3 ¼ disks. I use a CD RW drive. I extract the files from floppy disk images and burn the files to a CD. Then I can often run the software that way. Usually works. But it is still possible to get a bad floppy image from the internet archive or win world PC.

rei4 said:
But it is still possible to get a bad floppy image from the internet archive or win world PC.

There were also security systems that relied on properties of the floppy. These include data written in places other than files, normally ignored data like a floppy disk's serial number, data written as values to normally-unused areas like empty file listings of the file allocation table or the disk boot sector, using data in unallocated sectors, or even require that specific files be located in specific physical disk locations like a specific track, cluster, or sector. On a more extreme end sometimes they'd make sections of the disk as bad/unreadable by the hardware and they would need to match, preventing a clone or disk image since the clone would have a readable spot where the corruption was on the original.

They were built so even if you copied all the files, you still wouldn't have everything. A hardware emulator would need to also handle attempts at reading these.

frob said:
There were also security systems that relied on properties of the floppy. These include…

Meh. Spare the effort. It's hopeless.

X-Copy Pro for Amiga - Amedeo Valoroso

:D

(Actually these programs did not always work. I could never copy my GEOS disk which came with the C64, iirc.)

This topic is closed to new replies.

Advertisement