26 comments

  • david-gpu 8 hours ago

    I began writing GPU drivers in 2006 and had to support some legacy fixed-function chips from the late 90s at one point.

    I think you and other commenters pretty much summarized what it was like. Documentation was often poor, so you would sometimes have to reach out to the folks who had actually written the hardware (or the simulator) for guidance.

    That is the easy part of writing a driver, even today. Just follow the specification. The code in a GPU driver is relatively simple, and doesn't vary that much from one generation to the next. In the 90s some features didn't have hardware support, so the driver would do a bunch of math in the CPU instead, which was slow.

    I'm contrast, the fun part are the times when the hardware deviates from he specification, or where the specification left things out and different people filled in the blanks with their own ideas. This is less common nowadays, as the design process has become more refined.

    But yeah, debugging hardware bugs essentially boils down to:

    (1) writing the simplest test that triggers the unexpected behavior that you had observed in a more complex application, then

    (2) providing traces of it to the folks who wrote that part of the hardware or simulator,

    (3) wait a few days for them to painstakingly figure out what is going wrong, clock by clock, and

    (4) implement the workaround that they suggest, often something like "when X condition happens on chips {1.23, 1.24 and 1.25}, then program Y register to Z value, or insert a command to wait for module to complete before sending new commands".

    It was more tedious than anything. Coming up with the simplest way to trigger the behavior could take weeks.

    Well, that's what it was like to write user mode drivers. The kernel side was rather different and I wasn't directly exposed to it. Kernel drivers are conceptually simpler and significantly smaller in terms of lines of code, but much harder to debug.

  • MisterTea 3 days ago

    You can download the Voodoo 2 programming manual which is only around 250-something pages long. The Voodoo2 was fixed function; you loaded assets into the voodoo memory then called functions to operate on those assets. The driver takes care of those two roles by loading and managing the assets in the voodoo memory and an API to program the registers on the card to execute functions on the loaded assets. There were more steps involved with geometry processing which happened in the driver but I am unsure if those were handled in user space by the libraries the application called or the driver code itself.

    This isn't 250 something pages, only 132 so maybe I was wrong, but its a good look into how the Voodoo2 worked: https://www.dosdays.co.uk/media/3dfx/voodoo2.pdf

    See also: https://3dfxarchive.com/reference.htm

    A fun tidbit is the voodoo2 has a 2D mode but was not VESA compliant so it could not be used in a PC without being tied to a VESA card for 2D graphics. I believe that ability was there for custom and non-pc platforms.

    • markus_zhang 3 days ago

      Thanks, that’s interesting. I never owned a Voodoo card but I was definitely drooling over it when I saw a demo machine running Unreal on it.

      And the second link definitely has everything one needs to do everything with those cards.

    • ferguess_k 3 days ago

      Thanks a lot! I'll take a look. It's really a treasure trove. But I probably need to purchase some Voodoo card to do real work on it.

  • bluedino 6 hours ago

    Graphics programming "in the 90s" encompasses a pretty wide range.

    You had raw access to things like VGA cards and then different tricks per each individual card when it came to things like SVGA

    http://qzx.com/pc-gpe/index.php

    You had a few things like UniVBE that tried to create a common platform for VESA cards to be used

    https://wiki.sierrahelp.com/index.php/Scitech_UniVBE

    Meanwhile, you had things like video drivers for Windows, usually basic 2D functionality:

    https://www.os2museum.com/wp/windows-9x-video-minidriver-sou...

    https://www.os2museum.com/wp/antique-display-driving/

    In the mid-90's we started getting 3D hardware like 3Dfx and Rendition cards, they each had their proprietary interfaces.

    And then finally, the 3D hardware was adopted by standards like OpenGL and DirectX.

  • TapamN 8 hours ago

    I created a (currently not publicly released) driver for the 1998 Sega Dreamcast's video hardware from scratch. It supports additional features over the driver in the open source homebrew OS, KallistiOS (KOS), like better render-to-texture support (the KOS driver only supports rendering to framebuffer sized textures), tile multipass (which allows for accumulation buffer style effects or soft shadows), and dynamically toggling anti-aliasing on the fly (with KOS it's fixed after init). Some screenshots of my driver can do are here: https://imgur.com/a/DyaqzZD

    I used publicly available documentation (like https://www.ludd.ltu.se/~jlo/dc/ and the now defunct dcdev Yahoo Group), looked at the existing open source KOS driver, and looked at the source for Dreamcast emulators to figure out how things worked.

    The GPU in the Dreamcast is a bit more complicated than PSX/PS2/GC since it doesn't accept polygons and draw them directly to the framebuffer. It's a tile-based deferred renderer, like many mobile GPUs, so it instead writes the polygons to a buffer in video RAM, then later walks through the polygons and renders the scene in tiles to an on-chip 32x32 pixel buffer, which finally gets written to RAM once.

    This allows the Dreamcast to have a depth-only fillrate close to the 360 and PS3 (DC is 3.2 GPix/s vs 360/PS3 4.0 GPix/s), and it basically preforms a depth-only prepass to avoid doing texture reads for obscured texels. It can also preform per-pixel transparency sorting (order-independent transparency) with effectively no limit to the number of overlapping pixels (but the sorter is O(n^2), so a lot of overlap can become very expensive).

    To get a working driver for the Dreamcast, you have to set up some structures in video RAM so that the hardware knows what polygons are in what tile. Another thing the driver needs to do is coordinate the part of the hardware that takes polygon commands and writes them to video RAM, and the part that actually does rendering. You typically double buffer the polygons, so that while the hardware is rendering one frame, user code can submit polygons in parallel for the next frame to another buffer.

    My driver started as just code in "int main()" to get stuff on the screen, then I gradually separated stuff out from that into a real driver.

  • thibaut_barrere 7 hours ago

    There were quite a few books and resources on the topic: Michael Abrash Zen of Graphic Programming (https://archive.org/details/zenofgraphicspro00abra), Ralph Brown interrupt list (https://ctyme.com/rbrown.htm), books like “PC Interdit” in France etc.

    And online stuff as well.

    Graphic programming, without a GPU, and even without a FPU, was quite interesting (here is a realtime-ish phong rendering I implemented circa 1995, without any float numbers https://m.youtube.com/watch?v=eq5hzUkOJsk).

    A lot of stuff can be found online these days.

    Have fun!

  • dapperdrake 2 days ago

    Most of the math and most of the problem domain are essentially the same as today. Just hardware got many magnitudes more capable. That makes all the difference.

    As always: talk to hardware engineers about their hardware. Drivers are involved, because they are all about the hardware. The way that software wants to view the world doesn’t really matter there. (You probably already knew most or all of this.)

    Disclaimer: Didn’t program drivers back then. But I do go deeply into that field of math.

  • Lumoscore 3 days ago

    From what I’ve seen, a lot of 90s driver work was exactly that mix of partial docs, trial-and-error with registers, and mailing some engineer at the card vendor hoping they’d admit to a bug. It wasn’t glamorous, but it’s kind of wild how much of it came down to persistence and a bit of luck

    • ferguess_k 3 days ago

      Thanks. I bet there were a lot of battle stories like what I read. Alas most of those went into history's garbage bin :/

      I was even thinking about getting my hand on a few cheap physical cards (not sure which ones are cheaper), a Pentium box, and see if I can do anything -- even displaying some colors is fun enough.

  • jacquesm 5 hours ago

    Funny, I've been reviving an OS I wrote in the 90's as well as the graphics package, window manager and a couple of demo apps that I wrote to go with it. It's been a long and arduous trip down memory lane but after a couple of weeks of work I have it working under Qemu or Virtualbox. The thing that still befuddles me is how to get the COM ports to work, for some reason they really don't emulate those well in either one of those.

    Yesterday evening I finally got VGA modes 16 and 18 working, so now I have 640x350 and 640x480 graphics. Major milestone, and lots of fun to get it working this far. The last time it ran was 26/Jan/1994 so it's been mothballed since then and the old hardware is long gone.

    As for how it is done: you use the BIOS to switch modes, then you have some basic info about the memory map of the video mode and usually a page register where you can set which part of the memory on the graphics card is exposed in a window starting at A000:0000. Later graphics modes, such as VESA based modes used a slightly different mode switch and tended to have the option to address all of the RAM in a contiguous stretch doing away with the bankswitching, which greatly improved low level performance.

    All of the graphics primitives were in software, no HW acceleration at all.

    • toast0 14 minutes ago

      > The thing that still befuddles me is how to get the COM ports to work, for some reason they really don't emulate those well in either one of those.

      What are you having trouble with for COM ports? Checking status (port + 5) and interrupt identification (port + 2) might help if you're missing interrupts?

  • JohnFen 3 days ago

    During that time, I had a job for a major games company doing nothing but developing Windows graphics card drivers. They were moderately complex beasts (enormously complex compared to other device drivers), but not really that huge of a thing.

    The biggest effort about them was reverse-engineering certain cards. The games often used very strange video settings, and the card manufacturers had poor, sometimes no, documentation about their operation at a low level.

    • ferguess_k 3 days ago

      Thanks for sharing. I'm surprised that game companies would do that. Is it the Voodoo era or slightly earlier (S3)? Does that mean you actually ship drivers with games? What did the card manufacturers say about this?

      • JohnFen 2 days ago

        Earlier. The standard video drivers didn't support video modes needed for more advanced video game graphics, making custom drivers necessary. The card manufacturers didn't mind. Why would they? If a hit video game supported their card, they sell more cards. Even better when they didn't have to put any substantial resources into that support.

        • prox 8 hours ago

          Do you still write video drivers or is it mostly covered like with Nvidia drivers?

          • JohnFen 7 hours ago

            It was a long time ago. The trajectory of my career took me in an entirely different direction.

  • vertnerd 6 hours ago

    Fun link. I never wrote GPU drivers, but it does remind me of writing my first Ethernet card driver back in the day. I felt like I had decoded the Rosetta Stone, and there was absolutely no one to talk to who understood how that felt.

  • trollbridge 6 hours ago

    Writing graphics card drivers in the 1990s was a nightmare, and often the most complex part of an operating system, particularly ones that supported protected mode and virtual DOS machines.

    Windows/386's graphics card driver was so integrated with the protected-mode kernel that it had an entirely different kernel for each graphics card. It came out of the box with support for CGA and EGA, and a later version added Herculues, VGA, and 8514/A. That was it.

    By Windows 3.0, it had a more modular architecture - but this made the drivers even harder to write. A driver that would run in user space had to be written in 16-bit assembly or C. And then a secondary driver that would run in kernel space had to be written in 32-bit assembler, and that driver had to completely virtualise the device for DOS apps (and also had to provide a virtual device the aforementioned 16-bit driver could use).

    Drivers for OS/2 got even more complex: the above 16-bit Windows driver was still needed. Plus a 32-bit protected-mode driver for OS/2 apps. Plus a "base" 16-bit driver for character mode when the system was booting, although most cards could use the generic VGA or SVGA driver for that. Plus the 32-bit virtualisation driver for DOS apps. Plus a specialised 16-bit Windows driver that would share the display with the 32-bit protected-mode driver so that both Windows and OS/2 apps could be on the display at the same time.

    XFree86 was the usual way to support graphics cards on Linux, alongside very simple kernel-based drivers for character support - or more complex drivers for framebuffer support. If you wanted to run XFree86 on OS/2, you had to have a separate executable (just like Linux) for the specific graphics driver. So an OS/2 machine could end up having this many graphics drivers:

    - Base video driver, e.g. BVHSVGA - 16-bit Windows full screen driver - 16-bit Windows seamless driver - 32-bit OS/2 Presentation Manager driver - 32-bit virtualisation driver - 32-bit XFree86 driver

    NT would be yet another driver, although at least on NT it was just a single kernel space driver, and Windows 95 yet another. Windows 95 needed the same virtualisation driver as prior versions. NT didn't typically provide that, but NT was also notorious for poor support for DOS games because of that.

    Some vendors actually supported all of these. You'd get a couple disks with a graphics card for GEM, DOS (VESA), OS/2 (16-bit and 32-bit), Windows 2.x, 3.x, NT 3.x, and even SCO Unix. The workload to write all these drivers was insane.

  • justincormack 7 hours ago

    SGI ran OpenGL in hardware on their cards.

  • PaulHoule 3 days ago

    These were pretty proprietary I remember.

    • ferguess_k 3 days ago

      Yeah I think that was the case and still the case for many companies (nVidia). From what I briefly looked up, good thing that we can now develop drivers for virtual graphic card, and there are OSS drivers from both Intel and AMD.

  • Eisenstein 7 hours ago

    Not sure if this helps you, but John Carmack wrote a guide to writing GPU drivers for Quake 3 Area back in '99:

    * https://web.archive.org/web/20000303225420/http://www.quake3...

  • contingencies 8 hours ago

    In that era I remember I had one computer with a particular Oak(?) VGA card and it ran Heretic at something like 2x the framerate of any other system. No idea why. Might have been the motherboard or the memory speed or something. Never got to the bottom of it. Total mystery.

    • fredoralive 6 hours ago

      Assuming other factors (CPU etc) are similar, possibly the card used a faster bus like VESA local bus or PCI whilst the others were plain ISA?

      • contingencies 3 hours ago

        Pretty sure this was ISA era. So yeah, VLB would make sense. While I don't recall it being any longer than regular cards, I just checked and that was only 12mm difference. On the face of it you may have just solved the mystery, but talking to the AI most of the supposedly faster chipsets of the time like S3, Cirrus Logic, ET4000, etc. were definitely on my radar ... so I'm not sure that's the reason, because this system definitely outperformed others. My running theory was that the developer hand-tweaked their graphics code on a similar configuration system. In the absence of any hard info (honestly can't really trust my memory after ~30 years!) I guess that'll remain the guesstimate explanation.