Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.
The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.
A lot of GPUs in this list are basically just previous GPU but faster or more RAM. I kind of thought it was going to focus on interesting new architecture innovations.
I think pairing RX 5700 XT with Control as the "defining game" is an interesting choice, considering the facts 1. AMD cards were incapable of RT at the time and 2. Control was basically the first game with a good, comprehensive RT implementation that had a massive positive impact on the graphics.
I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.
Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.
Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).
The other elephant in the room is the consoles, and even if they're capable of RT they also have to consider the performance capabilities versus visual payoff. As I see it the PC versions of games like Control from studios like Remedy are trailblazers, it's an early implementation (geforce 20 released in 2018, Control was 2019) as the ultra option to shakedown their implementation and start iteration early so future games will benefit, however the baseline is non-RT.
I had the Voodoo 1 with VGA passthrough from the 2D card. When you loaded a game you'd head a little clunk from a relay on the Voodoo taking over the VGA signal and you knew you were about to have a good time. Doesn't seem that long ago!
Agreed, those early manufacturers/models that experimented more feels more relevant than the more incremental listings of multiple 2000 3000 and 4000 series NVidia GPU's.
Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.
And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).
I used my 1080 Ti for about eight years. The successor GPU is in some ways way faster (raytracing, AI features etc.), but in others really quite stagnant considering the huge stretch of time that passed between them. ~10 years for 2-3x performance in GPUs at higher nominal and real price points shows how slow silicon advances have been compared to the 90s and 2000s. The same period from 2000 to 2010 would've seen 1000x performance if not more. The difference between a 1080 Ti and a more expensive RTX 50 card is the RTX can render ideally triple the frames in synthetic benchmarks, double the frames in some rasterizing games (most games won't see gains that high), and do a few relatively tame raytracing tricks at performance which is still not really good. At the same throughput it consumes maybe half the power or a bit less. The difference between a GeForce 2 and e.g a Radeon HD 4k is several planes of existence.
I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.
It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.
The 8800 GT is easily the most impactful GPU in my mind. The combination of that video card with valve's Orange Box was insane value proposition at the time.
I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.
Still using my RX 5700 XT. The amdgpu driver had a major issue resuming from suspend a few months ago[0], but other than that, I'm not aware of (nor have I experienced) any stability issues. Maybe you had a bad card.
I don't like to spend much on hardware, so I bought an 5700XT a few years ago and run a "steam machine" of sorts. Never had any Linux-related problems.
We had the Riva TNT2 in our family computer, so that was fun to see that again, I think it was paired with an AMD K6-2 chip.
One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.
When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.
I don't see my first GPU on there, it was the humble GeForce4 MX440. It could run almost any game I cared about for a surprisingly long time, even if it's not a true modern card.
These days almost all my machines are on iGPUs baked into the CPU. There's way less fun for me, but they are a lot more compact at least.
I'm on a 3060 currently and the changes in the 4xxx and 5xxx just aren't appealing to me. As soon as iGPUs get 3060 performance I'll probably switch. And they aren't far off.
The 9400 GT mattered to me as it was my first gpu. Had bought NFS Carbon only to find that the home pc only had a CD drive not DVD lol, so finally with that drive upgrade also came the 9400 GT and fun ensued.
If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better
About a decade ago, I discovered that the HD 530 iGPU included with my budget-oriented i3-6300 CPU was better-performing than the physically-impressive SLI pair of 9800GTs I had been using, at something like 1/10th the power consumption.
I have fond memories of lending a Voodoo 2 from a friend when I was moving from a 486 to a K6 based system component by component. At that time I was still using my old ISA VGA card, which meant 2D performance was horrible, and I couldn't really watch videos on that thing - but thanks to the Voodoo I could play Unreal Tournament without problems.
This brings so many memories. I remember how badly I wanted an GeForce 6800 Sadly, I was never able to justify spending this much money on a GPU. Still holds true, even today.
Ah I was just trying to remember the model names last week and this website pops up like magic, weird how the internet works sometimes. The 560 Ti was a dream for teenage me and most of my friends back then, but I must say my Radeon HD 4870 game powered most of my favourite Team Fortress 2 years.
Yeah the 560 Ti was insanely popular in my group of friends. In ~2004 there was a good amount of FX 5700s, some people struggling on Geforce 4, and some on the FX 5900 Ultras. Some were updating every two years, some closer to four. When the 560 Ti came out, everyone got it.
Worth noting this covers consumer gaming GPUs only — the cards most of us are nostalgic about, but a different lineage than what actually drives Nvidia's revenue today. That said, gaming silicon is where most of the foundational architecture innovations originated: unified shaders, async compute, hardware ray tracing all debuted on consumer cards before being repurposed for datacenter workloads. The H100 exists because of the engineering path that ran through the 8800 GTX and Volta Titan. A companion visualization of "every GPU that mattered for AI" would be much shorter and start much later.
The title of site should probably have "for gaming" at the end as it doesn't consider GPUs for compute such as the A100 or the GTX 580 3GB that AlexNet was trained on.
I see it as similar to virtual reality, it was born and grew up with gaming demands and influences, but other disciplines may be more attractive for a mature product
I don't think there's strong evidence of this being an ad. I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list. I think it's just that Nvidia has been the dominant force in consumer-level GPUs for a while now.
> I don't think there's strong evidence of this being an ad.
There is strong evidence. Click on the link above. It was posted by a viral marketing company. They even feature the GPU story on their website: https://sheets.works/data-viz
> I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list.
Yes, because otherwise the ad would be too obvious.
I think it's a terrible UI - requires 3 different things to see the GPUS: scrolling vertically down to see the Era buttons which then scrolls up and hides the Era buttons even if you have enough vertical screen space, clicking on the Era buttons, clicking < > buttons to see the GPUs of an Era.
I can't remember last time I've seen such a confused design.
[delayed]
It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.
At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.
Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.
https://en.wikipedia.org/wiki/S3_Texture_Compression
Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.
G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's
The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
Like virtualized NICs pretending to be an NE2000? That's interesting, do you know why they'd use a G200 and not something like an older ATI chip?
Probably started out as a real G200 chip which might’ve been the cheapest and easiest to integrate in the 2000s? Or it had the needed I/O features to support KVM (since this would’ve involved reading the framebuffer from the BMC side), or matrox was amenable to adding that.
A lot of GPUs in this list are basically just previous GPU but faster or more RAM. I kind of thought it was going to focus on interesting new architecture innovations.
like the PS3? seems like everything is using PC architecture now. it does have RDNA.
I think pairing RX 5700 XT with Control as the "defining game" is an interesting choice, considering the facts 1. AMD cards were incapable of RT at the time and 2. Control was basically the first game with a good, comprehensive RT implementation that had a massive positive impact on the graphics.
> massive positive impacts on graphics
I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.
Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.
Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).
The other elephant in the room is the consoles, and even if they're capable of RT they also have to consider the performance capabilities versus visual payoff. As I see it the PC versions of games like Control from studios like Remedy are trailblazers, it's an early implementation (geforce 20 released in 2018, Control was 2019) as the ultra option to shakedown their implementation and start iteration early so future games will benefit, however the baseline is non-RT.
I had the Voodoo 1 with VGA passthrough from the 2D card. When you loaded a game you'd head a little clunk from a relay on the Voodoo taking over the VGA signal and you knew you were about to have a good time. Doesn't seem that long ago!
Honorable mention, the Rendition Vérité 1000 https://fabiensanglard.net/vquake/index.html
Released before the Voodoo 1 with glquake and gl support for Tomb Raider.
Very interesting culture difference between rendition and 3dfx in their chip design approach..
Agreed, those early manufacturers/models that experimented more feels more relevant than the more incremental listings of multiple 2000 3000 and 4000 series NVidia GPU's.
Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.
And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).
I'm still rocking a Z97, i7-4790k and a 980Ti :) I'm still waiting until I need an upgrade. DDR3 is still performing good enough for the games I run.
I was running a 970ti for the longest time, it was only when I wanted to get into some VR gaming that it was time for an upgrade.
Same. Still play StarCraft2 on a 4790k and AMD R9 Fury X.
I also have that exact setup sitting around, but am just using my ryzen laptop now.
I used my 1080 Ti for about eight years. The successor GPU is in some ways way faster (raytracing, AI features etc.), but in others really quite stagnant considering the huge stretch of time that passed between them. ~10 years for 2-3x performance in GPUs at higher nominal and real price points shows how slow silicon advances have been compared to the 90s and 2000s. The same period from 2000 to 2010 would've seen 1000x performance if not more. The difference between a 1080 Ti and a more expensive RTX 50 card is the RTX can render ideally triple the frames in synthetic benchmarks, double the frames in some rasterizing games (most games won't see gains that high), and do a few relatively tame raytracing tricks at performance which is still not really good. At the same throughput it consumes maybe half the power or a bit less. The difference between a GeForce 2 and e.g a Radeon HD 4k is several planes of existence.
Hey, I could have used that i7-4790k!
I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.
It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.
The 8800 GT is easily the most impactful GPU in my mind. The combination of that video card with valve's Orange Box was insane value proposition at the time.
I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.
Came here for this ommission. I saved up for a long time to get an 8800 GTX, and I had that card for 5 years before upgrading again.
I retired my 5700 XT a few years ago. Wasn't there some kind of hardware problem with it? It kept locking up my Linux kernel.
Still using my RX 5700 XT. The amdgpu driver had a major issue resuming from suspend a few months ago[0], but other than that, I'm not aware of (nor have I experienced) any stability issues. Maybe you had a bad card.
0: https://gitlab.freedesktop.org/drm/amd/-/issues/4531
I don't like to spend much on hardware, so I bought an 5700XT a few years ago and run a "steam machine" of sorts. Never had any Linux-related problems.
I know sheets.works was made with an agent, however, still good taste on the design.
That mattered on the PC evolution, it misses many others e.g TMS34010.
https://en.wikipedia.org/wiki/TMS34010
We had the Riva TNT2 in our family computer, so that was fun to see that again, I think it was paired with an AMD K6-2 chip.
One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.
When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.
Missing the Radeon RX Vega 64!
I don't see my first GPU on there, it was the humble GeForce4 MX440. It could run almost any game I cared about for a surprisingly long time, even if it's not a true modern card. These days almost all my machines are on iGPUs baked into the CPU. There's way less fun for me, but they are a lot more compact at least.
That will probably be my next GPU.
I'm on a 3060 currently and the changes in the 4xxx and 5xxx just aren't appealing to me. As soon as iGPUs get 3060 performance I'll probably switch. And they aren't far off.
The MX440 is a nearly 25 year old GPU, it performed somewhere between a Geforce 2 and GeForce 3 ti 200.
It was a good budget option those decades ago.
Yes the MX440 deserves to be on this list. More important than the GeForce2 imo.
The 9400 GT mattered to me as it was my first gpu. Had bought NFS Carbon only to find that the home pc only had a CD drive not DVD lol, so finally with that drive upgrade also came the 9400 GT and fun ensued.
I really want to see TDP over time.
If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better
Here's one anecdotal datapoint:
About a decade ago, I discovered that the HD 530 iGPU included with my budget-oriented i3-6300 CPU was better-performing than the physically-impressive SLI pair of 9800GTs I had been using, at something like 1/10th the power consumption.
(It didn't do PhysX, but nobody cared.)
I have fond memories of lending a Voodoo 2 from a friend when I was moving from a 486 to a K6 based system component by component. At that time I was still using my old ISA VGA card, which meant 2D performance was horrible, and I couldn't really watch videos on that thing - but thanks to the Voodoo I could play Unreal Tournament without problems.
This brings so many memories. I remember how badly I wanted an GeForce 6800 Sadly, I was never able to justify spending this much money on a GPU. Still holds true, even today.
I had the 6600 GT, insane price-perf ratio, kept it for like 8 years
Ah I was just trying to remember the model names last week and this website pops up like magic, weird how the internet works sometimes. The 560 Ti was a dream for teenage me and most of my friends back then, but I must say my Radeon HD 4870 game powered most of my favourite Team Fortress 2 years.
Yeah the 560 Ti was insanely popular in my group of friends. In ~2004 there was a good amount of FX 5700s, some people struggling on Geforce 4, and some on the FX 5900 Ultras. Some were updating every two years, some closer to four. When the 560 Ti came out, everyone got it.
I don't understand this - where's Trident VGA?
Surprised PUBG was the defining game for so many. I don’t recall it being a demanding one.
My old GTX770 sitting in a drawer somewhere appreciates this post.
Worth noting this covers consumer gaming GPUs only — the cards most of us are nostalgic about, but a different lineage than what actually drives Nvidia's revenue today. That said, gaming silicon is where most of the foundational architecture innovations originated: unified shaders, async compute, hardware ray tracing all debuted on consumer cards before being repurposed for datacenter workloads. The H100 exists because of the engineering path that ran through the 8800 GTX and Volta Titan. A companion visualization of "every GPU that mattered for AI" would be much shorter and start much later.
not a very good list, from a historical perspective it’s missing many important cards, as mentioned by others
also, the gpu did not exist until 1999
looks like this was created for engagement
1999? You sure?
The point is that Nvidia popularized the term, Id guess.
Nvidia called the Geforce 256 the first ever GPU.
This is such a cool visualization. Thanks for creating it!
The title of site should probably have "for gaming" at the end as it doesn't consider GPUs for compute such as the A100 or the GTX 580 3GB that AlexNet was trained on.
You all fell for a marketing site for: https://sheets.works.
I have to say that this site is complete low-effort slop.
not the whitehouse.gov design language
Oh, my beloved TNT2 Ultra.
mine too
I was so sad when I retired my 1060 6GB. That thing served me well for almost a decade.
Gaming GPUs only which are those we are all nostalgic about, but hardly the ones that matter now for Nvidia.
I see it as similar to virtual reality, it was born and grew up with gaming demands and influences, but other disciplines may be more attractive for a mature product
Turns out corporations and governments can pay way more than individuals.
Missed the Voodoo 5 5000 which laid the ground work for nvlink
> We build visual stories like this for companies
Combined with the color scheme of this site, this might be a cleverly disguised Nvidia ad.
Edit: Clicking through to their main page [1]: yeah, that's definitely an Nvidia ad.
1: https://sheets.works/data-viz/hire
I made this, and it's not an ad. Chose Nvidia colours, thinking that a GPU website should seem familiar
You seem to be affiliated with sheets.works, so it appears to be an ad for that site then.
I noticed that the list seemed a little Nvidia heavy when there were absolutely other cards that deserved a mention in the earlier years.
I don't think there's strong evidence of this being an ad. I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list. I think it's just that Nvidia has been the dominant force in consumer-level GPUs for a while now.
> I don't think there's strong evidence of this being an ad.
There is strong evidence. Click on the link above. It was posted by a viral marketing company. They even feature the GPU story on their website: https://sheets.works/data-viz
> I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list.
Yes, because otherwise the ad would be too obvious.
I think it's a terrible UI - requires 3 different things to see the GPUS: scrolling vertically down to see the Era buttons which then scrolls up and hides the Era buttons even if you have enough vertical screen space, clicking on the Era buttons, clicking < > buttons to see the GPUs of an Era.
I can't remember last time I've seen such a confused design.
Appreciate the feedback, fixed it
Why didn't datacenter GPUs make the list. AI trained with them is such a significant part of computing today.
Because consumers don't care about them, probably. They're never going to be remembered fondly like gaming cards.
Website is called "Every GPU that mattered". The GPUs that trained AlexNet, GPT-1 and 2 are probably the most consequential GPUs in compute history.
Sure, I just explained why they probably aren't there. Every GPU that gamers cared about isn't as catchy, I suppose.
The reason datacenter cards even exist are gaming GPUs. gaming basically funded GPU development up to the point of AI explosion.
So no, the most important AI card isn't AI card, it's gaming GPUs that funded that mess
>No RX480
Hard pass.