Intel's CEO wanted to buy NVidia 20 years ago. Unfortunately for Intel, the board of directors killed the proposal as "too expensive" at $20B. Granted no one could see the future of AI on GPUs, but think of the ROI on that investment.
Not sure that would have mattered once LLMs emerged. NVidia's success is mostly a matter of being the only major advanced GPU player at the time when LLMs exploded onto the scene. That won't be the case for long. The big players are already heavily investing in high performance AI chips to wean themselves off a dependance on NVidia.
Nvidia’s success is due to a decade+ of investment into their software stack (CUDA and friends), built on a world view that more software can and should be massively parallel.
They were a key cause for LLMs being a thing in the first place.
This. I worked in HPC between 2010 and 2013, and people were trying to compete with NVidia in the GPGPU space even back then, seeing the way the wind was blowing. Compute bandwidth, and FLOPS/watt has steadily been more and more important the last 15 years, even before AI.
NVidia has continued to stay ahead because every alternative to CUDA is half baked trash, even when the silicone make sense. As a tech company, trading $$ for time not spent dealing with compatibility bugs and broken drivers pretty much always makes business sense.
To be fair, Cuda has improved a lot since 2014 or so. I messed up my Linux box multiple times trying to install Cuda but the last time it was just apt install and maybe setting ld library and it all just worked.
It’s pretty bad. just when you think you can order AMD chips since there is no shortage, and use a translation layer and have a cheap AI datacenter, it turns out AMD is fumbling the ball at every step of the way
I think this undersells the story. NVidia's success is built on innovating and scaling for 20 years. Vastly oversimplifying it:
- CUDA conception in 2006 to build super computers for scientific computing.
- CUDA influencing CPU designs to be dual purpose, with major distinction of RAM amounts (for scientific compute you need a lot more RAM compared to gaming)
- Crypto craze driving extreme consumer GPU demand which enabled them to invest heavily into RND and scale up production.
- AI workload explosion arriving right as the crypto demand was dying down.
- Consistently great execution, or at least not making any major blunders, during all of the above.
As someone who's intimately familiar with both cultures, I'm convinced Intel would have killed any innovation Nvidia had going for it before it had a chance to really take off.
Management at the two could not be more opposite if they tried.
Intels 3rd wave management by MBA would have ruined nvidia way back then forcing them into something dumb and blocked all of the R&D they did since since that's what they also did at intel.
Other companies are capable of making big GPUs; they aren't the only TSMC customer. Intel themselves have perfectly fine GPUs. Their issue is that their management never allocates enough space in their chips to let them perform well.
Nvidia's advantage is that they have by far the most complete programming ecosystem for them. (Also honestly… they're a meme stock.)
LLMs absolutely did not emerge because of NVIDIA. you’re the one imho who is mistaking correlation with causality.
The first transformer models were developed at Google. NVIDIA were the card du jour for accelerating it in the years since and have contributed research too, but your statement goes way too far
I was around the space before Alexnet came out. Without NVidia the last 15 years of AI would not have happened.
Deep learning was only possible because you could do it on NVidia cards with Cuda without having to use the big machines in the basement.
Trying to convince anyone that neural networks could be useful in 2009 was impossible - I got a grant declined and my PhD supervisor told me to drop the useless tech and focus on something better like support vector machines.
Just like how ROCm is supposed to be competitive today and it isn't unless you have an army of grad students to look after your data center cards.
I tried using AMD Stream and it lacked documentation, debugging information and most of the tools needed to get anything done without a large team of experts. NVidia by comparison could - and was - used by single grad students on their franken-stations which we build out of gaming GPUs.
The less we talk about the disaster that the move to opencl was the better.
Not with Rocm since I’ve moved my personal stack to NVIDIA (for rendering) and Macs for day to day use.
I did write quite a bit of OpenCL prior to that on Intel/AMd/NVIDIA, both for training and for general rendering though, and did some work with Stream before then.
Was it OpenCL1? That's the only one I hadn't tried out for AMD GPUs. Everything else I have and can say with absolute certainty that you spend more time fighting the hardware than you did writing code.
Both 1 and 2. I haven’t done much with 3 as OpenCL is effectively a dead api at this point.
1 was definitely a lot easier to work with than 2. CUDA is easier than both but I don’t think I hit anything I could do in CUDA that I couldn’t do in OpenCL, though CUDA of course had a larger ecosystem of existing libraries.
I think their point is that easily accessible hardware capable of supporting the research is the reason the research has come as far as it has, and I would tend to agree. At the very least, GPUs keeping the PCI ecosystem going has played a major role in allowing the specialized accelerator market to flourish.
But that could apply to any of the GPU manufacturers. CUDA made for an easier ecosystem but if it didn’t exist it would have been any of the other APIs.
The first transformer models didn’t even use CUDA and CUDA didn’t have mass ecosystem inroads till years later.
I’m not trying to downplay NVIDIA but they specifically mentioned cause and effect, and then said it was because of NVIDIA.
First transformer != LLM. Imagine a world where you had to use AMD or CPUs, there would be no AlexNet, there would be no LLMs. Nvidia seeded universities with gifted hardware accelerators for a over a decade. Nvidia built the foundations for modern ML on which transformer lies, it's just one step on a long road to LLMs.
AMD had GPU compute APIs at the time as well. They also used to contribute GPUs (albeit in much smaller quantities). They just ended up killing them in favor of OpenCL which then withered on the vine.
NVIDIA absolutely contributed to the foundation but they are not the foundation alone.
Alexnet was great research but they could have done the same on other vendors at the time too. The hardware didn’t exist in a vacuum.
OpenCL itself is kinda fine, but the libraries never existed because every academic from postdoc up could get a free NVidia card. You literally filled a form out and they sent you it.
Please, this is just revisionism. There’s nothing inherent to AlexNet that relied on NVIDIA hardware or even software. It was just what happened to be available and most straightforward at the time.
To say it wouldn’t have been possible on AMD is ludicrous and there is a pattern to your comments where you dismiss any other companies efforts or capabilities, but are quite happy to lay all the laurels on NVIDIA.
The reality is that multiple companies and individuals got us to where we are, and multiple products could have done the same. That's not to take away from NVIDIA's success, it's well earned, but if you took them out of the equation, there's nothing that would have prevented the tech existing.
> It was just what happened to be available and most straightforward at the time
AMD made better hardware for a while and people wanted OpenCL to succeed. The reason why nvidia became dominant was because their competitors simply weren’t good enough for general purpose parallel compute.
Would AI still have happened without CUDA? Almost certainly. However nvidia still had a massive role in shaping what it looks like today.
> The first transformer models didn’t even use CUDA and CUDA didn’t have mass ecosystem inroads till years later.
I graduated college in 2010 and I took a class taught in CUDA before graduating. CUDA was a primary driver of NN research at the time. Sure, other tools were available, but CUDA allowed people to build and distribute actually useful software which further encouraged the space.
Could things have happened without it? Yeah, for sure, but it would have taken a good deal longer.
Nvidia will make general purpose GPUs while the AI players will make ASICS. I guess most data center will prefer GPUs so customers can run whatever AI model they need.
And if the Intel board jumped on every expensive acquisition novelty that came to the table, how does that go?
Everyday I look at stocks and wish I'd known to buy the ones that went up today.
Overall, pretty uninteresting and uninsightful, unless you have Doc Brown's DeLorean with the Mr. Fusion upgrade (BTTF II). Even then, would it actually be good if you built your own reality to such an extent? I'm sure life's weird for the UHNW's, unclear if it's actually better. We're all still stuck on the same planet, our kids and kid's kids will all face the same struggles.
Even still, today is a pretty interesting temporal location to occupy!
Before AI Nvidia made some nice money from the crypto craze. Also, during pandemics and after, but before AI, there was a severe shortage of Nvidia GPUs.
>during pandemics and after, but before AI, there was a severe shortage of Nvidia GPUs.
you'd think that such multi-year shortage of a product would be used as an opportunity by other players to jump in and make great money. Yet that major machinery of capitalism is failing here.
DIA has ~$35.62 billion, IYY has ~$2.5B in the DJIA. There are only 30 companies in it (~$12T total value), so it averages $1.1B per company.
But the S&P 500 index funds has somewhere around ~$2.5T invested, but that is spread across 500 companies (~$40T total valuation), so it averages $5B per company.
Thus you are correct the S&P 500 is more influential in terms of index fund reallocations, but only roughly 5x more influential.
The dow has nothing to do with index funds. The dow industrial average is price weighted. The s&p 500 is market weighted. No competent index would follow the dow- it doesn't even make sense conceptually, it has zero relation to the economics of the companies when you use price weighting, and 30 companies is a stupid low number.
There is, as far as I can tell, zero point to the dow, it's a completely useless tracker that is reported on because people talk about it because it's reported on.
This may make sense in theory but in reality is wrong. The DJIA has historically had a very close correlation to the S&P. Plus there is an inescapable psychological aspect to the Dow that is unique. When the Dow is off 1500 points it hits much differently than the S&P falling 200 points.
The S&P 500 is cap-weighted, so the average value per company is meaningless. Nvidia is currently 7% of the index, while the smallest member is 0.01%.
Since Nvidia was already a part of the S&P 500 (and other similar indices) prior to its big run, those index investors generally profited from its rise. New flows into those funds do help prop it up, though.
The DJIA is a weird historical relic, and there's little reason for anyone to buy a fund tracking it. It's possible that those who did anyways will end up holding a tiny fraction of the bag due to this change, but it's not a big effect.
That’s half the story, there’s a lot of private funds that passively track as well, it just comes down to what’s the net flow due to rebalance of public + private indices. Could be a lot, could be that much
You are correct, this will affect both Intel and NVIDIA's stock prices, although it may already have been baked in depending how predictable this was.
Index funds are big business. Dropping Intel and replacing it with NVIDIA will cause a rebalancing of the DJIA index fund investments from Intel to NVIDIA. Yup, there is an index premium.
Index funds follow indices and to follow an index asset manager holds a weighted basket of respective stocks. If stock is excluded from the index, asset manager sell it off and that puts downward pressure on the stock price. Reverse happens if stock is included into the index.
Liquidity is held by funds. Funds (some of them pensions) have rules where to allocate. They just follow these rules and allocate. Now this change will change the allocation to Nvidia. Investors can tap into this liquidity. The $$$ are then transferred from the funds (some of them are pensions, uh I mentioned that) to the investors.
AI bubble then burst. Funds are “adjusted” and someone is not getting his pension.
DJIA is not taken very seriously outside of mainstream media it seems. The news loves to quote big drops or increases to the DJIA due to the index being price weighted. “Wow the news is saying the DJIA dropped 1000 points today! That’s huge.” Nah, that’s like 2%
This seems like another media opportunity about a nothing burger event for an index that has lost relevance, when other indices like the S&P 500 and QQQ have already incorporated NVIDIA a while ago. They’re just playing catch up.
I’d say that depends. Many media organizations operate without profits — or even incurring losses — in order to serve the public good, promote an ideology, engage in activism, spread misinformation, etc.
Yes, several state-run and private media organizations operate without profits or at a loss in order to spread misinformation.
> which major media operations are operating at a loss how for the public good (discounting state run) and how?
Sorry, I was unable to properly understand your question.
In any case, I never said the organizations that are not primarily concerned with chasing profits are “major” ones. I believe that applies best to lesser-well-known entities, like those who live on social media or the blogosphere. Though there are some instances of major state-run companies, even on TV and on the radio, that operate in a similar fashion.
Also, I am surely not equating the spread of misinformation with serving the public good — these are just distinct objectives that may be sought by media organizations, rather than avoiding bankruptcy.
As one example, the Chicago NPR affiliate, WBEZ, acquired the Chicago Sun-Times as what was widely described as a "no-cash" deal.[1] A rather more truthful description is that WBEZ was paid $61 million (largely through philanthropic support) to take over a long-ailing newspaper, as described by the city's other long-ailing paper, the Chicago Tribune:
"Chicago Sun-Times becomes nonprofit newspaper with $61 million in backing as WBEZ merger closes"
Of newspapers operating at a loss or as philanthropies, there are the Baltimore Banner,[2] The Guardian,[3] and ProPublica,[4] which all operate as non-profits, relying on a mix of advertising, subscriptions, and philanthropy. The privately-held, for-profit Washington Post is a for-profit paper that's operated at a loss for years, and this before losing ~10% of its subscribers due to recent editorial decisions.[4][5]
There are numerous propaganda institutions (usually labeled as "think tanks") promulgating various ideologies or interests, with the Atlas Network being amongst the largest and most influential:
4. ProPublica was established as a 501(c)(3) in 2007, with funding from the Sandler, Knight, MacArthur, and Ford foundations, along with the Pew Charitable Trusts, Carnegie Corporation, and Atlantic Philanthropies: <https://en.wikipedia.org/wiki/ProPublica>.
Or it's a time-honored way, neutral, way to communicate the impact of business news. YMMV, in my experience not based not on facts, but how willing I am to catastrophize things in order to make things I don't like simple to blame on someone.
> DJIA is not taken very seriously outside of mainstream media it seems
AFAIK its only real utility is if you want to make very long term comparisons over the years against old values of itself, where its long baseline may be valuable compared to other metrics that don't stretch as far.
Everything else is just flim-flam for getting views/clicks or comforting very old viewers with something that is a familiar staple.
The dow is price weighted. Vanguard, being a sort of non profit, probably doesn't want people investing in an obviously horrible index that should have gone away decades ago.
They don't care about people investing in losing bets. But they care about wasting resources on running stuff that does not sell. And not having a plus followed by big number and percentage sign in column next to name is bad seller.
In the end how funds are marketed and presented is important part to understand. Also it is better for them to sell actively managed fund with bigger number on it than index with negative one.
> nothing burger event for an index that has lost relevance
This argument has been around since time immemorial. The right way to think of it is more like a country club or a who's who, rather than a survey or a directory.
As for the news at hand, it's really more about Intel than Nvidia. Sic transit gloria mundi.
According to this, the original DJIA was 11 stocks, comprising 9 railroad companies, 1 steamship company, and 1 telegraph company. Capital was deployed very differently 140 years ago.
Those were the telecoms, transport, and infrastructure companies of their day. They connected the industrial and agricultural output of the United States in much the way Apple, Amazon, Boeing, Verizon, and Walmart (all current components of the DJIA) do today.
That said, yes, it's interesting to watch how the components and industrial sectors represented change over time.
The first DJIA proper (26 May 1896) featured cotton oil, sugar, tobacco, gas & coke (coal), cattle feed, electrical utility, lead, railroads, leather, rubber, and a holding company (trust) largely engaged in utilities and transportation, and dropped later the same year, along with US Rubber. Changes to the average have been a consistent feature to its origins.
Think of those which aren't directly comparable to modern concerns (e.g., oil & gas, electric utilities) as raw materials (mining and ag), transport and logistics, and food (or feed).
The Dow Jones index serves a different purpose: it's meant to be easy to update without the use of a computer. It's easy to update and publish the index since the math is a lot easier, and doesn't involve pulling in hundreds of quotes and trying to tabulate a weighted average.
It's not that useful now that we have computers, but in the early 1900s it was a reasonably good approximation of a market cap using fast math.
Index funds didn't exist until 1975 while the Dow was formed in 1896. So I suppose it didn't really matter if the Dow was tracking things well, it wouldn't have impacted your investments.
It was still useful to have a general idea of how stock markets were valued, particularly in light of the events on and about 29 October 1929, a/k/a Black Tuesday. Valuation of equities markets themselves has a profound impact on the monetary and financial systems as a whole.
On which point, John K. Galbraith's The Great Crash: 1929 (1954) remains an excellent history of those events (and notes the DJIA's value frequently), as well as a general primer on equities and investments, and how they may go wrong.
I love that so many people keep saying this so confidently yet if anyone bothered to look they’d see that the correlation between the two is nearly perfect over any timeframe including the past five years
Yes, it is entirely symbolic and has been for decades. The only purpose it really serves is differentiating those who know a little bit about stock markets and those who don't. It's annoying that serious media outlets continue to publish it, alongside valid indexes, implying that it has anywhere near the same level of legitimacy. Also, I don't like it.
Note that the DJIA also changes constantly over time (as discussed elsewhere in this thread), as do the weights assigned to the individual companies constituting it.
The way that the DJIA changes isn't the same as an index of, say, the n most highly capitalised equities might (Fortune 5, 10, 20, S&P 500, etc.).
Nobody should take Dow Jones Industrial Average seriously. It's only relevant because it's been around for 139 years. It only tracks arbitrarily selected 30 large companies and it's not even weighted by market cap.
It was made this way because 139 years ago we didn't have computers and someone had to manually calculate the average.
Two back-in-the-day engineering driven future-forward companies and today run into the ground by Quarterly Reports And Nothing Else Matters culture and meandering around.
For some value of “new”, anyway. I remember attending a talk on how GPU programming was cool and how GPGPU was the next big thing—in the mid 2000s, as a middle schooler. I don’t believe CUDA was a thing yet (looking at Wikipedia, the first release might have been several months later?), instead Cg and NV_fragment_program4 were the new hotness.
Eh, not really? Nvidia is 31 years old, that's hardly new or wizzy. Sure, Intel is a bit older at 56, but I feel like Nvidia has been around more than long enough.
Intel's CEO wanted to buy NVidia 20 years ago. Unfortunately for Intel, the board of directors killed the proposal as "too expensive" at $20B. Granted no one could see the future of AI on GPUs, but think of the ROI on that investment.
Nvidia would've acquired the culture of Intel, so it wouldn't be the company it was today if that happened.
Not sure that would have mattered once LLMs emerged. NVidia's success is mostly a matter of being the only major advanced GPU player at the time when LLMs exploded onto the scene. That won't be the case for long. The big players are already heavily investing in high performance AI chips to wean themselves off a dependance on NVidia.
Nvidia’s success is due to a decade+ of investment into their software stack (CUDA and friends), built on a world view that more software can and should be massively parallel.
They were a key cause for LLMs being a thing in the first place.
This. I worked in HPC between 2010 and 2013, and people were trying to compete with NVidia in the GPGPU space even back then, seeing the way the wind was blowing. Compute bandwidth, and FLOPS/watt has steadily been more and more important the last 15 years, even before AI.
NVidia has continued to stay ahead because every alternative to CUDA is half baked trash, even when the silicone make sense. As a tech company, trading $$ for time not spent dealing with compatibility bugs and broken drivers pretty much always makes business sense.
> half baked trash
> dealing with compatibility bugs
> broken drivers
Describes my experience trying to use CUDA perfectly.
We have a long way to go and we haven't even started yet.
Given that experience, think about what state the alternatives must be in!
To be fair, Cuda has improved a lot since 2014 or so. I messed up my Linux box multiple times trying to install Cuda but the last time it was just apt install and maybe setting ld library and it all just worked.
It’s pretty bad. just when you think you can order AMD chips since there is no shortage, and use a translation layer and have a cheap AI datacenter, it turns out AMD is fumbling the ball at every step of the way
More than a decade.
I was using it as soon as it came out in 2007 and my dinky desktop workstation was out performing the main frame in the basement.
> a world view that more software can and should be massively parallel.
Twenty years ago I was thinking we'd be speccing machines in kilocores by now.
https://en.wikipedia.org/wiki/Connection_Machine
4090 has 16384 cuda cores, so there's that!
Nvidia's marketing is misleading. Those "cuda cores" are more SIMD lanes than cores. Number of SMs is a more appropriate equivalent to CPU core count.
128 and 512 cores are normal now, so not that far off.
I think this undersells the story. NVidia's success is built on innovating and scaling for 20 years. Vastly oversimplifying it:
- CUDA conception in 2006 to build super computers for scientific computing.
- CUDA influencing CPU designs to be dual purpose, with major distinction of RAM amounts (for scientific compute you need a lot more RAM compared to gaming)
- Crypto craze driving extreme consumer GPU demand which enabled them to invest heavily into RND and scale up production.
- AI workload explosion arriving right as the crypto demand was dying down.
- Consistently great execution, or at least not making any major blunders, during all of the above.
As someone who's intimately familiar with both cultures, I'm convinced Intel would have killed any innovation Nvidia had going for it before it had a chance to really take off.
Management at the two could not be more opposite if they tried.
Intels 3rd wave management by MBA would have ruined nvidia way back then forcing them into something dumb and blocked all of the R&D they did since since that's what they also did at intel.
Other companies are capable of making big GPUs; they aren't the only TSMC customer. Intel themselves have perfectly fine GPUs. Their issue is that their management never allocates enough space in their chips to let them perform well.
Nvidia's advantage is that they have by far the most complete programming ecosystem for them. (Also honestly… they're a meme stock.)
Other companies don't have CUDA.
You are completely ignoring the impacts of Covid and GPU cryptocurrency mining on Nvidia profits.
It could have been AMD/ATI profiting from such random events, as they were the ones that financed AI development.
This seems to have cause and effect backwards. LLMs emerged because of Nvidia.
LLMs absolutely did not emerge because of NVIDIA. you’re the one imho who is mistaking correlation with causality.
The first transformer models were developed at Google. NVIDIA were the card du jour for accelerating it in the years since and have contributed research too, but your statement goes way too far
I was around the space before Alexnet came out. Without NVidia the last 15 years of AI would not have happened.
Deep learning was only possible because you could do it on NVidia cards with Cuda without having to use the big machines in the basement.
Trying to convince anyone that neural networks could be useful in 2009 was impossible - I got a grant declined and my PhD supervisor told me to drop the useless tech and focus on something better like support vector machines.
I was around then too and people forget that AMD also had GPU compute APIs.
The difference is AMD killed theirs to use OpenCL and NVIDIA kept CUDA around as well.
Just like how ROCm is supposed to be competitive today and it isn't unless you have an army of grad students to look after your data center cards.
I tried using AMD Stream and it lacked documentation, debugging information and most of the tools needed to get anything done without a large team of experts. NVidia by comparison could - and was - used by single grad students on their franken-stations which we build out of gaming GPUs.
The less we talk about the disaster that the move to opencl was the better.
I agree ROCm was a mess and only recently is even really usable.
That said ROCm is quite a recent thing borne out of acknowledging that OpenCL 2 was a disaster.
OpenCL 1 had a reasonable shot and was gaining adoption but 2 scuppered it.
Did you ever train anything past mnist on AMD? I try every five years or so and promise myself never again each and every time.
Not with Rocm since I’ve moved my personal stack to NVIDIA (for rendering) and Macs for day to day use.
I did write quite a bit of OpenCL prior to that on Intel/AMd/NVIDIA, both for training and for general rendering though, and did some work with Stream before then.
Was it OpenCL1? That's the only one I hadn't tried out for AMD GPUs. Everything else I have and can say with absolute certainty that you spend more time fighting the hardware than you did writing code.
Cuda by comparison JustWorks^tm.
Both 1 and 2. I haven’t done much with 3 as OpenCL is effectively a dead api at this point.
1 was definitely a lot easier to work with than 2. CUDA is easier than both but I don’t think I hit anything I could do in CUDA that I couldn’t do in OpenCL, though CUDA of course had a larger ecosystem of existing libraries.
Damn, I missed the best one.
Thanks for the trip down memory lane.
I think their point is that easily accessible hardware capable of supporting the research is the reason the research has come as far as it has, and I would tend to agree. At the very least, GPUs keeping the PCI ecosystem going has played a major role in allowing the specialized accelerator market to flourish.
But that could apply to any of the GPU manufacturers. CUDA made for an easier ecosystem but if it didn’t exist it would have been any of the other APIs.
The first transformer models didn’t even use CUDA and CUDA didn’t have mass ecosystem inroads till years later.
I’m not trying to downplay NVIDIA but they specifically mentioned cause and effect, and then said it was because of NVIDIA.
First transformer != LLM. Imagine a world where you had to use AMD or CPUs, there would be no AlexNet, there would be no LLMs. Nvidia seeded universities with gifted hardware accelerators for a over a decade. Nvidia built the foundations for modern ML on which transformer lies, it's just one step on a long road to LLMs.
AMD had GPU compute APIs at the time as well. They also used to contribute GPUs (albeit in much smaller quantities). They just ended up killing them in favor of OpenCL which then withered on the vine.
NVIDIA absolutely contributed to the foundation but they are not the foundation alone.
Alexnet was great research but they could have done the same on other vendors at the time too. The hardware didn’t exist in a vacuum.
OpenCL itself is kinda fine, but the libraries never existed because every academic from postdoc up could get a free NVidia card. You literally filled a form out and they sent you it.
I agree the library situation for OpenCL never stood up. Khronos tried at the start but then lost steam.
It’s one of the projects I think the Khronos group mishandled the most unfortunately.
Do they have a project that achieved success?
OpenGL, Vulkan, WebGL?
Alexnet could not have been done on AMD, nope.
Please, this is just revisionism. There’s nothing inherent to AlexNet that relied on NVIDIA hardware or even software. It was just what happened to be available and most straightforward at the time.
To say it wouldn’t have been possible on AMD is ludicrous and there is a pattern to your comments where you dismiss any other companies efforts or capabilities, but are quite happy to lay all the laurels on NVIDIA.
The reality is that multiple companies and individuals got us to where we are, and multiple products could have done the same. That's not to take away from NVIDIA's success, it's well earned, but if you took them out of the equation, there's nothing that would have prevented the tech existing.
The crux of the issue is this:
> It was just what happened to be available and most straightforward at the time
AMD made better hardware for a while and people wanted OpenCL to succeed. The reason why nvidia became dominant was because their competitors simply weren’t good enough for general purpose parallel compute.
Would AI still have happened without CUDA? Almost certainly. However nvidia still had a massive role in shaping what it looks like today.
> The first transformer models didn’t even use CUDA and CUDA didn’t have mass ecosystem inroads till years later.
I graduated college in 2010 and I took a class taught in CUDA before graduating. CUDA was a primary driver of NN research at the time. Sure, other tools were available, but CUDA allowed people to build and distribute actually useful software which further encouraged the space.
Could things have happened without it? Yeah, for sure, but it would have taken a good deal longer.
Nvidia will make general purpose GPUs while the AI players will make ASICS. I guess most data center will prefer GPUs so customers can run whatever AI model they need.
Surely intel would have killed off CUDA
100% on target. The Intel culture was a big contributor to Intel fall.
And the danger of Nvidia buying Intel is the same.
See also MD/Boeing
Nvidia buying Intel would mean some damn fast CPUs. I'd like to see that.
And most likely the death of Intel being one of the largest contributors to the Linux kernel.
Nvidia buying Intel would be doing Intel a great service.
Can you expand on what is the Intel culture and why it would hurt growth?
Some of my favourite reading on this topic is Matt Pharr's retrospective on ISPC: https://pharr.org/matt/blog/2018/04/30/ispc-all
And if the Intel board jumped on every expensive acquisition novelty that came to the table, how does that go?
Everyday I look at stocks and wish I'd known to buy the ones that went up today.
Overall, pretty uninteresting and uninsightful, unless you have Doc Brown's DeLorean with the Mr. Fusion upgrade (BTTF II). Even then, would it actually be good if you built your own reality to such an extent? I'm sure life's weird for the UHNW's, unclear if it's actually better. We're all still stuck on the same planet, our kids and kid's kids will all face the same struggles.
Even still, today is a pretty interesting temporal location to occupy!
Not gonna say they jumped on every one, but…McAfee, Havok(!), Infineon’s wireless business, Mobileye, etc
Yeah, McAfee..
https://youtube.com/watch?v=bKgf5PaBzyg
metadat slowly saunters off into the nearby foggy embankment, disappearing, Homer Simpson style
Intel involvement would have killed any initiative that would lead to what Nvidia is now.
So, the ROI would have been much, much lower.
Before AI Nvidia made some nice money from the crypto craze. Also, during pandemics and after, but before AI, there was a severe shortage of Nvidia GPUs.
>during pandemics and after, but before AI, there was a severe shortage of Nvidia GPUs.
you'd think that such multi-year shortage of a product would be used as an opportunity by other players to jump in and make great money. Yet that major machinery of capitalism is failing here.
AMD at one point wanted to buy Nvidia too.
Imagine where we'd (not) be if that happened. Acquisitions/consolidation kill innovation.
The "Dow Jones Industrial Index" isn't that industrial any more. Six of the companies are financial. Only about half actually run factories.
And MTV doesn't play music. The DJIA is curated slice of the American economy with a goofy weighting.
This is how the dump begins. Onto the index huggers and pension funds
Major index funds don't follow the DJIA. $DIA has 5% of $SPY's assets under management.
DIA has ~$35.62 billion, IYY has ~$2.5B in the DJIA. There are only 30 companies in it (~$12T total value), so it averages $1.1B per company.
But the S&P 500 index funds has somewhere around ~$2.5T invested, but that is spread across 500 companies (~$40T total valuation), so it averages $5B per company.
Thus you are correct the S&P 500 is more influential in terms of index fund reallocations, but only roughly 5x more influential.
The dow has nothing to do with index funds. The dow industrial average is price weighted. The s&p 500 is market weighted. No competent index would follow the dow- it doesn't even make sense conceptually, it has zero relation to the economics of the companies when you use price weighting, and 30 companies is a stupid low number.
There is, as far as I can tell, zero point to the dow, it's a completely useless tracker that is reported on because people talk about it because it's reported on.
This may make sense in theory but in reality is wrong. The DJIA has historically had a very close correlation to the S&P. Plus there is an inescapable psychological aspect to the Dow that is unique. When the Dow is off 1500 points it hits much differently than the S&P falling 200 points.
> The dow has nothing to do with index funds
I am not sure what point you’re making but there is ~$38B invested in index funds (the ones I mention in the previous post) that track the DJIA.
Granted it is a fraction of the index funds which track the S&P 500.
The S&P 500 is cap-weighted, so the average value per company is meaningless. Nvidia is currently 7% of the index, while the smallest member is 0.01%.
Since Nvidia was already a part of the S&P 500 (and other similar indices) prior to its big run, those index investors generally profited from its rise. New flows into those funds do help prop it up, though.
The DJIA is a weird historical relic, and there's little reason for anyone to buy a fund tracking it. It's possible that those who did anyways will end up holding a tiny fraction of the bag due to this change, but it's not a big effect.
Put another way, IYY and DIA combined hold <0.1% of NVDA's market cap.
Sure but the daily volume is only $300M.
I think you're looking at the volume in shares per day, not dollars per day.
That’s half the story, there’s a lot of private funds that passively track as well, it just comes down to what’s the net flow due to rebalance of public + private indices. Could be a lot, could be that much
You are correct, this will affect both Intel and NVIDIA's stock prices, although it may already have been baked in depending how predictable this was.
Index funds are big business. Dropping Intel and replacing it with NVIDIA will cause a rebalancing of the DJIA index fund investments from Intel to NVIDIA. Yup, there is an index premium.
Index huggers already own plenty of Nvidia (and Intel)? SP500 and total stock market funds have 4-7% in Nvidia at this point
Could you please explain how do you see the proposed dump?
Index funds follow indices and to follow an index asset manager holds a weighted basket of respective stocks. If stock is excluded from the index, asset manager sell it off and that puts downward pressure on the stock price. Reverse happens if stock is included into the index.
Liquidity is held by funds. Funds (some of them pensions) have rules where to allocate. They just follow these rules and allocate. Now this change will change the allocation to Nvidia. Investors can tap into this liquidity. The $$$ are then transferred from the funds (some of them are pensions, uh I mentioned that) to the investors.
AI bubble then burst. Funds are “adjusted” and someone is not getting his pension.
Pumping a big momentum trade and selling it upon index rebalance is a classic buy side trade
Nobody cares about DJI
The symbolism here is pretty nice.
Long time ago Nvidia tried to make it's own x86 CPU but was threatened with lawsuits by Intel. Now Nvidia can buy Intel for a song.
I wonder if they have enough clout to change DJIA -> DJAI ?
Only if they can make it French too
DJIA is not taken very seriously outside of mainstream media it seems. The news loves to quote big drops or increases to the DJIA due to the index being price weighted. “Wow the news is saying the DJIA dropped 1000 points today! That’s huge.” Nah, that’s like 2%
This seems like another media opportunity about a nothing burger event for an index that has lost relevance, when other indices like the S&P 500 and QQQ have already incorporated NVIDIA a while ago. They’re just playing catch up.
100%. It’s a good example of how the media wants to entertain, not inform.
The DJIA is a price-weighted index and doesn’t track dividends, yet is somehow supposed to reflect investor sentiment.
the media wants to entertain, not inform
The media wants to not go bankrupt. It’s the customers that decide whether and how that’s possible.
> The media wants to not go bankrupt.
I’d say that depends. Many media organizations operate without profits — or even incurring losses — in order to serve the public good, promote an ideology, engage in activism, spread misinformation, etc.
spread misinformation? which major media operations are operating at a loss how for the public good (discounting state run) and how?
> spread misinformation?
Yes, several state-run and private media organizations operate without profits or at a loss in order to spread misinformation.
> which major media operations are operating at a loss how for the public good (discounting state run) and how?
Sorry, I was unable to properly understand your question.
In any case, I never said the organizations that are not primarily concerned with chasing profits are “major” ones. I believe that applies best to lesser-well-known entities, like those who live on social media or the blogosphere. Though there are some instances of major state-run companies, even on TV and on the radio, that operate in a similar fashion.
Also, I am surely not equating the spread of misinformation with serving the public good — these are just distinct objectives that may be sought by media organizations, rather than avoiding bankruptcy.
The Guardian has turned a profit in maybe 4 of the last 30 years.
As one example, the Chicago NPR affiliate, WBEZ, acquired the Chicago Sun-Times as what was widely described as a "no-cash" deal.[1] A rather more truthful description is that WBEZ was paid $61 million (largely through philanthropic support) to take over a long-ailing newspaper, as described by the city's other long-ailing paper, the Chicago Tribune:
"Chicago Sun-Times becomes nonprofit newspaper with $61 million in backing as WBEZ merger closes"
<https://www.chicagotribune.com/2022/01/31/chicago-sun-times-...>
Of newspapers operating at a loss or as philanthropies, there are the Baltimore Banner,[2] The Guardian,[3] and ProPublica,[4] which all operate as non-profits, relying on a mix of advertising, subscriptions, and philanthropy. The privately-held, for-profit Washington Post is a for-profit paper that's operated at a loss for years, and this before losing ~10% of its subscribers due to recent editorial decisions.[4][5]
There are numerous propaganda institutions (usually labeled as "think tanks") promulgating various ideologies or interests, with the Atlas Network being amongst the largest and most influential:
<https://www.sourcewatch.org/index.php?title=Atlas_Network>
________________________________
Notes:
1. See for example the WSJ's coverage: "Chicago Public Media to Acquire Chicago Sun-Times, Creating a Nonprofit Local-News Powerhouse" <https://www.wsj.com/amp/articles/chicago-public-media-to-acq...> archive/paywall: <https://archive.is/LP3Q6>.
2. A non-profit newspaper established in 2022: <https://en.wikipedia.org/wiki/The_Baltimore_Banner>.
3. A slightly dated take on 2016 turmoil at The Guardian: "Everything you need to know about the Guardian’s giant bust-up" (2026-5-18) <https://www.standard.co.uk/lifestyle/london-life/fueding-and...> and Wikipedia's entry on the Scott Trust Limited which underwrites the paper: <https://en.wikipedia.org/wiki/Scott_Trust_Limited>.
4. ProPublica was established as a 501(c)(3) in 2007, with funding from the Sandler, Knight, MacArthur, and Ford foundations, along with the Pew Charitable Trusts, Carnegie Corporation, and Atlantic Philanthropies: <https://en.wikipedia.org/wiki/ProPublica>.
5. "The Washington Post publisher disclosed the paper lost $77 million last year. Here’s his plan to turn it around" (2024-5-23) <https://www.cnn.com/2024/05/23/media/washington-post-will-le...>
6. "Washington Post cancellations hit 250,000 – 10% of subscribers" (2024-10-29) <https://www.theguardian.com/media/2024/oct/29/washington-pos...>
Or it's a time-honored way, neutral, way to communicate the impact of business news. YMMV, in my experience not based not on facts, but how willing I am to catastrophize things in order to make things I don't like simple to blame on someone.
> DJIA is not taken very seriously outside of mainstream media it seems
AFAIK its only real utility is if you want to make very long term comparisons over the years against old values of itself, where its long baseline may be valuable compared to other metrics that don't stretch as far.
Everything else is just flim-flam for getting views/clicks or comforting very old viewers with something that is a familiar staple.
Do any index funds track it?
Quite a few, pretty much every major ETF provider should have one. I recall Vanguard not, and a quick google seems to confirm that. I do wonder why?
The one I know of off the top of my head is DIA.
The dow is price weighted. Vanguard, being a sort of non profit, probably doesn't want people investing in an obviously horrible index that should have gone away decades ago.
They don't care about people investing in losing bets. But they care about wasting resources on running stuff that does not sell. And not having a plus followed by big number and percentage sign in column next to name is bad seller.
In the end how funds are marketed and presented is important part to understand. Also it is better for them to sell actively managed fund with bigger number on it than index with negative one.
> nothing burger event for an index that has lost relevance
This argument has been around since time immemorial. The right way to think of it is more like a country club or a who's who, rather than a survey or a directory.
As for the news at hand, it's really more about Intel than Nvidia. Sic transit gloria mundi.
So you can just replace underperforming/flailing companies and still call it an average?
Yes that's how it's always worked https://en.m.wikipedia.org/wiki/Historical_components_of_the...
According to this, the original DJIA was 11 stocks, comprising 9 railroad companies, 1 steamship company, and 1 telegraph company. Capital was deployed very differently 140 years ago.
It was and wasn't.
Those were the telecoms, transport, and infrastructure companies of their day. They connected the industrial and agricultural output of the United States in much the way Apple, Amazon, Boeing, Verizon, and Walmart (all current components of the DJIA) do today.
That said, yes, it's interesting to watch how the components and industrial sectors represented change over time.
The first DJIA proper (26 May 1896) featured cotton oil, sugar, tobacco, gas & coke (coal), cattle feed, electrical utility, lead, railroads, leather, rubber, and a holding company (trust) largely engaged in utilities and transportation, and dropped later the same year, along with US Rubber. Changes to the average have been a consistent feature to its origins.
Think of those which aren't directly comparable to modern concerns (e.g., oil & gas, electric utilities) as raw materials (mining and ag), transport and logistics, and food (or feed).
DJIA is kind of a joke index. SP500 is much better.
The Dow Jones index serves a different purpose: it's meant to be easy to update without the use of a computer. It's easy to update and publish the index since the math is a lot easier, and doesn't involve pulling in hundreds of quotes and trying to tabulate a weighted average.
It's not that useful now that we have computers, but in the early 1900s it was a reasonably good approximation of a market cap using fast math.
Index funds didn't exist until 1975 while the Dow was formed in 1896. So I suppose it didn't really matter if the Dow was tracking things well, it wouldn't have impacted your investments.
It was still useful to have a general idea of how stock markets were valued, particularly in light of the events on and about 29 October 1929, a/k/a Black Tuesday. Valuation of equities markets themselves has a profound impact on the monetary and financial systems as a whole.
On which point, John K. Galbraith's The Great Crash: 1929 (1954) remains an excellent history of those events (and notes the DJIA's value frequently), as well as a general primer on equities and investments, and how they may go wrong.
I love that so many people keep saying this so confidently yet if anyone bothered to look they’d see that the correlation between the two is nearly perfect over any timeframe including the past five years
Yes, it is entirely symbolic and has been for decades. The only purpose it really serves is differentiating those who know a little bit about stock markets and those who don't. It's annoying that serious media outlets continue to publish it, alongside valid indexes, implying that it has anywhere near the same level of legitimacy. Also, I don't like it.
May I know why you consider it a joke? I obviously have no much clue about it all
If you want to estimate the stock market, would you rather:
1. sum 500 of the biggest companies by size (price * n shares).
or
2. have WSJ editors select 30 companies by any criteria they see fit, but you don't get to see the size of the companies, only the share price.
Note that the DJIA also changes constantly over time (as discussed elsewhere in this thread), as do the weights assigned to the individual companies constituting it.
The way that the DJIA changes isn't the same as an index of, say, the n most highly capitalised equities might (Fortune 5, 10, 20, S&P 500, etc.).
Wonder how much DJIA had to give to get them to list.
Its not a market average, its an average of members curated by an arbitrary set of rules.
Many exist.
This is how it has always been.
Nobody should take Dow Jones Industrial Average seriously. It's only relevant because it's been around for 139 years. It only tracks arbitrarily selected 30 large companies and it's not even weighted by market cap.
It was made this way because 139 years ago we didn't have computers and someone had to manually calculate the average.
A cautionary tale for Nvidia to not become the next Intel.
I've been assuming that Intel will eventually catch up, because that's the world we grew up in, but that's not going to happen, is it?
May I suggest a merger of intel with Boeing?
Two back-in-the-day engineering driven future-forward companies and today run into the ground by Quarterly Reports And Nothing Else Matters culture and meandering around.
They'd be a lovely cultural fit.
Does that mean it's gonna top...
Ok so why switch?
Symbolic of the times. The new wizzy guy edges out the old school lumbering giant.
For some value of “new”, anyway. I remember attending a talk on how GPU programming was cool and how GPGPU was the next big thing—in the mid 2000s, as a middle schooler. I don’t believe CUDA was a thing yet (looking at Wikipedia, the first release might have been several months later?), instead Cg and NV_fragment_program4 were the new hotness.
Eh, not really? Nvidia is 31 years old, that's hardly new or wizzy. Sure, Intel is a bit older at 56, but I feel like Nvidia has been around more than long enough.
> Intel shares were down 1% in extended trading on Friday. Nvidia shares rose 1%.
Riveting