Transmeta made a technology bet that dynamic compilation could beat OOO super scalar CPUs in SPEC.
It was wrong, but it was controversial among experts at the time.
I’m glad that they tried it even though it turned out to be wrong. Many of the lessons learned are documented in systems conferences and incorporated into modern designs, ie GPUs.
To me transmeta is a great example of a venture investment. If it would have beaten Intel at SPEC by a margin, it would have dominated the market. Sometimes the only way to get to the bottom of a complex system is to build it.
The same could be said of scaling laws and LLMs. It was theory before Dario, Ilya, OpenAI, et al trained it.
I think more about the timing being incorrect - betting on software in an era of exponential hardware growth was unwise (software performance can’t scale that way). The problem is that you need to marry it with a significantly better CPU/architecture because the JIT is about not losing performance while retaining back compat.
However, if you add it onto a better CPU it’s a fine technique to bet on - case in point Apple’s move away from Intel onto homegrown CPUs.
One aspect of Transmeta not mentioned by this article is their "Code Morphing" technique used by the Crusoe and Efficeon processors. This was a low level piece of software similar to a JIT compiler that translated x86 instructions to the processor's native VLIW instruction set.
Similar technology was developed later by Nvidia, which had licensed Transmeta's IP, for the Denver CPU cores used in the HTC Nexus 9 and the Carmel CPU cores in the Magic Leap One. Denver was originally intended to target both ARM and x86 but they had to abandon the x86 support due to patent issues.
Didn't Transmeta's technology end up in Apple's PowerPC emulator Rosetta, following the switch to Intel?
IIRC Transmeta's technology came out of HP (?) research into dynamic inlining of compiled code, giving performance comparable to profile-guided optimization without the upfront work. It worked similarly to an inlining JIT compiler, except it was working with already compiled code. Very interesting approach and one I think could be generally useful. Imagine if, say, your machine's bootup process was optimized for the hardware you actually have. I'm going off decades old memories here, so the details might be incorrect.
I liked the Transmeta web page from before they launched. It was just bare HTML with no styling. It said:
This page is not here yet.
The product hype and lack of knowledge about what it was meant that nobody knew what to expect. In these hyped expectations, and with Torvalds on board, everyone expected that everything would be different. But it wasn't.
A similar product launch was the Segway, where we went from this incredible vision of everyone on Segways to nobody wanting one.
The hype was part of the problem with Transmeta. Even in it's delivered form it could have found a niche. For example, the network computer was in vogue at the time, thanks to Oracle. A different type of device, like a Chromebook might have worked.
With Torvalds connected to Transmeta and the stealthy development, we never did get to hear about who was really behind Transmeta and why.
Transmeta made a technology bet that dynamic compilation could beat OOO super scalar CPUs in SPEC.
It was wrong, but it was controversial among experts at the time.
I’m glad that they tried it even though it turned out to be wrong. Many of the lessons learned are documented in systems conferences and incorporated into modern designs, ie GPUs.
To me transmeta is a great example of a venture investment. If it would have beaten Intel at SPEC by a margin, it would have dominated the market. Sometimes the only way to get to the bottom of a complex system is to build it.
The same could be said of scaling laws and LLMs. It was theory before Dario, Ilya, OpenAI, et al trained it.
I think more about the timing being incorrect - betting on software in an era of exponential hardware growth was unwise (software performance can’t scale that way). The problem is that you need to marry it with a significantly better CPU/architecture because the JIT is about not losing performance while retaining back compat.
However, if you add it onto a better CPU it’s a fine technique to bet on - case in point Apple’s move away from Intel onto homegrown CPUs.
One aspect of Transmeta not mentioned by this article is their "Code Morphing" technique used by the Crusoe and Efficeon processors. This was a low level piece of software similar to a JIT compiler that translated x86 instructions to the processor's native VLIW instruction set.
Similar technology was developed later by Nvidia, which had licensed Transmeta's IP, for the Denver CPU cores used in the HTC Nexus 9 and the Carmel CPU cores in the Magic Leap One. Denver was originally intended to target both ARM and x86 but they had to abandon the x86 support due to patent issues.
https://en.wikipedia.org/wiki/Project_Denver
Very similar approach is used in MCST Elbrus CPUs: https://en.wikipedia.org/wiki/Elbrus-8S#Supported_operating_...
Didn't Transmeta's technology end up in Apple's PowerPC emulator Rosetta, following the switch to Intel?
IIRC Transmeta's technology came out of HP (?) research into dynamic inlining of compiled code, giving performance comparable to profile-guided optimization without the upfront work. It worked similarly to an inlining JIT compiler, except it was working with already compiled code. Very interesting approach and one I think could be generally useful. Imagine if, say, your machine's bootup process was optimized for the hardware you actually have. I'm going off decades old memories here, so the details might be incorrect.
No, you are confusing Transmeta with Transitive. https://en.wikipedia.org/wiki/QuickTransit
A lot ended up in HotSpot for the JVM. I know a number of extremely good engineers whose career path went TransMeta -> Sun -> Google.
Dynamo <https://www.cse.iitm.ac.in/~krishna/courses/2022/odd-cs6013/...>?
I remember it being in one of Sony VAIO's product lines called the picturebook, for its small form factor and a swivel webcam.
> But they were still a technology company, and if their plans had gone well, they would have sold their product to dotcoms
I'm not sure that that's really correct; they were very desktop-oriented.
I liked the Transmeta web page from before they launched. It was just bare HTML with no styling. It said:
The product hype and lack of knowledge about what it was meant that nobody knew what to expect. In these hyped expectations, and with Torvalds on board, everyone expected that everything would be different. But it wasn't.A similar product launch was the Segway, where we went from this incredible vision of everyone on Segways to nobody wanting one.
The hype was part of the problem with Transmeta. Even in it's delivered form it could have found a niche. For example, the network computer was in vogue at the time, thanks to Oracle. A different type of device, like a Chromebook might have worked.
With Torvalds connected to Transmeta and the stealthy development, we never did get to hear about who was really behind Transmeta and why.
All I know about Transmeta is that Linus Torvalds moved over from Finland to the USA to work at this startup.
Other than that, it seems to have sunk without a trace.