to deploy a 2nd hand Cray-1 at UQ, we had to raise the ex-IBM 3033 floor, it turned out the bend radius for flourinert was NOT the same as a water cooled machine. We also installed a voltage re-generator which is basically a huge spinning mass, you convert Australian volts to DC, spin the machine, and take off re-generated high frequency volts for the cray, as well as 110v on the right hz for boring stuff alongside. the main bit ran off something like 400hz power, for some reason the CPU needed faster mains volts going in.
The flourinert tank has a ball valve, like a toilet cistern. we hung a plastic lobster in ours, because we called the cray "Yabbie" (Queensland freshwater crayfish)
That re-generator, the circuit breakers are .. touchy. the installation engineer nearly wet his trousers flipping on, the spark-bang was immense. Brown trouser moment.
The front end access was Unisys X11 Unix terminals. They were built like a brick shithouse (to use the australianism) but were a nice machine. I did the acceptance testing, it included running up X11 and compiling and running the largest Conways game of life design I could find on the net. Seemed to run well.
We got the machine as a tax-offset for a large Boeing purchase by Australian defence. End of life, one of the operators got the love-seat and turned it into a wardrobe in his bedroom.
Another, more boring cray got installed at department of primary industries (Qld government) to do crops and weather modelling. The post cray-1 stuff was .. more ordinary. Circular compute unit was a moment in time.
I used a Cray C-90 and T3D 256-core machine from 1995-1999. The T3D used commodity Alpha 21164, and was already behind the T3E when we got it (Cray refurbished.) By the end it was outclassed by an SGI Oxygen box with 8 CPUs. I’d already ported a lot of software from SunOS and HP-UX to Irix and Unicos (Cray) and it was easy to move it to Linux in the end.
There a lot of discussion here https://retrocomputing.stackexchange.com/questions/7412/why-... but nothing seems conclusive.. I would wager the last answer, "IBM was using 400Hz", to be most directly causal reason. The motor-generator configuration might provide galvanic isolation and some immunity to spikes and transients as well?
Smaller transformers and capacitors in all the linear power supplies.
400Hz is still common in aircraft. Distribution losses are higher, but you're going across the room, not across the country.
> the main bit ran off something like 400hz power, for some reason the CPU needed faster mains volts going in.
Aerospace originally did that to reduce component size, CDC and IBM took advantage of the standard in the early 60's.
Strangely, it seems mainframes didn't adopt switching power supplies until the end of the 70's, despite the tech being around since the end of the 60's.
There you go. Not to doubt what you say, but we definitely had the love seat yet we also had a tank of vaguely flourescing green liquid. Maybe we had some intermediate state, the cray-1 cpu form but the cray-2 upgraded coolant.
It wouldn't surprise me if we had the bastard love-child of leftovers from Boeing.
400 Hz is really the next best thing to a switching supply, as the transformers and filter capacitors can be smaller than they would need to be at 50/60 Hz. It can save cost and space for filter capacitors, especially in a three-phase system where there's not as much ripple to deal with.
Another rationale may have been that the flywheel on the motor-generator would cover a multitude of power-quality sins.
I knew a guy who worked at one of the national labs that had its own Cray supercomputer, in a computer room with a big observation window that visitors could admire it through, of course (all Crays required observations windows to show them off).
Just before a tour group came by, he hid inside the Cray, and waited for them to arrive. Then he casually strolled out from the back of the Cray, pulling up the zipper of his jeans, with a sheepish relieved expression on his face, looked up and saw the tour group, acted startled, then scurried away.
The Cray 1-ish machines I had access to at Shell and Chevron were most definitely tucked away in rooms with no visibility into them. In fact the Chevron machine room had pretty stern "no photography" placards, which I took seriously and is sadly why I don't have a photo of sitting on the loveseat of their machine.
Getting access took just short an act of God and I was a sysadmin in the central support group! They didn't want us puttering on the machines, so as far as I could tell it mostly sat idle.
The CRAY-1 was so ridiculously ahead of its time that it took until the Pentium MMX (1997) for “ordinary” computers to catch up to its raw performance.
That’s 20 years or about 10,000X the available VLSI transistors via Moore’s Law.
With the visual analysis of LLMs, throw in some extra ingredients with a camera, microphone, speaker, and display, and were getting close. Add Bluetooth for biosensors and... Iterate a couple of generations...
"The 160 MFLOPS Cray-1 was succeeded in 1982 by the 800 MFLOPS Cray X-MP, the first Cray multi-processing computer. In 1985, the very advanced Cray-2, capable of 1.9 GFLOPS peak performance
...
By comparison, the processor in a typical 2013 smart device, such as a Google Nexus 10 or HTC One, performs at roughly 1 GFLOPS,[6] while the A13 processor in a 2019 iPhone 11 performs at 154.9 GFLOPS,[7] a mark supercomputers succeeding the Cray-1 would not reach until 1994."
Oh but we had The Forbin Project, its sequel Colossus, and later Wargames. Not to mention Star Trek episodes with malignant computers. And I have No Mouth But I Must Scream.
In the 70s, science fiction fed me Cold War fears of world-controlling mainframes.
Inmos not imos, if my memory cells serve ne correctly. I lived overseas at the time, so I did not hear about the Cray till like 1980/81. My friend (we were like 12) had an idea to write a simulator for digital circuits, and I was puzzled as to why you would want to simulate a circuit when you can build it and test it. He was way ahead of his time.
> The aesthetics of the machine have not been neglected. The CPU is attractively housed in a cylindrical cabinet. The chassis are arranged two per each of the twelve wedge-shaped columns. At the base are the twelve power supplies. The power supply cabinets, which extend outward from the base are vinyl padded to provide seating for computer personnel.
I remember doing a report on this in high school in the late 80s. I'd love to do an order of magnitude comparison to a modern M4 Mac... Amazing how far we've come.
I just did a BOTE calculation for my iPhone (A17 Pro chip; GPU rated at 4 Tflops). According to the sales blurbage in TFA, the Cray 1 performed at 80 Mflops. (Yes, that is OBVIOUSLY not comparing apples to Apples -- pun intended). Unless I've dropped a decimal point, my iPhone is (capable of) 50,000 times the floating point speed of a Cray 1.
Cray himself. Here's a talk about designing it. My favorite part is his description of the aluminum machining they had to invent in order to move the freon through the frame to keep the machine from melting. It's a great talk.
"Seymour said he thought it was odd that Apple bought a Cray to design Macs because he was using Macs to design Crays. He sent me his designs for the Cray 3 in MacDraw on a floppy.” reports KentK.
In the past few years whenever I re-watch 2001 when Dave is shutting down HAL, I see a spaceship capable data center. And HAL sings "Daisy.." finally at the foundational, bare metal layer.
2. It’s a secure facility and wanted to prevent people from looking in.
3. To not have to look at something outside.
4. It’s a secure facility and wanted to prevent the chance of taking a picture of someone or something outside that could compromise the location of the computer or someone’s identity; sometimes the first place a photogenic computer was built was at a customer site.
to deploy a 2nd hand Cray-1 at UQ, we had to raise the ex-IBM 3033 floor, it turned out the bend radius for flourinert was NOT the same as a water cooled machine. We also installed a voltage re-generator which is basically a huge spinning mass, you convert Australian volts to DC, spin the machine, and take off re-generated high frequency volts for the cray, as well as 110v on the right hz for boring stuff alongside. the main bit ran off something like 400hz power, for some reason the CPU needed faster mains volts going in.
The flourinert tank has a ball valve, like a toilet cistern. we hung a plastic lobster in ours, because we called the cray "Yabbie" (Queensland freshwater crayfish)
That re-generator, the circuit breakers are .. touchy. the installation engineer nearly wet his trousers flipping on, the spark-bang was immense. Brown trouser moment.
The front end access was Unisys X11 Unix terminals. They were built like a brick shithouse (to use the australianism) but were a nice machine. I did the acceptance testing, it included running up X11 and compiling and running the largest Conways game of life design I could find on the net. Seemed to run well.
We got the machine as a tax-offset for a large Boeing purchase by Australian defence. End of life, one of the operators got the love-seat and turned it into a wardrobe in his bedroom.
Another, more boring cray got installed at department of primary industries (Qld government) to do crops and weather modelling. The post cray-1 stuff was .. more ordinary. Circular compute unit was a moment in time.
(I think I've posted most of this to HN before)
I used a Cray C-90 and T3D 256-core machine from 1995-1999. The T3D used commodity Alpha 21164, and was already behind the T3E when we got it (Cray refurbished.) By the end it was outclassed by an SGI Oxygen box with 8 CPUs. I’d already ported a lot of software from SunOS and HP-UX to Irix and Unicos (Cray) and it was easy to move it to Linux in the end.
There a lot of discussion here https://retrocomputing.stackexchange.com/questions/7412/why-... but nothing seems conclusive.. I would wager the last answer, "IBM was using 400Hz", to be most directly causal reason. The motor-generator configuration might provide galvanic isolation and some immunity to spikes and transients as well?
Smaller transformers and capacitors in all the linear power supplies. 400Hz is still common in aircraft. Distribution losses are higher, but you're going across the room, not across the country.
> the main bit ran off something like 400hz power, for some reason the CPU needed faster mains volts going in.
Aerospace originally did that to reduce component size, CDC and IBM took advantage of the standard in the early 60's.
Strangely, it seems mainframes didn't adopt switching power supplies until the end of the 70's, despite the tech being around since the end of the 60's.
Cray-1 or Cray-2? IIRC Fluorinert was new with the Cray-2, while wikipedia suggests that the Cray-1 used a freon as coolant.
There you go. Not to doubt what you say, but we definitely had the love seat yet we also had a tank of vaguely flourescing green liquid. Maybe we had some intermediate state, the cray-1 cpu form but the cray-2 upgraded coolant.
It wouldn't surprise me if we had the bastard love-child of leftovers from Boeing.
400 Hz is really the next best thing to a switching supply, as the transformers and filter capacitors can be smaller than they would need to be at 50/60 Hz. It can save cost and space for filter capacitors, especially in a three-phase system where there's not as much ripple to deal with.
Another rationale may have been that the flywheel on the motor-generator would cover a multitude of power-quality sins.
Hello Bandit!
I knew a guy who worked at one of the national labs that had its own Cray supercomputer, in a computer room with a big observation window that visitors could admire it through, of course (all Crays required observations windows to show them off).
Just before a tour group came by, he hid inside the Cray, and waited for them to arrive. Then he casually strolled out from the back of the Cray, pulling up the zipper of his jeans, with a sheepish relieved expression on his face, looked up and saw the tour group, acted startled, then scurried away.
The Cray 1-ish machines I had access to at Shell and Chevron were most definitely tucked away in rooms with no visibility into them. In fact the Chevron machine room had pretty stern "no photography" placards, which I took seriously and is sadly why I don't have a photo of sitting on the loveseat of their machine.
Getting access took just short an act of God and I was a sysadmin in the central support group! They didn't want us puttering on the machines, so as far as I could tell it mostly sat idle.
The CRAY-1 was so ridiculously ahead of its time that it took until the Pentium MMX (1997) for “ordinary” computers to catch up to its raw performance.
That’s 20 years or about 10,000X the available VLSI transistors via Moore’s Law.
I wonder how many times faster my iPhone 17 Pro Max is?
Sometimes I like to remind myself we are living in the future. A future that seemed like SciFi when I was a kid in the 70s!
Sadly I don’t think we will ever see Warp Drives, Time Travel or World Peace. But we might get Jet Packs!
Recently I've found myself wanting a tricorder type device.
With the visual analysis of LLMs, throw in some extra ingredients with a camera, microphone, speaker, and display, and were getting close. Add Bluetooth for biosensors and... Iterate a couple of generations...
We're getting there!
From the Wikipedia article on the Cray 1:
"The 160 MFLOPS Cray-1 was succeeded in 1982 by the 800 MFLOPS Cray X-MP, the first Cray multi-processing computer. In 1985, the very advanced Cray-2, capable of 1.9 GFLOPS peak performance
...
By comparison, the processor in a typical 2013 smart device, such as a Google Nexus 10 or HTC One, performs at roughly 1 GFLOPS,[6] while the A13 processor in a 2019 iPhone 11 performs at 154.9 GFLOPS,[7] a mark supercomputers succeeding the Cray-1 would not reach until 1994."
> while the A13 processor in a 2019 iPhone 11 performs at 154.9 GFLOPS,[
Sustained ? Or just for some ms when the thermals kick in ?
>how many times faster my iPhone 17 Pro Max is..
Sadly most of that power is not working for you, most of the time, but working against you, by spying, tracking and manipulating you.
Bet that was not include in your sci-fi dreams in 70s..
Oh but we had The Forbin Project, its sequel Colossus, and later Wargames. Not to mention Star Trek episodes with malignant computers. And I have No Mouth But I Must Scream.
In the 70s, science fiction fed me Cold War fears of world-controlling mainframes.
I'd love to see you substantiate that - bet you can't.
At the risk of sounding cliche i'll point out that ios probably uses several times the capacity of a cray 1 just to get the keyboard to work.
Yes, but the impressive part is that the iPhone does it on a few mW instead of a few MW
Seymour Cray never designed a silicon microprocessor. I wonder what he would have created had that opportunity presented itself.
https://en.wikipedia.org/wiki/Seymour_Cray
In his memory some have said his name are the last two characters in RISC and CISC.
Not terribly impressive considering an average 20 year old super computer c. 2005 is still about 100x as fast as today's best consumer cpus
Well, yeah, Dennard scaling ended around 20 years ago!
for a much more in-depth description of its predecessor, see https://archive.computerhistory.org/resources/text/CDC/cdc.6...
i don't think there is a comparable book about the cray-1?
Blew my mind age 4. Then found out about the imos transputer. And robotics magazine. 70's were popping. Ponging?
XMOS is still keeping the Inmos dream alive, more or less!
Inmos not imos, if my memory cells serve ne correctly. I lived overseas at the time, so I did not hear about the Cray till like 1980/81. My friend (we were like 12) had an idea to write a simulator for digital circuits, and I was puzzled as to why you would want to simulate a circuit when you can build it and test it. He was way ahead of his time.
> The aesthetics of the machine have not been neglected. The CPU is attractively housed in a cylindrical cabinet. The chassis are arranged two per each of the twelve wedge-shaped columns. At the base are the twelve power supplies. The power supply cabinets, which extend outward from the base are vinyl padded to provide seating for computer personnel.
My first time on Cray, I thought I had a core dump, my program ended so quickly. No, the job had finished.
Same with "go test". :)
Am I right in thinking there are no working Cray-1s?
I know a couple of museums have them, but I don't think any software has ever surfaced, am I right?
No, both COS (but missing Cray Fortran) and Unicos have been located.
> "The compact mainframe occupies a MERE 70 sq. ft. of floor space."
(emphasis mine)
I remember doing a report on this in high school in the late 80s. I'd love to do an order of magnitude comparison to a modern M4 Mac... Amazing how far we've come.
I just did a BOTE calculation for my iPhone (A17 Pro chip; GPU rated at 4 Tflops). According to the sales blurbage in TFA, the Cray 1 performed at 80 Mflops. (Yes, that is OBVIOUSLY not comparing apples to Apples -- pun intended). Unless I've dropped a decimal point, my iPhone is (capable of) 50,000 times the floating point speed of a Cray 1.
In my back pocket. To watch cat videos.
Cray 1 - 160 MFLOPS
M4 CPU - 280 GFLOPS
M4 GPU - 2900 GFLOPS
> I'd love to do an order of magnitude comparison to a modern M4 Mac... Amazing how far we've come.
Yes, would be nice to compare the capabilities (multiuser, multitasking, security, RCE). Did we get _so_ far ? How many users can a Mac sustain ?
who was the industrial designer? Cray-1 fits so well it looks straight out of 2001: A space Odyssey
Cray himself. Here's a talk about designing it. My favorite part is his description of the aluminum machining they had to invent in order to move the freon through the frame to keep the machine from melting. It's a great talk.
https://www.youtube.com/watch?v=vtOA1vuoDgQ
somewhat related:
"Seymour said he thought it was odd that Apple bought a Cray to design Macs because he was using Macs to design Crays. He sent me his designs for the Cray 3 in MacDraw on a floppy.” reports KentK.
https://cray-history.net/2021/07/16/apple-computer-and-cray-...
In the past few years whenever I re-watch 2001 when Dave is shutting down HAL, I see a spaceship capable data center. And HAL sings "Daisy.." finally at the foundational, bare metal layer.
Why the thin black curtain on the window?
Thoughts:
1. To block some sunlight from getting in.
2. It’s a secure facility and wanted to prevent people from looking in.
3. To not have to look at something outside.
4. It’s a secure facility and wanted to prevent the chance of taking a picture of someone or something outside that could compromise the location of the computer or someone’s identity; sometimes the first place a photogenic computer was built was at a customer site.
I suspect the curtains are closed to avoid causing exposure issues etc. in the photos being taken.
As for windows in a computer room, seems a bit unusual, but a nicer working environment than the usual windowless box I'd guess...
It was the 70s, and that was simply the style at the time.
Or, you know: Sometimes, a window shade is just a window shade.
Documentation is fantastic.
I had the cover of this pinned up on my wall! Supercomputer porn.