Microcomputers – The Second Wave: Toward a Mass Market

(technicshistory.com)

73 points | by cfmcdonald 3 days ago ago

40 comments

  • andrehacker 3 days ago

    Great article. It’s wild to look back at 1977/1978 and realize how suddenly the personal computer era exploded into the mainstream. The PET, TRS-80, and Apple II all hit the market within months of each other, and while hobbyists had already been tinkering with machines like the Altair, IMSAI, KIM-1, and Apple I, this was the moment computers truly became “consumer” products.

    From a technical perspective, the timing made sense—there was a foundation of microprocessor-based systems and a growing community of enthusiasts. But for the general public, it felt like computers went from obscure to omnipresent overnight. They were suddenly on TV, in magazines, featured in books, and even depicted in movies and shows. That cultural shift was massive. For many of us, it marked the beginning of having computers in our homes—something that’s never changed since.

    I appreciated the article’s attention to detail too. The bit about the TRS-80 monitor being repurposed from an existing product (with a "Mercedes Silver" color to boot), and the PET’s sheet metal casing being a practical choice rather than a design one—those are the kinds of behind-the-scenes decisions that rarely get spotlighted but say a lot about how fast things were moving back then.

  • K0balt 3 days ago

    It’s truly remarkable to me that in the late 70s/early 80s it was considered that programming your own computer in basic was not something that required special skills or technical ability.

    It just goes to show how far out expectations have dropped, with basic human ingenuity and capability for expression having been crippled by reliance on increasingly advanced automation with increasingly simple interfaces.

    Humanity is not going to fare well in the world of pervasive synthetic intelligence with simple language interfaces. I fear we will see an unprecedented dumbing down of the population, a new “dark age” perhaps.

    • zoeysmithe 3 days ago

      I was little little during this time and the percent of adults that owned computers was tiny and the percent who could program them was even tinier. I think its very easy to fall for "le wrong generation" narratives because they are so ego pleasing to think things only got this way recently. That people are somehow magically 'dumbed down' now.

      When instead its always been like this. That certain types of people do certain things and others are highly disinterested in it, and this sort of modern 80s or 90s renaissance never occurred. It was the same tiny community of people doing the highly technical work, just like today.

      • Supernaut 2 days ago

        > its always been like this. That certain types of people do certain things and others are highly disinterested in it

        I agree. I remember 1984, when all of my schoolyard peers had a home computer. It was only myself and one other guy that ever experimented with our systems' integrated BASIC. Everyone else exclusively used their computers for gaming. Unsurprisingly, the other programmer and I are the only two who subsequently made careers in IT.

    • Jtsummers 3 days ago

      > It just goes to show how far out expectations have dropped

      Those computers were vastly simpler, and many weren't connected to any kind of external network (or the networks were, again, vastly simpler). It's like the difference between a Model T era car (many people possess the technical ability to maintain them, even today) versus a modern car.

      It's not that we expect less of people today, we've just produced something much more complex so what it even means to understand or use a computer has changed, because it's not the same thing as 40-50 years ago. If I threw an Apple II level computer at someone today, I'd expect as much from them as throwing the same computer at someone in 1980 (actually more, they'd have more foundational knowledge than a random person in 1980 would have).

      • HankStallone 2 days ago

        Also, as programming languages have gotten more powerful, I think they've gotten so they require a particular kind of mind to work with. I could teach anyone with a little interest and aptitude to program in BASIC 2.0, because there aren't any hard concepts. Variables are global, there's no recursion, no objects, not really even functions, just subroutines. So you've got looping and variables, just enough to let you do some calculations and put things on a screen. Pretty simple. Also, my Commodore 128 came with a System Guide in which over 100 pages were a pretty solid BASIC instruction and explanation of every command. So if you bought one of those machines, there was a good chance you'd at least tinker with BASIC. You had to learn a couple commands just to use the thing, after all.

        Moving to something modern, even a language that's considered easy to learn like python, you very quickly get into more complicated concepts. That seems to have created a situation where the easier and more powerful we make programming for programmers, the more it gets out of reach of anyone else. Though there are still languages specifically for learning (like BASIC), so that doesn't have to be a problem.

    • microtherion 2 days ago

      I think it's the opposite: our expectations have RISEN to an extent that it's harder for beginners to get started.

      One of my sons was interested in learning programming, but the goal he envisioned was to write an AAA game. It was rather discouraging to have to tell him that it would take a minimum of 5 years (realistically more) to get to the level where he could consider getting hired into a team of hundreds to work on such a game.

      In contrast, the first computer I had access to had a whopping 7167 bytes of RAM to work with, and a 25x40 character screen. Correspondingly, our ambitions were much more limited.

      I recently started reading back issues of Dr. Dobb's magazine (the internet archive has issues until about 1990: <https://archive.org/details/dr_dobbs_journal?and%5B%5D=creat...>), and many articles seem to fall into the categories of either being fairly simple (mostly games), or visions of overly ambitious projects that likely never came to fruition (a multi user UNIX system implemented on an 8080 system with 32K RAM and a floppy drive, to be written by somebody who encountered his first computer two years earlier…).

      To be sure, there were also quite sophisticated programs, such as a 6502 floating point package co-developed by Steve Wozniak.

      • K0balt 2 days ago

        To be sure, the barrier to entry on professional level development is higher, but we also have chip-8, python, arduino, and other languages and systems that are arguably much more approachable than BASIC with a line only editor.

        Many commenters have pointed out that it was always a tiny fraction of people that could do this kind of thing, and I think that’s probably true, in retrospect. My ideas about the subject are probably colored by the fact that it always seemed easy to me, and I assumed (at the time) that others would also find it easy because I hadn’t come to grips with the bell curve yet. But if I think critically, there was basically no one I knew that was doing anything more ambitious than typing in games from magazines at that time…. So perhaps my worries about the dumbing down of society are overblown?

        After all, it’s not like no one is graduating high school or something.

    • cmrdporcupine 2 days ago

      I grew up in this era, owned a VIC-20, and it's simply not true that many people were doing anything useful with the rather awful BASIC dialects that shipped in microcomputers. They were just a pain to work with -- no real editor, etc and often missing the ability to even use the graphics etc functions of the computer they shipped on (esp on Commodore devices). Professional developers mostly wrote to assembly.

      We did BASIC (& Logo) programming in my elementary school, on Apple IIs. It's rare anybody got past basic "hunt the wumpus" programs.

      There were program listings for simple games in the back of computer magazines of the era. Invariably the better ones were full of DATA statements at the end that were bits of 6502 or Z80 machine code to do the "real" work. Woe befell you if you typed them in slightly wrong.

      Later, mostly in the "16-bit" era, we got structured BASIC varieties with better/real editors, and that definitely changed things. I got a lot done in GFA Basic on my Atari ST. But it's debatable if GW-BASIC, GFA Basic, (and later Visual Basic) etc were "really" BASICs... they were more like ... permissive and weird Pascal.

    • Pamar 8 hours ago

      I am from that era, so I might add something that perhaps is not obvious at all nowadays.

      The microcomputer explosion gave birth to an large number of actual paper magazines and at least 50% of their content were... actual source listing you had to manually retype. Basic was already fragmented in a billion different flavors and dialects (especially if your program had any kind of graphics) so the more ambitious user could also try their hand at translating a listing from - say - TSR-80 to Apple Basic.

      In any case you were directly exposed to the actual source code, and tweaking or experimenting with it felt very natural.

    • simmerup 2 days ago

      People probably say the same about how we don't have looms in the house anymore and no one can repair their own clothes

      Just because it's a skill you value doesn't mean its a skill others value

    • HeyLaughingBoy 2 days ago

      No, it's that those expectations were wrong in the first place.

      I spent a fair amount of time in the early-mid 80's tutoring engineering students in Computer Science (basically a CS101 for non-CS students). A fair number of otherwise very smart people simply couldn't grasp the concepts beyond the very basics.

    • forinti 2 days ago

      I think you overestimate the number of people who could actually do something useful with BASIC.

      • K0balt 2 days ago

        That could be. It was easier than assembly, at least, and as long as you could use peek and poke you could still get into the nitty gritty when you needed to. I remember using data statements as a memory space to communicate between my machine code and basic since it wasn’t really supported lol.

    • Cthulhu_ 2 days ago

      And yet, people struggle; I read a post earlier about someone who tried to get elderly people onboarded with their iDevices, and they couldn't make heads or tails from it, already struggling with the PIN input. Mind you I'm sure they would've struggled with basic and everything in between too.

  • MomsAVoxell 2 days ago

    There is a .. "new Wave" (?) .. happening around microcomputers associated with that era .. in the modern context. I think the reason this article has impact, is that this 'new Wave' of microcomputers is tangible, visible, and not just on the horizon but happening every single day, in a kind of quiet revolution/resurgence of platforms once considered 'outdated', suddenly becoming relevant again.

    You can still use these machines - this holy trinity of computers still get regular software made for them - and as platforms, there is a resurgence happening.

    The ZX Spectrum Next. The myriad FPGA-based consoles that allow full access to entire "retired architectures". The Apollo A6000 'next-gen Amiga' .. all of these new 'hobbyist systems' fulfil the original need that the 'trinity' systems proposed. There are, literally, hundreds of different ways to get a modern reproduction of the 'second wave' systems.

    So I imagine a day, in the not too distant future, that really useful applications are released for the 'next-gen 8-bit hobby microcomputer' systems. Put a well-fitted Apple II environment in a wrist-watch, and watch the devs arrive .. ;)

    By way of example, I have in my (admittedly extensive) retro computing collection, an Oric-1/Atmos system that was used consistently, every single week for 40 years, to record motorbike club membership details, statistics, visit logs, and so on. For 40 years that system was doing its job as an on-site membership database and fuel log. That it was offline and only physically accessible was a feature, not a bug.

    I think there are plenty of other places that the modern, new-school 'retro-' microcomputers can find their setting - just a little bit above embedded, perhaps, side-wise to the mobile, and very definitely competing alongside desktop in terms of active user experience.

    Not to mention, all the 'new retro consoles' are an awesome market for games and entertainment, of course ..

    • HankStallone 2 days ago

      It'd cost me about $10k to replace the stack of Commodore equipment I gave away to a scrap dealer 20-ish years ago, if I bought it all used now. But I'm still tempted to start, maybe with just one or two pieces.

      • microtherion 2 days ago

        I recently picked up a used Epson HX-20 for a very reasonable price. It's still an amazing design, and could probably still be used for actually useful tasks.

        • MomsAVoxell 7 hours ago

          I have the Olivetti M10 version of the HX20 - its very definitely something I could type on all day, writing a book or some such thing.

          My iPad with smart keyboard, not so much. I just get distracted by all the apps.

    • justsomehnguy 2 days ago

      > there are plenty of other places that the modern, new-school 'retro-' microcomputers can find their setting

      Quite amusingly it tends to be a very narrow and specialized things usually. Quite the opposite of the universability of a modern computers.

      • MomsAVoxell 2 days ago

        I think seeing a narrow/specialized use case for older computers demonstrates the general applicability of computers as a whole.

        They’re “infinite machines” inasmuch as there is an infinity of uses for a computer, and not all of those cases are obvious.

  • iberator 2 days ago

    I never used or even seen those computers, yet from pure curiosity I picked up 6502 CPU(Apple 2, Commodore 64, NES) assembly as a hobby project. Its surprisingly easy, even for dumb person like me.

    Great learning tool if you dream of building your own CPU architecture:)

    Try it!

    • dspillett 2 days ago

      > yet from pure curiosity I picked up 6502 CPU(Apple 2, Commodore 64, NES) assembly as a hobby project.

      I cut my programming teeth on an Acorn Electron (a somewhat cut-down BBC Micro) then later a Master128. The fact that the BASIC ROM included a decent multi-pass assembler was great for learning the deeper workings of the machine and such things in general. Being able to easily mix BASIC and assembly provided good lessons wrt choosing where optimisations were worth bothering with and where you should just stick with BASIC to save dev time. Those machines were also based around variants of the 6502.

      • cmrdporcupine 2 days ago

        This was the most annoying thing about the Commodore machines, frankly. Lack of out of the box tooling for working in 6502 assembly. I had a VIC-20 and quickly hit the limits of its BASIC and started having to POKE 6502 machine code to RAM to do useful things, by reading opcode tables for the processor -- because growing up my parents simply could not afford to go buy me an assembler cartridge for the machine.

        Was always jealous of people who had that available to them, and that it was built into the BASIC of the BBC machines is pretty cool. Even the monitor program built into the Apple II was superior.

        • dspillett 2 days ago

          > because growing up my parents simply could not afford to go buy me an assembler cartridge for the machine

          I was lucky in that my elder bother did the research and my parents (who paid for the things at the time) saw the wisdom and spent the extra on the Acorn machines, that were best educationally (while falling behind considerably wrt the availability of games, despite being more than capable enough alongside other popular machines on the market), despite the extra cost (which for my parents at the time, though I didn't realise it back then, was a very significant issue).

          That sort of “investment” in us kids, and general encouragement of our interests, by our parents did a lot for us, me in particular (by luck rather than favouritism: I was just the most techie minded of me and my brothers, they invested in my younger brothers interest in music too), long term. They didn't have much, but they made every effort to make the most of it for us.

        • HankStallone 2 days ago

          The monitor built into the C128 was a big step up. Not as good as an assembler since it didn't have labels, but you could do a lot with it.

      • forinti 2 days ago

        As well as mixing Assembly into BASIC, BBC BASIC allowed you to use its internal routines from Assembly. You could, for instance, use the floating point routines.

        So the BASIC ROM was not a lost 16KB of address space if you were only using Assembly.

        • dspillett 2 days ago

          Aye. The graphics primitives, and just about everything else, too. A lot of thought went into making those machines conveniently programmable which made them great for both learning and tinkering.

          Though the main two reasons I had for assembly were faster maths loops and graphics (via direct access to screen memory including, on the Master, the shadow screen for double-buffered drawing), so neither of those parts where sometime I called much.

  • gabrielsroka 2 days ago

    > the original Apple Computer (later called the Apple I)

    All the original manuals/ads/etc (that I can find online) called it the Apple-1 (Arabic, not Roman). I think Woz was already working on the Apple II by the time the Apple I was released.

  • jnaina 2 days ago

    Surprised the Atari 8-bit series was left out. It was arguably one of the first home computers designed with an appliance-like philosophy, rather than the typical open, hobbyist-oriented design of its time. Features like ROM cartridges, the Atari SIO serial bus (which even influenced the design of USB), multiple joystick ports that doubled as input for external switches or sensors, and support for smart peripherals like disk drives and drawing tablets set it apart.

    I still have my Atari 800 — it was my first real step into learning 6502 machine language.

    • andrehacker 2 days ago

      The Atari 800 came a bit later (1979), the 3 discussed all were introduced in 1977. 1979 was also when the TI-99/4 came out which eventually became popular when the price was dropped below cost.

  • sbecker 3 days ago

    > “ Three factors were required to join this holy ensemble: the technical expertise to design a capable and reliable microcomputer, a nose for the larger business opportunity latent in the hobby computer market…”

    I think the same opportunity exists now in the hobby robot market.

    • ChuckMcM 3 days ago

      Okay, that made me chuckle. As someone[1] who has also predicted the opportunity for hobby robots based on my experience with home computers. And yet that has really never materialized. I mean there are common household robots today, things like Roombas and robot lawn mowers, but the whole "app" ecosystem, a robot that can do lots of different things, Etc. is still not in the cards. Part of that has to do with how insanely hard vision to inverse kinematics is, "fuzzy logic" was going to fix that before, now "AI" is the buzzword of choice, but realistically you need a lot of things to go right to make this work.

      Back in the day we had everyone in the club set some goals for their robot, mine were simple; on voice command, have my robot go to a special refrigerator, get out a cold Diet Dr. Pepper, replace it with one from stock, then bring the cold one to me, where ever I was in the house.

      Even allowing for a lot of customized environment to support the robot that is a really high bar today (much less in 1998 when I was thinking I'd have it working by the early 2000's)

      End effectors, vision, custom fridge that an open electonically on demand with structured storage to hold beverages in pre-specified places, etc etc. I could probably get a lot closer.

      Of course my eldest child could do that at 3 years old without any programming at all and no custom engineering of the appliances. So some things seem effortless for people are really challenging for computers.

      [1] I was President of the Home Brew Robotics Club (hbrc.org) for 10 years around the turn of the century (after Dick Prather had been President for 10 years and before Wayne Gramlich became President for 10 years :-)

      • microtherion 2 days ago

        In 1977, a company named Quasar Industries demonstrated a household robot claimed to understand a vocabulary of 4000 words and capable of vacuuming, washing dishes, and teaching the children of the household French. It would ship within the next two years, at a cost of $4000.

        Some CMU graduate students attended one of the demos, and discovered two people discreetly loitering in the background, one doing the "speaking" of the robot, and the other remote controlling it.

        Luckily in today's more advanced times, we have Optimus shipping any day now, which definitely has never been teleoperated at any of its demos…

  • zabzonk 3 days ago

    Just thinking how, shall we say "svelte" Woz looked in the first photo, compared with the "portly" (but cheerful and brilliant) look of later years.

    • andrehacker 3 days ago

      Silly remark, but talking about "Svelte", I can't parse the picture with Mike Markkula: the Apple II in the front is not exactly a small computer: it is a full size keyboard with several inches of space between the keyboard and the edges so either I don't understand the basics of perspective or this is a scaled down model, no ?