The Amiga 1200 was arguably the last and best of the golden era "console computers", (meaning all-in-ones with the CPU inside the keyboard and could display on a standard analog TV). This style of home computer largely reigned the consumer market between 1980 and 1990, beginning with 8-bit machines like the Commodore 64, Atari 400/800 and Apple II, evolving to 16-bit with the Amiga 500 and Atari 1040STE and ending with the 32-bit Amiga 1200.
The Amiga 1200 was based on a Motorola 68020, had a hardware-based bit blitter and could display 320x240 in 256 colors, basically the holy grail of analog gaming graphics modes. It was the only widely distributed home computer capable of faithfully emulating the vast majority of 1980-90 dedicated arcade machines with few compromises. I have all the machines mentioned above in pristine working condition and still enjoy playing around with them.
It was a special era in computer history because every platform had its own unique hardware and operating system providing different capabilities and trade-offs. This gives each platform an unmistakable (and strongly-opinionated) 'personality'.
The 68020 is probably the last great CPU to write pure assembly in, too. The instruction set is rich enough to make it straightforward to write by hand - unlike RISC-ier chips, C is almost superfluous for many tasks - but simple and clean enough to keep in your head (unlike x86). It cleans up some of the rough edges from the 68000, and adds some great features and a ton of speed. But: no MMU, no complex pipeline or branch prediction, and the cache is instruction-only, meaning you rarely need to think about it to get optimum performance.
I disagree that RISC chips are especially hard to write for by hand - plenty of people did exactly that for the Acorn Archimedes. (And writing by hand can still be worthwhile even when you're lacking fancy CISC instructions, because you can design bespoke, custom-use "calling conventions" that make way better use of the register file. This is hard to do in C even when theoretically possible-- via fancy, non-standard asm intrinsics-- because you don't know beforehand how the compiler will choose to allocate registers.) The MC68k architecture proved a dead end, and this is ultimately what would have doomed the Amiga installed base, even if Commodore hadn't been mismanaged to the extent it was.
The ARM chips were brilliant and definitely the most pleasant of the RISC chips to handwrite for, particularly with a nice assembler that handles the quirks of dealing with constants for you.
While the 68k ISA was a dead-end, that's mostly because Motorola was all-in on PowerPC; AFAIK there's no fundamental reason they couldn't have pulled an Intel and thrown transistors at it until it could keep up despite the decode complexity.
FYI, the 68060 was fast, providing a good upgrade path from the 68040s found in a4000s.
Near the end, after the initial bankruptcy, PowerPC boards, some with 060s and PPC chips both, hit the scene.
Entire SDKs were introduced, and many bits of software were ported. This was a good bridge, and as Apple made this sort of jump, eg, they didn't die when switching from 68k to ppc, I don't think a healthy Commodore would have either.
So I disagree. The Amiga could have transitioned too.
The biggest problem I have with RISC assembly is that only being able to access memory through dedicated load/store instructions tends to make the code more complex with additional register shuffling.
I think its too much to lump all RISC chips in the same bucket.
I have written some assembler for MIPS and Alpha AXP back in the day. The mips assembler was quite straight forward and reasonable but I found the alpha to be very hard.
Another thing that made this era special is that these were meant to be mass market devices in which it was kind of assumed that most owners would want to learn basic programming (usually in BASIC, no pun intended) to get the most out of their devices. The machines came with manuals including programming tutorials.
> The Amiga 1200 was arguably the last and best of the golden era "console computers", (meaning all-in-ones with the CPU inside the keyboard and could display on a standard analog TV).
Surely if "last and best" is your criterion, one can hardly discount the Atari Falcon. (And the earliest models in the Acorn Archimedes line were definitely "all-in-ones" integrating a keyboard although they might have lacked in-built TV out.)
The Acorn A3010 was the only "home" computer, targeted at the non-educational market, that Acorn ever made. It is distinguished by being the only Acorn computer with green (not red or grey) function keys, and also by being the only Acorn computer to which Econet could not be fitted. And it had a TV-out.
At the time it was launched, 1992, the ARM2 processor inside it was considerably more powerful than workstations that could be bought for ten times the price. It was insanely powerful, but hamstrung by not having any form of graphics acceleration. (The CPU was so fast it was just assumed you'd do everything in the CPU.)
It's a shame, really. RISC OS running on ARM deserved to be the British standard of computing, but Acorn's failure in the US market meant they could not compete with the sheer PR power of the large American competition.
There must be some rose-tinted glasses in the retrospective pipeline here. Even assuming a generous IPC advantage, a 12Mhz ARM wasn’t likely to be competitive with a contemporaneous SPARCstation @ 40Mhz or an Intel 486DX2 @ 50Mhz.
What made the Amiga great is that this was not just a bit but way more advanced than anything out there. When I had to, reluctantly, switch to a 386 PC, this was very painful because it felt like going backwards in time by several years. A friend of mine had an Amiga 1000 (it came out in 1985, before the 500 whic surfaced in 1987): going to his home was witnessing the future.
Fun sidenote: I recently found I still have my bootleg 5"1/4 Amiga floppy drive reader: this was way cheaper than buying 3"1/2 disks. If you planned to buy more than 30 floppies, it was cheaper to buy a 5"1/4 drive and 30 5"1/4 floppies than it was to buy only 30 3"1/2 floppies. We'd add a switch at the back of the Amiga to decide from which drive to boot. I plan to open that drive one of these days and see how it was made.
My Dad bought the family computer for Christmas, 1985. It was an Amiga 1000 and I still vividly remember him teaching me to type "dpaint" at the command line to fire up DeluxePaint. (IRRC, the DeluxePaint disk didn't come with even a minimal Workbench environment on it, just an AmigaDOS window). I was 6 years old.
I used that machine for 10 years. I made drawings and animations (DP III), added titles to the silly camcorder tapes my friends and I made (DeluxeVideo), wrote book reports (ProWrite), made Christmas and birthday cards (various), composed music (Music Studio), played games, and did some light exploring of the system itself, playing with startup-sequence to improve boot times and customize things.
It wasn't until I was a junior in high school that I switched to a 486, and it felt like a step backward in many ways. Of course it was much faster, and had a hard-drive (my first ever), and the resolution was (only a bit!) better. But it was still limited to 16 colors, only 640x480, and all the software was bland business-y bullshit.
I have nothing but fond memories of that machine, and I wish Amiga had been more successful in the market. It took a decade to get back to where they would have been had the trajectory continued from their 1985 launch.
When I went with the 10 MHz 8088 with a 40 MB hard drive I also looked at the alternatives of getting either the Amiga 500 or just getting a Lt. Kernal 20 MB hard drive for my 128.
The driving factor for needing more storage of course was for my BBS at the time.
Wow, “Lt. Kernal 20 MB” is something I haven’t thought about in ages. A friend of mine bought whatever the competitor to the Lt. Kernal was at the time, used, and it was a complete lemon — he never got it working and never recouped his money. I wish I could remember the name of it — he was going to start his own BBS called and I was going to be a subop.
I was a hard-core BBS user back in that era but never actually ran one, so I went 64 > 128 > Amiga 500 > 386.
I remember, sometime in '91 or '92 getting into a flaming argument on a local BBS message board about why the Amiga was better than the PCs at the time. While Amigas weren't common in the U.S. mid-Atlantic at the time they did have a small following of die-hards (even then!) and even a very nice retail store at the local mall. A couple of my friends even had Amigas that I enjoyed playing games on from time to time.
The arguments even then were frozen in time, look at all the Amiga HAM mode colors vs your 286s CGA! Ha! PC beeper vs the Amiga's sweet 4-channel digital sound. and on and on.
Meanwhile, I typed my angry responses on my 33 MHz 80386 with a 387 math co processor, a sound blaster, and VGA graphics on a system with a case actually designed to hold a hard-drive and 2 entire MB of RAM. I was bewildered at this stranger's insistence that their thoroughly hacked A500 was a better system. I was even recently introduced to the demoscene and while the Amiga demos had better design, the PC demos of the time were really quite spectacular, introducing real-time 3d in ways nothing at the time could hope to match.
It was an angry, bitter argument that went on for weeks until we both tired of it and gave up.
Looking back what I really remember was the resentment at losing their platform. Of investing and heading down a spectacular evolutionary dead-end and a desire for it not to be wiped away by boring beige boxes that lacked any of the passion or wit that the creators of the Amiga had infused into their creation.
What made Amigas great was the creative life energy that was weaved throughout it more than any specific technical considerations. If the Apple Macs were created with the taste of expert graphic designers, and PCs by accountants, the Amiga was made by the kind of anarchist creatives who would later on go to make things like the early Burning Man, perfect the 90s counter-culture digital art movements like the demoscene, off-beat public access videos with the video toaster, the epic Babylon 5, and allow bedroom game coders to absolutely maximize their art. It was the seed that gave visual representation to earlier cyberpunk.
It influenced everything and no lessons were learned from it.
> Looking back what I really remember was the resentment at losing their platform. Of investing and heading down a spectacular evolutionary dead-end and a desire for it not to be wiped away by boring beige boxes that lacked any of the passion or wit that the creators of the Amiga had infused into their creation.
You put into words the feeling I've held onto for 25 years. When I finally gave up on my A1000 and moved to Windows on a 486, it felt like a loss to be grieved.
> What made Amigas great was the creative life energy that was weaved throughout it more than any specific technical considerations. If the Apple Macs were created with the taste of expert graphic designers, and PCs by accountants, the Amiga was made by the kind of anarchist creatives who would later on go to make things like the early Burning Man, perfect the 90s counter-culture digital art movements like the demoscene, off-beat public access videos with the video toaster, the epic Babylon 5, and allow bedroom game coders to absolutely maximize their art. It was the seed that gave visual representation to earlier cyberpunk.
Seriously, you're capturing my memory of this platform perfectly. The Mac was the nearest thing to an Amiga, but it was still lacking in some hard-to-pin-down way that I think you just nailed.
IMHO Macs were not all that great at that time (early 90's). I think what came closest to the "Amiga experience" were NeXT and Silicon Graphics workstations, at 20x..100x the price.
>Meanwhile, I typed my angry responses on my 33 MHz 80386 with a 387 math co processor, a sound blaster, and VGA graphics on a system with a case actually designed to hold a hard-drive and 2 entire MB of RAM.
... which cost you a fortune and yet was still worse than the oldest and cheapest Amiga, nevermind the A1200 which was made available around that time.
I finally "upgraded" to an Athlon with Linux in 2000. I do not feel like I missed anything in between; Through the years, I had lots of chances to extensively use 386/486 with DOS and Pentiums with Windows 9x elsewhere than home, and at no point I thought the user experience held a candle to my Amiga, which by the way was just an A500 with 1MB RAM and no hard drive.
By '91 the PC was finally starting to catch up. In the late 80s the Amiga was far superior, at least as far as consumer priced equipment goes. I was in the process of switching from my beloved A500 to a 386sx-40(IIRC) in '91 when law enforcement confiscated them all, but I remember wanting a PC for more power.
Got caught hacking. Charged with a couple felonies and a misdemeanor. Did probation for a few years and was supposed to stay off computers but I ended up back online with a 300 baud modem and a green screen terminal.
Unlike other books about retro it does not only try to trigger your nostalgia but is an actual deep dive into the history and the technology about this machine.
For example: It explains in detail how the famous „Boing Ball“ demo used all of the Amiga‘s custom chips to keep the CPU almost idle while running. This was important for showing off the preemptive multitasking capabilities of the Amiga OS.
It also includes a thorough look into the implementation details of Deluxe Paint I and how the IFF format came to be.
I haven't read that book, but your description reminds me of this recent video which I thought did a good job explaining why the Amiga was so great: https://youtu.be/PHN8ANlR8KI
As a kid in a computer store in 1987, it was a pretty easy choice. The PC was either monochrome or 4 color CGA graphics. The amiga had a 3d ray traced animation of juggling spheres.
The Mac II could display millions of colours, but not at the same time.
From wikipedia:
"The Macintosh II includes a graphics card that supports a true-color 16.7 million color palette and was available in two configurations: 4-bit and 8-bit. The 4-bit model supports 16 colors on a 640×480 display and 256 colors (8-bit video) on a 512×384 display ..."
There were optional video RAM upgrades, but none that gave you 'millions of colours at 640x480'.
Wikipedia also notes: "The video card does not include hardware acceleration of drawing operations."
Also, for comparison, in 1987 when the Mac II came out (two years after the Amiga 1000) you could buy a Mac II for USD$5,500 or the Amiga for USD$1300.
That dependent entirely on what video card you used; Radius, RasterOps, and others were doing full 24-bit video on the Mac II in 1988-89 on two-page displays. Yes, the video card and monitor each cost about $3000 at the time, to install into your $8000 Mac II or IIx, but they were broadly compatible (not needing drivers for particular applications) and were commonly used in print & publishing.
Sure, but we went from 'as a kid in a computer store in 1987' to 'if you waited a few years, and stumped up an order of magnitude more money, you could buy something nearly as good from Apple and another company'.
That rough timeframe would see the Toaster arriving in the Amiga world, which would put the Apple offerings even further back in comparison.
Woz is / was a legend, but I'm certainly no Apple Inc apologist, so my reaction to a story about what made the Amiga so great is a bit different, I guess.
As much as I respect Dillon, I can't say I've seen anything about DragonflyBSD that suggests it includes anything of what made AmigaOS interesting. Maybe there's more in the internals. I'd love to know more about it.
That doesn't really feel like it's giving you anything worthwhile. AROS and UAE provides more than good enough abilities to run old AmigaOS code. But old AmigaOS code doesn't really bring any of what was great to a modern system.
I hope it's exaggerating the similarities of Hammer and FFS, because FFS was dated already a couple of decades ago. It was never a strength of AmigaOS.
Yeah I'm mostly being tounge in cheek. Looking forward to when my kid is old enough and can start tinkering with an Arduino for example (nothing quite like that around when I was a kid).
my kids (5, 7) got a Lego Boost for Christmas this year.
they've spent basically 4 whole days with it with breaks for food and sleep. it was amazing. Lego has found a spectacular balance between physical play and screen time.
No generation of Amiga that was commercially available could come even close to rendering the real-time 3D graphics of Minecraft. The Amiga was impressive but even running something like Doom (or Gloom - since Doom was never officially ported) was a struggle on the cutting edge Amiga 1200.
Bitplane graphics was probably a bad decision. It's good for games, but terrible for any program where you want to change one pixel at a time - If you have a linear frame buffer, lighting up a pixel means a read from memory, some OR'ing and a write. When you have 5 bit-planes, you need as many reads as bits change in the color value, up to 5, and then 5 writes. Good for games, but bad for mostly everything else.
Also, the keyboard, it came out a good couple years after the DEC LK201, which is very similar to both the Amiga and the STs. The IBM enhanced came out at about the same time, but, IIRC, the 122 key layout dates back to the Model F family.
I wonder how hard it is for people who were not old enough back then to properly place these machines in their historical context.
Changing a single pixel at a time wasn't a common operation in the 80s when the Amiga was popular.
The main class of application that would have used single-pixel modifications would be paint-style programs, and the cpu was generally fast enough to keep up with user input.
There are advantages to planar graphics that can help applications too. Being able to dynamically change the number of bitplanes based on an application's color requirements both saves ram and increases cpu performance.
It wasn't really until about 1992 that the lack of chunky graphics really became a liability, with the rise of 3D games like Doom and various multimedia applications, like streaming video off CD-ROM and more advanced productivity apps. If Amiga had actually delivered the AAA chipset in 1993 with it's various chunky graphics modes, maybe the platform would have survived.
> single-pixel modifications would be paint-style programs
Any app that draws a pixel as a single operation does that. In order to make this faster, you'll need to keep a bitmap to hold the image and then, maybe, use the blitter to render it to the screen. Bitplanes allow some tricks with games, when you split the planes into more than one playfield, or could even use it for windowing, putting the active window on one group of bitplanes above the rest so that you wouldn't bother to redraw windows because oclusion isn't destructive.
Yes, but I'm having a hard time thinking of many applications from the late 80s and very early 90s that required per-pixel control of the frame buffer.
Text rendering is not done per-pixel, it's done in character sized blocks. Most UI elements at the time were tiled-based. You can do a lot with just blocks of pixels. And with anything that's done in large block (especially blocks that are multiples of 8 pixels wide), the per-pixel cost is atomized and becomes much less of an overhead.
If you are predominantly working with bitmaps, there aren't many reasons to operate on a pixel basis, but, take, for instance, drawing a line of n pixels. For a linear 8-bit frame buffer, you'll need up to n reads and n writes (because the 68000 had a 16-bit bus). For an 8-plane buffer (for the sake of simplicity), you'll need 8 times as many operations, one for each plane. If you know you'll be setting or unsetting a pixel in the same 16 pixel region, you can collapse those in-plane ops into a single write, but that requires some cleverness (like off-screen rendering to fast RAM).
Fun fact regarding bitplane: a friend of mine realized that he could use only 16 colors instead of 32, have all colors from, say, 0 to 15 be black, then clear the whole screen by zero'ing only the first bitplane (so effectively only blitting one-fifth of what would typically need to be blitted in case you wanted to clear the whole screen between two frames).
Problem is: running the Amiga in 16 colors mode (instead of 32) was faster than using that trick in 32 colors mode.
However... That trick worked on the Atari ST. It was a very fast way to clear the whole screen. But as the Atari ST only had 4 bitplanes to start with, then you'd end up with only 8 usable colors if you used that trick.
We're talking decades ago, but I'm pretty sure I remember this correctly (as in: we actually tried and timed all this on both the Amiga and the Atari ST back in the days).
> I wonder how hard it is for people who were not old enough back then to properly place these machines in their historical context.
Yup, what an era. At least I've got my Raspberry Pis today.
The original Amiga shipped in 1985 and at that time the bitplane approach offered a reasonable trade-off versus chunky pixels by enabling some uniquely powerful capabilities. When Commodore died of self-inflicted wounds in 1994, a much more powerful new chipset codenamed "Hombre" had been created and was working on prototype boards. Hombre had chunky pixels among other cool things. https://en.wikipedia.org/wiki/Amiga_Hombre_chipset. Too bad it never shipped
Bitplane graphics allowed the Amiga to do a lot of graphical tricks other machines, given the limited memory of the times, just couldn’t do. Examples include halfbrite and HAM modes. It’s part of the reason anyone who saw Amigas in the early days felt like it was decades ahead of the competition.
By the time chunky mode graphics and 3D rendering really started to have an advantage, computers were moving to seperate graphics cards anyway, so I don’t see it as a bad choice. Had Amiga continued I’m sure we would still be using Nvidia and AMD graphics cards in them now. In fact 3rd party graphics cards did start appearing for the Amiga.
The bad decisions for Amiga all came down to Commodore sucking it dry and not putting any money into the engineering division.
I think about half the comments I've ever made here on HN are about my Amiga 1000 that I got in 1985. By 1989 it had a 14MHz 68020/68881 with 4MB Fast RAM, an 80 MB SCSI drive, and a MIDI interface that I built myself from diagrams from a Byte Magazine article. Such an awesome piece of kit. I worked in front of a Sun 3/50 (that's were I got the SCSI disk) and used the Sun compilers to write my Amiga code. Poor person's Sun workstation it was.
I see a lot of hymns to the Amiga, and people reading them would naturally think, if I had an Amiga think what I could've done...
Well, I had an Amiga, and all I did was use it to play games; maybe those games were a little aesthetically better, but in the end, I didn't do much with it.
So even if you didn't have an Amiga, don't worry - you might never have done anything great with it anyway (like me).
The A500 helped me to study how digital gates and ICs work using a software that allowed their emulation plus the excellent Don Lancaster cookbooks. I managed to "build" a device that could grab a value on a port, store it into a memory, get it back and send it over a serial line to be reconstructed back on the other side. It was awfully slow, of course, but the experience turned out of great help to me.
I also made some bucks by building and selling MIDI interfaces, ADC0804 based parallel port audio samplers and a very simple device which would activate a beeper whenever there was a disk write. Most viruses replicated by writing themselves on the bootblock as soon as a diskette was inserted so an unexpected beep could be a sign that something was going on.
With the A2000 plus the Dr.T's KCS sequencer I managed to compose and record my first over 20 minutes long prog rock song using a Roland R8 drum machine, a Yamaha TX81Z expander for bass and a TX802 for guitars and other intruments, then a Roland Juno 2 for solos and pads. It was amazing how the 8MHz Amiga 500 and 2000 (actually 7.16 MHz as the system clock was synced to the video clock which in my case was PAL) could record a very fast realtime double stroke roll without a hiccup (sw/hw limit was 1/384). I used this seemingly unique feature of the R8 by pressing both flam and roll buttons while modulating the dynamics on the instrument pad, and the Amiga didn't complain. (Any Neil Peart fans out there should understand what I was trying to accomplish:^)
On all machines I made my first experiences with 68K assembly and other languages, also developing along with a friend a chat system for the Dialog Pro BBS software.
The Amiga also taught me how fast an user interface can be if well developed, and how to make use of system resources without wasting them. I would probably never think that Android on a 2GHz phone today is slow if I didn't experience AmigaOS on a 25MHz computer back then.
Also, AmigaOS had no memory management, so if I allocated say one byte and exited from my program, that byte would remain
unavailable until next reboot (short of using dangerous low level tools to free it). This forced developers to be very careful when using resources: each allocation would need its corresponding free before exiting, and situations where one would need to allocate A, then B, then C, and one of them wasn't available had to be taken into account (usually by conditionally freeing in reverse order).
So did I do anything great with the Amiga? Probably not, but the knowledge and experience turned out useful in all my jobs, and still is today. To me, using the Amiga alone was the great accomplishment.
C was a pain in the ass, at least for those like me who only had one disk drive.
That's why assembly language was the natural progression from Basic back then.
Aztec C was a commercially available, and very popular, C compiler available for the Amiga at the time. And for any reasonable size program, it was fast enough. I worked on one unusually large program for the comparable Atari ST that took about half an hour to build. Most "normal" programs compiled a lot faster.
I seem to recall that Dice-C was faster than Lattice/SAS.
If C compatibility wasn't required, then Amiga-E was possibly the best option if one wanted a (very) fast compiler paired with a high lever powerful language.
I wonder sometimes about the opportunities and impetuses surrounding different people, thinking that we need a lot of the right things before anything is at all likely to germinate.
The A1000/A500 reigned from 1985 to 1990 or so, more or less the same platform for 5 years, a platform I could write all programs I wanted to in pure assembler, and eventually knowing every register and bits of HW in the entire system. Essentially just the right amount of complexity for individuals (kids, even) to learn without the knowledge getting obsolete before you mastered it. It was "lightning in a bottle" so to speak.
I'm glad the Arduino, RPi and other small devices exist today, that don't evolve as fast as desktop OSes or mobile SDKs do, that give the chance for curious minds to learn before their knowledge gets obsoleted.
The Amiga was hackable. While the PC ecosystem cut corners, and the Macintosh tried to keep people from knowing the internal details, the Amiga enjoyed a hacker/tinkerer culture from the start.
13:21 in the linked video: using the copper to change palette below a certain scanline - that is genius! I really like how creative the 80's and early 90's platforms were!
I think that was also the idea behind the HAM(?) image format, that allowed you to display images with more than 256 colors, by dynamically changing the pallette as the frame is drawn. A similar trick was used to change the location of the screen memory midway through the frame, and that's how they implemented the guru meditation error drop-down. Unfortunately, IIRC, the CPU was so slow the raster would travel about ten pixels between changes, so it was mostly just good for effects that used full scan lines, like raster bars, etc. You couldn't use it to programmatically draw a high resolution image or anything like that.
>A similar trick was used to change the location of the screen memory midway through the frame, and that's how they implemented the guru meditation error drop-down.
A simpler example, the support for showing multiple virtual desktops (Amiga called them screens) at once. The DMA registers in the Agnus and the video mode in Denise would be set to the next screen at a certain point @ raster, triggered by the copper.
So many great (and bad) things: The Aminet repository. AREXX language. Directory Opus utility customizable file manager. Ability to sync w/ TVs. Hundreds of rich assembler coded video, audio, and graphics Demos of international origin. Cinemaware games. NewTek DigiPaint. Diga telecomm tool. Miami TCP/IP (I helped H.K. to get PPP working with NT Server). Unixy and open source tools available. Lattice C. Very capable multi-tasking. Hardware accelerators, FPU, SCSI boards. Dragging dropdown virtual screens. A single fixed root vector for OS (I no longer remember why important but the obvious), Logical Volumes and automatic mount requesters; good for small storage systems as the system knows to search and otherwise ask you to insert the correct disk, data or program. Dungeon Master. FA-18. Music Tracker and mods. Good enough quality sound. Again Directory Opus and Im sure I missed a lot.
Doom is probably the most obvious example of how the PC eventually came to overtake the Amiga’s graphical capabilities, but I think it’s less well known that it took even longer for anything to overtake the Amiga’s audio capabilities.
Demoscene creators were putting out fantastically creative polyphonic rich music on Amigas for years while Doom-playing PCs were still blip-blopping.
Even when Sound Blaster type cards (with sampled instruments) became popular, MIDI was the standard of PC audio, which couldn’t hold a candle to MODs.
The MOD music format actually outlived the Amiga, being used in 1998 in Unreal and in 2000 in Deus Ex.
Doom is actually what made me to scrap my Amiga and get a PC. Some guy showed me his collection of pirated id software games and although the machine was big and ugly and the sound was crap i know i just had to have one.
The Gravis Ultrasound was a 3rd or 4th generation PC soundcard that came out 7 years after the Amiga. Its claim to fame was basically that it was as good as an Amiga.
The Amiga 1200 was based on a Motorola 68020, had a hardware-based bit blitter and could display 320x240 in 256 colors, basically the holy grail of analog gaming graphics modes. It was the only widely distributed home computer capable of faithfully emulating the vast majority of 1980-90 dedicated arcade machines with few compromises. I have all the machines mentioned above in pristine working condition and still enjoy playing around with them.
It was a special era in computer history because every platform had its own unique hardware and operating system providing different capabilities and trade-offs. This gives each platform an unmistakable (and strongly-opinionated) 'personality'.