This point is more important than many people give it credit for;
Thanks.
<snip> sunk a lot of costs into the x86 instruction set, and it was simpler to throw a few thousand engineers at the hardware AND a few hundred thousand at the software
Yep (but here you're agreeing with me, and now I'm agreeing with you agreeing with me ;-) )!
<snip> An early ARM running a microkernel pushed a lot more onto the individual software developer (also giving them a lot more freedom) compared to x86.
Right and this explains how ARM started out and expanded its market too. As a recap for those who don't know (which probably doesn't include
@adespoton , but might include some reader of this comment in the future), ARM began at Acorn Computers in Cambridge, UK.
Acorn had achieved some early success with their 2MHz (ie. 2x faster than an Apple ][, or VIC-20/C64) 6502 based 8-bit computer: The BBC Micro. The BBC Micro was indeed commissioned by the British Broadcasting Corporation in 1980-1981, because the UK government had been shocked by the sudden implications of ICs and how far behind we were (perhaps). The BBC figured if they comissioned and endorsed a computer, then they could educated the British public with it. And they did: they built an educational TV series around it and something like 80% of UK schools bought BBC Micros. The spec was good, making the computer relatively expensive (£400 for the day), but it was a fast 6502 running a very fast version of a structured BASIC.
This was a similar situation to IBM: "No-body ever got fired for buying a BBC Micro for a school". If you knew nothing about computers (which went for most teachers and head teachers), then you would feel safe by buying it. But by the mid 1980s, it was obvious that 8-bit computers had had their day and Acorn needed to find a proper successor.
I was at the 30th anniversary of the BBC micro in 2012, at ARM, in Cambridge where Steve Furber explained. Furber and Sophie Wilson (who design the BASIC) went to Intel in the US to see if they would licence the 80286 to ARM. They liked the CPU, but figured they could design a more efficient bus interface, but Intel wouldn't let them create a custom 80286. But before they left for the UK, they dropped in on the Western Design Centre, because they knew they were designing the 16-bit successor to the 6502: the 65816 (the 65C816 CMOS version came a bit later). They thought it might plug the gap.
What they didn't expect was that WDC was basically a two-person outfit with some students dropping in to help out. And this made them think: "If these guys could develop a CPU, then Acorn could too". They'd heard about the early RISC research so they set about designing the ARM (Acorn Risc Machine) CPU, modelling the design in MODULA-2 on a BBC Micro itself. Furber did the architecture, while Wilson wrote BBC BASIC for it. And both of these went together: the language influenced the design of the CPU. The key thing is that
ARM was small and low-power, because the engineering resources forced it to be minimal.
The first CPU taped out in 1985 at 25,000 transistors. Then the next iteration: ARM2 was used in the earliest Acorn Archimedes computers. These computers ultimately attracted the attention of Apple after they'd given up using the Hobbit CPU for the Newton. Apple then persuaded Acorn to spin off the CPU division to a separate company going under a modified name for the CPU: Advanced Risc Machines.
And this is how ARM made it work. All the earliest embedded applications for ARM were single customers with high sales volumes of hardware & low power requirements. Here,
the cost of the programmer is small compared to the cost of the device's manufacturing and this makes the application viable. It didn't matter that drivers and OSs had to be ported or rewritten. ARM had the basic advantage that embedded applications needed. With this, the portfolio of ARM chips grew over time.
In both cases, ARM progressed from pretty much entrenched positions: Schools to Embedded Systems to general purpose computers. Of course, the first ARM application was a general purpose computer, it's just that those computers weren't in an entrenched positio, which is why they were out-competed by PC in the end.
That's really why I enjoyed using the MC68k series so much, after all, even if it was self-limiting
I like 68K coding, because it's relatively easy.
I think people overthink why x86 won. IBM made an open system where nobody had to pay money to make hardware upgrades. Large software companies made programming languages for x86 to support office and industry and tons of applications came about because of that. Finally, sales were so great that Intel and AMD (second source CPU from the start) had the money to make much better and faster CPUs killing everyone else.
Some of this is true, but some of it isn't really what happened at the time. The IBM PC 'won', because execs knew "Nobody ever got fired for buying IBM." So, technical rationalisations based on the "it's an open architecture" are themselves overthinking the situation, because the IBM PC had already won before these factors were major drivers for its success.
Nevertheless, it's still very true that Digital Research and Microsoft produced high quality tools for x86 very early on and this massively leveraged the popularity of the architecture from a development viewpoint. You're correct that AMD was a second source from the start (an IBM requirement), but neither company had the money to outcompete other CPU manufacturers until after the PC became overwhelmingly dominant. Intel, for example, weren't the all-powerful company they became even up to the beginning of the 80386 era: the 8086 was designed because they were struggling and the 80286 was already challenging their business model at the time the 80386 was launched (this is one of the reasons Intel sent AMD buggy microcode).
The 8086/8088 and 68000 came out around the same time 1977 and the only reason Intel won with a 16 bit CPU vs Motorola 32 bit 68000 was because of the volume sold to IBM.
Before the IBM PC people had to relearn the OS and apps every time they upgraded machines, so they were used to it. Even in the early days of the x86 you had a few versions of DOS, Windows 2.x/3.x, OS/2, CPM/86, GEM, hell even GEOS came out for the PC plus assorted UNIX. There was just so many choices and so many apps.