I think there's a lot of fondness for Acorn, at least here in the UK but I'd like to offer a slightly more accurate history.
I was the owner of many Acorn machines including BBC B, Master, A410, RiscPC 600. The hardware, clearly designed or at least originated by Sophie Wilson was remarkable. It was robust, well designed and incredibly expandable. To this day there is not a single computer that actually made sense more than anything that Acorn kicked out. A human could learn everything about it in intimate detail without a problem.
However the software was a source of constant pain. Firstly nothing was finished initially when the Archimedes came out. The Arthur OS was apparently named as a "A Risc operating system by THURsday" because their internal OS project, apparently Unixlike, went down the crapper during development and they had to hack something up quickly so they had a minimum viable product. What I ended up with was a barely usable OS that consisted of a quick port of Acorn MOS from the BBC Master series and a naff GUI chucked on top for my £1400 investment (a hell of a lot back then and even now) that wasn't fixed properly until RISC OS 2 came out in 1989 so I sat there with a lemon for a year. After that we were stuck with a cooperatively multitasked operating system with a worldview completely different to anything else at the time or in the future. A lot of progress was made but it never had any prospects despite a lot of us clinging onto the initial investment.
Now I certainly enjoyed the platform but in retrospect, I'd have invested my money in something else back then if I knew what was going to happen.
I full respect the achievements here and more importantly the legacy (I have 12 ARM processors still in various things in my house!) but for us footsoldiers who paid up back then, it wasn't all love and happiness.
I learned almost all my low-level computer knowledge from Archimedes & RISC PC through the 90s. Very glad that ARM code is still a useful skill, even if knowledge of absolute memory address of OS_ SWI patch tables went in the dustbin :)
But it was only after Acorn imploded in 1998, and a a couple of years of working with Linux that I thought "hmmm, you mean I can write shared libraries for my C code that aren't kernel modules?" and "what, you mean the computer can just switch away from my task even though I've not called Wimp_Poll? What if I'm not done?" and "What, you mean the OS will just kill my task if I address some memory I'm not supposed to? How does it not know I didn't intend to patch the OS from my desktop application?" etc. etc.
Also the Archimedes (at least) was pretty much the most expensive computer on the planet at the time - something like £3000 in 1988 money - it's amazing they sold so many to people just on the strength of Zarch :) ( https://www.youtube.com/watch?v=ALfnZjCiuUQ )
Windows and Mac OS were both cooperatively multitasking until 1995 and 2001, although they did have more memory protection than RISC OS.
As to the worldview, everyone had their own worldview before the internet, / vs \ vs : vs .
Mac OS Classic had filetypes, same as RISC OS. Bit odd that the web uses them really, considering it has mimetypes.
The RISC OS GUI is still miles ahead of everyone else. Its application packaging was pretty good, easy to edit (although the !boot system should be restricted in what it can do).
But they never went through the transition that Windows did (twice), or Mac did to OS X.
The GUI was amazing, and is still one of the easiest-to-use GUIs I've ever encountered, even though there were some major omissions, the main one being total lack of accessibility --- you couldn't drive it from the keyboard at all. (Win 3.1, OTOH, had amazingly good accessibility and still beats a lot of modern software.)
The underlying OS was not amazing. There were some good parts; the relocatable module system was very elegant, if rather alien by modern standards; the ability to resize nearly all memory areas on the fly (by dragging bars in the GUI!) was great; proper pluggable filesystem modules in the 1980s were a revelation; the built-in BASIC was super fast and reasonably comfortable to program in, even though it didn't have structured types (some of the built-in ROM software was written in Basic!)...
But the bad parts are bad. The underlying technology is really primitive, being a non-preemptive OS with no memory protection that's worth anything, and then it's suffered from many years of ad-hoc organic growth... e.g. a lot of the core APIs pass flags in the top 8 bits of addresses, because back then the ARM had a 24 bit address bus. Running on a modern machine with more than 16MB of RAM? Good luck with that. APIs are duplicated everywhere with slight changes. There are lots of undeclared dependencies between modules (including recursively, which isn't really supported). Platform independence isn't really a thing, except when it was crudely bolted on for some platforms.
Plus there are some... questionable... design decisions. My favourite is the big chunk of myserious code in the main system allocator which gropes up the call stack every time you try to allocate memory. Why? It's looking for stack frames belonging to the system allocator, so that it can tell whether it's being called reentrantly. Why is it doing this? So it can detect whether it's being called from an interrupt, at which point it goes through an alternative call path and return memory from a different pool!
If anyone's interested, a few years back I wrote a proof-of-concept RISC OS kernel reimplementation called R2: http://cowlark.com/r2/index.html
Unlike the real thing, it runs everything in user mode except for a tiny stub to handle SWIs. (There's a built in ARM emulator.) It's complete enough to run Basic. While reverse engineering the operating system I found out way, way too much about how RISC OS worked inside. shudder
Weird and wonderful it was, but also amazingly hackable. Using similar tricks to the allocator, I was able to "ROM patch" RISC OS 3.1 to look like 3.5 [1] using only a few bytes of module space. I do miss those days :)
I too owned a whole bunch of Acorn stuff. Most of the software was excellent. The best basic interpreter around and a very good shot at a DOS.
The Acorn 'Unicorn' was excellent but too expensive and this is roughly where the parent comment comes in. So Acorn did a lot of good stuff before they lost the plot, roughly around the time they released the Archimedes, which was an amazing machine for the time but the Amiga had far surpassed anything Acorn could offer software wise and the ST and Commodore were gobbling up the lower end.
Yes the BASIC interpreter was the best thing there was on the platform. Mainly for me at least because it was heavily structured, didn't require line numbering like earlier versions and had a stupidly powerful inline assembler.
I had a 1040ST and an Amiga 500 at the time as well (spot the geek) and the software wasn't that great on those platforms either IMHO. Even PDS on DOS was nicer to program in with the 2kg pile of manuals.
The killer was the rise of the PC and you know what; I'm glad it killed everything. Perhaps controversially, a couple of years down the line and as someone who wanted to get shit done back then, things like Windows 3, VB, Word, Excel, OLE appearing were clearly the future.
.. by being a careful GUI clone of the existing successful product, Lotus 1-2-3. Right down to being largely formula-compatible and having the same hotkeys. That's why F2 is "edit current cell".
The plastic which held them in place was always broken on the school computers :-)
In all seriousness though, I personally owe a great debt to Sophie Wilson et al for building computers and software that made it very easy to start programming the second after turning a computer on; A BASIC interpreter and an Assembler available instantly from ROM. I had endless hours of fun building parallax scrolling star fields by poking directly to video RAM and slowly rendering 3d scenes in Render Bender. It was just a shame that the computers were so expensive.
I spent about a year of savings on my 'beeb' and it was the best money I ever spent. All tricked out it cost as much as a very decent second hand car, a very large amount of money for me back then, I very much recall that I bought the drive enclosure long before I could afford the drives (I basically bought the whole thing piece-by-piece as money became available). Because I couldn't afford the drives I hacked my Sony cassette deck to function as a sort of poor-mans disk drive by connecting the buttons to the user port and the end-of-tape detector to an input pin so I could tell roughly where the head was on the tape. Really slow, especially during on-tape sorting but it worked.
A friend of mine who had rich parents got his + 2 HD drives for his birthday, I had to take the long way around but eventually got there.
Without that machine I'm pretty sure my career would have started 5 years later.
I was lucky enough to have an Archimedes when I was 14, and despite the Arthur OS, it was the best thing in the world for writing games: fastest desktop computer in the world, easy to use 256 colour sprite system and interpreted basic as a programming language. When the basic compiler came out a few years later it all got even faster. The desktop ui seemed pretty amazing to me compared to what else was around that I had seen. But it was an expensive machine, granted.
"Now I certainly enjoyed the platform but in retrospect, I'd have invested my money in something else back then if I knew what was going to happen."
I take your point about Acorn as a business but I used a 'lab' full of RISC-OS 2 bases A310s at College with 20Mb Rodime hard drives. We did scanning, dtp, Genesis multimedia packages &c and various home grown projects.
What else could we have bought at that time for similar use cases? Not trolling, my memory of the time is hazy and I recollect being extremely underwhelmed by DOS based PCs in another 'lab'.
EDIT: flashbacks to Aldus Framemaker on Apricot PCs, Amstrad PCW spreadsheet applications being used in a theatre box office, and an early 9" screen Mac being used with some form of DTP software.
My college had an (experimental?) machine they got from Acorn that dual-booted into a 'nix OS. It got stuck in a corner and forgotten about, but if it had been a few years later I think it could have been an amazing project for them.
When I was in secondary school, we had a plain grey, all-in-one keyboard and system desktop case that had a microdrive and was connected to a hacked colour TV via a massive umbilical cord. It turned out to be a BBC Micro prototype.
This is true; it's like the Domesday Project laserdisc/BBC micro system. Or Minitel. Ahead of its time at launch, but a technological dead end that ends up as a sunk investment.
Is there a good source of information about ARX (the originally-planned OS for the Acorn ARM machines) out there? I did search around a bit a few years back and found very, very little.
The dramatisation "Micro Men" https://en.wikipedia.org/wiki/Micro_Men covers some of the early history here, including the super-rapid creation of the BBC Micro and a cameo by Sophie Wilson.
Risc OS was roughly contemporary with Windows 3.0 and Apple's system 7, offering a cheaper and seemingly faster system (although rather idiosyncratic in its use of the mouse menubutton and drag-and-drop instead of save dialogs). It booted from ROM in something like a second.
Micro Men is available on youtube [1] and is definitely worth a watch. The scene were one of the characters is eating noodles using multimeter probes as chopsticks made me smile.
You may struggle to find Sophie in that film because at that time she was a man, and was Roger with the long hair (if I recall correctly).
Very informative programme, some funny scenes with Sinclair driving past in his C5.
I still love my Beeb, great machines. Very well written manuals too!
Second that, especially since David Braben was recognized formally as well and he wouldn't have stood a chance doing what he did without Sophie Wilson doing hers first.
If you took her work out of the world a surprising number of items would suddenly stop working.
Our public library had an Acorn Electron for anyone to use : I have spent many free afternoons there until my parents doubled up my savings ( at age 12 ) and I bought my own.
Around the same time I entered high school and they had a full classroom full of networked BBC Micros!
Going from a BBC Master to an Acorn Archimedes 310 at the College I was working in then was like going from an Apple ][ to a Mac.
Archimedes 310s(?) were in use for medical graphics applications (expanded RAM and I think I recall they had floating point units of some kind). There was also a parametric CAD application that was used by Lucas(?) for a bit. The music software Sibelius was released first on this platform having been coded in assembler by the brothers Finn for speed.
The subsequent financial and ownership history of the ARM processor design isn't as inspiring I'm afraid.
This is ridiculous. An entire article about the creation of ARM, and only a passing reference to Steve Furber, the guy who actually designed the chip itself. Either there was an intentional bias here, or it was very poorly researched.
I interviewed Steve Furber for a magazine feature about the ARM many years ago and he confirmed to me that Sophie pretty much designed and implemented the whole ARM instruction set in her head. It worked first time when switched on.
The way I hear it[1], she did the instruction packing on a pub napkin, while IBM were devoting quite a bit of computing horsepower to do the same thing.
[1] Vague recollections of an article in Acorn User. I'd really have to dig out those old magazines to be sure.
That wouldn't surprise me at all. And the really staggering thing is how little the ARM's design and instruction set has changed since then. The fact that you can use Peter Cockerell's 1980-something tutorial on ARM ASM even today shows how well it was designed. I put it up there with such quality-without-a-name pieces of engineering like the SR-71, the HP-12C and the Leica M3.
The SR-71 was so far ahead of its time, it's a little ridiculous. Look down/shoot down capability in the mid 1960s. Synthetic aperture radar mapping. An IMU/astronavigation system almost as capable as the early GPS units of a few decades later. An airframe that actually becomes stronger the more you fly it as designed. Thermal management system techniques that weren't used elsewhere for decades. I can keep going. But basically, for almost every single system in the aircraft, the next comparable system you can find was developed at least 1 to 2 decades later.
Oh, without a doubt. ARM assembly is a joy to code, much more so than any other. It just feels... right. There's no fluff and cruft and, at the same time, it didn't feel like there was anything really missing. RISC processors have a reputation for poor code density, but the ARM, even when it came out originally, was the exception to that rule: it was pretty easy to get code density for the ARM comparable to that of the 680x0 and x86.
I had a series of Acorn machines over the years - I used a trusty A/410 with 200mb hard drive as my main machine right up to the early 2000s and taught myself both design and programming with it. I knew that thing inside and out.
At the time - and I still sort of do - I thought that, despite its idiosyncrasies, RiscOS was way ahead of Windows and the Mac. For example the Mac only got scaling scrollbar widgets with System 9, and the boot time was amazing. (Upgrading from RiscOS 3 to RiscOS 4 by carefully prising out and replacing a bunch of ROM chips was fun.)
In hindsight they made the right choice, but I was bitterly disappointed when the parents bought a Mac over a RiscPC. Teenaged me didn't let them forget how unhappy I was for quite a while...
Getting RiscOS ported to the Raspberry Pi was a great move, a real blast of nostalgia, but I'd love to see what a modern-day version would be like – 64 bit, great graphics, lightning fast, weird extra mouse button.
It makes me happy that their technical legacy lives on in ARM, and their educational spirit is continued with the Pi.
Are you sure about that? (I hear they use low-end x86 cores and ARC cores but I haven't heard about ARM. ARC in general is very hackable while ARM is not hackable at all, so those rumors made sense to me.)
A lot of flash media controllers use an arm core, for example. Also, Intel uses ARM's graphics in some of their low power consumption chips [2]. Because ARM chips sip power so well, you find them in some really surprising places.
It's historically relevant that at the time of those remarkable accomplishments Sophie was named "Roger". And this is an article about history, isn't it?
It might be an interesting adjunct to how a part of the tech community treated one of their own with acceptance but beyond that, no, not terribly relevant. If anything, that's entirely incidental to the story, other than the fact that the Acorn community was very accepting of her transition.
Consider the following scenario. Let's say Sophie was born female and never needed to transition. Now, let's say that she got married after Acorn folded in the late '90s, and took her husband's name. In that case, we'd refer to her by the name she uses now, not the name she used then. The same goes in this case.
The article includes a short paragraph of biographical information, including her date and place of birth - maiden name would usually be mentioned with that, especially if she had done relevant work under her maiden name.
Another good example is Lynn Conway [1][2], that together with Carver Mead wrote "Introduction to VLSI System Design" bestseller that would catalyze the Mead & Conway revolution [3] in VLSI design in late 70's, early 80's.
Interesting, certainly, to find out that a technically accomplished womaan is actually transgendered. It is also interesting (and perhaps relevant) to know that she accomplished these things in an environment where she was percieved as male, and therefore had none of the friction and other problems or issues that might be encountered by women in such a male dominated field. It's great that she was able to keep her role, and was accepted as a woman afterwards, as well.
It also seems to be more likely (than in other fields) to find trans woman in IT, but I worry that this is simply confirmation bias at work? I wonder if there is any research on the relative statistics of trans folk in the IT industry?
EDIT: Thinking about it, perhaps it is just that, as with the GP comment, people are more likely to bring the fact up in a field like IT, where 'facts' are held as more important, and in other areas it would simply be ignored or not mentioned?
The relevance is that Roger Wilson was a notable figure in the 80s and 90s -- he and Steve Furber were a powerhouse team. As far as I know, she transitioned to Sophie after her rise to notability, and she's quite a private person, so it's likely that many people who are very familiar with Roger's work will never have heard of Sophie.
It's an important clarification, just like we call the artist that insists on using a new graphic symbol for his stage name "the artist formerly known as Prince".
How else are we going to keep track of one person's name changes over the years? It's just common sense, really.
I was the owner of many Acorn machines including BBC B, Master, A410, RiscPC 600. The hardware, clearly designed or at least originated by Sophie Wilson was remarkable. It was robust, well designed and incredibly expandable. To this day there is not a single computer that actually made sense more than anything that Acorn kicked out. A human could learn everything about it in intimate detail without a problem.
However the software was a source of constant pain. Firstly nothing was finished initially when the Archimedes came out. The Arthur OS was apparently named as a "A Risc operating system by THURsday" because their internal OS project, apparently Unixlike, went down the crapper during development and they had to hack something up quickly so they had a minimum viable product. What I ended up with was a barely usable OS that consisted of a quick port of Acorn MOS from the BBC Master series and a naff GUI chucked on top for my £1400 investment (a hell of a lot back then and even now) that wasn't fixed properly until RISC OS 2 came out in 1989 so I sat there with a lemon for a year. After that we were stuck with a cooperatively multitasked operating system with a worldview completely different to anything else at the time or in the future. A lot of progress was made but it never had any prospects despite a lot of us clinging onto the initial investment.
Now I certainly enjoyed the platform but in retrospect, I'd have invested my money in something else back then if I knew what was going to happen.
I full respect the achievements here and more importantly the legacy (I have 12 ARM processors still in various things in my house!) but for us footsoldiers who paid up back then, it wasn't all love and happiness.