That is something I found out in a ‘discussion’ (don’t think it really deserves that name) earlier this week. Some guy was talking about the fact that it took a long time for 32-bit OSes to catch on, after the introduction of the 80386. This part is true, simple fact-checking will reveal that the 80386 was introduced in 1985, while the first 32-bit version of Windows (NT 3.1), the most popular x86 OS, was introduced in 1993. And, if you want to get more detailed, another reasonably popular 32-bit OS was OS/2 2.0, which was introduced not much sooner, namely 1992. There were some more obscure early 32-bit OSes, such as Microsoft’s Xenix, which was already made available in a 32-bit version in 1987, but that’s about it.
Now, where this guy went wrong, was his claims that Microsoft deliberately frustrated the 32-bit market by keeping MS-DOS 16-bit only. Then he started a whole debate on how OS-developers deliberately limit the capabilities of hardware, so they can enable it as a paid ‘feature’ later. Claiming that even iOS and Android still do this today, and that the hardware would be much more powerful with an unlimited OS. This sounds like something that you either want to believe, or tastes too much like a conspiracy theory to take seriously. It all depends on how you roll.
In my case, I have actually experienced this 16-bit DOS era, and the rather painful uptake of 32-bit software. So I know for a fact that at least in this instance, his claims are completely false. Namely, a major factor the guy has been ignoring in his argumentation is the price. Computers were a LOT more expensive back in the day, and progress was not as fast. Today, Intel launches a new line of CPUs every year (tick-tock), and the new line is immediately launched as a fully vertical product line, replacing older products.
But in the 80s and 90s, things were different. A new line of CPUs was only introduced when a fully new microarchitecture was developed, which generally took 3-4 years, for example:
- 8086/8088: 1978
- 80186/80286: 1982
- 80386: 1985
- 80486: 1989
- Pentium: 1993
- Pentium Pro: 1995
In between, there were ‘new products’ launched from time to time, which generally was the same architecture, but at a higher clockspeed, because of an improved manufacturing process and/or some tweaks to the architecture. Since the latest generation of CPUs was very expensive, and there generally were only 2 or 3 variations of it in terms of clockspeed, the mainstream and low-end markets made use of the earlier generations. So, although the 8088 was already introduced in the late 70s, PCs with 9.54 MHz 8088 CPUs were still common mainstream computers around 1987/1988, and were actually still sold at prices exceeding those of the Commodore Amiga 500 or Atari ST, which technically had more powerful 68000 processors, and more advanced graphics and audio. So for consumers, these PCs were already quite expensive.
For me personally, my first 32-bit PC was a 386SX-16 (which is pretty much as low-end as you can get as far as 32-bit PCs go). I got it around 1990 or 1991. It was still quite an expensive machine at the time, and among my friends, I was one of the first to have a 32-bit machine (not to mention the Sound Blaster Pro I had in it, which cost me as much as my entire Amiga did earlier. PCs really were horribly expensive). Many people still used an 8088 machine, or a 286. My machine came with 1 MB of memory, because memory was still incredibly expensive around that time. This meant that even though the CPU was capable of running 32-bit software, it was still very limited.
I later expanded the memory to 5 mb, and then I tried to install Windows NT on it. It worked, but not much more than that. It ran like a dog, because the computer was quite underpowered. Besides, most software was still DOS-based, so you would run it inside the NTVDM anyway (which was neither fast nor very compatible). I also tried OS/2 Warp on it, which left roughly the same impression.
Around 1993 or 1994 I got my first 486, a DX2-66. I started out with 4 mb of memory, which again was barely enough to run some early 32-bit games such as Doom, and Windows NT still was not much of a success. Memory was still quite expensive. Around the time that Windows 95 came out though, I also upgraded to 8 mb, and that was the first time I actually had a 32-bit system that was powerful enough to run a full 32-bit OS. Windows 95 was also the first 32-bit PC OS (or well… 32-bit-ish) that became popular in the mainstream.
In those days, we also had some early linux machines at university. They were 486SX-25 machines, with 8 mb of memory, if I recall correctly. They were also quite underpowered running linux with XFree86 and a lightweight window manager. So it’s not like Microsoft/IBM did such a horrible job with OS/2 and Windows NT either. A full 32-bit OS with proper memory protection, virtual address space, virtual memory, and multitasking just takes quite a bit of overhead.
So anyway, blaming it all on Microsoft is a bit strange. For the most part, the problem was that 32-bit PCs were ridiculously expensive in the early years, not to mention underpowered. It is perfectly natural that there were no regular office/home OSes before the early 90s, because the machines were just too expensive. 386 and 486 machines were initially mainly sold as servers, where mainly UNIX-like OSes would be used, and Microsoft did offer Xenix at an early stage.
You can still find old PC Magazines from those days on Google books. Take this one from December 1992 for example, and see the high price for a 486DX2-66. And in 1990, you’d pay similar prices for a 386. Note also that they start with 1 mb of memory, and how quickly prices go up when you want a system with 4 mb of memory. Memory made up a significant chunk of the total price tag of a PC back then. A PC with 8 mb or more, to run a 32-bit OS comfortably was just not very feasible for regular consumers in those days. Let’s throw this one in for fun as well, one of the first 386 systems in 1986, reviewed in PC Magazine. At a whopping 16 MHz, with only 1 mb of memory, you’d pay $6,499 with a 40 mb HDD, or $8,799 with a 130 mb HDD. You want 4 mb more ram? That will cost you $2,999! In 1986 money, that is more expensive than a small family car! Just to illustrate how ridiculous the notion is of 32-bit consumer computing back in the late 80s. And those are 1986 prices. If you were to correct for inflation, it would be more than twice that in today’s money. Computers of more than $20,000? Yes, that is what we are talking about here.
As another funny bit of trivia: because of a bug, some early 80386 CPUs would not work reliably in 32-bit mode, and their package reads “16-bit S/W only”. Which wasn’t that big a deal, as most 386es were mainly running MS-DOS and Windows anyway.
The guy then said that he tried Slackware, which was ‘already 32-bit and multitasking’. So I said: “What do you mean ‘already’?” By the time the first linux distributions arrived, Windows NT was also on the market, which was a fully 32-bit production-ready multitasking OS. Then he argued that Windows NT was not aimed at consumers, and that it could have been if there were more drivers for consumer hardware. Say what? He even admitted himself that Slackware did not have a lot of drivers for consumer hardware either, and clearly Slackware was not aimed at consumers either.
Besides, drivers for consumer hardware? I would say that the biggest problem was that consumers simply could not AFFORD the hardware, so marketing it at consumers would not have made sense. Drivers were never that much of a problem, in my experience. A lot of the hardware for my 386SX and 486 already came with Windows NT drivers at an early stage. The biggest problems were that my systems were not powerful enough to run the OS, and there was no reason to run the OS anyway, because hardly any of the software I used had a 32-bit version at all, and if they did, it was usually based on a DOS extender, and as such, it ran much better directly from DOS. Mostly a chicken-and-egg problem, which was solved when prices of 386 and 486 systems started coming down in the early 90s. An obvious problem for software developers in those days was that going 32-bit would mean they would lock out all those people who still used 286 and older machines. DOS extenders were not common before 1992 either, so they arrived more or less at the same time as the 32-bit versions of OS/2, Windows and linux/386BSD.
And DOS extenders were not a good permanent solution, because it was still DOS, so you still had no multitasking and no memory protection. A native 32-bit version of MS-DOS would be more or less the same, and would not be a very compelling OS anyway. When you want to go 32-bit, you should go with a more advanced OS as well, and that is exactly what Microsoft did, with Xenix, OS/2 and Windows NT (and backporting much of NT to Win32s extensions and Windows 9x as a more lightweight solution for consumers).
But well, it seems people who weren’t actually around in those days will choose to believe the conspiracy theory of evil Microsoft over common sense and easily verifiable facts.