Someone at Intel decided to shift a segment address left 4 bits before
adding the 16-bit offset. Simply by shifting 8 bits instead of 4, the memory
limit would have been 16MB instead of 1MB (well, plus adding 4 device pins
for the extra address lines).
And who was it decided to put memory-mapped video at the 640K location, IBM?
I used machines based on the same hardware, which were compatible enough
to run some PC software, which didn't have that limitation.
After a few brief forays into Linux (where there was always some essential
component that didn't work at all, and a few more that worked badly), I was
glad to get back to Windows where it looked much more professional.
How do you configure a printer that uses TCP/IP?
Hint: It's not a "network" printer. It's a local printer directly connected
to this machine. Using one of COM1, LPT, or... "standard TCP/IP port".
Windows was fairly good at *looking* professional. But there were a couple
of years during which plugging a USB mouse into a NetBSD machine worked
nearly instantaneously and quite reliably, and plugging a USB mouse into a
Windows machine might or might not work at all, and if it did it took ten
seconds or longer to identify and install drivers.
(And probably someone normally using Mac OS would say the same about
Windows.)
To some extent, this is certainly true, but baby duck syndrome is not nearly
sufficient to explain some of these things. No amount of Mac users growing
up with it made the Classic MacOS "you must have N+1 megabytes of disk space
for backing store to add 1MB of virtual memory to a system, and most
applications require a lot more memory if you don't have virtual memory
enabled" design decision rational or justifiable. It was broken, whether by
design or otherwise.
And while Win32/GDI was a nightmare to work with, I understand that X11
wasn't that much better...
Perhaps it wasn't, but you could always replace it.
The thing is... Once you get past the initial baby duck syndrome and
not-used-to-that, and start looking at the documentation and writeups by
experienced professionals who really do like a given system... Windows
loses. By a gigantic margin. It's genuinely, objectively, awful. It
is unstable, insecure, and actively hostile to long-term code maintenance.
If you look at a Windows app, you can tell when it was written by the APIs
it is built for, because they get replaced every few years. There's Unix
apps I'm still using now that were last significantly updated in the early
90s. Anything new enough to have prototypes typically works.
There's totally some baby-duck syndrome involved, but... There's also some
real differences in design philosophies, and in the long run, differences
in design philosophies *matter*.
Think about the stuff sandeep keeps proposing doing to C. Windows is an
operating system where people like him were given carte blanche.
-s