V
Vincent Manis
You are correct. On thinking about it further, my source was some kind of internal developer seminar I attended round about 1985 or so, where an Apple person said `don't use the high order bits, we might someday produce machines that use all 32 address bits', and then winked at us.On Tue, 10 Nov 2009 16:05:01 -0800, Vincent Manis wrote:
That is incorrect. The original Inside Mac Volume 1 (published in 1985)
didn't look anything like a phone book. The original Macintosh's CPU (the
Motorola 68000) already used 32-bit addressing, but the high eight pins
were ignored since the CPU physically lacked the pins corresponding to
those bits.
In fact, in Inside Mac Vol II, Apple explicitly gives the format of
pointers: the low-order three bytes are the address, the high-order byte
is used for flags: bit 7 was the lock bit, bit 6 the purge bit and bit 5
the resource bit. The other five bits were unused.
You are also correct (of course) about the original `Inside Mac', my copy was indeed 2 volumes in looseleaf binders; the phonebook came later.
That's my point. I first heard about Moore's Law in 1974 from a talk given by Alan Kay. At about the same time, Gordon Bell had concluded, independently, that one needs extra address bit every 18 months (he was looking at core memory, so the cost factors were somewhat different). All of this should have suggested that relying on any `reserved' bits is always a bad idea.By all means criticize Apple for failing to foresee 32-bit apps, but
criticizing them for hypocrisy (in this matter) is unfair. By the time
they recognized the need for 32-bit clean applications, they were stuck
with a lot of legacy code that were not clean. Including code burned into
ROMs.
I am most definitely not faulting Apple for hypocrisy, just saying that programmers sometimes assume that just because something works on one machine, it will work forevermore. And that's unwise.
-- v