is None or == None ?

V

Vincent Manis

On Tue, 10 Nov 2009 16:05:01 -0800, Vincent Manis wrote:
That is incorrect. The original Inside Mac Volume 1 (published in 1985)
didn't look anything like a phone book. The original Macintosh's CPU (the
Motorola 68000) already used 32-bit addressing, but the high eight pins
were ignored since the CPU physically lacked the pins corresponding to
those bits.

In fact, in Inside Mac Vol II, Apple explicitly gives the format of
pointers: the low-order three bytes are the address, the high-order byte
is used for flags: bit 7 was the lock bit, bit 6 the purge bit and bit 5
the resource bit. The other five bits were unused.
You are correct. On thinking about it further, my source was some kind of internal developer seminar I attended round about 1985 or so, where an Apple person said `don't use the high order bits, we might someday produce machines that use all 32 address bits', and then winked at us.

You are also correct (of course) about the original `Inside Mac', my copy was indeed 2 volumes in looseleaf binders; the phonebook came later.
By all means criticize Apple for failing to foresee 32-bit apps, but
criticizing them for hypocrisy (in this matter) is unfair. By the time
they recognized the need for 32-bit clean applications, they were stuck
with a lot of legacy code that were not clean. Including code burned into
ROMs.
That's my point. I first heard about Moore's Law in 1974 from a talk given by Alan Kay. At about the same time, Gordon Bell had concluded, independently, that one needs extra address bit every 18 months (he was looking at core memory, so the cost factors were somewhat different). All of this should have suggested that relying on any `reserved' bits is always a bad idea.

I am most definitely not faulting Apple for hypocrisy, just saying that programmers sometimes assume that just because something works on one machine, it will work forevermore. And that's unwise.

-- v
 
V

Vincent Manis

I inadvertently must have deleted a paragraph in the response I just posted. Please add: The pointer format would have caused me to write macros or the like (that was still in the days when Apple liked Pascal) to hide the bit representation of pointers.

-- v
 
G

greg

Vincent said:
That's my point. I first heard about Moore's Law in 1974 from a talk given
by Alan Kay. At about the same time, Gordon Bell had concluded, independently,
that one needs extra address bit every 18 months

Hmmm. At that rate, we'll use up the extra 32 bits in our
64 bit pointers in another 48 years. So 128-bit machines
ought to be making an appearance around about 2057, and
then we'll be all set until 2153 -- if we're still using
anything as quaintly old-fashioned as binary memory
addresses by then...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,230
Members
46,817
Latest member
DicWeils

Latest Threads

Top