F
Francois
Hello everybody,
(sorry for my English, I'm French, I'll do my best)
I'm beginning C and personally I'm learning above all with books. I
started and finished a book for novice (a French book "Livre du C
premier langage" by Delannoy). I'm reading the K&R book (Kernighan and
Ritchie second edition) which is far more difficult of course but very
precise. However, I'm not fully satisfied with some things.
I find that this book has a too far away approach on the computer (or
achitecture, i don't the right word): how is a data coded in binary, how
does the problem of "signed" or "unsigned" happen from the point of view
of the computer...
Obviously, the C language is a "high level" one, which mustn't refer to
the computer (by principle). So that's logical that a book upon C
doesn't talk too much of the problems depending on the computer. And
yet, I'd like to know if such a book does exist. I mean a book which
exhibits C language and which doesn't hesitate in talking about what
happens in the computer from time to time: for example the book could
explain things from the point of view of "a classical" computer. For
sure, I'd like to avoid electronical considerations. For example, the
2's complement will be dealt explaining how things are easier from the
point of view of the computer, avoiding purely electronical
considerations if possible.
I hope I've been clear enough about the compromise I'm looking for (but
does it really exist?). For people who understand French, you can go and
see this discussion (in which my login is sisco) which gives more
precisions about my wishes:
http://www.siteduzero.com/forum-83-...codage-binaire-du-contenu-d-une-variable.html
Thanks a lot in advance for helping me.
Sincerely
François
(sorry for my English, I'm French, I'll do my best)
I'm beginning C and personally I'm learning above all with books. I
started and finished a book for novice (a French book "Livre du C
premier langage" by Delannoy). I'm reading the K&R book (Kernighan and
Ritchie second edition) which is far more difficult of course but very
precise. However, I'm not fully satisfied with some things.
I find that this book has a too far away approach on the computer (or
achitecture, i don't the right word): how is a data coded in binary, how
does the problem of "signed" or "unsigned" happen from the point of view
of the computer...
Obviously, the C language is a "high level" one, which mustn't refer to
the computer (by principle). So that's logical that a book upon C
doesn't talk too much of the problems depending on the computer. And
yet, I'd like to know if such a book does exist. I mean a book which
exhibits C language and which doesn't hesitate in talking about what
happens in the computer from time to time: for example the book could
explain things from the point of view of "a classical" computer. For
sure, I'd like to avoid electronical considerations. For example, the
2's complement will be dealt explaining how things are easier from the
point of view of the computer, avoiding purely electronical
considerations if possible.
I hope I've been clear enough about the compromise I'm looking for (but
does it really exist?). For people who understand French, you can go and
see this discussion (in which my login is sisco) which gives more
precisions about my wishes:
http://www.siteduzero.com/forum-83-...codage-binaire-du-contenu-d-une-variable.html
Thanks a lot in advance for helping me.
Sincerely
François