Rui Maciel said:
BartC wrote:
I don't believe this is true. I don't remember ever seeing anyone
encourage
the use of macros: quite the opposite, actually. They are seen as a last
resort kind of thing.
What is indeed encouraged is the practice of writing your own routines to
solve your own problems. This isn't a bad thing to do, because it is
essentially all that a programmer does.
Writing routines is OK; that's just programming. And it doesn't change the
syntax.
Using macros however can effectively change the language.
That's essentially what every API or every helper routine is: a dialect.
Whenever someone develops their own library or adopts one developed by a
third-party, that person is defining their own dialect.
But it's a well-understood way of extending a language. A function call
looks like a function call. A macro invocation can involve anything. You
look at the body of a function, and you still see normal C code. You look at
a macro
definition and it's quite often gobbledygook.
What's the difference between for(int I = A; i < B; i++) and FOR(I, A, B)
?
My FOR(I,A,B) macro iterates between A and B inclusively, and does so
reliably between having to write the loop index 3 times (you've mixed up a I
with i), or remembering to use a <= instead of = (you've used <), or just
having to bother to write all those parts of a loop which are the compiler's
job not mine:
#define FOR(i,a,b) for (i=a; i<=b; ++i)
#define TO(i,x) for (i=x; i; --i)
With FOR, I just give it the 3 elements that are all that are really needed
to define the loop, and can concentrate the loop body!
Because the suggestions that have been presented are already a part of the
language. A concept might not be expressed with a particular syntactic
sugar, but that doesn't mean the language doesn't support it, and if we
have
a choice between a concise language and an inflated one to express the
exact
same concepts, conciseness always wins.
Syntax, especially basic constructs that practically every other language
have had for decades, will hardly inflate the language. What does inflate it
are the 1000 or so functions in the run-time library. And endless blocks of
macros like this:
#define INT_MAX +32767
#define INT_MIN -32767
#define LONG_MAX +2147483647
#define LONG_MIN -2147483647
#define LLONG_MAX +9223372036854775807
#define LLONG_MIN -9223372036854775807
#define MB_LEN_MAX 1
#define SCHAR_MAX +127
#define SCHAR_MIN -127
#define SHRT_MAX +32767
#define SHRT_MIN -32767
#define UCHAR_MAX 255
#define USHRT_MAX 65535
#define UINT_MAX 65535
#define ULONG_MAX 4294967295
#define ULLONG_MAX 18446744073709551615
when all that is really needed is a single syntax feature that can be
applied to any type in the same way as sizeof().