A
Alf P. Steinbach
* Ian Collins:
Using pedantic mode /is/ the pragmatic choice.
Not using pedantic mode isn't pragmatic.
It's just ostrich policy: hide yer head in the sand and assume that
since one doesn't see the enemy (non-portable and dubious code
constructs), the enemy isn't there, or at least, doesn't attack you.
Weight the need for 64-bit integer math against the need for using
compilers that doesn't yet support it, or alternatively the cost of
implementing it. On the surface it's not so much an engineering
decision as a management decision, a cost/benefit decision. However,
most compilers, even pedantic mode Ming g++ 3.4.4, support 64-bit
integers, just not non-standard "long long", so implementing it seems to
be very trivial (weasel-words "seems to be": this is just off-the-cuff):
T:\> cat <bah.cpp
#include <iostream>
#include <ostream>
#ifdef __GNUC__
# define EXTENSION __extension__
#else
# define EXTENSION
#endif
EXTENSION typedef unsigned long long ULongLong;
int main()
{
using namespace std;
cout << hex << ULongLong(-1) << endl;
}
T:\> gnuc bah.cpp & a
ffffffffffffffff
T:\> _
I think this is the proper, pragmatic, pedantic way... ;-)
Uhm, I think you're right that it's easy to test for, using some
implementation of the C header that provides fixed size ints (whatever
it was called, I think it's part of C++ TR1), but the above fails:
T:\> cat <bah.cpp
#include <iostream>
#include <ostream>
#include <climits>
#define STRINGVALX(tok) #tok
#define STRINGVAL(tok) STRINGVALX( tok )
int main()
{
using namespace std;
cout << STRINGVAL(ULLONG_MAX) << endl;
}
T:\> gnuc bah.cpp & a
(2ULL * 9223372036854775807LL + 1)
T:\> _
Reason for test failure that <climits> provides C99 macros...
But see also middle of this posting.
So what ever "gnuc" is, it is invoking g++ in pedantic mode. You have
the choice of being pedantic, or pragmatic.
Using pedantic mode /is/ the pragmatic choice.
Not using pedantic mode isn't pragmatic.
It's just ostrich policy: hide yer head in the sand and assume that
since one doesn't see the enemy (non-portable and dubious code
constructs), the enemy isn't there, or at least, doesn't attack you.
If you had to do 64 bit math, which would you prefer?
Weight the need for 64-bit integer math against the need for using
compilers that doesn't yet support it, or alternatively the cost of
implementing it. On the surface it's not so much an engineering
decision as a management decision, a cost/benefit decision. However,
most compilers, even pedantic mode Ming g++ 3.4.4, support 64-bit
integers, just not non-standard "long long", so implementing it seems to
be very trivial (weasel-words "seems to be": this is just off-the-cuff):
T:\> cat <bah.cpp
#include <iostream>
#include <ostream>
#ifdef __GNUC__
# define EXTENSION __extension__
#else
# define EXTENSION
#endif
EXTENSION typedef unsigned long long ULongLong;
int main()
{
using namespace std;
cout << hex << ULongLong(-1) << endl;
}
T:\> gnuc bah.cpp & a
ffffffffffffffff
T:\> _
I think this is the proper, pragmatic, pedantic way... ;-)
By easy to test for, I was thinking of something along the lines of
#include <limits.h>
int main() {
#if !defined ULLONG_MAX
# SomeBigIntClass bah;
#else
long long bah;
#endif
}
Uhm, I think you're right that it's easy to test for, using some
implementation of the C header that provides fixed size ints (whatever
it was called, I think it's part of C++ TR1), but the above fails:
T:\> cat <bah.cpp
#include <iostream>
#include <ostream>
#include <climits>
#define STRINGVALX(tok) #tok
#define STRINGVAL(tok) STRINGVALX( tok )
int main()
{
using namespace std;
cout << STRINGVAL(ULLONG_MAX) << endl;
}
T:\> gnuc bah.cpp & a
(2ULL * 9223372036854775807LL + 1)
T:\> _
Reason for test failure that <climits> provides C99 macros...
But see also middle of this posting.