P
peter.havens
Hi all,
I'm attempting to build Python 2.4.1 on Solaris 10 using gcc 3.4.3. I
get the following build error:
<snip>
gcc -c -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes
-I. -I./Include -DPy_BUILD_CORE -o Objects/complexobject.o
Objects/complexobject.c
Objects/complexobject.c: In function `complex_pow':
Objects/complexobject.c:479: error: invalid operands to binary ==
Objects/complexobject.c:479: error: wrong type argument to unary minus
Objects/complexobject.c:479: error: invalid operands to binary ==
Objects/complexobject.c:479: error: wrong type argument to unary minus
make: *** [Objects/complexobject.o] Error 1
</snip>
I've poked around a bit, and it looks like there is a macro called
Py_ADJUST_ERANGE2 which is being called at that point in the code. It
uses a defined value for Py_HUGE_VAL. I'm guessing that the definition
of Py_HUGE_VAL is the problem. The comments above the define for
Py_HUGE_VAL, and the notes in the Misc/NEWS file lead me to believe
that I need to correct the defined value of Py_HUGE_VAL (which is set
to HUGE_VAL now, and I'm assuming that comes from a C standard).
However, I'm not sure what to set that value to.
I found two files in Solaris that seem to define HUGE_VAL in some way.
/usr/include/iso/math_c99.h
/usr/include/iso/math_iso.h
....and math_iso.h looks like the best choice. Here is the relevant
sections of the header:
<snip>
#if !defined(_STDC_C99) && _XOPEN_SOURCE - 0 < 600 &&
!defined(__C99FEATURES__)
typedef union _h_val {
unsigned long _i[sizeof (double) / sizeof (unsigned long)];
double _d;
} _h_val;
#ifdef __STDC__
extern const _h_val __huge_val;
#else
extern _h_val __huge_val;
#endif
#undef HUGE_VAL
#define HUGE_VAL __huge_val._d
#endif /* !defined(_STDC_C99) && _XOPEN_SOURCE - 0 < 600 && ... */
</snip>
....my C isn't very strong, so that doesn't make sense to me -- is
HUGE_VAL set to a type rather than a value? Anyway, I set Py_HUGE_VAL
to LONG_MAX. It compiled and 'make test' seems to work out. Will this
cause problems? Should I report this as a Python bug?
Thanks,
Pete
I'm attempting to build Python 2.4.1 on Solaris 10 using gcc 3.4.3. I
get the following build error:
<snip>
gcc -c -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes
-I. -I./Include -DPy_BUILD_CORE -o Objects/complexobject.o
Objects/complexobject.c
Objects/complexobject.c: In function `complex_pow':
Objects/complexobject.c:479: error: invalid operands to binary ==
Objects/complexobject.c:479: error: wrong type argument to unary minus
Objects/complexobject.c:479: error: invalid operands to binary ==
Objects/complexobject.c:479: error: wrong type argument to unary minus
make: *** [Objects/complexobject.o] Error 1
</snip>
I've poked around a bit, and it looks like there is a macro called
Py_ADJUST_ERANGE2 which is being called at that point in the code. It
uses a defined value for Py_HUGE_VAL. I'm guessing that the definition
of Py_HUGE_VAL is the problem. The comments above the define for
Py_HUGE_VAL, and the notes in the Misc/NEWS file lead me to believe
that I need to correct the defined value of Py_HUGE_VAL (which is set
to HUGE_VAL now, and I'm assuming that comes from a C standard).
However, I'm not sure what to set that value to.
I found two files in Solaris that seem to define HUGE_VAL in some way.
/usr/include/iso/math_c99.h
/usr/include/iso/math_iso.h
....and math_iso.h looks like the best choice. Here is the relevant
sections of the header:
<snip>
#if !defined(_STDC_C99) && _XOPEN_SOURCE - 0 < 600 &&
!defined(__C99FEATURES__)
typedef union _h_val {
unsigned long _i[sizeof (double) / sizeof (unsigned long)];
double _d;
} _h_val;
#ifdef __STDC__
extern const _h_val __huge_val;
#else
extern _h_val __huge_val;
#endif
#undef HUGE_VAL
#define HUGE_VAL __huge_val._d
#endif /* !defined(_STDC_C99) && _XOPEN_SOURCE - 0 < 600 && ... */
</snip>
....my C isn't very strong, so that doesn't make sense to me -- is
HUGE_VAL set to a type rather than a value? Anyway, I set Py_HUGE_VAL
to LONG_MAX. It compiled and 'make test' seems to work out. Will this
cause problems? Should I report this as a Python bug?
Thanks,
Pete