Compile time initialization of data.

S

S.K.Mody

typedef unsigned long long int uint64
typedef unsigned char uint8;

Class Simple
{
union { uint64 x; uint8 r[8]; }

public:
Simple(uint64 n) : x(n) {;}
//....
};


Class Simple_user
{
static const Simple simple_array[8];

public:
//....
};

const Simple Simple_user::simple_array[8] =
{ 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08 };


Will the array of simples always be constructed at compile time?
Can an optimizing compiler construct the array at compile time?

Thanks for any answers/comments.
 
V

Victor Bazarov

S.K.Mody said:
typedef unsigned long long int uint64

Not valid C++ (yet). There is no 'long long' type in C++. You
should probably have used 'double'.
typedef unsigned char uint8;

Class Simple

class Simple

, maybe?
{
union { uint64 x; uint8 r[8]; }

public:
Simple(uint64 n) : x(n) {;}
//....
};


Class Simple_user

class Simple_user

, maybe?
{
static const Simple simple_array[8];

public:
//....
};

const Simple Simple_user::simple_array[8] =
{ 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08 };


Will the array of simples always be constructed at compile time?

No. It's unspecified (8.5.1/14).
Can an optimizing compiler construct the array at compile time?

It may. Whether a particular one _can_ or not, depends on its
implementors, doesn't it?

V
 
S

S.K.Mody

Victor said:
Not valid C++ (yet). There is no 'long long' type in C++. You
should probably have used 'double'.

In the actual code, I used the C99 header provided with
glibc (stdint.h) which has nice fixed width types like
uint16_t, uint64_t, int_fast8_t etc, which may make it to C++
eventually. I'll have to use some preprocessor conditionals or
roll my own uint64 if I decide to compile it for other platforms.
No. It's unspecified (8.5.1/14).


It may. Whether a particular one _can_ or not, depends on its
implementors, doesn't it?

True, but I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

I'm using g++ on x86 Linux.
 
V

Victor Bazarov

S.K.Mody said:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

There is no other "agreement" among compiler writers except the
Standard Document, I hope.
I'm using g++ on x86 Linux.

Good for you. It doesn't matter here, though.

V
 
S

S.K.Mody

Victor said:
S.K.Mody said:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

There is no other "agreement" among compiler writers except the
Standard Document, I hope.

Why do you hope? Would there be any problems if a subset
of the set of all compilers behaved somewhat predictably in
some respects even though the exact behaviour is left
unspecified by the standard?
 
V

Victor Bazarov

S.K.Mody said:
Victor said:
S.K.Mody said:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

There is no other "agreement" among compiler writers except the
Standard Document, I hope.

Why do you hope? Would there be any problems if a subset
of the set of all compilers behaved somewhat predictably in
some respects even though the exact behaviour is left
unspecified by the standard?

Yes. The problem is simple: if there is nothing _governing_
the behaviour, nothing is there to prevent it _change_ some
sunny day, and therefore none of it can be _relied upon_. What
else did you expect me to tell you?
 
S

S.K.Mody

Victor said:
S.K.Mody said:
Victor said:
S.K.Mody wrote:
[..] I am hoping that there is some agreement among compiler
writers as to what constitutes an acceptable level of optimization.
After all, there is technically no requirement for example that
a compiler inline any methods at all, but for some types of code
that would prove to be unacceptable.

There is no other "agreement" among compiler writers except the
Standard Document, I hope.

Why do you hope? Would there be any problems if a subset
of the set of all compilers behaved somewhat predictably in
some respects even though the exact behaviour is left
unspecified by the standard?

Yes. The problem is simple: if there is nothing _governing_
the behaviour, nothing is there to prevent it _change_ some
sunny day, and therefore none of it can be _relied upon_. What
else did you expect me to tell you?

I think the philosophy of C++ provides the governing principle
- To achieve the right balance between portability and efficiency.
For example without inlining, to which the original question is
closely related, it would often be unacceptably inefficient to
have deeply nested calls to small functions. But such calls may
be necessary for a variety of reasons related to good C++ design.
So should one go back to writing macros and forget about design
principles or can one compromise a little and ask for some informal
guarantees from the specific compiler (or class of compilers)
that one may be working with? The latter seems to me to be a better
option - since the choice is unmaintainable spaghetti code v/s
well designed code with some compiler specific preprocessing.

You may regard this as a strictly compiler related question but
it seems to me that the C++ efficiency goals virtually require
the compiler to provide such informal albeit non-portable
guarantees. The original question could therefore be rephrased
as "Is there any sort of uniformity among compilers in this regard?"
I'm not sure whether your answers were based on specific knowledge
of widely varying implementations or on the legal position of the
standard.
 
B

Ben Pope

S.K.Mody said:
I think the philosophy of C++ provides the governing principle
- To achieve the right balance between portability and efficiency.
For example without inlining, to which the original question is
closely related, it would often be unacceptably inefficient to
have deeply nested calls to small functions. But such calls may
be necessary for a variety of reasons related to good C++ design.
uh-huh.

So should one go back to writing macros and forget about design
principles or can one compromise a little and ask for some informal
guarantees from the specific compiler (or class of compilers)
that one may be working with? The latter seems to me to be a better
option - since the choice is unmaintainable spaghetti code v/s
well designed code with some compiler specific preprocessing.

If you know that you will never compile your program on anything other than the compiler you're using today, do what you like. But rest-assured that there are lots of people who
expected their code to not still be around 20 years later. (Such as that in banking systems... remember Y2K? It not like the programmers didn't know about the problem, they just
expected the code to not be around when the problem surfaced).

The point is that one day, somebody might try to re-use the code (I have heard of it happening in the past). They may not expect it to rely upon on-standard behaviour.
You may regard this as a strictly compiler related question but
it seems to me that the C++ efficiency goals virtually require
the compiler to provide such informal albeit non-portable
guarantees.

Not in the slightest. Optimisation is nice, but there is loads of code that relies upon side effects that should not exist, and optimising that code breaks it. Again, reliance
upon non-standard behaviour (such as doing anything in a copy-constructor that is not purely related to copying the object), might break your code some day.
> The original question could therefore be rephrased
as "Is there any sort of uniformity among compilers in this regard?"

Yeah, of there is. But thats NOT the question. The question is "Should you rely upon it?"
I'm not sure whether your answers were based on specific knowledge
of widely varying implementations or on the legal position of the
standard.

Probably a bit (or more) of both.

Ben
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,999
Messages
2,570,243
Members
46,835
Latest member
lila30

Latest Threads

Top