delete POD array - on which platforms does it not work?

B

Bo Persson

Triple-DES said:
The _CrtDumpMemoryLeaks() function can be used when running under
the debugger. I too tried this, and was unable to produce a memory
leak on VC8.

How many times did you run the test, and under what conditions?

What if it only fails first Wednesday after a full moon on leap years,
and only if an important customer is watching. That's usually when
Undefined Behavior goes bad.
Without further evidence, I wouldn't go as far as saying that this
is 100% safe on VC8, but it would seem that the most trivial cases
are handled gracefully.


It actually does work, but just by chance. Usually new[] has to store
a count, so the system can later know how many times delete[] has to
call the destructor for the objects in the array. As a space
optimization, the count is not stored for objects without a
destructor, so by pure chance a char[10] object has the same layout as
any other 10 byte memory block.

Do you want to rely on this??


Bo Persson
 
D

Default User

Martin said:
Hi all!

char* p = new char[n]; // POD!
...
delete[] p; // std compliant
delete p; // will work on VC8
free(p); // will also work on VC8

How do you konw it works?




Brian
 
S

stan

Triple-DES said:
The _CrtDumpMemoryLeaks() function can be used when running under the
debugger. I too tried this, and was unable to produce a memory leak on
VC8.

Without further evidence, I wouldn't go as far as saying that this is
100% safe on VC8, but it would seem that the most trivial cases are
handled gracefully.

This certainly seems slightly more comfortable than "100% sure". That
level of confidence always sets off warning flags "100%" of the time :)

To be honest I'm not convinced that even this claim stands up very well.
How does optimizing, debugging, ... etc other environment changes impact
this outcome? I'm not asking for anyone to actually test this. I'm
personally willing to take the standard at it's word and consider it
undefined. From the first time I heard about it I've been adverse to
doing anything that might cause demons to fly out of my nose. My
children have moved back in with me and I'm already pressed for space.

I can't actually imagine why this question would ever come up. I can't
see any benefit from skirting the rules. Am I missing some possible
valid reason to investigate this?
 
T

Triple-DES

How many times did you run the test, and under what conditions?

I tried various constructs that I could come up with involving new[] /
delete and ran them with the normal debug/release flags.
What if it only fails first Wednesday after a full moon on leap years,
and only if an important customer is watching. That's usually when
Undefined Behavior goes bad.

It is very likely that it may fail under certain circumstances. I am
not planning to take "advantage" of this in my own code, as you are
implying.
Without further evidence, I wouldn't go as far as saying that this
is 100% safe on VC8, but it would seem that the most trivial cases
are handled gracefully.

It actually does work, but just by chance. Usually new[] has to store
a count, so the system can later know how many times delete[] has to
call the destructor for the objects in the array. As a space
optimization, the count is not stored for objects without a
destructor, so by pure chance a char[10] object has the same layout as
any other 10 byte memory block.

It is obvious that the implementation needs to know the number of
objects allocated, and IMO it is a flaw in the language that this
information is not accessible (like e.g via p.length), but that's OT.
I wouldn't go as far as calling it pure chance, since a new'ed char
array is suitably aligned to store any object.
Do you want to rely on this??

No, but unfortunately my post may be read as a support to this
"idiom". That was not my intention.

DP
 
T

Triple-DES

Triple-DES wrote: [snip]
Without further evidence, I wouldn't go as far as saying that this is
100% safe on VC8, but it would seem that the most trivial cases are
handled gracefully.

This certainly seems slightly more comfortable than "100% sure". That
level of confidence always sets off warning flags "100%" of the time :)

Yes, and I guess I should moderate myself even further. What I should
have written was something along the lines of: Under certain
conditions, it is possible to write code in such a way that mixing
new[] / delete will not leak memory using VC8.
To be honest I'm not convinced that even this claim stands up very well.
How does optimizing, debugging, ... etc other environment changes impact
this outcome? I'm not asking for anyone to actually test this. I'm
personally willing to take the standard at it's word and consider it
undefined. From the first time I heard about it I've been adverse to
doing anything that might cause demons to fly out of my nose. My
children have moved back in with me and I'm already pressed for space.

I can't actually imagine why this question would ever come up. I can't
see any benefit from skirting the rules. Am I missing some possible
valid reason to investigate this?

My own reason for investigating this was that I wanted to convince the
OP that it was unsafe. Unfortunately, I haven't had any luck so far :)

DP
 
M

Martin T.

Triple-DES said:
Triple-DES wrote: [snip]
Without further evidence, I wouldn't go as far as saying that this is
100% safe on VC8, but it would seem that the most trivial cases are
handled gracefully.
This certainly seems slightly more comfortable than "100% sure". That
level of confidence always sets off warning flags "100%" of the time :)

Yes, and I guess I should moderate myself even further. What I should
have written was something along the lines of: Under certain
conditions, it is possible to write code in such a way that mixing
new[] / delete will not leak memory using VC8.
To be honest I'm not convinced that even this claim stands up very well.
How does optimizing, debugging, ... etc other environment changes impact
this outcome? I'm not asking for anyone to actually test this. I'm
personally willing to take the standard at it's word and consider it
undefined. From the first time I heard about it I've been adverse to
doing anything that might cause demons to fly out of my nose. My
children have moved back in with me and I'm already pressed for space.

I can't actually imagine why this question would ever come up. I can't
see any benefit from skirting the rules. Am I missing some possible
valid reason to investigate this?

My own reason for investigating this was that I wanted to convince the
OP that it was unsafe. Unfortunately, I haven't had any luck so far :)

No convincing was needed, actually. :)
I'm not going to do this, but having a clue as to *how* evil something
is can never hurt. Not to repeat it, but to know how urgent it is to fix
it should you encounter it.

thanks for your insights,
br,
Martin
 
M

Martin T.

Yannick said:
Yannick said:
I find "new" useful, albeit rarely, but "new[]", virtually never.
Have you ever compared the performance of std::vector<int>(n) vs. new
int[n] (in a release build, but without optimization?)

Why would I not want to let the compiler optimize the code?

This is a typical C hand coder way of opting out: "C++ is slow. Look I
compile with debugs and no optimization and std::vector::at() is 10x
slower than accessing a malloc'ed array and std::vector::push_back()
is 50x slower than memcpy".

I was only referring to the *allocation* in unoptimized _release_ (not
debug) code on VC8. Since we do NOT use any optimization flags here,
this is what will end up in production.
My results are somewhere factor 5 on VC8 (VS2005) in favor of new[].

Try again with optimization on. You will probably find:

std::vector<int>(n) is a bit slower than new int[n] but not much

Great to know, thanks, and I do believe you, but for my production
environment non-optimized code is what matters.
(...)
I agree though - I do not use new[] except in the classes that hide it
from me.
Also - technically the (2003) std does not guarantee that the memory of
std::vector is allocated contiguously, right?

Wrong, it is guaranteed.
std:string are not guaranteed contiguous yet although I have yet to
see one that isn't but vector is guaranteed.
See:
http://www.parashift.com/c++-faq-lite/containers.html#faq-34.3

Oh. Thanks for the info!

br,
Martin
 
Y

Yannick Tremblay

I was only referring to the *allocation* in unoptimized _release_ (not

Sorry, I didn't mean that you said the above. I was a bit exagerating
to illustrate a point :) But that's not that far from reality from
some apologist that try to hide their refusal to learn new (!? 20 year
old) technology.
debug) code on VC8. Since we do NOT use any optimization flags here,
this is what will end up in production.

If you do not let the compiler do any optimisation then that clearly
mean that speed is of no importance at all for your company. Hence
you may as well take the safety and speed of development of
std::vector and ignore all execution speed issues.

My results are somewhere factor 5 on VC8 (VS2005) in favor of new[].

Try again with optimization on. You will probably find:

std::vector<int>(n) is a bit slower than new int[n] but not much

Great to know, thanks, and I do believe you, but for my production
environment non-optimized code is what matters.

Guess that leaves you with a couple of options:

1- Explain to your employer that they are being stupid. :)
(I suspect you'll not take that option :-(

2- It is still worth comparing. Using the Stepanov code, with g++ -O0
I get:
1st run:
size array vector with pointers vector with iterators
10 1.62 1.93 3.79
100 1.04 1.09 2.36
1000 1.25 1.28 2.39
10000 1.16 1.19 2.17
100000 1.05 1.08 1.93
1000000 1.14 1.17 2.14
2nd run:
size array vector with pointers vector with iterators
10 1.54 1.82 3.73
100 1.06 1.05 2.27
1000 1.24 1.29 2.39
10000 1.17 1.16 2.20
100000 1.06 1.07 1.90
1000000 1.15 1.13 2.05

i.e. array and vector with pointers are showing same performance once
size>=100 without any optimisation. Iterator are slower. Obviously
this test is not quite your particular case but it certainly shows
that assumption about slow vectors are not correct. With -O2 or -O3
the result are even more interesting.

You are talking mostly about memory allocation. Try testing just
memory allocation. A very basic test could be written in a few lines:

const size_t loopIter = 100000;
const size_t dataSize = 10000;

void test1()
{
for(size_t i = 0; i < loopIter ; ++i)
{
std::vector<char> v(dataSize);
}
}

void test2()
{
for(size_t i = 0; i < loopIter ; ++i)
{
char *v = new char[dataSize];
delete v;
}
}

void test3()
{
for(size_t i = 0; i < loopIter ; ++i)
{
char *v = new char[dataSize];
memset(v, 0, dataSize);
delete v;
}
}

void test4()
{
for(size_t i = 0; i < loopIter ; ++i)
{
std::vector<char> v;
v..reserve(dataSize);
}
}

Both with and without optimisation. On my system, regardless of
optimisation, test1 is pretty close to test3. test4 is a bit slower
than test2 but far less than a 2-1 margin, more like 10 to 50%.

But again, it is worth repeating: the above is unlikely to make much
difference in your application. It would be a rare case where the
cost of new[] vs vector is making a significant difference in the
overall speed of the application. Optimizing a little trivial loop
like above might only gain you 0.01% speed in the whole application.


Yannick
 
S

stan

Martin said:
Triple-DES said:
Triple-DES wrote: [snip]
Without further evidence, I wouldn't go as far as saying that this is
100% safe on VC8, but it would seem that the most trivial cases are
handled gracefully.
This certainly seems slightly more comfortable than "100% sure". That
level of confidence always sets off warning flags "100%" of the time :)

Yes, and I guess I should moderate myself even further. What I should
have written was something along the lines of: Under certain
conditions, it is possible to write code in such a way that mixing
new[] / delete will not leak memory using VC8.
To be honest I'm not convinced that even this claim stands up very well.
How does optimizing, debugging, ... etc other environment changes impact
this outcome? I'm not asking for anyone to actually test this. I'm
personally willing to take the standard at it's word and consider it
undefined. From the first time I heard about it I've been adverse to
doing anything that might cause demons to fly out of my nose. My
children have moved back in with me and I'm already pressed for space.

I can't actually imagine why this question would ever come up. I can't
see any benefit from skirting the rules. Am I missing some possible
valid reason to investigate this?

My own reason for investigating this was that I wanted to convince the
OP that it was unsafe. Unfortunately, I haven't had any luck so far :)

No convincing was needed, actually. :)
I'm not going to do this, but having a clue as to *how* evil something
is can never hurt. Not to repeat it, but to know how urgent it is to fix
it should you encounter it.

Seems like the risk is determined by the worst case possibility since
you can never be sure when that might happen. Given the right
circumstances it could format the drive. Without going down to the
assembly code you can't determine the factors involved and you can't be
sure the worst case won't happen; even it it didn't happen the last time
you ran the program. "I did it before and nothing really bad happened"
is the kind of thinking used by drunk drivers. It just doesn't seem like
a good model to follow. To me it seems like you have to put this in with
the top priority bugs to be fixed soon, and I certainly would fix it
prior to release.

I suppose this might be an issue where past programming experience might
impact one's view. I don't mean years of programming here, I'm talking
about the type and evnironment of programs. It's probably also influenced by
the number of hours of debugging and lost sleep :)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,176
Messages
2,570,950
Members
47,500
Latest member
ArianneJsb

Latest Threads

Top