S
Steven T. Hatton
I read Stroustrup's article of the day:
http://www.research.att.com/~bs/C++.html
Programming with Exceptions. InformIt.com. April 2001.
http://www.research.att.com/~bs/eh_brief.pdf
Some of these ideas are finally beginning to sink in. I believe I looked at
the same article a while back and decided I wasn't quite ready for it. If I
understood things correctly, there seems to be a slight problem with the
design of his exception safe assignment operator. I believe I have
correctly extracted all the relevant code, and appropriately commented it
to describe the various issues discussed in the article. My questions are
in the final comment block.
class Vector { // v points to an array of sz ints
int sz;
int* v;
public: explicit Vector(int n); // create vector of n ints
Vector(const Vector&);
~Vector(); // destroy vector
Vector& operator=(const Vector&); // assignment
int size() const;
void resize(int n); // change the size to n
int& operator[](int i); // subscripting
const int& operator[](int i) const; // subscripting
};
Vector::Vector(int i) //constructor
:sz(i) ,v(new int) { } // establishes class invariant
Vector::~Vector() { delete[] v; }
int Vector::size() const { return sz; } //no effect on class representation
struct Bad_range { }; // I want an Über-exception to derive from!
int& Vector:perator[](int i)
{
if (0<=i && i<sz) return v;
throw Bad_range(); // no effect on class representation if thrown
}
/*
This is a bad assignment operator implementation because
it can lead to bad things like double deletion when exceptions
are thrown. That's because it fails to maintain the class invariant
that says a Vector holds an array
*/
Vector& Vector:perator=(const Vector& a)
{
sz = a.sz; // get new size
delete[] v; // free old memory
v = new int[n]; // get new memory
copy(a.v,a.v+a.sz,v); // copy to new memory
}
/*
The is a better version because it maintains the invariant
even if the memory allocation fails. We can probably trust
copy() not to throw an exception, or to otherwise fail,
so we don't worry about losing *p if it fails.
*/
Vector& Vector:perator=(const Vector& a)
{
int* p = new int[n]; // get new memory [or thorw bad_alloc?]
copy(a.v,a.v+a.sz,p); // copy to new memory
/* invariant is violated in the next statement*/
sz = a.sz; // get new size
delete[] v; // free old memory
/* invariant is reestablished in the next statement*/
v = p;
}
/*
This is the example of where the problem with the first form
of Vector:perator=(const Vector& a) might arise. Vector::v is
deleted once by the assignment operator, and then again when the
destructor Vector::~Vector() is called upon exit of the containing
block.
*/
int main() {
try
{
Vector vec(10) ;
cout << vec.size() << ´\n´; // so far, so good
Vector v2(40*1000000) ; // ask for 160 megabytes
vec = v2; // use another 160 megabytes
}
catch(Range_error) {
cerr << "Oops: Range error!\n";
}
catch(bad_alloc) {
cerr << "Oops: memory exhausted!\n";
}
}
/*
Now my observation is this:
If I don't free the memory in Vector::v until the penultimate statement
in the assignment operator function, then I will not be able to
allocate as much memory as I would with the original (bad)
implementation.
Earlier in the article Stroustrup provides this example of a
destructor: ~File_ptr() { if (p) fclose(p); }
If I were to use the comparable ~Vector() { if(v) delete[] v; }
in conjunction with the first form of the assignment operator
function that would seem to solve the problem of not freeing
potentially usable memory before allocating the new array.
A problem with that approach seems to be that it violates the
guarantee of class invariance. Is this a correct observation on my
part? Is there a way to accomplish both the goal of freeing available
memory prior to allocating more, and preserving the class invariant?
I can think of some approaches, but they all seem to add overhead
to the assignment operator function.
*/
http://www.research.att.com/~bs/C++.html
Programming with Exceptions. InformIt.com. April 2001.
http://www.research.att.com/~bs/eh_brief.pdf
Some of these ideas are finally beginning to sink in. I believe I looked at
the same article a while back and decided I wasn't quite ready for it. If I
understood things correctly, there seems to be a slight problem with the
design of his exception safe assignment operator. I believe I have
correctly extracted all the relevant code, and appropriately commented it
to describe the various issues discussed in the article. My questions are
in the final comment block.
class Vector { // v points to an array of sz ints
int sz;
int* v;
public: explicit Vector(int n); // create vector of n ints
Vector(const Vector&);
~Vector(); // destroy vector
Vector& operator=(const Vector&); // assignment
int size() const;
void resize(int n); // change the size to n
int& operator[](int i); // subscripting
const int& operator[](int i) const; // subscripting
};
Vector::Vector(int i) //constructor
:sz(i) ,v(new int) { } // establishes class invariant
Vector::~Vector() { delete[] v; }
int Vector::size() const { return sz; } //no effect on class representation
struct Bad_range { }; // I want an Über-exception to derive from!
int& Vector:perator[](int i)
{
if (0<=i && i<sz) return v;
throw Bad_range(); // no effect on class representation if thrown
}
/*
This is a bad assignment operator implementation because
it can lead to bad things like double deletion when exceptions
are thrown. That's because it fails to maintain the class invariant
that says a Vector holds an array
*/
Vector& Vector:perator=(const Vector& a)
{
sz = a.sz; // get new size
delete[] v; // free old memory
v = new int[n]; // get new memory
copy(a.v,a.v+a.sz,v); // copy to new memory
}
/*
The is a better version because it maintains the invariant
even if the memory allocation fails. We can probably trust
copy() not to throw an exception, or to otherwise fail,
so we don't worry about losing *p if it fails.
*/
Vector& Vector:perator=(const Vector& a)
{
int* p = new int[n]; // get new memory [or thorw bad_alloc?]
copy(a.v,a.v+a.sz,p); // copy to new memory
/* invariant is violated in the next statement*/
sz = a.sz; // get new size
delete[] v; // free old memory
/* invariant is reestablished in the next statement*/
v = p;
}
/*
This is the example of where the problem with the first form
of Vector:perator=(const Vector& a) might arise. Vector::v is
deleted once by the assignment operator, and then again when the
destructor Vector::~Vector() is called upon exit of the containing
block.
*/
int main() {
try
{
Vector vec(10) ;
cout << vec.size() << ´\n´; // so far, so good
Vector v2(40*1000000) ; // ask for 160 megabytes
vec = v2; // use another 160 megabytes
}
catch(Range_error) {
cerr << "Oops: Range error!\n";
}
catch(bad_alloc) {
cerr << "Oops: memory exhausted!\n";
}
}
/*
Now my observation is this:
If I don't free the memory in Vector::v until the penultimate statement
in the assignment operator function, then I will not be able to
allocate as much memory as I would with the original (bad)
implementation.
Earlier in the article Stroustrup provides this example of a
destructor: ~File_ptr() { if (p) fclose(p); }
If I were to use the comparable ~Vector() { if(v) delete[] v; }
in conjunction with the first form of the assignment operator
function that would seem to solve the problem of not freeing
potentially usable memory before allocating the new array.
A problem with that approach seems to be that it violates the
guarantee of class invariance. Is this a correct observation on my
part? Is there a way to accomplish both the goal of freeing available
memory prior to allocating more, and preserving the class invariant?
I can think of some approaches, but they all seem to add overhead
to the assignment operator function.
*/