I do not understand though reluctance to use C style casts.
They can cast to reinterpret_cast but they also try out the weaker cast
before. It is just automatic way to do what you would do manually - first
try const_cast, then static_cast and finally reinterpret_cast... yes, it can
use reinterpret_cast but you would have to use it anyway in a particular
case where you really need it.
So what's wrong with little automation, providing that you know what you are
doing with casts?
Perhaps a story would help you understand.
A while back I worked under a guy who used C++ but basically feared
everything about it. Boy did he love inheritance though. He used
inheritance as his only method of code reuse. He created these
gigantic higherarchies that spanned multiple concepts attempting to
tie them together. Trying to add a new object to this system was a
nightmare because a lot of the time he put in "assert(false); // this
function shouldn't be called" as the base object definition instead of
using pure virtuals. He did this because his higherarchy was a total
pain in the ass and required you to implement more functions than you
needed. This of course left one not knowing what functions they
needed to override until running the program and getting an assert.
Eventually this guy quit and moved on to other things and places.
Hopefully he learned something about code design but he was pretty
stubborn. Once he did, I set about trying to fix the project and put
it into a state that allowed for more code reuse and in which it was
easier to add new behavior.
Did I mention this guy hated everything C++? This included new-style
casts. He used C casts everywhere. He even used them where they were
not necessary. For example:
class Object
{
public:
Object* readObjectA(char* c) { return (Object*)ObjectA(c); }
... lots more of these ...
};
class ObjectA : public Object { ... };
Now, I did not realize this was going on at the time. I split all the
various responsibilities that Object had in its interface into
multiple base objects and used multiple inheritance to apply those
interfaces only where they were being used. Sent the code off to
testing and it starts acting really stupid. What happened?
Well, the original developer had seen something in the compiler output
like, "cannot convert ObjectA* to Object*, use a reinterpret or c-
style cast," and so he did that instead of realizing he needed to put
these functions in a place that could see the full definition of
ObjectA so that it COULD do that conversion the way it's supposed to.
Believing only in all things C this kind of thing comes completely
natural to him. Of course, a reinterpret cast to one base among many
is a very bad thing to do.
This problem was trivial to fix but later I ran into further issues.
Because there were C style casts all through the code, we had
opportunity for more issues to come up. This used the Win32 API and
had a lot of callback functions. These functions would take a type
number ID (guy hated RTTI and dynamic casting) and then cast to the
appropriate type, having already casted the long or void* to the base
type. Often this code would assume it's only going to get a certain
subset of objects passed to it because they're the only ones that have
some somewhat related functionality. If the type wasn't of one it
would assume the other. It would use c-style casts to do so. So the
code would promptly cast to the wrong derived class object and start
calling functions on it. This issue did not turn up until I started
messing with things; it was of course illiciting undefined behavior
all over the place but nobody was the wiser about it because it just
worked, and had worked for 15 years.
Because this guy used c-style casts instead of the appropriate new-
style (or even the inappropriate one), it was quite impractical to
hunt down casts like these and fix them. They had to turn up as
errors, which occasionally they did when we started adding new objects
to the tree that performed roles that were previously assumed to only
be performed by a group of classes. We knew that we needed to start
doing dynamic casts for these and shooting out error or exceptions
(which the guy also hated of course) instead of allowing undefined
behavior, but we were stuck because there is no such thing as a
regular expression that will match c-style casts.
This product tree had to be scrapped and redone. We were stuck with
old interfaces, lacked any amount of I18N and were trying to get into
foreign markets, etc...things that nobody really thought of when the
project started and because of poor design choices and especially
because of an abundant use of C-style casts...the amount of work
necessary to fix the code was astronomical compared to simply redoing
the whole thing. Cost the company many thousands of dollars to do so
and they're still at it 3 years later.
C-style casts are horrible because they can do anything at any time
without any warning. One minor code change can turn a static cast
into a reinterpret cast. You can't hunt these things down because
regular expressions are useless. The undefined behavior they cause
may or may not turn up at some time in the near future....it may be 2
decades before your code starts crashing in some place utterly
unrelated to the cast in an object that has nothing to do with what is
actually in the memory its using. NOBODY can keep track of every
place that needs to cast, especially within complex desktop
applications that use casting quite regularly. Stuff gets lost and
bad things happen. The new style casts provide a much better method
because they turn up more errors, you'll get a compiler error instead
of successful compile when your static_cast is no longer appropriate,
and can be searched for quite easily. They should be used for all
cases in which they can be, which is all cases in most projects.