[ ... ]
If I needed some "concept" I would create an abstract class
that has all those properties and the compiler could check
that the given type conforms to all the properties of the
specified class.
Not so. If you use inheritance, you quickly end up with something
virtually indistinguishable from Smalltalk. In particular, your type
checking ends being done at run-time rather than at compile time.
The problem is fairly simple: inheritance means selecting a type far
enough up the tree that it's an ancestor of EVERY type you might care to
work with. Unfortunately, that almost inevitably includes a lot of other
types you don't want to work with.
You also end up really warping the inheritance tree to make this work.
Consider for example, sorting collections of objects. We can compare
strings, so we want to support sorting collections of strings. We can
also compare integers, so we want to support sorting collections of
strings. To support that, we have to apply 'sort' to some common
ancestor of both 'string' and 'integer' -- so far so good.
Now we run into rather a problem though: contrary to your earlier
statement, floating point numbers really canNOT be sorted. In
particular, sorting (at least as normally implemented) requires that if
!(a<b) && !(b<a) then (b==a). A floating point NaN violates that
requirement -- a NaN doesn't compare equal to another NaN, and generally
doesn't even compare equal to itself!
Since we want to allow assignment from integer to floating point, but
not the reverse, floating point should be an ancestor of integer. We've
already seen, however, that string needs to descend from something that
is NOT an ancestor of floating point -- ergo, floating point ends up as
a base of both string and integer. This allows something we really DON'T
want though -- an implicit conversion from any string to floating point.
The alternative is for string and integer to be only distantly related
(sensible) and do the sorting on some abstract type near the root of the
tree that's an ancestor of integer, floating point AND string, and then
check at run time to ensure against attempting to sort things like
floating point numbers that can't be sorted.
The only reason that this is not done is that OO is no longer
"in", i.e. the OO "FAD" has disappeared. We have new fads
now.
Quite the contrary. 30+ years of experience with using inheritance has
shown that while it models some relationships quite well, there are many
other relationships it models quite poorly.
Inheritance requires us to specify similarity in an all or nothing
fashion. Specifically, a derived object shares ALL the characteristics
of its base. If there is too much difference to allow that, then we
can't use inheritance, which prevents us from taking advantage of ANY
similarity.
Concepts allow us to specify the exact degree of similarity between
types necessary for a particular operation. There's no need to create a
new type specifically to encapsulate the similarity needed for this
particular operation, and there's no need to operate on excessivly
abstract objects, and then do run-time checks to ensure that the objects
really do support the operations we care about. Likewise, we don't end
up with unwanted implicit conversions just because SOME operations imply
a relationship that doesn't work otherwise.