Juha said:
And I have noticed that often when someone doesn't have any *actual*
good arguments, they instead go meta, criticizing someone for the
concepts or terminology they are using, rather than their actual
arguments.
Yeah, that was a bit snarkier than it perhaps should have been.
I don't think the invocation of category errors in this sort of
discussion is likely to be productive. Different languages live in
semi-orthogonal domains, so there tends not to be a lot of
synergy. Good language systems allow for interfacing to
other language systems with a minimum of fuss.
Ok, you lost all credibility. I don't think it's necessary to go further.
I've built significant, non-embedded, production systems using only
'C' that used no dynamic allocation at all. It doesn't get any less
worse than that. It wasn't difficult. You can do that in C++ too, but
it's a lot less ... sensible.
It is possible that there is a bias error I missed, but memory leaks
were much less a problem when the state of the art was 'C', at
least within the things I saw. As a *cultural* artifact, C++
made things worse for a while. This being said, newer
languages are beginning to improve on that significantly.
it well could be that 'C' simply kept otherwise sensible people
from *doing* programming through barrier to entry. Ironic,
since it was shrinkwrap 'C' compilers that helped popularize it...
I have never understood why managing memory allocation manually was
considered to be so onerous. Perhaps it's a personal failing. And
limitation can be extremely important to the general creative process.
I suppose my main gripe about C++ is just how much harder it made it
to dynamically configure a running system. If all the cardinalities and
sizes have to be declared at instantiation of all the objects, you
have then to tear them all down and rebuild them when a message to
reconfigure comes through. This can be a serious liability.