Revistiing using return value as reference

E

Erik Wikström

Funny though, at some point in their education. Someone new to C++ will
realize that references _are_ "crippled pointers" (that are
automatically dereferenced) and then comes the inevitable question of
why they are in the language. Then come these almost inevitable
discussions... :)

Maybe if it was explained at the outset, that references are "crippled
pointers", useful precisely because they are "less powerful" (and
therefor more focused to a task,) people wouldn't get confused when they
come to the realization themselves.

That would require that you teach them what pointers are early on. If
you teach them references first they might instead come to consider
pointers as references on steroids.
 
D

Daniel T.

Erik Wikström said:
That would require that you teach them what pointers are early on.
If you teach them references first they might instead come to
consider pointers as references on steroids.

Good point. I'm looking at this from the POV of a C programmer comming
to C++, and from the POV of standard instructional texts that do in fact
teach pointers first.
 
J

johanatan

Just what don't you understand?  An expression in Java never
evaluates to an object.  An expression is always either a
reference or a basic types.

Well, please re-read my question. I will quote here again for
clarity:

The question was 'can an expression evaluate to a reference to an
object'? The answer seems to me to be yes, and also seem to be
according to you. So, you can do something like:

class derived : public object {}

class mainClass
{
derived f( derived one, derived two )
{
//do some sort of combination involving omitted derived members
derived retVal = one.somemethod(two);
return retVal;
}

int main()
{
derived one, two, three;
// initialize one, two, and three
derived four = f(f(one, two), three);
}
}

So, the result of f is a variable (a reference variable) and is thus
an 'l-value' (according to the Java definition of such). As far as I
know this is no problem in Java, but in C++, the result of f(one, two)
would be an r-value and could not thus be passed to the outer f(rval,
three)

Is that not correct?

--Jonathan
 
J

johanatan

Funny though, at some point in their education. Someone new to C++ will
realize that references _are_ "crippled pointers" (that are
automatically dereferenced) and then comes the inevitable question of
why they are in the language. Then come these almost inevitable
discussions... :)

Maybe if it was explained at the outset, that references are "crippled
pointers", useful precisely because they are "less powerful" (and
therefor more focused to a task,) people wouldn't get confused when they
come to the realization themselves.

Yea, I think that order would be preferred (or at least in parallel
with assembly and C). But, I think it somewhat depends on the
individual student. And, if Joel (from Joel on Software) is correct,
then a student that can't understand pointers is in the wrong field.
I tend to agree with that. Trying to simplify the situation by
calling them references or an aliases is just going to prolong the
inevitable. (I would refer you back to Joel's article on 'Java
Schools' to support this point).

And, incompleteness as it pertains to aliases is also a type of
'incorrectness'. The little detail that a reference occupies some
memory might be viewed by some as incompleteness, but by others as
incorrectness. There are other details it is 'incorrect' on as well.
As I said previously, any abstraction is 'imperfect'.

Another thought has occurred about the '&'-- why do you think the
same symbol was used for the 'address-of' operator and the 'reference'
marker? It seems quite a strange coincidence when there are still
other perfectly usable symbols unused in C++ (take $ for instance).

--Jonathan
 
J

Jerry Coffin

[ ... ]
Another thought has occurred about the '&'-- why do you think the
same symbol was used for the 'address-of' operator and the 'reference'
marker? It seems quite a strange coincidence when there are still
other perfectly usable symbols unused in C++ (take $ for instance).

The dollar sign is part of ASCII, but has never been part of C or C++
basic character set. That symbol is also absent from quite a few other
character sets, such as ISO 646. C++ has gone to a fair amount of effort
to include features (trigraphs, digraphs, alternate symbols) to allow
writing its source code with restricted character sets, so using this
symbol would be _quite_ a strange addition to the language.

There seems to be quite a bit of resistance to adding any new/different
symbol or key word to the language, even when/if avoiding it means
adding a new meaning to one that already means too much.
 
J

johanatan

[ ... ]
Another thought has occurred about the '&'--  why do you think the
same symbol was used for the 'address-of' operator and the 'reference'
marker?  It seems quite a strange coincidence when there are still
other perfectly usable symbols unused in C++ (take $ for instance).

The dollar sign is part of ASCII, but has never been part of C or C++
basic character set. That symbol is also absent from quite a few other
character sets, such as ISO 646. C++ has gone to a fair amount of effort
to include features (trigraphs, digraphs, alternate symbols) to allow
writing its source code with restricted character sets, so using this
symbol would be _quite_ a strange addition to the language.

There seems to be quite a bit of resistance to adding any new/different
symbol or key word to the language, even when/if avoiding it means
adding a new meaning to one that already means too much.


Well, there are other already plenty of other used symbols too (some
of which are much less 'overloaded'). ?, %, ^, and ~ come to mind.
Those could just as easily been overloaded as & (who's already serving
triple-duty and now quad-duty with 'address-of', reference, bitwise-
AND, and && (if you include && in your count).

Maybe there's a connection there between 'address-of' and reference.
I know there's always been one in my mind, but it is true that this
could have not been the intent of the designer (much as it came to be
used as 'syntactic sugar' more than its original purpose for operator
overloading). But, I find it a pretty convincing coincidence.

--Jonathan
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,888
Messages
2,569,964
Members
46,294
Latest member
HollieYork

Latest Threads

Top