When did K&R function declarations become obsolete?

8

88888 Dihedral

That was the Pascal and Fortran era in 198x.
There are p2c and f2c available, thus a lot people just switched programming
in C.

I don't think p2c and f2c had much influence (I was looking for
CORAL66toc myself)
But nowadays the 3rd generation of programming language


how do you count generations?

1. FORTRAN, COBOL, ALGOL, LISP
2. Pascal, Occam, Ada, C, C++
3. Java, Perl, Python, Ruby, C#


1st generation: Just pure machine instructions without any macro or library
or pseudo instructions. This is somewhat like the relay era long time ago.

2nd generation: assembler with enhanced tools

3rd generation: algol family, basic, fortran, pascal, c .....
// modular and structure with algorithms

4th: forth, c++, object pascal, lisp // object oriented

5th: python, ruby, erlang, perl..... //generic, object, system level

This is a personal classification of various computer languages.
where do Scheme, Snobol, Haskell, Prolog go?

4-6 gen mixed, some languages were just experiments and were not popular
enough to gain millions of users
 
R

ralph

.... On
the other hand, as far as I am concerned, K&R function declarations became
obsolete the first time a function prototype caught an error of mine that
would have taken me hours to track down otherwise.

It wasn't just the compiler 'parser' either. 'Standard C' also made it
much easier to work with code browsers, formatters, and
static-checkers. Initiallly home-grown, but quickly supplemented with
external commercial utilities.

-ralph
 
K

Kaz Kylheku

It wasn't just the compiler 'parser' either. 'Standard C' also made it
much easier to work with code browsers, formatters, and
static-checkers. Initiallly home-grown, but quickly supplemented with
external commercial utilities.

Given that such things work fine for languages that don't have header files and
declarations, I'm somewhat skeptical.
 
B

Ben Bacarisse

Scott Fluhrer said:
I suspect that this is the problem; the compiler sees the tokens:

foo(a, b, c, d, e, f)

Is this a call to the function foo, or is this the start of a K&R definition
of the function 'foo'?

You use of "the problem" is confusing. Alan Mackenzie explained what the
problem was that he and I were talking about, and it's not to do with
compiling but parsing inside an editor.

I think you are suggesting that there is "another problem".
Now, in standard C, there really isn't such an ambiguity; a call to a
function can occur only within another function, while function definitions
can occur only at global level. However, I suspect that the OP is talking
about GCC (or a similar compiler), which allows functions to be defined in
other functions -- in that case, this ambiguity might happen a lot.

Personal opinions: I'd suspect that just forbidding implicit types of
functions-within-other-functions (so it'd have to be 'int foo(a,b,c,d,e,f)'
would be sufficient to break up the ambiguity in the most common cases. On
the other hand, as far as I am concerned, K&R function declarations became
obsolete the first time a function prototype caught an error of mine that
would have taken me hours to track down otherwise.

Standard C (C99 that is) both forbids nested function *and* has removed
implicit int in declarations, so there is (doubly!) no ambiguity.

My own person opinion: I think it's a shame that ',' was chosen to
separate function parameter declarations. I would have used the compact
form available for declarations generally:

int foo(int a, b, c; double *sum);

When they don't need to be named one would prefer

int foo(int; int; int; double *);

but, just to keep things uniform, I think the syntax might

int foo(int ,,; double *);
 
R

ralph

Given that such things work fine for languages that don't have header files and
declarations, I'm somewhat skeptical.

Skeptical of what?

That it isn't easier to write a browser, editor macros, or other
development utilites for C code that has prototypes compared to free
form code that doesn't? That's kind of silly, IMHO.

-ralph
 
K

Kaz Kylheku

Skeptical of what?

That it isn't easier to write a browser, editor macros, or other
development utilites for C code that has prototypes compared to free
form code that doesn't? That's kind of silly, IMHO.

Yes.

It's no harder to extract the same information from

int foo(a, b)
char *a;
double b;
{

than from

int foo(char *, double b)
{

which is what I think your point boils down to.

The type info is there. Prototypes are juts a crutch for type checking,
which allows C to retain the model of translation taking place without
any global knowledge about the program, other than what is brought in
via #include. They are a quick and dirty way to almost achieve support
for modularity in the language with the assistance of a textual preprocessor,
nothing more.

Code cross referencing and browsing tools cannot reasonably follow the same
limitation of dealing with one translation unit.

Do prototypes make it easier to make such tools? I don't see how. If you write
a great tool that cross-references material from K&R C code bases, it is extra
work to adapt that tool to handle prototypes, even just to recognize and ignore
them.
 
S

Seebs

It's no harder to extract the same information from

int foo(a, b)
char *a;
double b;
{

than from

int foo(char *, double b)
{

It's harder for me on reading it. It's also quite a lot harder in the
case where I'm looking at headers, and the choice is between:

extern int foo();
extern int foo(char *, double);
The type info is there. Prototypes are juts a crutch for type checking,
which allows C to retain the model of translation taking place without
any global knowledge about the program, other than what is brought in
via #include. They are a quick and dirty way to almost achieve support
for modularity in the language with the assistance of a textual preprocessor,
nothing more.

They're good enough modularity for me to use binaries for source code that's
never been on my machine, which is good for me.
Do prototypes make it easier to make such tools?

Well, they certainly do in the fairly common case where there's a reason
for which you have headers but not the complete source.

-s
 
J

Joe keane

extern int foo();
extern int foo(char *, double);

My onion is that no one gives a rat's behind about the syntax for how
functions are -defined- [i like old one better] but putting types where
they are -declared- is worth about five other features combined.

Maybe you could make a utility that checks across source files?
 
B

Ben Pfaff

extern int foo();
extern int foo(char *, double);

My onion is that no one gives a rat's behind about the syntax for how
functions are -defined- [i like old one better] but putting types where
they are -declared- is worth about five other features combined.

I'm not sure of what relationship you are suggesting between an
onion and a rat's behind, but I think that it's rather nice to
have the declaration and the definition take the same form, that
is, I appreciate not having to write the definition differently
from the prototype.
 
S

Seebs

My onion is that no one gives a rat's behind about the syntax for how
functions are -defined- [i like old one better] but putting types where
they are -declared- is worth about five other features combined.

I mostly agree with this. If it weren't for the "default promotions"
applying to old-style definitions, I might well use them by preference.

-s
 
K

Keith Thompson

extern int foo();
extern int foo(char *, double);

My onion is that no one gives a rat's behind about the syntax for how
functions are -defined- [i like old one better] but putting types where
they are -declared- is worth about five other features combined.

Maybe you could make a utility that checks across source files?

Certainly some people do care about the syntax of function definitions.

Personally, I prefer the "new-style" (actually several decades old)
definitions, mostly because they're consistent with "new-style"
declarations.
 
K

Kaz Kylheku

My onion is that no one gives a rat's behind about the syntax for how
functions are -defined- [i like old one better] but putting types where
they are -declared- is worth about five other features combined.

I mostly agree with this. If it weren't for the "default promotions"
applying to old-style definitions, I might well use them by preference.

Obsolecence of a feature can lead to a semantics change to the old syntax,
rather than removal.

I.e. it could simply be required, one day, that

void x(f)
float f;
{
}

now means exactly the same thing as:

void x(float f)
{
}

and the syntax is no longer obsolescent, just the old semantics.
 
A

Alan Mackenzie

Kaz Kylheku said:
It's no harder to extract the same information from
int foo(a, b)
char *a;
double b;
{
than from
int foo(char *, double b)
{
which is what I think your point boils down to.

As I explained yesterday, it is harder for an editor (in particular,
Emacs). There, parsing is done backwards, finding the syntactic context
of a particular line. Old style definitions are a royal pain in the
arsenic. The type definitions in them are the one thing in C which isn't
context free. (C++ is much worse.)
 
K

Keith Thompson

Jonathan Leffler said:
They are still used in the GCC source code, probably to ensure that
GCC can be compiled by a compiler that does not recognize prototypes.
(It used to be the case the HP-UX comes with such a compiler, used for
rebuilding the kernel.)

As of gcc 3.4:

* GCC now requires an ISO C90 (ANSI C89) C compiler to build. K&R C
compilers will not work.

(quoted from the gcc 4.6.0 NEWS file). gcc 3.4.0 was released April,
2004; the last 3.4 release, 3.4.6, was March, 2006. But that doesn't
necessarily mean there aren't still non-prototype declarations.

[...]
I still encounter them routinely, and curse the fact that no-one else
has bothered to convert them. (There are more or less complete sets
of headers with prototypes for the functions - so there is normally a
prototype in scope when the function definition is compiled or the
called, but not always.) And one or two benighted individuals have
produced new K&R functions for consistency with the rest of the
file. When I find out in time, I have a firm discussion with the
individual, who usually repents of their aberrant ways. But too often
I don't find out about it until some time later.

Maintaining existing coding style is *usually* a very good idea.
Prototypes are an unusual case where one style is clearly better than
another one.
 
8

88888 Dihedral

Jonathan Leffler said:
They are still used in the GCC source code, probably to ensure that
GCC can be compiled by a compiler that does not recognize prototypes.
(It used to be the case the HP-UX comes with such a compiler, used for
rebuilding the kernel.)

As of gcc 3.4:

* GCC now requires an ISO C90 (ANSI C89) C compiler to build. K&R C
compilers will not work.

(quoted from the gcc 4.6.0 NEWS file). gcc 3.4.0 was released April,
2004; the last 3.4 release, 3.4.6, was March, 2006. But that doesn't
necessarily mean there aren't still non-prototype declarations.

[...]
I still encounter them routinely, and curse the fact that no-one else
has bothered to convert them. (There are more or less complete sets
of headers with prototypes for the functions - so there is normally a
prototype in scope when the function definition is compiled or the
called, but not always.) And one or two benighted individuals have
produced new K&R functions for consistency with the rest of the
file. When I find out in time, I have a firm discussion with the
individual, who usually repents of their aberrant ways. But too often
I don't find out about it until some time later.

Maintaining existing coding style is *usually* a very good idea.
Prototypes are an unusual case where one style is clearly better than
another one.

--
Keith Thompson (The_Other_Keith) (e-mail address removed) <http://www.ghoti.net/~kst>
Will write code for food.
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"

Unless there won't be new instruction sets of CPU designed to emerge on the world, the old style K&R non-prototyped assembly calling convention won't
die.
 
L

luser- -droog

When I started using C in 1989 they were still quite common to be
seen in code I was looking at but that dropped significantly during
the following decade and I hardly ever have seen code using them
written in this millenium. So my very personal guess is that most
people use the "new" form since at least about 1995.

Mid-90s seems about right. I've got Johnson & Reichard, X Window
Applications Programming, 2ed, from 1992 that still has K&R function
declarations. But, of course, they'd been maintaining that code since
about 1985.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,083
Messages
2,570,591
Members
47,212
Latest member
RobynWiley

Latest Threads

Top