Why doesn't strrstr() exist?

  • Thread starter Christopher Benson-Manica
  • Start date
K

Keith Thompson

Magnus Wibeck said:
The Ariane software module that caused the problem was written in
Ada. http://sunnyday.mit.edu/accidents/Ariane5accidentreport.html
Had it been written in C, the actual cause (integer overflow)
probably would not have caused an exception. I'm not saying that it
would have been better in C, but you *cannot* blame the C standard
for what happened there.

Nor can it be blamed on Ada. (Not that you did so, I just wanted to
clarify that point.)

The details are off-topic but easily Googlable.
 
K

Keith Thompson

Chris Torek said:
[off-topic drift, but I cannot resist...]

You feel that I my choice of moniker reflects something about my level
of expertise? Note that "Default User" is NOT the default name in
XanaNews, my current newsreader.

I always figured it meant that you are known for requiring a
"default:" label in every switch(). :)

Or for using a "Default:" label, which is perfectly legal but not
terribly useful.
 
M

Magnus Wibeck

Chris Hills wrote:
[..]
>
> why not?

I find such a comparison void of meaning. Like comparing apples and anxiety.
A language standard is a passive item that describes one specific thing.
You cannot pay it money to do what you want.
A support department in a commercial company (should) bend over backwards
to help its customers getting issues with the company's products sorted out.

I missed the point websnarf was making, which, if I understand it correctly,
is that the fact that there are (lots of) support for products that use C
somehow infers that C is an unsafe language.

If that is the point websnarf was trying to make, the comparison should
be the frequency of 3rd party support contacts made regarding C casued
problems compared to other languages. Obviously numbers that are darn near
impossible to gather.

I'm not getting into the "C is unsafe" discussion, I just saw a few,
as I see it, flawed, deductions about C and the "unsafeness" of it,
and tried to address them.

/Magnus
 
J

Joe Wright

Default said:
Stephen Hildrey wrote:





You feel that I my choice of moniker reflects something about my level
of expertise? Note that "Default User" is NOT the default name in
XanaNews, my current newsreader.



Brian

Nonsense. Among the several header lines of your message are..

From: "Default User" <[email protected]>
Newsgroups: comp.lang.c
Subject: Re: Why doesn't strrstr() exist?
Date: 25 Aug 2005 18:11:29 GMT
Lines: 15
Message-ID: <[email protected]>
User-Agent: XanaNews/1.16.3.1
 
W

websnarf

Hallvard said:
So it's partly their fault? What should they have done -
refrained from standardizing the already existing C language?

The ANSI C standard was good enough for 1989, when the computing
industry was still in its growing stages. It served the basic purpose
of standardizing everyone behind a common standard.

Its the standards that came *AFTER* that, where the problem is. The
problem of "buffer overflows" and similar problems was well documented
and even then were making the news. And look at the near unilateral
ambivalence to the C99 standard by compiler vendors. The point is that
the "coming together" has already been achieved -- the vendors have
already gotten the value out of unified standard from the 1989
standard. The C99 standard doesn't solve and crutcial problems of
similar nature.

But suppose the C99 standard (or C94, or some future standard) included
numerous changes for the purposes of security, that broke backwards
compatibility. If there are vendors who are concerned about backward
compatibility they would just stick with the older standard (which is
what they are doing right now anyways) and if they felt security was
more important then they would move towards the new standard.

The point being that, the real reason there has been so little C99
adoption is because there is little *value* in it. The foremost thing
it delivers is backwards compatibility -- but its something the
compiler vendors *ALREADY HAVE* by sticking with the previous
standards.

Because C99 has so little value add over C89, there is no demand for
it. And it fundamentally means that language really only solves the
same problems that it did in 1989. Even for me, restrict, was really
the only language feature I was remotely interested in, and <stdint.h>
the only other thing in the standard that has any real value in it.
But since the vendors I use are not interested in implementing C99, I
have lived with "assume no aliasing" compiler switches and I have
fashioned my very own stdint.h. It turns out, that in practice, this
completely covers my C99 needs -- and I'm sure it solves most people's
C99 needs. And this is just using 1989 C compiler technology.

If the C99 standard had solved *important* problems that are plaguing
programmers today, then I think there would be more demand. You might
cause a fracture in the C community, but at least the would be some
degree of keeping up with the needs of the C community. And the reason
*WHY* such things should be solved in the C standard and not just other
languages, is because this is where the largest problems are, and where
the effect can be leveraged to the degree.
That would not have helped: K&R C was already widely used, and
people were cooperating anyway to get some sort of portability out
of it.

Right. It would not have helped in 1989. But C99 doesn't help anyone
today. That's the key point.
Or should they have removed every undefined situation from the
language?

No, just the worst offenders.
[...] Bye bye free() and realloc() - require a garbage
collector instead. To catch all bad pointer usage, insert
type/range information in both pointers and data. Those two
changes alone in the standard would change the C runtime
implementation so much that it's practically another language.

Right. That's not what I am advocating.

Tell me, if you removed gets, strtok, and strn??, would you also have
practically another language?
 
K

Keith Thompson

Joe Wright said:
Default User wrote: [...]
You feel that I my choice of moniker reflects something about my
level
of expertise? Note that "Default User" is NOT the default name in
XanaNews, my current newsreader.
Brian

Nonsense. Among the several header lines of your message are..

From: "Default User" <[email protected]>
Newsgroups: comp.lang.c
Subject: Re: Why doesn't strrstr() exist?
Date: 25 Aug 2005 18:11:29 GMT
Lines: 15
Message-ID: <[email protected]>
User-Agent: XanaNews/1.16.3.1

And how does this imply that his statement is nonsense?

If Mr. and Mrs. User named their son Default, I'd expect his articles
to have very similar headers if he used the same server and
newsreader.
 
R

Randy Howard

(e-mail address removed) wrote
(in article
Randy Howard wrote:

Right. If you're not with us, you are with the terrorists.

Excuse me?
Why does being a low language mean you have to present a programming
interface surrounded by landmines?

If you have access to any sequence of opcodes available on the
target processor, how can it not be?
Exposing a sufficiently low level
interface may require that you expose some danergous semantics, but why
expose them up front right in the most natural paths of usage?

Do you feel that 'gets()' is part of the most natural path in C?
Ok, the halting problem means basically nobody guarantees anything
about computer programming.

Fair enough, but you're just dodging the underlying question.
But its interesting that you bring up the questions of assembly
language. If you persuse the x86 assembly USENET newsgroups, you will
see that many people are very interested in expanding the power and
syntax for assembly language (examples include HLA, RosAsm, and
others).

For a suitably generous definition of 'many', perhaps.
A recent post talked about writing a good string library for
assembly, and there was a strong endorsement for the length prefixed
style of strings, including one direct reference to Bstrlib as a design
worth following (not posted by me!).

I would have been shocked if you had not figured out a way to
bring your package up. :)
So, while assembly clearly isn't an inherently safe language, it seems
quite possible that some assembly efforts will have a much safer (and
much faster) string interface than C does.

Which does absolutely nothing to prevent the possibility of
developing insecure software in assembler. It may offer some
advantages for string handling, but that closes at best only one
of a thousand doors.
[...] If you want to argue that too many people
write code in C when their skill level is more appropriate to a
language with more seatbelts, I won't disagree. The trick is
deciding who gets to make the rules.

But I'm not arguing that either. I am saying C is to a large degree
just capriciously and unnecessarily unsafe (and slow, and powerless,
and unportable etc., etc).

Slow? Yes, I keep forgetting how much better performance one
achieves when using Ruby or Python. Yeah, right.

Powerless? How so? It seems to be the only language other than
assembler which has been used successfully for operating system
development.

Unportable? You have got to be kidding. I must be
hallucinating when I see my C source compiled and executing on
Windows, Linux, NetWare, OS X, Solaris, *bsd, and a host of
other UNIX-like platforms, on x86, x86-64, PPC, Sparc, etc.
Well, not exactly. If you're not using C or C++, then buffer overflows
usually at worse lead to a runtime exception; in C or C++, exploits are
typically designed to gain shell access in the context of the erroneous
program. Its like honey for bees -- people attack C/C++ programs
because they have this weakness. In other safer programming languages,
even if you had a buffer overflow, allowing a control flow
zombification of the program is typically not going to be possible.

That is all true, and it does nothing to address the point that
C is still going to be used for a lot of development work. The
cost of the runtime error handling is nonzero. Sure, there are
a lot of applications today where they do not need the raw speed
and can afford to use something else. That is not always the
case. People are still writing a lot of inline assembly even
when approaching 4GHz clock speeds.
How about simpler language that is more powerful, demonstrably faster,
more portable (dictionary definition), obviously safer and still just
as low level?

That would be nice.
Just take the C standard, deprecate the garbage, replace
a few things, genericize some of the APIs, well define some of the
scenarios which are currently described as undefined, make some of the
ambiguous syntaxes that lead to undefined behavior illegal, and you're
immediately there.

I don't immediately see how this will be demonstrably faster,
but you are free to invent such a language tomorrow afternoon.
Do it, back up your claims, and no doubt the world will beat a
path to your website. Right? "D" is already taken, what will
you call it?
Your problem is that you assume making C safer (or faster, or more
portable, or whatever) will take something useful away from C that it
currently has. Think about that for a minute. How is possible that
your mind can be in that state?

It isn't possible. What is possible is for you to make gross
assumptions about what 'my problem' is based up the post you are
replying to here. I do not assume that C can not be made safer.
What I said, since you seem to have missed it, is that the
authors of the C standard are not responsible for programmer
bugs.
But I was not advocating that. You want punishment -- so you
implicitely are *demanding* programmer perfection.

No, I am not. I do not demand that doctors are perfect, but I
expect them to be highly motivated to attempt to be perfect.
So, estimate the time taken to absorb this information per programmer,
multiply it by the average wage of that programmer, multiply that by
the number of programmers that follow that and there you get the cost
of doing it correctly.

What cost? Some 'world-wide rolled-up cost'? For me, it cost
me almost nothing at all. I first discovered gets() was
problematic at least a decade ago, probably even earlier, but I
don't keep notes on such things. It hasn't cost me anything
since. If I hire a programmer, this has all been settled to my
satisfaction before they get an offer letter. It hasn't been a
problem and I do not expect it to be one in the future.
The standards body, just needs to remove it and those costs go away.

They do not. As we have already seen, it takes years, if not
decades for a compiler supporting a standard to land in
programmer hands. With the stunningly poor adoption of C99, we
could not possibly hope to own or obtain an open source C0x
compiler prior to 2020-something, if ever. In the mean time,
those that are serious solved the problem years ago.
You don't think people who move code around with calls
to gets() in it should remove them?

Of course I do. In fact, I say so, which you conveniently
quoted just below...
And an old million line program?

Didn't /you/ just say that they should be removed?
I think this process should be
automated. In fact, I think it should be automated in your compiler.
In fact I think your compiler should just reject these nonsensical
functions out of hand and issue errors complaining about them.

Make up your mind. Fixing them in the the compiler, as I would
expect an 'automated' solution to do, and rejecting the
offending lines are completely different approaches.
Hey! I have an idea! Why not remove them from the standard?

Great idea. 15 years from now that will have some value.

A better idea. Patch gcc to bitch about them TODAY, regardless
of the standard.
Interesting -- because I do. You make gets a reserved word, not
redefinable by the preprocessor, and have it always lead to a syntax
error.

What part of 'people can still fire up and old compiler' did you
fail to read and/or understand?
This has value because, developers can claim to be "C 2010 compliant"
or whatever, and this can tell you that you know it doesn't have gets()
or any other wart that you decided to get rid of.

They could also simply claim "we are smarter than the average
bear, and we know better to use any of the following offensive
legacy functions, such as gets(), ..."

To clarify, since it didn't soak in the first time, I am not
opposed to them being removed. I simply don't this as a magic
bullet, and certainly not in the sense that it takes far too
long for the compilers to catch up with it. I would much rather
see compilers modified to deny gets() and its ilk by default,
and require a special command line option to bypass it, /if at
all/. However, the warning message should be far more useful
than
gets.c: 325: error: gets() has been deprecated.

That's just oh so useful, especially to newbies. I wouldn't
care if it dumped a page and a half of explanation, along with a
detailed example of how to replace such calls with something
safer. After all, good code doesn't have it in them anyway, and
it won't annoy anyone that is competent.
This would in turn
put pressure of the legacy code owners to remove the offending calls,
in an effort that's certainly no worse than the Y2K issue (without the
looming deadline hanging over their heads).

If, and only if, they use a compiler with such changes. We
still see posts on a regular basic with people using old 16-bit
Borland compilers to write new software.
Oh I see. So, which socialist totally unionized company do you work as
a programmer for? I'd like to apply!

I don't think you understood me. I know of no company that has
a policy for this. However, if I was working on something and
felt that something was being done that could be inherently
dangerous, and it was going to ship anyway, I would take some
form of legal action, if for no other reason than to be able to
disassociate myself from the impending lawsuits.

I would much rather go look for work than participate in
something that might wind up with people dying over the actions
of some meddling manager.
[...] If you
are being overworked, you can either keep doing it, or you can
quit, or you can convince your boss to lighten up.

Hmmm ... so you live in India?

Why would you think so?
I'm trying to guess where it is in this
day and age that you can just quit your job solely because you don't
like the pressures coming from management.

Where do you live? Because I am trying to guess where on the
planet you would /not/ have the right to quit your job.
Indentured servitude is not widely practiced anymore, AFAIK.
[...] ESPECIALLY in this case, the C standard folks are not to blame.

But if the same issue happens and you are using a safer language, the
same kinds of issues don't come up. Your code might be wrong, but it
won't allow buffer overflow exploits.

You can have 10 dozen other forms of security failure, that have
nothing to do with buffer overflows. It isn't a panacea. When
one form of attack is removed, another one shows up.

For example, the last straw the sent Microsoft windows off my
network for eternity happened recently. A computer system
running XP, SP2, all the patches, automatic Windows updates
daily, virus software with automatic updates and real-time
protection, email-virus scanning software, two different brands
of spyware protection, also with automatic updates enabled, and
both a hardware firewall and software firewall installed, got
covered up in viruses after 2 hours of letting my kids use it to
go play some stupid online kids game on disney.com or
nickelodeon.com (not sure which, since they went to both, and I
didn't want to replicate it). Suddenly, when I come back to
look at it, it has 3 or 4 new taskbar icons showing downloads in
progress of I know not what task manager shows a bunch of extra
processes that shouldn't be there, the registry run keys are
stuffed fool of malware, and it's pushing stuff out the network
of I know not what. I pull the cable, start trying to delete
files, which Windows wants to tell me I don't have permission to
do, scanning, the browser cache directories are filled with .exe
and .dll files, it's out of control.

A few expletives later, and I was installing a new Linux distro
that I had been meaning to try out for a while.

I had done just about everything I could imagine to lock the
system down, and it still got out of control in 2 hours letting
a 12-yr-old browse a website and play some games.

Of course, if enough people do the same thing, the bad guys will
figure out how to do this on Linux boxes as well. But for now,
the OS X and Linux systems have been causing me (and the kids)
zero pain and I'm loving it.
That's a nice bubble you live in. Or is it just in your mind?

No, I'm just not a spineless jellyfish. It's rather
disappointing that it surprises you, it doesn't say much for
your own backbone that you would just roll over when faced with
this sort of thing.
Because its not as structured, and that's simply not practical.
Doctors have training, internships, etc. Lawyers have to pass a bar
exam, etc. There's no such analogue for computer programmers.

Thank you. You get it now. That is exactly what is missing.
Because the most successful programmers are always ones that are
able to think outside the box,

Then they should have zero problems passing a rigorous training
program and examinations.
but the bar for average programmers is pretty low --

Fine. If you don't have your cert, you can be a 'nurse', you
can write scripts, or use uber-safe languages certified for
those not willing to prove themselves worthy through formal
certification.
but both can make a contribution, and neither can guarantee
perfect code.

And no doctor can guarantee that you won't die on the operating
table. But, they have to prove that they are competent anyway,
despite the lack of a guarantee of perfection. Would you like
it if they didn't have to do so?
Dennis Ritchie had no idea that NASA would put a priority inversion in
their pathfinder code.

Are you implying that Dennis Ritchie is responsible for some bad
code in the pathfinder project?
Linus Torvalds had no idea that the NSA would
take his code and use it for a security based platform.

Is there any evidence that the NSA chose his code because it was
not worth fooling with? What is your point? Oh, you're going
to tell us...
My point is
that programmers don't know what the liability of their code is,
because they are not always in control of when or where or for what it
might be used.

Wow, that is tortured at best. Presumably Ritchie is in your
list because of C or UNIX? How could he be 'liable' for an
application or driver written by somebody else 30 years later?

Are the contributors to gcc responsible for every bad piece of
software compiled with it?

If someone writes a denial-of-service attack program that sits
on a Linux host, is that Torvald's fault? I've heard of people
trying to shift blame before, but not that far. Maybe you might
want to blame Linus' parents too, since if they hadn't conceived
him, Linux wouldn't be around for evil programmers to write code
upon. Furrfu.
The recent JPEG parsing buffer overflow exploit, for example, came from
failed sample code from the JPEG website itself. You think we should
hunt down Tom Lane and linch him?

Nope. If you take sample code and don't investigate it fully
before putting it into production use, that's /your/ problem.
You think a doctor would take a sample of medicine he found
laying on a shelf in 7-11 and administer it to a patient in the
hopes that it would work? Downloading source off the web and
using it without reading and understanding it is similarly
irresponsible, although with perhaps less chance (although no
guarantee) of it killing someone.
You still don't get it. You, I or anyone you know, will produce errors
if pushed. There's no such thing as a 0 error rate for programming.

Then I do get it, because I agree with you. Let me know when I
can write a device driver in Python.
Just measuring first time compile error rates, myself, I score roughly
one syntax error per 300 lines of code. I take this as an indicator
for the likely number of hidden bugs I just don't know about in my
code. Unless my first-compile error rate was 0, I just can't have any
confidence that I don't also have a 0 hidden bug rate.

Strange logic, or lack thereof. Having no first-compile errors
doesn't provide ANY confidence that you don't have hidden bugs.
Go measure your own first-compile error rate and tell me you are
confident in your own ability to avoid hidden bugs.

That would be pointless, since measuring first-compile error
rate proves zilch about overall bug rates. If you want to avoid
hidden bugs, you have to actively look for them, test for them,
and code explicitly to avoid them, regardless of how often your
compiler detects a problem.
If you still think you can achieve a 0 or near 0 hidden bug rate,
[snip, no sense following a false premise]
For a nuclear reactor, I would also include the requirement that they
use a safer programming language like Ada. Personally I would be
shocked to know that *ANY* nuclear reactor control mechanism was
written in C. Maybe a low level I/O driver library, that was
thoroughly vetted (because you probably can't do that in Ada), but
that's it.

Well gee, there you have it. It seems that there are some
places were C is almost unavoidable. What a shock. Who's
wearing those rose-colored glasses now?
[...] For
example, there are operations that have very low success rates,
yet there are doctors that specialize in them anyway, despite
the low odds.

Well, your analogy only makes some sense if you are talking about
surgeons in developing countries who simply don't have access to the
necessary anesthetic, support staff or even the proper education to do
the operation correctly. In those cases, there is little choice, so
you make do with what you have. But obviously its a situation you just
want to move away from -- they way you solve it, is you give them
access to the safer, and better ways to practice medicine.

You seem to ignore the /fact/ that even in the finest medical
facilities on the planet (argue where they are elsewhere) there
are medical operations that have very low success rates, yet
they are still attempted, usually because the alternative is
certain death. A 20% chance is better than zero.
So you want some people to stay away from C because the language is too
dangerous.

So are chainsaws, but I don't want chainsaws to be illegal, they
come in handy. So are steak knifes, and despite them be illegal
on airplanes, being stuck with plastic 'sporks' instead, I still
like them when cutting into a t-bone. You can not eliminate all
risk.

Do you really think you can do anything to a language that
allows you to touch hardware that will prevent people from
misusing it? Not all development work is for use inside a VM or
other sandbox.
While I want the language be fixed so that most people
don't trigger the landmines in the language so easily.

I am not opposed to the language removing provably faulty
interfaces, but I do not want its capabilities removed in other
ways. Even so, there is no likelihood of any short-term
benefits, due to the propagation delay of standard changes into
compilers, and no proof that it will even be beneficial
longer-term.

It would probably be a better idea for you to finish your
completely new "better C compiler" (keeping to your string
library naming) and make it so popular that C withers on the
vine. It's been so successful for you already, replacing all
those evil null-terminated strings all over the globe, I quiver
in anticipation of your next earth-shattering achievement.
 
R

Randy Howard

(e-mail address removed) wrote
(in article
You are right, I cannot blame C for bugs that happen in other
languages. This is the most famous one from Ada.

You just got done telling me that Ada would avoid problems.
See, the thing is, with Ada bugs, you can clearly blame the programmer
for most kinds of failures.

Oh my, SURELY the Ada standard should not allow such things to
happen. Those thoughtless bastards, how could this be? ;-)
The programmer used priority based threading because that's what he had
available to him.

He used something that does not even exist in standard C, and
got bit in the ass. Gee, and to think that you want to hold the
standard committee (and judging by another post of yours Ritchie
himself) responsible when people do things like this. Wow.
Let's read on and see what sort of hole you choose to dig...
Suppose, however, that C had implemented co-routines

Suppose that you hadn't blamed standard C for something not
written in standard C. That one had a much higher chance of
being true until just recently.
Maybe the Pathfinder code would have more
coroutines, and fewer threads, and may have avoided the problem
altogether (I am not privy to their source, so I really don't know).

That didn't stop you from blaming it on standard C, why stop
now?
Coroutines are one of those "perfect compromises", because you can
easily specify a portable interface, that is very likely to be widely
supportable, they are actually tremendously faster than threading in
many cases, and all without adding *any* undefined behavior or
implementation defined behavior scenarios (other than a potential
inability to allocate new stacks.)

How strange that they are so wildly popular, whereas threads are
never used. *cough*
Full blown multithreading, such as
in POSIX is notoriously platform specific, and it should not surprise
anyone that only few non-UNIX platforms support full blowns POSIX
threads.

That's interesting, because I have used the pthreads interfaces
for code on Windows (pthreads-win32), Linux, OS X, solaris, and
even Novell NetWare (libc, since they started supporting them
several years ago). I didn't realize they didn't work, because
for some strange reason, they do work for me. Maybe I'm just
lucky, or maybe you're too fond of spouting off about things you
have 'heard' but don't actually know to be true.

Have there been bugs in pthread libraries? Yes. Have their
been bugs in almost every library ever used in software
development? Yes. Where they impossible to fix? No.
This fact has been noticed and adopted by those languages
where serious development is happening (Lua, Perl, Python). I don't
know if the C standards committee would be open to this -- I highly
doubt it.

Feel free to propose a complete coroutine implementation.
 
A

Antoine Leca

Antoine Leca wrote:

Even then it's problematic, because the search would not respect
alignment with boundaries between character encodings.

Good point, you are quite right, and this is often overseen problem.
It will only work with self-synchronizing encodings (UTF-8 comes to mind,
but the only others I know of are using SS2/SS3, the single shifts,
_without_ using LSx/SI/SO, the locking shifts, and they are NOT very common
;-)).
Quite narrow application for a general library function.


Antoine
 
A

Antoine Leca

En said:
Remember that almost every virus, buffer overflow exploit, core
dump/GPF/etc is basically due to some undefined situation in the ANSI
C standard.

<OT>
The worst exploit I've seen so far was because a library dealing with
Unicode was not checking about malformed, overlong, UTF-8 sequences, and
allowed to walk though the filesystem, including in places where webusers
are not supposed to go. AFAIK, the library is written in C++ (it could
equally been written in C, that won't change the point.)
And the exploit was successful because some key directories had bad default
permissions as factory setup.

Another one quite successful was based on an brocken API for address books;
the API can be accessed from (not strictly conforming) C code, but that is
not how it is used usually. And the way the API is accessed though C
purposely avoid possible buffer overflows.

The most sticky virus I had to deal with was a bootsector virus. PC
bootsectors are not known to be written in C, rather in assembly language.


Granted, all these behaviours are _not_ defined by the ANSI C standard.
</OT>

Just because C is very much used will mean that statically it will show up
more often in exploit or core dumps or GPF cases. This only shows it is a
successful language; there might be reasons for that; in fact, the ANSI C
Standard is a big reason for its prolongated life as a successful (= widely
used) language: I mean, had it not happen, C would probably be superceeded
nowadays (same for FIV then F77; or x86/PC.)


Antoine
 
M

Michael Wojcik

Michael Wojcik wrote:



You have no idea whether it represents a consensus or not.

Yes I do (and it doesn't). It's been debated here and in other groups
many times, and it's clear to anyone reviewing those debates that no
consensus exists.
A "consensus" is not necessarily complete unanimity.

Thank you, but I am perfectly aware of that. Had I meant unanimity,
I would have written "unanimity".

--
Michael Wojcik (e-mail address removed)

Unlikely prediction o' the day:
Eventually, every programmer will have to write a Java or distributed
object program.
-- Orfali and Harkey, _Client / Server Programming with Java and CORBA_
 
D

Default User

Chris said:
[off-topic drift, but I cannot resist...]

You feel that I my choice of moniker reflects something about my
level of expertise? Note that "Default User" is NOT the default
name in XanaNews, my current newsreader.

I always figured it meant that you are known for requiring a
"default:" label in every switch(). :)

Or that I just "use" a lot of them.



Brian
 
D

Default User

Joe said:
Default User wrote:
Nonsense. Among the several header lines of your message are..

Is it?
From: "Default User" <[email protected]>
Newsgroups: comp.lang.c
Subject: Re: Why doesn't strrstr() exist?
Date: 25 Aug 2005 18:11:29 GMT
Lines: 15
Message-ID: <[email protected]>
User-Agent: XanaNews/1.16.3.1

What do you think this says that contradicts my point? The heading
reflects the current settings of my newsreader. That doesn't mean they
were the base settings.



Brian
 
D

Douglas A. Gwyn

Remember that almost every virus, buffer overflow exploit, core
dump/GPF/etc is basically due to some undefined situation in the ANSI C
standard.

That's misplaced blame. I use the same standard and don't have
such problems.
 
W

websnarf

Antoine said:
Paul Hsieh va escriure:

<OT>
The worst exploit I've seen so far was because a library dealing with
Unicode was not checking about malformed, overlong, UTF-8 sequences, and
allowed to walk though the filesystem, including in places where webusers
are not supposed to go. AFAIK, the library is written in C++ (it could
equally been written in C, that won't change the point.)
And the exploit was successful because some key directories had bad default
permissions as factory setup.

This is the worst? Are you sure silent zombification of your machine
isn't worse?

In any event, compare this to Java, where Unicode is actually the
standard encoding for string data. Its not really possible to have
"unicode parsing problems" in Java, since all this stuff has been
specified in the core of the language. Compare this to ANSI C, which
uses wchar, which literally doesn't *specify* anything useful. So
technically the only reason one is writing Unicode parsers in C is
because the standard doesn't give you one.
Another one quite successful was based on an brocken API for address books;
the API can be accessed from (not strictly conforming) C code, but that is
not how it is used usually. And the way the API is accessed though C
purposely avoid possible buffer overflows.

The most sticky virus I had to deal with was a bootsector virus. PC
bootsectors are not known to be written in C, rather in assembly language.

So you've been in a time machine and just recently joined us in the
next millenium? Bootsector viruses are so 80s. Boot to a dos disk and
type "fdisk /fixmbr" and usually you are set.
Granted, all these behaviours are _not_ defined by the ANSI C standard.
</OT>

Just because C is very much used will mean that statically it will show up
more often in exploit or core dumps or GPF cases.

Ok, so normalize the measures based on usages of the language. Do you
think C and C++ still won't be the worst by a country mile?
[...] This only shows it is a
successful language; there might be reasons for that; in fact, the ANSI C
Standard is a big reason for its prolongated life as a successful (= widely
used) language: I mean, had it not happen, C would probably be superceeded
nowadays (same for FIV then F77; or x86/PC.)

The ANSI *C89* standard is the reason for its long life, success. But
that ducks the point that it also is the fundamental source for these
problems, exploits, etc.
 
O

Old Wolf

In any event, compare this to Java, where Unicode is actually the
standard encoding for string data.

Unicode is a character set, not an encoding.
Its not really possible to have "unicode parsing problems" in Java,
since all this stuff has been specified in the core of the language.

AFAIK the language doesn't specify how to deal with Unicode
characters whose value is greater than 65,535

Does it handle UTF-8, big-endian UCS-2, little-endian UCS-2,
b-e UTF16, l-e UTF16, and UCS-4 ? All of those occur in the
real world (unfortunately!)
 
W

websnarf

Randy said:
You just got done telling me that Ada would avoid problems.

It avoids buffer overflows, and other sorts of problems. Its not my
intention to defend Ada. Once again, I am not saying that either you
are with us or you are with the terrorists.

This was a point to demonstrate the programmers are not perfect, not
matter what you do. So this idea that you should just blame
programmers is just pointless.
Oh my, SURELY the Ada standard should not allow such things to
happen. Those thoughtless bastards, how could this be? ;-)


He used something that does not even exist in standard C, and
got bit in the ass. Gee, and to think that you want to hold the
standard committee (and judging by another post of yours Ritchie
himself) responsible when people do things like this.

Well, Ritchie, AFAIK, did not push for the standardization, or
recommend that everyone actually use C as a real application
development language. So I blame him for the very narrow problem of
making a language with lots of silly unnecessary problems, but not for
the fact that everyone decided to use it. The actual ANSI C committee
is different -- they knew exactly what role C was taking. They have
the ability to fix the warts in the language.
[...] Wow. Let's read on and see what sort of hole you choose to dig...

Or what words you will put in my mouth, or what false dichotomies you
will draw ...
Suppose that you hadn't blamed standard C for something not
written in standard C. That one had a much higher chance of
being true until just recently.


That didn't stop you from blaming it on standard C, why stop
now?

First of all, the standard doesn't *have* coroutines while other
languages do. And I never *did* blame the pathfinder bug on the C
standard. I see you have the CBFalconer disease of reading whatever
the hell you want from text that simply doesn't contain the content you
think it does.
How strange that they are so wildly popular, whereas threads are
never used. *cough*

Coroutines are not very widely *deployed*. So popularity is how you
judge the power and utility of a programming mechanism? Why don't you
try to add something substantive here rather leading with ignorance?
Can you give a serious pro-con argument for full threads versus
coroutines? Because I can.
That's interesting, because I have used the pthreads interfaces
for code on Windows (pthreads-win32), Linux, OS X, solaris, and
even Novell NetWare (libc, since they started supporting them
several years ago).

You undertstand that those are all mostly UNIX right? Even the windows
thing is really an implementation or emulation of pthreads on top of
Windows multithreading. Show me pthreads in an RTOS.
[...] I didn't realize they didn't work, because
for some strange reason, they do work for me. Maybe I'm just
lucky, or maybe you're too fond of spouting off about things you
have 'heard' but don't actually know to be true.

Have there been bugs in pthread libraries? Yes. Have their
been bugs in almost every library ever used in software
development? Yes. Where they impossible to fix? No.

Right. And have they fixed the generic problem of race conditions?
Race conditions are just the multitasking equivalent of
buffer-overflows. Except, as you know, they are *much* harder to
debug, and you cannot use tools, compiler warnings or other simple
mechanisms to help you avoid them. This is the real benefit of
coroutines over full threading. You can't have race conditions using
coroutines.
Feel free to propose a complete coroutine implementation.

I would if I thought there was an audience for it. These things take
effort, and a brief perusal of comp.std.c leads me to believe that the
ANSI committee is extremely capricious.

Think about it. You want me to propose something actually useful,
powerful and which would improve the language to a committee that
continues to rubber stamp gets().

Is your real point that I am supposed to do this to waste my time and
energy, obviously get rejected because the ANSI C committee has no
interested in improving the language, and this will be proof that I am
wrong?

Tell me, when is the last time the C language committee considered a
change in the language that made it truly more powerful that wasn't
already implemented in many compilers as extensions? Can you give me
at least a plausibility argument that I wouldn't be wasting my time by
doing such a thing?
 
W

websnarf

Chris said:
(Again, quite off-topic, but ...)

[Ariane rocket example]

You are right, I cannot blame C for bugs that happen in other
languages. This is the most famous one from Ada. ...
See, the thing is, with Ada bugs, you can clearly blame the programmer
for most kinds of failures.

I am reminded of a line from a novel and movie:

"*We* fix the blame. *They* fix the problem. Their way's better."

So in this case, how do we "fix" the problem of buffer overflows in C
programs? Shall we teach every bad programmer how not to do it, and
zap them if they get it wrong (a la Randy Howard style)? Or do you
perform some C library modifications so that the problem is
substantially mitigated?
[Pathfinder example]
The programmer used priority based threading because that's what he had
available to him.

Actually, the Pathfinder used vxWorks, a system with which I am
now somewhat familiar. (Not that I know much about versions
predating 6.0, but this particular item has been this way "forever",
or long enough anyway.)

The vxWorks system offers "mutex semaphores" as one of its several
flavors of data-protection between threads. The mutex creation
call, semMCreate(), takes several flag parameters. One of these
flags controls "task" (thread, process, whatever moniker you prefer)
priority behavior when the task blocks on the mutex.

The programmer *chose* this behavior, because vxWorks does offer
priority inheritance. (Admittedly, vxWorks priority inheritance
has a flaw, but that is a different problem.)

Thus, your premise -- that the programmer used priority based
scheduling (without inheritance) that led to the priority inversion
problem "because that's what he had available" is incorrect: he
could have chosen to make all the threads the same priority, and/or
used priority inheritance, all with simple parameters to the various
calls (taskSpawn(), semMCreate(), and so on).

I'm confused as to how you know that programmer who made the mistake
was in charge of what priority his task ran at. And in any event I
don't claim its the programmer (or designer, or whoever's fault it
really was's) *only* choice. Its just an example of a land mine being
there, and it being stepped on.

The reality is that VxWork "solves" this problem and others by having a
very sophisticated built-in debugging environment. They just had to
look at the task list and see that the high priority task was blocked.
Coroutines are hardly perfect.

You, of course, missed the word that I typed immediately following the
word perfect.
[...] However, if you like them, I suggest you investigate the Icon
programming language, for instance.

I looked briefly at it. The impression I had is that it supported too
many modes of coroutines which made them unnecessarily complicated to
use (I may be thinking of another language.)

In any event, I have *studied* coroutines in university, and can
revisit them in far more mainstream languages like Python and Lua. My
understanding of them is not the point (I think I do -- or at least the
"one shot continuation" kind) my point is they should probably be part
of the C standard.
 
C

Chris Hills

Well, Ritchie, AFAIK, did not push for the standardization, or
recommend that everyone actually use C as a real application
development language. So I blame him for the very narrow problem of
making a language with lots of silly unnecessary problems, but not for
the fact that everyone decided to use it. The actual ANSI C committee
is different -- they knew exactly what role C was taking. They have
the ability to fix the warts in the language.


I would if I thought there was an audience for it. These things take
effort, and a brief perusal of comp.std.c leads me to believe that the
ANSI committee is extremely capricious.

It's NOT down to the ANSI committee..... it is down to WG14 an ISO
committee of which ANSI is but one part. Since 1990 C has been handles
by ISO as an international standard. There are committees from many
countries involved. ANSI gets one vote like all the rest so don't blam
ANSI for all of it.
 
W

websnarf

Randy said:
Excuse me?

"False dichotomy". Look it up. I never mentioned high or low level
language, and don't consider it relevant to the discussion. Its a
false dichotomoy because you immediately dismiss the possibility of a
safe low-level language.
If you have access to any sequence of opcodes available on the
target processor, how can it not be?

C gives you access to a sequence of opcodes in ways that other
languages do not? What exactly are you saying here? I don't
understand.
Do you feel that 'gets()' is part of the most natural path in C?

Yes of course! When people learn a new language they learn what it
*CAN* do before they learn what it should not do. It means anyone that
learns C first learns to use gets() before they learn not to use
gets().
Fair enough, but you're just dodging the underlying question.

I am dodging the false dichotomy. Yes. You are suggesting that making
C safer is equivalent to removing buffer overflows from assembly. The
two have nothing to do with each other.
For a suitably generous definition of 'many', perhaps.

Terse, HLA, Rosasm, LuxAsm -- this is all for *one* assembly language.
I would have been shocked if you had not figured out a way to
bring your package up. :)

Oh by the way there is a new version! It incoroporates a new secure
non data-leaking input function! Soon to reach 5000 downloads and
80000 webpage hits! Come join the string library revolution and visit:
http://bstring.sf.net/ to see all the tastey goodness!
Which does absolutely nothing to prevent the possibility of
developing insecure software in assembler. It may offer some
advantages for string handling, but that closes at best only one
of a thousand doors.

You mean it closes the most obvious and well trodden thousand doors out
of a million doors.

Assembly is not a real application development language no matter how
you slice it. So I'm would be loath to make any point about whether or
not you should expect application to become safer because they are
writing them in assembly language using Bstrlib-like philosophies. But
maybe those guys would beg to differ -- who knows.

As I recall this was just a point about low level languages adopting
safer interfaces. Tough in this case, the performance improvements
probably drives their interest in it.
[...] If you want to argue that too many people
write code in C when their skill level is more appropriate to a
language with more seatbelts, I won't disagree. The trick is
deciding who gets to make the rules.

But I'm not arguing that either. I am saying C is to a large degree
just capriciously and unnecessarily unsafe (and slow, and powerless,
and unportable etc., etc).

Slow? Yes, I keep forgetting how much better performance one
achieves when using Ruby or Python. Yeah, right.

I never put those languages up as alternatives for speed. The false
dichotomy yet again.
Powerless? How so?

No introspection capabilities. I cannot write truly general
autogenerated code from the preprocessor, so I don't get even the most
basic "fake introspection" that's should otherwise be so trivial to do.
No coroutines (Lua and Python have them) -- which truly closes doors
for certain kinds of programming (think parsers, simple incremental
chess program legal move generators, and so on). Multiple heaps which
a freeall(), so that you can write "garbage-collection style" programs,
without incurring the cost of garbage collection -- again there are
real applications where this kind of thing is *really* useful.
[...] It seems to be the only language other than
assembler which has been used successfully for operating system
development.

The power I am talking about is power to program. Not the power to
access the OS.
Unportable? You have got to be kidding. I must be
hallucinating when I see my C source compiled and executing on
Windows, Linux, NetWare, OS X, Solaris, *bsd, and a host of
other UNIX-like platforms, on x86, x86-64, PPC, Sparc, etc.

Right. Because you write every piece of C code that's ever been
written right?
That is all true, and it does nothing to address the point that
C is still going to be used for a lot of development work. The
cost of the runtime error handling is nonzero. Sure, there are
a lot of applications today where they do not need the raw speed
and can afford to use something else. That is not always the
case. People are still writing a lot of inline assembly even
when approaching 4GHz clock speeds.

Ok, first of all runtime error handling is not the only path. In fact,
I don't recommend that as your sole approach. You always want error
detection to happen as early in the development process as possible,
and that means bringing errors to compile time. In this case, the most
obvious solution is to have better and safer APIs.

Second of all, remember, I *BEAT* the performance of C's strings across
the board on multiple platforms with a combination of run time and API
design in Bstrlib. This is a false idea that error checking always
costs performance. Performance is about design, not what you do about
safety.
That would be nice.


I don't immediately see how this will be demonstrably faster,
but you are free to invent such a language tomorrow afternoon.

Well just a demonstration candidate, we could take the C standard, add
in Bstrlib, remove the C string functions listed in the bsafe.c module,
remove gets and you are done (actually you could just remove the C
string functions listed as redundant in the documentation). Of course
there are many other simple changes, like abstracted heaps that include
a freeall() function that I have demonstrated which can also lead to
enormous performance improvements. This would immedately make the
language technically safer and faster.
Do it, back up your claims, and no doubt the world will beat a
path to your website. Right?

Uhh ... actually no. People like my Bstrlib because its *safe* and
*powerful*. They tend not to notice or realize they are getting a
major performance boost for free as well (they *would* notice if it was
slower, of course). But my optimization and low level web pages
actually do have quite a bit of traffic -- a lot more than my pages
critical of apple or microsoft, for example.

Its not hard to beat compiler performance, even based fundamentally on
weakness in the standard (I have a web page practically dedicated to
doing just that; it also gets a lot of traffic). But by itself, that's
insufficient to gain enough interest in building a language for
everyday use that people would be interested in.
[...] "D" is already taken, what will you call it?

How about "C"?
It isn't possible. What is possible is for you to make gross
assumptions about what 'my problem' is based up the post you are
replying to here. I do not assume that C can not be made safer.
What I said, since you seem to have missed it, is that the
authors of the C standard are not responsible for programmer
bugs.

Ok, well then we have an honest point of disagreement then. I firmly
believe that the current scourge of bugs that lead to CERT advisories
will not ever be solved unless people abandon the current C and C++
languages. I think there is great concensus on this. The reason why I
blame the ANSI C committee is because, although they are active, they
are completely blind to this problem, and haven't given one iota of
consideration to it. Even though they clearly are in the *best*
position to do something about it. And its them any only them -- the
only alternative is to abandon C (and C++) which is a very painful and
expensive solution; but you can se that people are doing exactly that.
Not a lot of Java in those CERT advisories.
No, I am not. I do not demand that doctors are perfect, but I
expect them to be highly motivated to attempt to be perfect.

Ok, you demand that they *try* to be perfect. I'm not advocating that
language be perfect or *try* to be perfect. I only want it not to be
thoroughly incompetent.
What cost? Some 'world-wide rolled-up cost'? For me, it cost
me almost nothing at all. I first discovered gets() was
problematic at least a decade ago, probably even earlier, but I
don't keep notes on such things. It hasn't cost me anything
since.

And so are you saying it didn't cost you anything when you first
learned it? And that it won't cost the next generation of programmers,
or anyone else who learns C for the first time?
[...] If I hire a programmer, this has all been settled to my
satisfaction before they get an offer letter. It hasn't been a
problem and I do not expect it to be one in the future.

But the cost is there. So the cost is ongoing.
They do not. As we have already seen, it takes years, if not
decades for a compiler supporting a standard to land in
programmer hands. With the stunningly poor adoption of C99, we
could not possibly hope to own or obtain an open source C0x
compiler prior to 2020-something, if ever. In the mean time,
those that are serious solved the problem years ago.

C99 is not being adopted because there is no *demand* from the users or
development houses for it. If the standard had been less drammatic,
and solved more real world problems, like safety, for example, I am
sure that this would not be the case. You also ignore the fact that
the C++ folks typically pick up the changes in the C standard for their
own. So the effect of the standard actually *is* eventually
propogated.

The fact that it would take a long time for a gets() removal in the
standard to be propogated to compiler, I do not find to be a credible
argument.

Also note thast C89, had very fast adoption. It took a long time for
near perfect and pervasive adoption, but you had most vendors more than
90% of the way there within a very few years.
Of course I do. In fact, I say so, which you conveniently
quoted just below...

A compiler error telling the user that its wrong (for new platform
compilers) is the best and simplest way to do this.
Didn't /you/ just say that they should be removed?

I am saying the process of manual removal, hoping that your programmers
are disciplined enough to do it, is not necessarily going to happen in
practice.
Make up your mind. Fixing them in the the compiler, as I would
expect an 'automated' solution to do, and rejecting the
offending lines are completely different approaches.


Great idea. 15 years from now that will have some value.

Uh ... but you see that its still better than nothing right? You think
programming will suddenly stop in 15 years? Do you think there will be
less programmer *after* this 15 year mark than there has been before
it?

Or, like me, do you think C will just become COBOL in 15 years?
A better idea. Patch gcc to bitch about them TODAY, regardless
of the standard.

The linker for the GNU linker already does this. But its perceived as
a warning. People do not always listen to warnings.
What part of 'people can still fire up and old compiler' did you
fail to read and/or understand?

Use of old compilers is not the problem. The piles of CERT advisories
and news stories about exploits are generally directed at systems that
are constantly being updated with well supported compilers.
They could also simply claim "we are smarter than the average
bear, and we know better to use any of the following offensive
legacy functions, such as gets(), ..."

But nobody would believe your claim. My claim could be audited, and a
company would actually worry about being sued for making a false claim
of the sort I am advocating unless it were true.
To clarify, since it didn't soak in the first time, I am not
opposed to them being removed. I simply don't this as a magic
bullet, and certainly not in the sense that it takes far too
long for the compilers to catch up with it. I would much rather
see compilers modified to deny gets() and its ilk by default,
and require a special command line option to bypass it, /if at
all/. However, the warning message should be far more useful
than
gets.c: 325: error: gets() has been deprecated.

Did I misspeak and ask for deprecation? Or are you misrepresenting my
position as usual? I'm pretty sure I explicitely said "non-redefinable
in the preprocessor and always leads to an error" to specifically
prevent people from working around its removal.
That's just oh so useful, especially to newbies. I wouldn't
care if it dumped a page and a half of explanation, along with a
detailed example of how to replace such calls with something
safer. After all, good code doesn't have it in them anyway, and
it won't annoy anyone that is competent.

Well, then our positions are not so different. Since my solution would
cause the developer to go to the manuals which would hopefully explain
the situation in the way you would expect it to.
If, and only if, they use a compiler with such changes. We
still see posts on a regular basis with people using old 16-bit
Borland compilers to write new software.

.... And you think there will be lots of CERT advisories on such
products? Perhaps you could point my to a few examples of such
advisories which are new, but which use old compilers such as Borland
C.

We can't do anything about legacy compilers -- and we don't *NEED TO*.
That's not the point. The "software crisis" is directed at development
that usually uses fairly well maintained compilers.
I don't think you understood me. I know of no company that has
a policy for this. However, if I was working on something and
felt that something was being done that could be inherently
dangerous, and it was going to ship anyway, I would take some
form of legal action, if for no other reason than to be able to
disassociate myself from the impending lawsuits.

Ok ... that's interesting, but this is ridiculuous. As I said above,
you do not write every piece of software in the world. And we are well
aware of about 10,000 programmers living in the pacific northwest who
we know do *NOT* share your attitude.

And you defence of the situation is that you assume every gainfully
employed programmer should be willing to quit the moment they see that
their process of programming is not likely to yield the highest
possible quality in software engineering.
I would much rather go look for work than participate in
something that might wind up with people dying over the actions
of some meddling manager.

That's nice for you. That's not going to be a choice for lots of other
people.
[...] If you
are being overworked, you can either keep doing it, or you can
quit, or you can convince your boss to lighten up.

Hmmm ... so you live in India?

Why would you think so?

Wild guess.
Where do you live? Because I am trying to guess where on the
planet you would /not/ have the right to quit your job.
Indentured servitude is not widely practiced anymore, AFAIK.

That isn't what I am saying. People's ability to quit or work at will
are often not related to things like programming philosophy or idealism
about their job. And software is and always will be created by
developers who have considerations other than the process of creating
perfect software.
[...] ESPECIALLY in this case, the C standard folks are not to blame.

But if the same issue happens and you are using a safer language, the
same kinds of issues don't come up. Your code might be wrong, but it
won't allow buffer overflow exploits.

You can have 10 dozen other forms of security failure, that have
nothing to do with buffer overflows.

I implore you -- read the CERT advisories. Buffer Overflows are #1 by
a LARGE margin. Its gotten to the point where its so embarassing to
Microsoft that they now try to disguise the fact that they have buffer
overlows through convoluted language. (You can still figure it out
though, when they say things like "hostile input leading to running of
arbitrary code ...")
[...] It isn't a panacea. When one form of attack is removed, another
one shows up.

If you remove buffer overflows, it doesn't mean that other kinds of
bugs will suddenly increase in absolute occurrence. Unless you've got
your head in the sand, you've got to know that *SPECIFICALLY* buffer
overflows are *BY THEMSELVES* the biggest and most solvable, and
therefore most important safety problem in programming.
For example, the last straw the sent Microsoft windows off my
network for eternity happened recently. A computer system
running XP, SP2, all the patches, automatic Windows updates
daily, virus software with automatic updates and real-time
protection, email-virus scanning software, two different brands
of spyware protection, also with automatic updates enabled, and
both a hardware firewall and software firewall installed, got
covered up in viruses after 2 hours of letting my kids use it to
go play some stupid online kids game on disney.com or
nickelodeon.com (not sure which, since they went to both, and I
didn't want to replicate it). Suddenly, when I come back to
look at it, it has 3 or 4 new taskbar icons showing downloads in
progress of I know not what task manager shows a bunch of extra
processes that shouldn't be there, the registry run keys are
stuffed fool of malware, and it's pushing stuff out the network
of I know not what. I pull the cable, start trying to delete
files, which Windows wants to tell me I don't have permission to
do, scanning, the browser cache directories are filled with .exe
and .dll files, it's out of control.

A few expletives later, and I was installing a new Linux distro
that I had been meaning to try out for a while.

I had done just about everything I could imagine to lock the
system down, and it still got out of control in 2 hours letting
a 12-yr-old browse a website and play some games.

Of course, if enough people do the same thing, the bad guys will
figure out how to do this on Linux boxes as well. But for now,
the OS X and Linux systems have been causing me (and the kids)
zero pain and I'm loving it.

I'm not sure how this is an argument that Buffer Overflows aren't the
worst safety problem in programming by a large margin.

None of those problems actually have anything to do with programmer
abilities, or language capabilities. They have to do with corporate
direction, mismanagement, and incompetent program architecture. That's
a completely seperate issue.
[... analogy taken too far, as usual, snipped ...]
Dennis Ritchie had no idea that NASA would put a priority inversion in
their pathfinder code.

Are you implying that Dennis Ritchie is responsible for some bad
code in the pathfinder project?

Uh ... no *you* are. My point was that he *COULDN'T* be.

Just like I can't be responsible if some bank used an old version of
Bstrlib to input passwords not realizing that longer passwords might be
leaked back to the heap and some other flaw in their program which
exposed the heap caused some passwords to become visible.

Sometimes you are *NOT AWARE* of your liability, and you don't *KNOW*
the situations where your software might be used.
Is there any evidence that the NSA chose his code because it was
not worth fooling with?

What? They *DID* fool with his code -- they created something called
"Security Enhanced Linux" and suggested people look at it. As I
recall, the changes were a little too drastic, but there are
alternatives that people have been working on that provide similar
functionality that the main brain *is* adopting (and you cannot deny
this was motivated by the NSA's little project, which by itself has
some usage).

So the question is, does Linus himself become liable for the potential
security flaws or failures that such "security enhancements" might not
deliver? (Keeping in mind that Linus still does personally accept or
reject the changes proposed to the Linux kernel.)
[...] What is your point? Oh, you're going to tell us...
My point is
that programmers don't know what the liability of their code is,
because they are not always in control of when or where or for what it
might be used.

Wow, that is tortured at best. Presumably Ritchie is in your
list because of C or UNIX? How could he be 'liable' for an
application or driver written by somebody else 30 years later?

That was *my* point. Remember you are claiming that you want to pin
responsibility and liability for code to people so that you can dish
out punishment to them. I see a direct line of responsibility from
weakness in the C library back to him (or maybe it was Thompson or
Kernigham). And remember you want to punish people.
Are the contributors to gcc responsible for every bad piece of
software compiled with it?

Well no, but you can argue that they are responsible for the bugs they
introduce into their compilers. I've certainly stepped on a few of
them myself, for example. So if a bug in my software came down to a
bug in their compiler, do you punish me for not being aware of the bug,
or them for putting the bug in there in the first place?
If someone writes a denial-of-service attack program that sits
on a Linux host, is that Torvald's fault? I've heard of people
trying to shift blame before, but not that far. Maybe you might
want to blame Linus' parents too, since if they hadn't conceived
him, Linux wouldn't be around for evil programmers to write code
upon. Furrfu.

Steve Gibson famously railed on Microsoft for enabling "raw sockets" in
Windows XP. This allows for easy DDOS attacks, once the machines have
been zombified. Microsoft marketing, just like you, of course
dismissed any possibility that they should accept any blame whatsoever.

With the latest service pack, the engineers took control (and
responsibility) and turned off raw sockets by default in Windows XP.
There *IS* a liability chain, and yes it *DOES* reach back that far,
even if marketing people try to convince you otherwise.
Nope. If you take sample code and don't investigate it fully
before putting it into production use, that's /your/ problem.

Oh I see. So you just want to punish, IBM, Microsoft, Unisys, JASC
software, Adobe, Apple, ... etc. NOBODY caught the bug for about *10
years* dude. Everyone was using that sample code including *myself*.
And its quite likely its just traces back to Tom Lane, or someone that
was working with him.
[... more appealing to analogies removed ...]
You still don't get it. You, I or anyone you know, will produce errors
if pushed. There's no such thing as a 0 error rate for programming.

Then I do get it, because I agree with you. Let me know when I
can write a device driver in Python.

False dichotomy ...
Strange logic, or lack thereof. Having no first-compile errors
doesn't provide ANY confidence that you don't have hidden bugs.

Speaking of lack of logic ... its the *REVERSE* that I am talking
about. Its because I *don't* have a 0 first-compile error rate that I
feel that my hidden error rate can't possibly be 0.
That would be pointless, since measuring first-compile error
rate proves zilch about overall bug rates. If you want to avoid
hidden bugs, you have to actively look for them, test for them,
and code explicitly to avoid them, regardless of how often your
compiler detects a problem.

You miss my argument. First-compile error rates are not a big deal --
the compiler catches them, you fix them. But they are indicative of
nature blind spots. This same thing must be true to some degree or
another to bugs which don't lead to compiler errors.

Testing, structured walk throughs/inspections, are just imperfect
processes for trying to find hidden bugs. Sure they reduce them, but
you can't believe that they would get all of them -- they dont! So in
the end you are still left with *some* bug rate. So write enough code
and you will produce an arbitrary number of hidden bugs.
Well gee, there you have it. It seems that there are some
places were C is almost unavoidable. What a shock. Who's
wearing those rose-colored glasses now?

None of those sentences have any connection to each other.

First of all, you missed the "maybe" in there. Assembly would be an
equally good choice, or enhanced versions of HLL compilers.
[... more analogy snipped ...]
Do you really think you can do anything to a language that
allows you to touch hardware that will prevent people from
misusing it?

When did I suggest or imply this?
[...] Not all development work is for use inside a VM or
other sandbox.

Again putting words in my mouth.
I am not opposed to the language removing provably faulty
interfaces, but I do not want its capabilities removed in other
ways. Even so, there is no likelihood of any short-term
benefits, due to the propagation delay of standard changes into
compilers, and no proof that it will even be beneficial
longer-term.

It would probably be a better idea for you to finish your
completely new "better C compiler" (keeping to your string
library naming) and make it so popular that C withers on the
vine.

When did I suggest that I was doing such a thing? Can you find the
relevant quote?
[...] It's been so successful for you already, replacing all
those evil null-terminated strings all over the globe, I quiver
in anticipation of your next earth-shattering achievement.

Actually, my strings are also NUL terminated. That's why people who
use it like it -- its truly a no-lose scenario. You really have to try
using it to understand it. If my library isn't *more* popular, its
probably just because I don't know how to advertise it. Or maybe its
just one of those things that hard to get people excited about. I
haven't received any negative feedback from anyone who's actually used
it -- just suggestions for improvements.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,169
Messages
2,570,920
Members
47,464
Latest member
Bobbylenly

Latest Threads

Top