Writing "absolutely" portable code

  • Thread starter ArifulHossain tuhin
  • Start date
R

Rui Maciel

ArifulHossain said:
Because of the battles i had with configure scripts. Sometimes it just
feels it can be done rather easily by hand tuned makefiles.

Yes, tiny projects which don't require extensive configuration checks tend
to be easier to maintain by writing a makefile and even test scripts
ourselves.

Nonetheless, a build process only requires an automated build process if its
size makes it impractical to manage custom makefiles and scripts by hand.

One example, there was a small project i was involved, which was modifying
a tiny "media realy" software. The project has only 20 10-15 c files. we
need to add some of our own. because of the hideous build systems. And it
was not necessary to maintain this small project with autotools. we ended
up writing custom makefiles for the project.

That is strange. Were those files C source code files? And did anyone in
that team had any experience with the GNU build system, or was everyone
cutting their teeth on it?


Rui Maciel
 
R

Rui Maciel

ec429 wrote:

Don't rely on the sizes of integer types;
if you need an N-bit int, use intN_t.

I would go a bit further than that. It would be a good idea to define type
names for primitive data types in a separate header (config.h), either
through macro definitions or typedefs, and then set/update that definition
based on checks performed by the configuration script.

The GNU build system (i.e., autotools) was developed basically with this in
mind, through the use of a config.h.in file, and cmake also supports this as
well.

Hand-make a configure script that
tests for the things you need (by compiling and running tiny C programs)
and outputs that information into a file that gets included by your
Makefile (and defines some variable to hold some -D macro definitions
for the cpp).

I don't believe this suggestion is very good. Any flexibility which might be
gained by developing our own shell scripts may require a relevant amount of
maintenance and research, once we need to support more than one platform.
There were good reasons why smart people decided to invest their time
developing automated build systems in order to avoid hand-making configure
scripts.

Hand-roll your makefile too, using implicit rules (%, $<
and $@ are your friends), but if some targets don't support GNU make,
you may need to write a tool to generate an explicit makefile from your
implicit rules (such a tool would be easier to write in, say, perl, than
in C).

Again, this is the same problem I've previously pointed out. If someone
intends to avoid automatic build systems when developing portable projects
and instead they opt to rely on all sorts of custom hand-written scripts,
they are digging themselves into a hole from which they may experience a lot
of problems just to escape it.

Another aspect which is rarely taken into account by those who decide to
rely on a set of hand-crafted scripts to automate their build process is
that another advantage in relying on automated build systems is familiarity.
There is a wealth of information and discussion groups dedicated to specific
automated build systems, and a considerable number of developers already
have experience in writing/managing them. It is also terribly easy for any
developer who intends to use one to simply read up on them and, in a matter
of minutes, get them up and running. Therefore, it wouldn't be a challenge
for a clueless newbie to read up on a standard/popular build system and not
have a problem implementing basic stuff, such as adding files to a project.

The same doesn't necessarily apply to hand-crafted scripts. Once a project
grows after some point, the project's hand-crafted scripts tend to become,
at least to those who didn't developed them, a violent mix of arcane
enchantments and textual noise, which is practically impossible to
understand, let alone maintain and update. As those scripts represent a
one-off attempt at writing a build script, there won't be a lot of help
available to those who intend to decipher them. So, although a specific mix
of "let's hand-craft everything" may serve well a specific developer in that
very specific occasion, it may undermine the ability to contribute by anyone
else. And this is a problem.


On the more general note of avoiding Autotools, I would note that all my
open-source projects (there are several) use hand-written Makefiles,
none has a configure script, and I have yet to encounter anything more
severe than a few #ifdef WINDOWS to handle cross-compilation to that
platform (and separate make rules for Windows targets). With a
cross-compiling gcc, even Cygwin is not necessary (although it would be
if you wanted to /compile/ on Windows).

Like any tool ever developed, there is a threshold that marks the start of
its usefulness. Or, better yet, the point where applying that tool makes
sense.

Considering build automation tools, their usefulness only kicks in if they
actually help manage the build process. In some tiny, one-off projects,
even writing a simple makefile ends up being too much work which brings
little to no reward.

So, just because some projects can be adequately managed by a set of hand-
crafted makefiles and shell scripts, it doesn't mean that build automation
tools are bad and should be avoided. They do have their purpose and they do
help a lot. You only need to understand where and when it makes sense to
rely on them, and in what cases it makes more sense to rely on other means.


Rui Maciel
 
B

Ben Bacarisse

superpollo said:
Eric Sosman ha scritto:

like what? the bike or the wheels?

How could it be like the bike? What would play the role of the training
wheels? What is it that is holding autoconf back and which needs to be
removed for it to achieve it's full potential?

In my book, analogies are good when you are trying to explain something
complex (the universe is like a soap bubble...) but not very useful when
they are employed to say something simple (you'll be better off without
autoconf holding you back).
 
J

jacob navia

Le 09/01/12 14:44, Ben Bacarisse a écrit :
Some typos just get to the heart of the matter in a way that no other
writing can.

Look Ben, I am agnostic ok? :)

But Apple's machines are *nice* to use. I would love that linux
machines were like that but no, they are never like that.

For instance,

$ man gcc

works, you do NOT get a warning telling you that GNU doesn't like man
pages. The documentation is entirely rewritten.

And then ALL the details: makefiles do not require tabs and will accept
tabs or spaces... and you know what impressed me the most?

$ cat foo.c
#include <math.h>
#include <stdio.h>

int main(int argc,char *argv[]) { printf("%g\n",sin(argc)); }

$ gcc foo.c
$ ./a.out 1
0.909297
$

NO NEED FOR -lm!!!!!!! The libraries are included BY DEFAULT!!!!!!!!

How many times we have discussed that here?
And I was told by the linux people that "THAT IS THE WAY IT IS..."

Apple did it.

Linux people are very conservative, and their version of Unix is frozen
around 1980. Apple did a better Unix. And I am not speaking about the
wonderful and intuitive user interface, the great LOOKS, the package
management system that has hundreths of public domain applications
ported to Apple Unix, etc etc.

I am really satisfied with my Mac, sorry. It is EXACTLY what linux
could have been if they would have worked in making a BETTER Unix
instead of producing several dozens of different window management
systems (equally bad and equally awful) and several dozens of equally
bad IDEs.

They never think about the USER EXPERIENCE when designing their programs
(just look at gdb), something Apple has always had in mind.
 
J

jacob navia

Le 09/01/12 14:44, Ben Bacarisse a écrit :
Some typos just get to the heart of the matter in a way that no other
writing can.

Look Ben, I am agnostic ok? :)

But Apple's machines are *nice* to use. I would love that linux
machines were like that but no, they are never like that.

For instance,

$ man gcc

works, you do NOT get a warning telling you that GNU doesn't like man
pages. The documentation is entirely rewritten.

And then ALL the details: makefiles do not require tabs and will accept
tabs or spaces... and you know what impressed me the most?

$ cat foo.c
#include <math.h>
#include <stdio.h>

int main(int argc,char *argv[]) { printf("%g\n",sin(argc)); }

$ gcc foo.c
$ ./a.out 1
0.909297
$

NO NEED FOR -lm!!!!!!! The libraries are included BY DEFAULT!!!!!!!!

How many times we have discussed that here?
And I was told by the linux people that "THAT IS THE WAY IT IS..."

Apple did it.

Linux people are very conservative, and their version of Unix is frozen
around 1980. Apple did a better Unix. And I am not speaking about the
wonderful and intuitive user interface, the great LOOKS, the package
management system that has hundreths of public domain applications
ported to Apple Unix, etc etc.

I am really satisfied with my Mac, sorry. It is EXACTLY what linux
could have been if they would have worked in making a BETTER Unix
instead of producing several dozens of different window management
systems (equally bad and equally awful) and several dozens of equally
bad IDEs.

They never think about the USER EXPERIENCE when designing their programs
(just look at gdb), something Apple has always had in mind.
 
E

Edward A. Falk

if you need an N-bit int, use intN_t. If you need, for instance,
strndup, have some #ifdef-ed code that provides a replacement if it's
not present in the standard library.

I got tired of managing the ifdefs, and realized that the very few
functions I needed them for, e.g. strndup(), basename(), etc. could
easily be written in a few lines of C. So I wrote my own StrnDup() and
BaseName() functions and now I don't worry about the different variants.

I know it seems stupid to replicate code that is in (some versions of)
the C library, but it turned out to be less work than dealing with all
the conditional compilation issues.

On the more general note of avoiding Autotools, I would note that all my
open-source projects (there are several) use hand-written Makefiles,
none has a configure script, and I have yet to encounter anything more
severe than a few #ifdef WINDOWS to handle cross-compilation to that
platform (and separate make rules for Windows targets).

Ditto.
 
E

ec429

And then ALL the details: makefiles do not require tabs and will accept
tabs or spaces...
Meaning that your developer on an Apple system writes non-conforming
makefiles and can't understand why no-one on other Unices ever seems to
use his software.
The pitfalls of Postel's prescription, as HTML demonstrated.
NO NEED FOR -lm!!!!!!! The libraries are included BY DEFAULT!!!!!!!!
Inherently brittle as it won't canonically know about all libraries,
also I bet it pisses off library developers when it links against the
old version instead of the new one they've just built.
Linux people are very conservative, and their version of Unix is frozen
around 1980. Apple did a better Unix. And I am not speaking about the
wonderful and intuitive user interface, the great LOOKS, the package
management system that has hundreths of public domain applications
ported to Apple Unix, etc etc.
Linux and GNU follow (largely) the POSIX standards. Standards are a
_good thing_. While every improvement is necessarily a change, not
every change is necessarily an improvement. When Linux does things the
1980 way, it's because the 1980 way is the Right Thing (tm). When the
1980 way is the Wrong Thing (tm), Linux changes it where compatibility
permits. (Interestingly, several of the Right Things turn out to have
come from Plan 9)
Package management? .deb; end of.
I am really satisfied with my Mac, sorry. It is EXACTLY what linux
could have been if they would have worked in making a BETTER Unix
instead of producing several dozens of different window management
systems (equally bad and equally awful) and several dozens of equally
bad IDEs.
GNU did make a "better Unix". Have you read The Unix Haters' Handbook?
1980s-vintage Unix had some serious issues, most the result of
vendors' attempts at product differentiation for competitive advantage.
Maintaining source-level compatibility with the post-divestiture
clusterf**k guaranteed that compromises would have to be made. As older
software has been replaced with newer, portable code, GNU and Linux have
evolved to discard as much of that baggage as they dare.
The WMs are not "awful"; neither GNOME nor KDE is perfect, but that's
largely because they erred in the direction of imitating MacOS Classic.
Xfce, LXDE, fvwm2 and several others are clean, unobtrusive, and
lightweight.
IDEs are simply not necessary; if nano were mouse-aware I wouldn't even
use a GUI editor. Text editor, compiler, make, debugger, shell tools -
that's all the environment you need, and 'integration' is pointless when
everything is a stream of text (a powerful abstraction that Macs, with
their "resource fork", break).
They never think about the USER EXPERIENCE when designing their programs
(just look at gdb), something Apple has always had in mind.
Looking at gdb, my impression is that its UX is actually very good.
It's exactly the right way to build a debugger - its interface is
essentially a scripting language with a REPL, allowing you to control
your program and query its state in as natural a way as is possible for
compiled programs. Compare, incidentally, the debugging cycles of
interpreted languages: for instance, the Python debugger is the Python
interpreter; most JS debuggers are JS interpreters.

Summary: Steve Jobs is not, in fact, god. He was just a competent marketer.

-e
 
S

Seebs

Contrariwise, Android apps are more likely to be open source (since it's
feasible to develop for Android without intending to make money out of
the software) meaning that the security-conscious end-user can compile
themselves, precluding the possibility of viruses (the end-user also
does not need an expensive Mac and SDK to do such compilation themselves).

What are you talking about?

The SDK is free last I checked; it's certainly not very expensive, and I
seem to recall that there's a Windows port. I don't know. A cheap mac is
cheap and perfectly adequate for iOS development.

I've developed an iOS app and will likely develop more; I doubt it'll be
profitable, but I have fun working on it because the iOS API is a lot more
pleasant for me to use than any of the GUI options offered by the *nix side
of the world.

I would rather have a more open system, but a system designed coherently is
enough better than systems thrown together by people who didn't understand
the problem space that I'll put up with it.

You probably know the old saying, "you have to understand the rules before
you can break them." Android consists of people who did not understand why
iOS was the way it was systematically breaking all of its rules out of spite.

-s
 
E

ec429

You probably know the old saying, "you have to understand the rules before
you can break them." Android consists of people who did not understand why
iOS was the way it was systematically breaking all of its rules out of spite.
I could just as easily say that Apple consists of people who did not
understand why Unix was the way it was, systematically breaking all of
/its/ rules out of spite.
However, I do not attribute to spite that which can be adequately
explained by incompetence.
Unix' design /is/ coherent; it may have a few crocky warts for
hysterical raisins, but it has central, unifying principles - moreover,
it has the /right/ central unifying principles.
My thesis that iOS will wither and die, and Android will subsume, while
partly based on market data from ComScore via esr, is also significantly
based on the observed success of the design philosophy that has been
dubbed "Worse Is Better".
Those who do not learn from Unix are doomed to reinvent it, poorly.

A cheap Mac may be cheap. But I can develop for Windows on a Linux box,
or develop for Linux on a Windows box. If you ask me to buy your
hardware in order to develop for it, I ask you to stop wasting my time
and bug someone else.

I stand corrected on the SDK though; maybe I'm thinking of the App Store
costs, and ISTR that those are waived for free apps. But I don't know
the details.

Nonetheless, the impression I get is that the iOS ecosystem is not
conducive to the practices of open source and casual hacking which I
consider central to a strong software engineering tradition.

-e
 
S

Seebs

I could just as easily say that Apple consists of people who did not
understand why Unix was the way it was, systematically breaking all of
/its/ rules out of spite.

You could, except that OS X *is* Unix. So's iOS, under the hood.
I stand corrected on the SDK though; maybe I'm thinking of the App Store
costs, and ISTR that those are waived for free apps. But I don't know
the details.

It's cheap enough that I did it just so I could run an app I wanted to
write on my own phone, and I don't remember what it cost.
Nonetheless, the impression I get is that the iOS ecosystem is not
conducive to the practices of open source and casual hacking which I
consider central to a strong software engineering tradition.

I'd somewhat agree, but the huge overlap with OS X development (which is
free and has no particular requirements) leaves us with a pretty healthy
community.

And it is a LOT more pleasant to work with, IME.

-s
 
K

Keith Thompson

ArifulHossain tuhin said:
I've posted a question regarding autoconf earlier where several people
suggested avoiding "autoconf" altogether. I can generalize the
suggestion by saying "avoiding" the whole "Autotools"
chain. [...]
Any pointer about the development process will be very helpful. I
absolutely hate gnu autotools. If we can avoid it, it will be less
pain in the back.
[...]

One advantage of GNU autotools, from the perspective of a user
installing software from source, is that they impose a consistent set of
commands to build and install packages. A typical sequence is:

tar zxf foobar-0.1.tar.gz
cd foobar-0.1
./configure --prefix=/some/where/foo-0.1
make
make install

Packages with custom configuration, building, and installation scripts
often impose a substantial burden if they don't follow a conventional
sequence of commands. If they do, that's great, but I'd really rather
not have to manually edit some file (whose identify is documented in
README, or INSTALL, or is it installation-instructions.txt?) if I want
to install it in a non-default location.
 
K

Kaz Kylheku

ArifulHossain tuhin said:
I've posted a question regarding autoconf earlier where several people
suggested avoiding "autoconf" altogether. I can generalize the
suggestion by saying "avoiding" the whole "Autotools"
chain. [...]
Any pointer about the development process will be very helpful. I
absolutely hate gnu autotools. If we can avoid it, it will be less
pain in the back.
[...]

One advantage of GNU autotools, from the perspective of a user
installing software from source, is that they impose a consistent set of
commands to build and install packages.

This is not an advantage of autotools, because anyone can write a script called
"configure" by hand which imitates the conventions, and do a much
better job of it.

Furthermore programs that use autotools do not really have consistent
conventions beyond that there is a script called "configure".
tar zxf foobar-0.1.tar.gz
cd foobar-0.1
./configure --prefix=/some/where/foo-0.1
make

--prefix is pretty ubiquitous, but it rapidly breaks down from there.
make install

So now, what is the convention to put the program somewhere
other than in /some/where/foo-0.1?

For some programs it is this:

make DESTDIR=/package/dir install

(/package/dir gets combined with the --prefix so there will be
/package/dir/some/where/foo-0.1)

For some others, there is configure argument, like

./configure --prefix=/foo --install-dir=/package/dir
Packages with custom configuration, building, and installation scripts
often impose a substantial burden if they don't follow a conventional
sequence of commands.

Nothing imposes a greater burden than a program which can't be built
for architecture X without executing architecture X code at build time.

I think many Linux distro maintainers would happily have every program follow
its own conventions, if they all could be cross-built without having to create
stupid little environments where stuff runs under chroot (because it wants to
wrongly access build-machine paths, not sticking to the sysroot) or even under
QEMU (because the build wants to run target architecture code, even when
cross-compiling).

Configure and build systems that work cleanly, do not pick up build machine
pollution, and cross-compile without emulation are more improtant than shitty,
broken build systems that are vaguely compatible with each other in the
syntactic trivialities like --path.
If they do, that's great, but I'd really rather
not have to manually edit some file (whose identify is documented in
README, or INSTALL, or is it installation-instructions.txt?) if I want
to install it in a non-default location.

If the program's build system doesn't suck, I will gladly RTFM inside it and
follow how it wants it done.
 
R

Rui Maciel

Kaz said:
This is not an advantage of autotools, because anyone can write a script
called "configure" by hand which imitates the conventions, and do a much
better job of it.

But does everyone do that? And how much of an hastle is it to maintain some
convoluted build script which you didn't wrote it yourself and hasn't been
written with maintenance in mind?

Furthermore programs that use autotools do not really have consistent
conventions beyond that there is a script called "configure".

Could you please elaborate on that?


So now, what is the convention to put the program somewhere
other than in /some/where/foo-0.1?

For some programs it is this:

make DESTDIR=/package/dir install

(/package/dir gets combined with the --prefix so there will be
/package/dir/some/where/foo-0.1)

For some others, there is configure argument, like

./configure --prefix=/foo --install-dir=/package/dir

It really depends on what you define as being "the program". The configure
scripts generated by the GNU build system provide a set of options to
configure a set of directories which come in handy in installing a software
package. For example, the --bindir option is used to specify the directory
into which the user executables are copied, and there are options available
to set other target directories used to install system admin executables,
object code libraries, pdf documentation, etc...

Nothing imposes a greater burden than a program which can't be built
for architecture X without executing architecture X code at build time.

I don't see how that is a problem caused by the GNU build system, or any
other automatic build system, for that matter. In fact, it is quite
possible to develop a hand-crafted build script which relies on some feature
which isn't available in a system.

I think many Linux distro maintainers would happily have every program
follow its own conventions, if they all could be cross-built without
having to create stupid little environments where stuff runs under chroot
(because it wants to
wrongly access build-machine paths, not sticking to the sysroot) or even
under QEMU (because the build wants to run target architecture code, even
when cross-compiling).

Configure and build systems that work cleanly, do not pick up build
machine pollution, and cross-compile without emulation are more improtant
than shitty, broken build systems that are vaguely compatible with each
other in the syntactic trivialities like --path.

Your complaint boils down to "a build script might be buggy". Hand-crafted
scripts aren't immune to bugs. They are also vulnerable to poor design
decisions and lack of insight from the developer.



Rui Maciel
 
A

ArifulHossain tuhin

Most are listed on my website -
Also see http://github.com/ec429

Thanks for the links. That's very helpful.
The biggest portability struggle I've had so far was with cwTBK, which
was not easy to port to Windows (largely because Windows has no
equivalent of ALSA's "arecord", nor a real command shell or terminal).

Most of what I know about project-architectural style, including how to
do Make right, I learned from ESR's brilliant "The Art of Unix
Programming" (http://catb.org/~esr/writings/taoup/). It's highly
recommended.

I have skimmed over it in the past. Never took time to read it thoroughly. Thank you for the suggestion. I should read it through. I like the "Rob Pike's
"Unix Programming Environment". But its a bit outdated though. But the principle's are the same i guess.
 
A

ArifulHossain tuhin

Yes, tiny projects which don't require extensive configuration checks tend
to be easier to maintain by writing a makefile and even test scripts
ourselves.

Nonetheless, a build process only requires an automated build process if its
size makes it impractical to manage custom makefiles and scripts by hand.



That is strange. Were those files C source code files? And did anyone in
that team had any experience with the GNU build system, or was everyone
cutting their teeth on it?

Our team was(and still is) little inexperienced dealing with Gnu Build system.
Lot of them used to develop in java/c# or visual studio. Our organization
is trying to train us, hopefully we won't be new in coming months. But still i believe a moderate sized project can be done by handwritten makefiles because the similarity in POSIX systems. Yes they require tweaks and won't be "automated". But autotools are no different. Most of the time, they need tweaks too. The problem is the "tweak"s are not very understandable becauseof the complexity associated with Autotools. Whereas makefiles are rather straight forward.
 
D

Dirk Zabel

Am 09.01.2012 23:50, schrieb Keith Thompson:
ArifulHossain tuhin said:
I've posted a question regarding autoconf earlier where several people
suggested avoiding "autoconf" altogether. I can generalize the
suggestion by saying "avoiding" the whole "Autotools"
chain. [...]
Any pointer about the development process will be very helpful. I
absolutely hate gnu autotools. If we can avoid it, it will be less
pain in the back.
[...]

One advantage of GNU autotools, from the perspective of a user
installing software from source, is that they impose a consistent set of
commands to build and install packages. A typical sequence is:

tar zxf foobar-0.1.tar.gz
cd foobar-0.1
./configure --prefix=/some/where/foo-0.1
If it works, fine.
Unfortunately, this step very often does not work for me. I am regularly
trying to cross-compile for an embedded, ppc-based linux system, which
in theory should be possible with the configure-parameter

--host=ppc-XXX-linux

Very often I have to work long until configure works. And if it works,
there is no guarantee that the next step, i.e. "make" works. And if that
works, there might be subtle errors which show up later when the program
runs on the target platform. Sometimes I don't get configure to work
anyway or decide, that it's not worth the time. For example, I once
tried to build apache for my target platform without success. Finally, I
decided to use another webserver (lighttpd), which might be more
appropriate anyway, but I wanted to give apache a try.
Just now I am in the process to cross-compile php for my target. The
first issue was that the target has MSB byte-order, which the configure
script did not care about. The actual sources had #ifdefs to take
byte-order into account, so after patching the byte-order information
into configure, I could solve this. But I have yet to solve further
problems. Often the configure-script tries to compile and run short
test-programs in order to find something out. This of course always
fails if you try to cross-compile. Than I have first to see what
information the configure script tried to aquire and how I can provide
this information manually. No, I don't like this.

So from this experiences, I would greatly prefer a manually written,
clean makefile together with some pre-written config.h where the
system-dependent definitions are visible. Btw I don't think, a manually
written script configuration script would be better, I think it could be
even worse.

Regards
Dirk
 
S

Stephen Sprunk

Our team was(and still is) little inexperienced dealing with Gnu Build
system. Lot of them used to develop in java/c# or visual studio. Our
organization is trying to train us, hopefully we won't be new in coming
months.

Hopefully that training is coming from someone with significant
experience. It can be a pretty steep learning curve.
But still i believe a moderate sized project can be done by handwritten
makefiles because the similarity in POSIX systems. Yes they require
tweaks and won't be "automated".

That's the major feature of autotools: to not need to tweak makefiles
for each target platform.

For in-house apps that only have a few targets, learning autotools is
probably not worth the effort; for open-source apps that have to run on
dozens of targets, each with varying libraries and other apps installed,
and which may be cross-compiled or installed in different locations,
it's well worth it.
But autotools are no different. Most of the time, they need tweaks too.

If so, you're (probably) doing it wrong.
The problem is the "tweak"s are not very understandable because of the
complexity associated with Autotools. Whereas makefiles are rather
straight forward.

That's just a matter of what you're used to. If I learned the autotools
system first, I'd probably find that easier than makefiles. The build
systems in the MS tools are baffling to me, but if that's what you
learned first, that's probably easier for you than makefiles.

S
 
D

David Dyer-Bennet

Richard said:
IDEs, well done, are a real boon. I can only assume you never used one
or worked with any considerably sized code base that allows context
help, auto class completions, refactoring, JIT debugging etc etc.

In theory, an IDE could be a good thing -- though it tends to lock you
into a set of tools rather than letting you choose individually, which
is a big problem.

I've never actually used one that was better than the non-IDE
environments I've programmed in, though. Visual Studio in particular is
a total disaster, especially for C and C++ code. We get random build
failures, crashes, it fails to find things it should find, Tool Tips
don't work, it misses breakpoints, and so forth -- and not just for me
(the non-Windows guy); the experts have the same problems and can't tell
me how to get around them.

I've worked on some fairly big projects, including a 5-million-line
thing while I was at Sun (it used Linux and GCC, in fact it had a custom
Linux distribution rolled into it).
 
D

David Dyer-Bennet

Richard said:
Not if it can be configured to use external tools.

Yes. In theory, it could be a good thing. But I've never actually
encountered one that actualized that possibility.
I think thats down to your set up. VS can be intrusive but millions use
it daily with a lot of success.

Nearly always for very small projects.
And jut look at Apple and their IDE and the satisfaction ratings.

Never seen it, I've heard serious complaints about the lack of
alternatives.
I am NOT saying discrete tools are a bad thing but bringing them all
together in an INTEGRATED UI can be a good thing.

I use emacs as and IDE and its benefits outweigh the disadvantages : but
I dont try to convince myself its good at debugging, completion, context
help etc, Its not.

The editor is the most important tool, so that's where I live, yes. But
the gdb integration IS pretty useful in my experience. Doesn't do class
browsing, that's the bit that looked hopeful in VS -- but it doesn't
deliver, it just doesn't work in C++. (Works pretty well for C#
though.)
 
R

Rui Maciel

jacob said:
and you know what impressed me the most?

$ cat foo.c
#include <math.h>
#include <stdio.h>

int main(int argc,char *argv[]) { printf("%g\n",sin(argc)); }

$ gcc foo.c
$ ./a.out 1
0.909297
$

NO NEED FOR -lm!!!!!!! The libraries are included BY DEFAULT!!!!!!!!

How many times we have discussed that here?
And I was told by the linux people that "THAT IS THE WAY IT IS..."

I've tested your example in Kubuntu 11.10, with GCC 4.6.1.

<test>
rui@Kubuntu:tmp$ cat main.c
#include <math.h>
#include <stdio.h>

int main(int argc,char *argv[]) { printf("%g\n",sin(argc)); }
rui@Kubuntu:tmp$ gcc main.c
/tmp/ccpoalMn.o: In function `main':
main.c:(.text+0x15): undefined reference to `sin'
collect2: ld returned 1 exit status
rui@Kubuntu:tmp$ gcc main.c -lm
rui@Kubuntu:tmp$
</test>


Rui Maciel
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,083
Messages
2,570,591
Members
47,212
Latest member
RobynWiley

Latest Threads

Top