No unanswered question

  • Thread starter Alf P. Steinbach /Usenet
  • Start date
J

Joshua Maurice

I tend to blame the company developing the tool in cases like these. I
know Rational Rose (not Rose Realtime, which I hear is different and
better), and I have always loathed it for its lack of support for sane
version control, parallel development etc.

It never occured to me because I have never used it for code
generation, but I suppose that in the same way it lacks support for
building.

Well, our problem is that we develop the GUI and some newer components
in Java, but our old engine is in C++. Our engine solves a domain
specific problem with a domain specific language. This language is
represented by an object graph, whose representation is logically
coupled with the GUI of the end users. Our engine takes this graph,
picks it apart into separate tasks assignable to separate threads, and
begins processing. Implicit in this is that we want to be able to take
an object graph from Java, serialize to some XML format or some binary
format, and then deserialize to C++ to give to the engine, and vice
versa for debugging etc. This potentially requires arbitrary code
generation. At one point in time, we used several Rose model files to
describe the object graph of the domain specific language. We had a
custom inhouse tool which converted this to C++ classes and Java
classes with the serialization code in place. I don't really know any
other sane way to handle this use case, the serialization of object
graphs between different languages such as C++ and Java.
Perhaps when you generate a lot of source files at random,
you should at the same time generate a Makefile fragment describing
the dependencies between them. Perhaps the code generator should
follow the same policy as a Makefile build and not touch a generated
header file unless it is actually changed.

So I'm defining tools that break Make as bad tools ... which of course
doesn't help people who are stuck with them :-/

Maybe it would help somewhat to wrap the code generator in a shell
script which only touches the regenerated .cpp/.h files which have
actually changed (and removes all of them if the generation fails).

The "conditional touching of files in a command of a rule" won't work,
at least not with GNU Make. The GNU Make mailing list has confirmed
that any file creation, deletion, or modification during phase 2 may
not be picked up. This has been my experience playing with it as well.
GNU Make effectively determines which portions of the graph are out of
date before running any command of any rule.

This is one facet of my major beef with Make: from a command, you
cannot conditionally choose to mark downstream nodes as up to date or
out of date. Depending on the kind of build step, this affects its
incremental "goodness", the ability to skip unnecessary build steps,
to varying degrees.

With the Rose generation, with GNU Make, when the code generation task
is out of date, you can mark all output files out of date. It'll
result in some additional C++ compilation - a better system could skip
more unnecessary work, but at least it's incrementally correct.

With Java compilation, you could make an incrementally correct build
system, but it would be a cascading rebuild without termination,
vastly inferior to the aforementioned system which can terminate the
rebuild early.
My assertion early in this thread was that such things (the "make
loopholes") can be avoided (don't use many include search paths)
and/or detected manually when they happen (rebuild from scratch when
files disappear from version control). I don't think I saw you
explaining why that isn't good enough, or did I miss that?

Well, a couple things.

First, some build steps, like javah, javac, the aforementioned Rose
compilation, unzipping, and others, produce output which is not
predictable without doing the actual build step. With file creation
and deletion, you need to check for:
1- Stale files - this might require some cleaning, and rerunning of
build steps downstream.
2- New files
2a- New files which hide old files on some search path. Relatively
unlikely for C++ depending on naming conventions (a lot more likely in
my company's product due to bad naming conventions and lots of include
path entries), but much more likely for Java.
2b- New files which require new build steps, or new nodes in the
dependency graph. For my Rose to C++ code generation, I do not know
what C++ files will come out of it until I actually do the code
generation. It will produce lots of .cpp files (which will not be
modified by hand). Each of these .cpp files needs to be compiled to
a .o file. I would like for this to be done in parallel, but to do
that I need to define new make rules, aka add nodes to the dependency
graph, which one really cannot do in GNU Make.

I know a little about GNU Make and how it can have makefiles as
targets, and if it detects an out of date makefile, it will like,
rebuild that makefile (and all prereqs), and restart GNU Make from the
start with the new makefile. Has anyone ever used this? I admit that I
haven't played around with this fully, but my initial impression is
that it's basically unworkable for my problems. Restarting the whole
shebang after every such Rose code generation would result in a lot of
makefile parsing, easily adding minutes (or likely much more) to a
build time. Though, I admit I could be wrong here.

Finally, why punt? We're requiring that the developer be fully aware,
but I think that a lot of these problems, such as "do a full clean
build whenever a file is deleted" is easy to forget or accidentally
miss. I think this is a little different than "don't dereference a
null pointer" or similar arguments you can make. When we're writing
code, we're aware of the pointer and that it could be null. When we're
doing a build, we're busy thinking about code, not about whether the
entire codebase breaks some "build style" rule, or whether some file
has been deleted. It's not practical to check your email for
"incremental build breaking messages". It's inefficient, and error
prone. Moreover, it's fixable. It's quite doable to handle all of
this, and more, and to do it faster than GNU Make. There is no reason
to punt. The investment of time now to make a build system which can
handle it all will save lots of developer time later - for those
developers which:
- work on "the build from hell" (like me)
- or those who forgot to check for a file deletion when they did a
sync
- or those who are working on a mixed code base with Java, C++, etc.
(like me).

Put another way, yes I recognize that the perfectly correct, academic
way is not the way to do things. For example, see my post here:
http://groups.google.com/group/comp.lang.c++.moderated/msg/dacba7e87ded4dd7

However, it seems clear to me that this is a clear win for investing.
The investment needs to be done once, by one guy, and everyone in the
entire C++, Java, and more, programming world can use it to save time.
Any time savings \at all\ is easily worth it when we can amortize the
cost to one guy but claim savings from every developer everywhere.
Now, it's hard to make such an argument to management, mostly because
it's wrong. For management, you correctly need to show that it helps
the company, which is a bit harder to show, but I still think that
this is the case. (My management and peers disagree though.)
[About being able to specify other actions than compiling and linking]
But no build tool actually does this. At best, they just provide a
framework for the new compilation step. GNU Make just provides a
framework. The build tool which I'm writing just provides a framework.

Yes, but it seemed to me you considered *not* providing that. That's
why I pointed out that it's important to many of us.

It seems to me that you have a pretty narrow focus and don't want to
listen to objections a lot. Actually, I think that's fine. That's what
*I* do when I have an idea and want to summon the energy to do
something about it.

I'm not quite following. One second. I think I need to clarify. Make
does not do C++ compilation out of the box, nor any other random
possible build kind. You need to write some logic in makefile to
handle this new kind of build. My new tool will be effectively the
same: it won't handle any arbitrary build kind out of the box - it
won't be magic, but it will be simple and quick to extend it to handle
X new build kind, just like make. The difference is that I'm strictly
enforcing the separation of build kind logic from the average
developer who just instantiates an already defined macro. If need be,
the average developer can add a new macro, but it will not be in an
interpreted language ala make so it will be much faster, the macro
definition cannot be in arbitrary build script file ala make which
will allow much easier auditing of incremental correctness, and it
will make it harder for a unknowledgeable developer to break the build
because the build system makes it exceptionally hard to do so. I think
an analogy which applies: "private, protected, public, const" are
technically unnecessary for perfect developers, but we recognize their
utility in protecting us from ourselves. (No, this is not a proof by
analogy. I'm just trying to explain my case.)
 
J

Joshua Maurice

On Sun, 2010-08-01, Joshua Maurice wrote:
[About being able to specify other actions than compiling and linking]
Most people (at least me) expect a build tool to be able to run
programs which the build tool has never heard of. I often have a Perl
script generate some table, a shell script formatting the documentation,
or some tool prepare the distributable stuff (e.g. RPMs for some
Linuxes, or some odd format for loading into PROM).
But no build tool actually does this. At best, they just provide a
framework for the new compilation step. GNU Make just provides a
framework. The build tool which I'm writing just provides a framework..
Yes, but it seemed to me you considered *not* providing that. That's
why I pointed out that it's important to many of us.
It seems to me that you have a pretty narrow focus and don't want to
listen to objections a lot. Actually, I think that's fine. That's what
*I* do when I have an idea and want to summon the energy to do
something about it.

I'm not quite following. One second. I think I need to clarify. Make
does not do C++ compilation out of the box, nor any other random
possible build kind. You need to write some logic in makefile to
handle this new kind of build. My new tool will be effectively the
same: it won't handle any arbitrary build kind out of the box - it
won't be magic, but it will be simple and quick to extend it to handle
X new build kind, just like make. The difference is that I'm strictly
enforcing the separation of build kind logic from the average
developer who just instantiates an already defined macro. If need be,
the average developer can add a new macro, but it will not be in an
interpreted language ala make so it will be much faster, the macro
definition cannot be in arbitrary build script file ala make which
will allow much easier auditing of incremental correctness, and it
will make it harder for a unknowledgeable developer to break the build
because the build system makes it exceptionally hard to do so. I think
an analogy which applies: "private, protected, public, const" are
technically unnecessary for perfect developers, but we recognize their
utility in protecting us from ourselves. (No, this is not a proof by
analogy. I'm just trying to explain my case.)

Sorry, I just realized a much better way to put this:

This ties back to answer an earlier question: "why isn't imposing a
'build style' restriction, plus requiring clean builds on known
unhandled cases, enough?"

The tools are meant to aid us developers by automating the dependency
analysis. If a system can handle all deltas (aka full incremental
correctness), it's easy to write, it has low overhead, it is as
extensible, and it does better at skipping unnecessary build steps,
then I see absolutely no reason to punt, yet you advocate punting.
Given these facts, the outcome seems clear. Only inertia and ignorance
is keeping us with the obviously inferior tool.

However, I have no problem with Keith and his build system. It's fully
incrementally correct, it's already written, it has low overhead
[supposedly, I'm still somewhat incredulous on this fact], it's as
extensible as Keith needs, and it skips most of the unnecessary build
steps. It doesn't work for me and the general case because it's not as
extensible as I need to other kinds of build steps while maintaining
full incremental correctness, and it has been unacceptably slow in my
experience compared to alternatives.
 
J

Jorgen Grahn

.
[About being able to specify other actions than compiling and linking]
Most people (at least me) expect a build tool to be able to run
programs which the build tool has never heard of. I often have a Perl
script generate some table, a shell script formatting the documentation,
or some tool prepare the distributable stuff (e.g. RPMs for some
Linuxes, or some odd format for loading into PROM).
But no build tool actually does this. At best, they just provide a
framework for the new compilation step. GNU Make just provides a
framework. The build tool which I'm writing just provides a framework.

Yes, but it seemed to me you considered *not* providing that. That's
why I pointed out that it's important to many of us.
....

I'm not quite following. One second. I think I need to clarify. Make
does not do C++ compilation out of the box, nor any other random
possible build kind. You need to write some logic in makefile to
handle this new kind of build. My new tool will be effectively the
same: it won't handle any arbitrary build kind out of the box - it
won't be magic, but it will be simple and quick to extend it to handle
X new build kind, just like make.

OK, then I misunderstood. A simple and quick way to handle new
"compilers" and "languages" is all I ask for in a make replacement.

/Jorgen
 
J

Joshua Maurice

<snip comparison of my make clone and GNU Make 3.81>

So, my company's management decided that we're not in the business of
making and selling make clones (pardon the pun), and some saw that we
might benefit in the future if this situation is improved, so they let
me open source it. I need to work out the details still, but I hope to
publish the source to my make clone in the near future.

I think that management has said it prefers a license with copyleft
and with an explicit linking exception. Anyone have any preferences on
this? Anyone know any good articles I could peruse? My default
position for copyleft and explicit linking exception is GNU LGPL, but
I'm not particularly well educated on this.

Second, where would I post the code? Sourceforge I guess?
 
J

Joshua Maurice

<snip comparison of my make clone and GNU Make 3.81>

So, my company's management decided that we're not in the business of
making and selling make clones (pardon the pun), and some saw that we
might benefit in the future if this situation is improved, so they let
me open source it. I need to work out the details still, but I hope to
publish the source to my make clone in the near future.

I think that management has said it prefers a license with copyleft
and with an explicit linking exception. Anyone have any preferences on
this? Anyone know any good articles I could peruse? My default
position for copyleft and explicit linking exception is GNU LGPL, but
I'm not particularly well educated on this.

So, sorry to be possibly greatly off topic, but I figure it relates to
programming in C++ however tenuously, and it relates to this thread a
bit more strongly.

So, I just reviewed basic copyright law, and the common licenses in
use, including GNU GPL, GNU LGPL, Boost, BSD, and MIT.

As I see it, my basic options are:

1- Full viral copyleft which tries to assert that linking creatives a
protected derivative work, ex: GNU GPL.
2- Full viral copyleft with an explicit linking exception, ex: GNU
LGPL.
3- Full viral on source only, aka an explicit binary exception, ex:
Boost, BSD, MIT licenses.

I wouldn't want someone to develop a plugin for my build framework,
then be allowed to distribute the build framework binaries but without
the source code to that very essential plugin. If I understand it
correctly, an explicit linking exception would allow such a thing,
thus I lean towards full GNU GPL. (I understand that the issue of
whether linking creates a protected derivative work has not been
decided by the courts, and there's lots of reasonable arguments on
both sides.)

However, I dislike the verbosity of the GNU GPL and the GNU LGPL. Is
all of that legally necessary? I would really prefer something shorter
and simpler which basically has the same essence, and causes as little
license conflict as possible.

Any comments?

I think my management is leaning towards GNU LGPL. I am not privy to
why. I just heard a one-off from a manager which I think included
"open source", "linking exception", and "not GNU GPL".
 
Ö

Öö Tiib

I wouldn't want someone to develop a plugin for my build framework,
then be allowed to distribute the build framework binaries but without
the source code to that very essential plugin. If I understand it
correctly, an explicit linking exception would allow such a thing,
thus I lean towards full GNU GPL. (I understand that the issue of
whether linking creates a protected derivative work has not been
decided by the courts, and there's lots of reasonable arguments on
both sides.)

Then use GPL. Like i understand it is separate tool for integrating
variety of other tools into build process. Integration goes without
linking and the tool will be probably distributed separately from
compilers and linkers and what else tools it will integrate so i do
not see why you need GNU LGPL.
However, I dislike the verbosity of the GNU GPL and the GNU LGPL. Is
all of that legally necessary? I would really prefer something shorter
and simpler which basically has the same essence, and causes as little
license conflict as possible.

There are hundreds of countries on our planet. At some places parts of
the GPL text may be unnecessary and at other places it may be not
sufficient to protect your work from usages that violate the license.
I think you do not need to write all that license into each source
code file, so why does verbosity worry you? Put a file with license
into distribution and refer to it from each source code file. If it is
GNU GPL of some version then no one will ever read it anyway,
everybody know already what it is about.
 
J

Joshua Maurice

In our company (producing commercial software) we are using multiple
external libraries for various peripheral tasks, both closed and open
source. However, if something is licensed GPL, it is automatically
excluded for us, we cannot use it however good it might be, and are bound
to reinvent wheels or use some other library. LGPL on the other hand is
OK.


For us it is the other way around. We are providing a large framework
which has its own extension libraries (effectively plugins). Now if one
plugin were somehow linked to some GPL code and loaded into our process,
it might arguably turn all of our large code base into GPL. I personally
would have nothing against this, but the management thinks otherwise. And
of course it would create a lot of problems with closed source libraries
we are using in other parts of the system.

As Öö Tiib put it else-thread, I think your complaints do not apply.
Presumably, you use GNU gcc or GNU Make on some platforms, and offhand
both are GNU GPL. However, that doesn't force your product built with
GNU gcc and GNU Make to be GNU GPL.
 
I

Ian Collins

As Öö Tiib put it else-thread, I think your complaints do not apply.
Presumably, you use GNU gcc or GNU Make on some platforms, and offhand
both are GNU GPL. However, that doesn't force your product built with
GNU gcc and GNU Make to be GNU GPL.

What you build with is irrelevant, it's what you link with that causes
license problems.
 
F

Francesco S. Carta

What you build with is irrelevant, it's what you link with that causes
license problems.

Once we are there, I'd like to ask a somewhat silly question, just for
confirmation.

Assume I have some GPL code and I make an executable out of it, I'm only
expected to provide the license and the source code of that binary, that
should not affect the licensing of any other executable that "uses" the
GPL executable by calling it and using its output, even if I happen to
ship all of it together in a single installer, is this interpretation
correct?

To put it in other words, take this as the content of an installer:
-----------------
gpl_program.exe
gpl_program.cpp
gpl_program.license
proprietary_program.exe (calls gpl_program.exe depends on its output)
-----------------

The fact that I ship all the above with a single installer does not
affect the licensing of proprietary_program.exe, which can be left fully
copyrighted and closed source, am I right?

(I know this is not a lawyers' lounge, I'm just asking for some common
sense interpretations)
 
Ö

Öö Tiib

Once we are there, I'd like to ask a somewhat silly question, just for
confirmation.

Assume I have some GPL code and I make an executable out of it, I'm only
expected to provide the license and the source code of that binary, that
should not affect the licensing of any other executable that "uses" the
GPL executable by calling it and using its output, even if I happen to
ship all of it together in a single installer, is this interpretation
correct?

To put it in other words, take this as the content of an installer:
-----------------
gpl_program.exe
gpl_program.cpp
gpl_program.license
proprietary_program.exe (calls gpl_program.exe depends on its output)
-----------------

The fact that I ship all the above with a single installer does not
affect the licensing of proprietary_program.exe, which can be left fully
copyrighted and closed source, am I right?

(I know this is not a lawyers' lounge, I'm just asking for some common
sense interpretations)

IANAL neither. AFAIK it seems it is done how you describe it
sometimes. There may be some thin nuances of course, who knows. Isn't
Apple XCode developer suite for example bundled with (modified by
Apple) GNU Compiler Collection?
 
B

Bo Persson

Öö Tiib said:
IANAL neither. AFAIK it seems it is done how you describe it
sometimes. There may be some thin nuances of course, who knows.
Isn't Apple XCode developer suite for example bundled with
(modified by Apple) GNU Compiler Collection?

Yes, but we also notice that Apple stopped at gcc 4.2, which is
licensed under GPL v2.

This is not simple at all. Not being a lawyer, I would still make the
example above two separate installers.


Bo Persson
 
F

Francesco S. Carta

Yes, but we also notice that Apple stopped at gcc 4.2, which is
licensed under GPL v2.

This is not simple at all. Not being a lawyer, I would still make the
example above two separate installers.

Thank you both for your comments, this is a problem I'm going to face,
I'll speak about it with the people who will distribute my code
(assuming I'll be able to do what they ask me to). I think that having
such an "helper" program licensed under BSD instead of GPL would
simplify my case.
 
J

Joshua Maurice

As Öö Tiib put it else-thread, I think your complaints do not apply.
Presumably, you use GNU gcc or GNU Make on some platforms, and offhand
both are GNU GPL. However, that doesn't force your product built with
GNU gcc and GNU Make to be GNU GPL.

I was thinking about this more, and GPL might preclude integrating my
new build system with something such as Eclipse (or any other
anything). As such, I'm heavily leaning towards LGPL.

In other news, my company's lwayers have given an ETA of a couple
weeks for figuring out the licensing stuff and giving me permission to
post the source code to my Make drop-in replacement, and the other
build system, my newer one which isn't a simple drop-in Make
replacement.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,145
Messages
2,570,825
Members
47,371
Latest member
Brkaa

Latest Threads

Top