Why it is important to have that *incremental* build when you have
large commercial product? You anyway want to have each module with
clear mark in it to what build it belongs. You make build farm that
makes you clean build each time. It is lot simpler solution. What does
a computer cost? Depends perhaps on continent but you likely can buy
several for cost of single month of good worker.
One may want incremental build when he develops something with over
average open-source-project size on sole computer at home. Then he
uses incremental builds and still majority of time goes into building,
not developing nor running tests. That is market for good incremental
build system.
First, let me note that the post to which you are replying only argued
that he basically did not test the incremental build system. I was
quite explicit in this. However, else-thread, I made arguments in
favor of incremental. Let me add a couple new replies here.
Some companies don't have those build farms. For starters, they're
expensive. I also disagree that build farms are the "simpler"
solution. Maintaining a build farm is a huge PITA. Just maintaining
automated build machines for the dozen or so platforms my company
supports for emergency bug fixes, hot fixes, and daily ML builds takes
the full time employment of 3+ people. The simpler solution is to have
the developer build everything himself. A lot less moving parts. A lot
less different versions running around. A lot less active Cruise
Control instances. Is building it all in one giant build more
sensible? Perhaps not; it depends on the situation, but a build farm
is definitely not simpler than building everything yourself in a giant
build.
Now, correct incremental. It's relatively clear to me that this is
more complex than a full clean build every time, aka not simpler. Is
it simpler or more complex than a build farm? I don't know. I would
think that a correct incremental build system is actually simpler than
a build farm. A build farm really is complex and takes effort to
maintain, but a correct incremental build system would only have to be
written once, or at least each different kind of task would only have
to be written once, amortized over all companies, whereas a build farm
would probably have a lot more per-company setup cost.
Why? On the contrary. Incremental builds are irrelevant. No one uses
these anyway.
The argument made was that he did not test the incremental build
system to any significant degree. He argued that he did test the
incremental build system with "continuous clean builds". He is in
error, and you are arguing a non sequitur.
Also, no one uses incremental builds? Why do we even use Make anymore
then? I suppose that Make has some nifty parallelization features, so
I guess it has some use if we ignore incremental.
I must say this is somewhat surprising to hear in a C++ forum. I
expected most people to still be under the incremental spell.
Incremental really is not that hard to achieve. It's just that no one
has really tried as far as I can tell. (Specifically under my
definition of "incremental correctness" which by definition includes
the associated build scripts as source, though I think even ignoring
build script changes, all common, purported incremental, build systems
are not incrementally correct under idiomatic usage.)
Yes. So *do* *not* use incremental build systems and sun is shining
once again.
Test of a clean build is lot simpler. Did it build everything it was
made to build? Yes? Success! No? Fail! That is it. Tested. Simple part
is over. Now can run automatic tests (that presumably takes lot more
time than building) to see if all the modules that were built (all
modules of full product) are good too.
Again, arguing a non sequitur. I would love to have a discussion of if
we should have clean builds, but you reply as though I was making the
argument in that quote that we should have incremental build systems.
I was not. I was very clear and explicit in that post that I was
arguing solely that he did not test the incremental build system for
correctness in any significant way.
Again, to change topics to your new point, why should we have correct
incremental builds? Because it's faster, and componentizing components
might not make sense, and it might be more costly to the company than
a correct incremental build system, especially when the cost of the
incremental build system can be amortized over all companies.
Think about it like this. It's all incremental. Splitting it up into
components is one kind of incremental. It's incremental at the
component level. However, the benefit of this can only go so far.
Eventually there would be too many different components, and we're
right in the situation described in Recursive Make Considered Harmful,
the situation without automated dependency analysis. Yes, we do need
to break it down at the component level at some point. It's not
practical to rebuild all of the linux kernel whenever I compile a
Hello World! app, but nor is it practical to say componentization
solves all problems perfectly without need of other solutions like a
parallel build, a distributed build, build farms, faster compilers,
pImpl, and/or incremental builds.
No one notices it because build system does everything automatically.
Checks out changed production branch from repository, runs all sort of
tools and tests and also produces web site about success and the
details and statistics (and changes in such) of various tools ran on
the code and on the freshly built modules. Building is tiny bit of its
job. It can even blame whose exactly changeset in repository did
likely break something. It can use some instant messaging system if
team dislikes e-mails. Of course ... all such features of build system
have to be tested too. If team does not sell the build system, then
testing it is less mission critical. Lets say it did blame an
innocent ... so there is defect and also interested part (wrongly
accused innocent) who wants it to be fixed.
Yes, a build system does everything "automatically" if you do a full
clean build every time, then it is handled automatically. Well, except
it's slow. And if there's a lot of dependency components which are
frequently changing, and you have to manually get these extra-project
dependencies, then we're in Recursive Make Considered Harmful. If
instead you use some automated tool like Maven to download
dependencies, and you do a full clean build, aka redownload, of those
every time, then it's really really slow. (I'm in the situation now at
work where we use Maven to handle downloading a bazillion different
kinds of dependencies. As Maven has this nasty habit of automatically
downloading newer "versions" of the same snapshot version, it's quite
easy to get inconsistent versions of other in-house components. It's
quite inconvenient and annoying. I've managed to deal with it, and
work around several bugs, to avoid this unfortunate default. Did I
mention I hate Maven as a build system?)
Also, an automated build machine polling source control for checkins
can only tell you which checkin broke the automated build (and tests)
if your full clean build runs faster than the average checkin
interval. At my company, the build of the ~25,000 source file project
can take 2-3 hours on some supported systems, and that's without any
tests. The basic test suite add another 5-6 hours. As a rough guess, I
would imagine we have 100s of checkins a day.
Even if we were to break up the stuff by team, with our level of
testing, I don't think we could meet this ideal of "automated build
machine isolates breaking checkin" without a build farm. Even then, as
all of this code is under active development, arguably a change to my
component should trigger tests of every component downstream, and as
inter-component interfaces change relatively often (but thankfully
slowing down in rate), it might even require recompiles of things
downstream. As the occasional recompile is needed of things
downstream, the only automation solution without incremental is to do
full clean rebuilds of the entire shebang.
Yes, I know the canned answer is "fix your build process". That is
still no reason to use the inferior tools (build systems) which would
help even after "we did the right thing" and componentized. Simply
put, full clean rebuilds do not scale to the size of my company's
project, and I argue that incremental correctness would be the
cheapest way to solve all of the problems.
It's unsurprising that I get about as much support in the company as I
do here.
However, I do admit that it might be a bad business decision to do it
fully in-house at this point in time. As I emphasized else-thread, it
is only easily worth it when amortized over all companies, or when
done by someone in GPL in their spare time for fun. However, the only
people who really need it are the large companies, and any single one
of them has little incentive to do it themselves. It's most
unfortunate.