There are two aspects: (1) avoiding a regular edit-compile-link cycle;
(2) avoiding the set-up or run-time needed to get to a test-point.
(1) comes about because C isn't really designed to be compiled and linked
quickly, and there are some complicated (and therefore slow) tools around.
In today's tools this is often the case. There is nothing inherent about
C that requires it be a slow compilation + linking process. The linker is
not even needed. It's used as a standard component so that multiple prior-
compiled objects can be linked together without requiring the entire thing
be compiled whenever one change is made. But, that's a design decision.
A rethinking of how that process could work (even with static-linked-in
objects) bypasses it. That's what Microsoft did with their incremental
linker (ilink.exe) which links in components atop the previously
constructed ABI.
I acknowledge that (2) can sometimes be of benefit, although what can be
done is restricted. MSDN says this about E&C on C#: "/The rule of thumb for
making E&C changes is that you can change virtually anything within a method
body, so larger scope changes affecting public or method level code are not
supported./"
C# is a managed language running in virtual machine (like Java). It has
limitations imposed upon it by the virtual machine.
I think that if compile-and-build was instantaneous, then you would have
far less need of such a feature (E&C). You don't always have an elaborate
set-up, or you can make other arrangements to take care of it (as I do).
Visual Studio 2003's "apply changes" is almost instantaneous (on the types
of programming I do, typically a main .exe, a few DLLs).
You might have become too dependent on this feature. Even ordinary debugging
should only be used for 'intractable' problems, it is sometimes said.
Computers compute. I believe that we should have immediate results from
our efforts. When we're coding, everything up to the point where we are
coding should be available for us to see the results of immediately. The
computer can spawn, execute until failure, and then pause or discard the
computed results. This ability doesn't exist today in standard tool chains
so it may be hard to visualize ... but there is nothing preventing it from
happening except the fact that it's never been coded for.
In fact you might be using the wrong language, if you do a lot of
programming by trial and error. (Which is what I often do, but I know
that C isn't the best language to use that way: you spend time typing
loads of semicolons and punctuation, writing forward prototypes, creating
convoluted code to get around a missing syntax feature, only to tear it
all down five minutes later as you try something else!)
You have a few false premises. I don't do a lot of programming by trial and
error. I have discovered that it is much faster to code something and then
see immediate results on it, making any required changes if it's wrong, than
to spend time in a static in-my-head environment trying to get it perfect and
flawless before I ever compile. Since the computer can run something for me,
and because it is very fast, and, with edit-and-continue, I can set a
breakpoint to a particular point in code, then step over my lines of code
one-by-one and see the immediate result on the computed data, I don't need
to spend as much time writing and thinking through code. I can give it my
best consideration, and then run it and fix it on-the-fly.
In my experience this model is notably faster than trying to get everything
perfect the first time before ever compiling because it takes a long time to
think through every scenario, and it doesn't take a long time to test it.
And, as you are seeing the changes come through line-by-line, it will spark
other considerations as you go, things the data makes you think of that you
may not of spontaneously thought of while you were in "coding the algorithm"
mode.
Most times I code something anymore I get it exactly right. I may make odd
typing mistakes, or what have you, but in my head the algorithm was exactly
what I needed the first time out. That doesn't change the fact that it's
still faster to do this inside the debugger where I can directly test my
changes in real-time, than to do it all in my head.
The computer is a tool. It should be helping people. If the environment
exists to let it have immediate execution on data, and a reset ability
should you need to start back over ... it only makes sense to use it. And
I'm not the only person who agrees. The GCC folk agree. Apple agrees.
And Microsoft has agreed for a long time.
I don't think you realize the project I'm currently working on, BartC.
If you are interested, please spend some time considering it:
http://www.visual-freepro.org
There's a wiki, videos, and you can see all of the source code. In one
paragraph, this is what I'm doing:
Creating a new virtual machine, writing the compilers for it,
creating the debugger for it, creating the IDE for it, coding all
algorithms related to this from the ground up, and providing
facilities for a Visual FoxPro-compatible language called Visual
FreePro.
I've been 18 months on this project since I started. I'm pressing forward
on it daily. It is a tremendous amount of design, coding, and so on. I
am currently about 50,000 lines of code into it, and I have about another
50,000 lines of code to go before it's where I want it to be. After that,
I will be porting it to my own operating system on x86, then later ARM,
then on to 64-bit x86, and 64-bit ARM.
My goals are to create new toolset alternatives to what currently exists,
and ones which, from inception, employ these features I go on about. They
exist in part today in various toolsets ... but what I'm trying to create
is a new ecosystem, a new community, something I call "The Village Freedom
Project" whereby all of the people world-wide will have free access to
these tools should they wish to use them. And I am doing all of this upon
the name of Jesus Christ, giving back the best of the talents, skills, and
abilities He gave me, unto Him, and unto mankind.
Nobody has to use my offering. I am doing this for Him, and for all of
those people who will want to use it. I am doing this because I recognize
from where my skills originate, and who it was who gave them to me in the
first place, and I desire to pass along my skills to you, and others, so
that each of you may gain from that which He first gave me.
Best regards,
Rick C. Hodgin