Stewart Gordon wrote:
[ ... ]
How many BASIC dialects were made to be compiled rather than
interpreted?
The question was not dialects, but language. The language was
originally designed to be compiled to native code, and for that matter
the first implementations DID compile to native code.
And anyway, whether it uses GC on strings is surely
implementation dependent.
It was _designed_ to use GC. You can argue that it might be possible to
implement it in other ways if you want, but then the same is true of
other things as well. There definitely HAS been at least one Lisp
interpreter that didn't use GC, but that doesn't change the original
design or intent.
Two options come to mind:
- use GC, copy on write (maybe this is one instance in which reference
counting would be the more efficient option)
Reference counting isn't an alternative to reference counting --
rather, it's a technique that _supports_ reference counting.
- copy on assignment, deallocate when it goes out of scope (in some
primitive BASICs that don't have procedures, this would mean deallocate
never, or perhaps only when assigned the value "")
Given the memory constraints at the time (remember, BASIC has been
around since 1964) this simply wasn't a realistic option.
The bottom line: BASIC strings were designed with GC in mind, and
essentially virtually every BASIC implementation ever has used exactly
that. Theoretically there may be other options, but none of this
changes the intent of the original design, nor the fact that the
majority of BASIC implementations _have_ used GC. It is true that there
have been quite a few
BASIC interpreters, but it's also true that it was designed to be
compiled, and many implementations have done exactly that.