Are you acknowledging that text mode is useful?
I can see how it can be in certain cases. On the whole, no. There are
simply too many variables on input files to rely upon something that may
or may not make your task easier. It's easier and better (from a debugging
and maintenance point of view) to account for the variables yourself. If
there is a later problem you can step through your code exactly and then
know explicitly what was causing the error.
If C had defined specified sizes for predefined types, then int would
probably be 16 bits and long would be 32 (the sizes they had in early
PDP-11 implementations).
That's no problem. I have created variables in my RDC which relate to
the bit size. Had these been introduced from the beginning, they would've
been straight-forward to anyone approaching the language:
s = signed, u = unsigned, f = floating point
s8, u8
s16, u16
s32, u32
s64, u64
f32, f64
And with modern additions like the double-double and quad-double floating
point extensions using floating point hardware:
f128, f256
And with modern extensions like MPFR:
fN
And other types which can be added.
Or perhaps not. The first edition of K&R, the book that defined the
language in 1978, showed int with a size of 16 bits on the PDP-11, 36
bits on the Honeywell 6000, and 32 bits on the IBM 370 and Interdata
8/32. None of those platforms had a 64-bit integer type, because 64-bit
integer arithmitec was not supported on the hardware of the time.
Which of those choices would you want to impose on all C implementations
for all platforms?
I would NEVER have defined a type like CHAR or INT or LONG. It would ALWAYS
have been of the bit size and signedness. That way I would know exactly
what I was using no matter the platform.
On the other hand, if you want fixed-width types, you can use int8_t,
int16_t, int32_t, and int64_t, defined in <stdint.h>, which was added to
the language by the 1999 standard.
When was C created? The same should've been added far earlier as first-class
citizens of the language, native throughout.
I suggest you need to be more familiar with what's already been done.
Reinventing the wheel is fine, but you may find that someone has already
figured out how to make it round.
It's lame. Even the 1999 syntax int16_t is lame. I'm expected to type that
many characters in for every use. We are human beings. We deal with
particular things. And no matter what a compiler can understand, we read
source code. The requirement of having that kind of declaration is silly.
I even struggle with s8 being two characters, and s16 and later being three.
I have tried various alternatives to make s8 be three so they're all equal.
The purposes of RDC are an integrated system which provides much GUI support
for developers. Some things have to be clunky in source code to convey
accurately the intended meaning ... however, it doesn't have to be shown to
the developer that way. The GUI will replace a lot of clunky code requirements
with more reader-friendly forms which convey the same meaning.
The VVM I'm building, and the associated language RDC, is part of an
exceedingly flexible and expandable platform. RDC is one language, as is
VXB++ (the xbase portion). They will initially exist, but there is nothing
to prevent a fully compliant C, C++, BASIC, or any other language from also
being added and used with the same tools provided a few crucial components
are added to support it.
There are better ways to serve human beings than rigid mechanics which make
sense logically, but are completely non-intuitive for daily use.
This is all my opinion. Everyone else may disagree. But for me it's been of
such a strong driving force within me that the last 20 years of my life have
been spent in pursuit of changing it. For the record I've only made some
progress and have hit wall after wall after wall in trying to move forward,
the enemy of this world pounding on me, preventing me from proceeding because
my goals are to serve people, not money.
Best regards,
Rick C. Hodgin