As a concrete example of a situation where values are
represented without any individual "bit-artifacts" being
present, consider the data stream between two modems. If I
understand correctly (I'm not a modem maven), each change in
the signal encodes an entire group of bits: one phase shift
represents 000, another phase shift represents 010, and so
on. "Value bits" are transmitted, but no isolated "signal
bits" are discernible.
There are a number of different encoding mechanisms used for
modems designed for use with telephone lines (the term 'modem'
applies to a number of other situations as well.) The first standards,
for 110 and 300 baud, were straight single-frequency carriers with
no phase encoding. 1200 baud was, if I recall correctly, dual frequency
but not phase encoded. All of the CCITT standard encodings from 2400 baud
upwards rely upon phase encoding. The proprietary Telebit protocols
worked rather differently (128 or 256 frequency channels but
individual signals were held for long periods.)
It is not exactly accurate to say that -each- change in the signal
encodes a group of bits; rather, the signal is sampled at several
intervals along the cycle, and the details of the "phase breaking"
that is imposed on the base sine wave over that interval encodes
a cluster of bits in a completely non-linear non-binary manner
[i.e., there is no single aspect of the phase changes that encodes
any one particular bit.]
Especially for the higher speeds, extra "logical" bits are encoded into
the signal, and the decoded group of bits is run through a
"constellation" corrector to find the most probable original bit group.
In theory the constellation corrector could map to a bit grouping that
had no obvious resemblance to the bits read off of the signal -- with
the non-linear encoding of bits and taking into account the need for
some slosh while the signal slews, the constellation correction could
decide that one should have chosen a different non-linear decoding.
Another example of non-binary encoding is gigabit ethernet, which
starts with the signaling mechanisms used for 100 megabit ethernet,
raised the carrier frequency from 100 megahertz to 125 megahertz,
but then does phase encoding and constellation correction that extracts
10 databits from each 16 signal values decoded out of the wave.