F
fermineutron
I am trying to write a function which will convert a large ascii
number, say 100 ascii digits, to its binary representation. It seems
that evey algorithm I am trying to think of is backwards. Normaly in
pen and paper ascii to binary conversion one would start by
subtracting largest power of 2 that can be subtracted from the given
number and work from left to right filling memory up with 1s and 0s.
In my case I cant really do that b/c the largest power of 2 which
would fit into a number i am trying to convert is too large even for
64 bit int.
Any help, suggestions, sample code would be greatly apreciated.
ets assume I have an array of char type, evey element of which has a
value equal to the value of next digit in ascii number.
Thanks ahead
number, say 100 ascii digits, to its binary representation. It seems
that evey algorithm I am trying to think of is backwards. Normaly in
pen and paper ascii to binary conversion one would start by
subtracting largest power of 2 that can be subtracted from the given
number and work from left to right filling memory up with 1s and 0s.
In my case I cant really do that b/c the largest power of 2 which
would fit into a number i am trying to convert is too large even for
64 bit int.
Any help, suggestions, sample code would be greatly apreciated.
ets assume I have an array of char type, evey element of which has a
value equal to the value of next digit in ascii number.
Thanks ahead