A
Al
I am typing the following code to convert a binary number to decimal.
It usually works fine with little numbers, but with some numbers, it
reads strange values:
#include <stdio.h>
#include <stdlib.h>
int main()
{
int bin, dec=0, num=1;
scanf("%d", &bin);
printf("%d\n", bin); //look at the value it prints!!
dec += (bin%10) * 1;
bin /= 10;
while (bin>1)
{
dec += (bin%10) * num*2;
num*=2;
bin /= 10;
}
dec += bin * num*2;
printf("%d\n", dec);
system("pause");
return 0;
}
When I debug or print the value of bin with an input of: 10011001100,
the values it prints are a lot different: 1421066508 for bin value and
2028 for dec!!!!!!!!
What's the problem? I it for integer overflow maybe?
_____ (also posted in comp.lang.c++)
It usually works fine with little numbers, but with some numbers, it
reads strange values:
#include <stdio.h>
#include <stdlib.h>
int main()
{
int bin, dec=0, num=1;
scanf("%d", &bin);
printf("%d\n", bin); //look at the value it prints!!
dec += (bin%10) * 1;
bin /= 10;
while (bin>1)
{
dec += (bin%10) * num*2;
num*=2;
bin /= 10;
}
dec += bin * num*2;
printf("%d\n", dec);
system("pause");
return 0;
}
When I debug or print the value of bin with an input of: 10011001100,
the values it prints are a lot different: 1421066508 for bin value and
2028 for dec!!!!!!!!
What's the problem? I it for integer overflow maybe?
_____ (also posted in comp.lang.c++)