How to convert an integer to ASCII character ?

A

akarui.tomodachi

What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

What I did:
.....
.....
char myStr[];
int myInt = 3;
sprintf(myStr, %d,myInt);
.....
.....

Please comment.
 
S

serrand

What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

What I did:
....
....
char myStr[];
int myInt = 3;
sprintf(myStr, %d,myInt);
....
....

Please comment.

#define DIGILEN log10 (MAX_INT) +2

char buf[DIGILEN];
sprintf(buf, "%d", int_var);

Xavier
 
S

serrand

serrand said:
What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

What I did:
....
....
char myStr[];
int myInt = 3;
sprintf(myStr, %d,myInt);
....
....

Please comment.

#define DIGILEN log10 (MAX_INT) +2

char buf[DIGILEN];
sprintf(buf, "%d", int_var);

Xavier

oops... sorry

#define DIGILEN (int)(log10 (MAX_INT) +3)

Your way seems to be the simpliest...

sprintf is doing the same job as printf : wheras printf outputs in stdin
sprintf outputs in its first argument, which have to be an allocated string

Xavier
 
L

Lew Pitcher

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
int i = 3;
char c = i + '0';

???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

- --
Lew Pitcher

Master Codewright & JOAT-in-training | GPG public key available on request
Registered Linux User #112576 (http://counter.li.org/)
Slackware - Because I know what I'm doing.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.7 (GNU/Linux)

iD8DBQFD9+6zagVFX4UWr64RAlFEAJ9CA4LmNOY13Nry6tTDnT5zrlcO3gCfU/hU
AGWsWWx0Al9HQaZtiP46Gas=
=OS08
-----END PGP SIGNATURE-----
 
C

CBFalconer

What is the most easiest way to convert an integer value to ASCII
character format ?
I tried with sprintf(). It works.
Is there any other way to do that ?

Objective::
I like to convert an integer value of 3 and write into a string buffer.

#include <stdio.h>

/* ---------------------- */

static void putdecimal(unsigned int v, char **s) {

if (v / 10) putdecimal(v/10, s);
*(*s)++ = (v % 10) + '0';
**s = '\0';
} /* putdecimal */

/* ---------------------- */

int main(void) {

char a[80];

char *t, *s = a;

t = s; putdecimal( 0, &t); puts(s);
t = s; putdecimal( 1, &t); puts(s);
t = s; putdecimal(-1, &t); puts(s);
t = s; putdecimal( 2, &t); puts(s);
t = s; putdecimal(23, &t); puts(s);
t = s; putdecimal(27, &t); puts(s);
return 0;
} /* main */

--
"If you want to post a followup via groups.google.com, don't use
the broken "Reply" link at the bottom of the article. Click on
"show options" at the top of the article, then click on the
"Reply" at the bottom of the article headers." - Keith Thompson
More details at: <http://cfaj.freeshell.org/google/>
Also see <http://www.safalra.com/special/googlegroupsreply/>
 
T

Thad Smith

> What is the most easiest way to convert an integer value to ASCII
> character format ?

Lew said:
???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

Interesting. Neither is the earlier code converting 3 guaranteed to
produce ASCII.
 
K

Kenny McCormack

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1


???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

The OP was asking how to do it with 3, not 300. You need to keep up.
 
?

=?ISO-8859-1?Q?Martin_J=F8rgensen?=

Lew said:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1



???

int i = 300;
char c = i + '0' ; /* nope. not an ascii character */

It works for single digits, right?


Best regards / Med venlig hilsen
Martin Jørgense
 
L

Lew Pitcher

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

stathis said:
Assuming ASCII it does.

Assuming any conforming C implementation, it does. The C standard guarantees it.

- --
Lew Pitcher

Master Codewright & JOAT-in-training | GPG public key available on request
Registered Linux User #112576 (http://counter.li.org/)
Slackware - Because I know what I'm doing.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.7 (GNU/Linux)

iD8DBQFD+QbaagVFX4UWr64RAkS4AJ9H8kT8tck4HFxxhC2f+xmDPRRu5QCgnbor
SX0i8pvlRNTifgvwU0h9Od0=
=YdQ6
-----END PGP SIGNATURE-----
 
B

Ben Pfaff

Lew Pitcher said:
Assuming any conforming C implementation, it does. The C
standard guarantees it.

The C standard guarantees that decimal digits are sequential and
in the proper order. The C standard doesn't guarantee that the
execution character set is ASCII. The OP asked to convert an
integer value to *ASCII* character format specifically.

Here's a portable way to get a single ASCII digit: 48 + num.
 
S

stathis gotsis

Flash Gordon said:
Assuming an implementation that conforms to the C standard it does,
whether it is ASCII or not. It's one of the few things the C standard
guarantees about the execution character set.

I was not aware of that, thanks for the correction.
 
T

Thad Smith

Flash said:
Assuming an implementation that conforms to the C standard it does,
whether it is ASCII or not.

Check Ben's point elsewhere in the thread. The OP defined "works" as
producing ASCII. While the code in question produces the corresponding
digit character in the execution set, it only produces the correct ASCII
character if the execution set is ASCII.
 
D

Dave Thompson

serrand wrote:
#define DIGILEN log10 (MAX_INT) +2

char buf[DIGILEN];
sprintf(buf, "%d", int_var);
oops... sorry

#define DIGILEN (int)(log10 (MAX_INT) +3)
In C89 an array bound must be a constant expression, and no function
call, even to the standard library, qualifies. In C99 this is still
true for an object with static duration, but if your code snippet is
entirely within a function (not shown) and thus is automatic, this is
legal, though rather inefficient. A C89-legal and (probably) much more
efficient method is to approximate the digits needed for the maximum
value that could be represented in the object size:
sizeof(int)*CHAR_BIT * 10/3 + slop_as_needed
Your way seems to be the simpliest...

sprintf is doing the same job as printf : wheras printf outputs in stdin

stdout. Frequently stdin stdout and stderr are all the/an interactive
terminal or console or window or whatever, but they need not be.
sprintf outputs in its first argument, which have to be an allocated string

- David.Thompson1 at worldnet.att.net
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,997
Messages
2,570,239
Members
46,827
Latest member
DMUK_Beginner

Latest Threads

Top