onkar said:
how many bytes will be allocted by following code -
#include<stdio.h>
#define MAXROW 3
#define MAXCOL 4
int main(int argc,char **argv){
int (*p)[MAXCOL];
p=(int (*)[MAXCOL])malloc(sizeof(*p)*MAXROW);
printf("%d\n",sizeof(*p));
return 0;
}
The number of bytes allocated by a call to malloc() is simply the
value of the argument passed to malloc(). (That's assuming the
allocation succeeds; if it fails, malloc() returns a null pointer, and
you should *always* check for that.)
*p is of type int[MAXCOL], or int[4], so sizeof(*p) is 4*sizeof(int).
Multiplying by MAXROW, or 3, gives us 12*sizeof(int), so that's the
number of bytes allocated. (We don't know what sizeof(int) is on your
system.)
But there are some problems with your code.
Don't cast the result of malloc(). If you call malloc(), you must
have a "#include <stdlib.h>" to make its declaration visible.
(Casting the result can mask the error message triggered by calling
malloc() with no visible declaration. It's like cutting the wires to
the oil light on your car's dashboard rather than adding oil; either
solution will turn off the warning light, but only one actually fix
the problem.)
printf's "%d" format expects an int argument. sizeof yields a result
of type size_t. To print a size_t value, you can convert it to some
type that printf knows about. For example:
printf("%lu\n", (unsigned long)sizeof *p);
C99 has "%zu", which accepts a size_t value directly, but not all
implementations yet support it.
You may be thinking that the value printed reflects the number of
bytes allocated. It doesn't. If p points to the first element of an
array, sizeof(*p) gives you just the size of that element, not the
size of the entire array.
Arrays and pointers are tricky, and pointers to arrays are seldom
useful. I suggest reading section 6 of the comp.lang.c FAQ,
<
http://www.c-faq.com/>.