is it possible to cast a one dimensional array to a two dimensionalarray?

S

SamL

Is it possible to cast a one dimensional array to a two dimensional
array? For example, cast int a[100] to int a[10][10]. If that is not
possible, is there a way that allows me to use a[100] as b[10][10]? I
do not want to use it as a pointer to pointer, I need to use it as a
true two dimensional array. Thanks.
 
J

James Kuyper

SamL said:
Is it possible to cast a one dimensional array to a two dimensional
array? For example, cast int a[100] to int a[10][10]. If that is not
possible, is there a way that allows me to use a[100] as b[10][10]? I
do not want to use it as a pointer to pointer, I need to use it as a
true two dimensional array. Thanks.

You cannot cast arrays, you can only cast values of scalar types. A
pointer has a scalar type, so can cast a pointer to one type into a
pointer to an array, which should do the job. Thus, you can do:

int (*b)[10] = (int(*)[10]) a;

Then b[row][col] == a[row*10+col], for valid values of row and col.

That looks like the cast is being applied to 'a', but in C an array name
gets automatically converted, in most cases, into a pointer to the first
element of the array, and that's exactly what happens here, so the
conversion is actually being performed on a pointer, not an array.

Technically, the standard says nothing about where 'b' points. However,
in practice it will almost certainly point at the same location in
memory, so this should work.

Since, in many ways, pointers function a lot like pointers, this may
meet your needs.
 
S

Spiros Bousbouras

SamL said:
Is it possible to cast a one dimensional array to a two dimensional
array? For example, cast int a[100] to int a[10][10]. If that is not
possible, is there a way that allows me to use a[100] as b[10][10]? I
do not want to use it as a pointer to pointer, I need to use it as a
true two dimensional array. Thanks.

You cannot cast arrays, you can only cast values of scalar types. A
pointer has a scalar type, so can cast a pointer to one type into a
pointer to an array, which should do the job. Thus, you can do:

int (*b)[10] = (int(*)[10]) a;

Then b[row][col] == a[row*10+col], for valid values of row and col.

That looks like the cast is being applied to 'a', but in C an array name
gets automatically converted, in most cases, into a pointer to the first
element of the array, and that's exactly what happens here, so the
conversion is actually being performed on a pointer, not an array.

Technically, the standard says nothing about where 'b' points. However,
in practice it will almost certainly point at the same location in
memory, so this should work.

Don't you mean the standard says nothing about where b
points for i>0 ? b[0] is safe but b[1] technically is UB , is it
not ?
 
J

James Kuyper

Spiros said:
SamL said:
Is it possible to cast a one dimensional array to a two dimensional
array? For example, cast int a[100] to int a[10][10]. If that is not
possible, is there a way that allows me to use a[100] as b[10][10]? I
do not want to use it as a pointer to pointer, I need to use it as a
true two dimensional array. Thanks.
You cannot cast arrays, you can only cast values of scalar types. A
pointer has a scalar type, so can cast a pointer to one type into a
pointer to an array, which should do the job. Thus, you can do:

int (*b)[10] = (int(*)[10]) a;

Then b[row][col] == a[row*10+col], for valid values of row and col.

That looks like the cast is being applied to 'a', but in C an array name
gets automatically converted, in most cases, into a pointer to the first
element of the array, and that's exactly what happens here, so the
conversion is actually being performed on a pointer, not an array.

Technically, the standard says nothing about where 'b' points. However,
in practice it will almost certainly point at the same location in
memory, so this should work.

Don't you mean the standard says nothing about where b
points for i>0 ? ...


No, I meant what I said. The only relevant thing the standard says is "A
pointer to an object or incomplete type may be converted to a pointer to
a different object or incomplete type. If the resulting pointer is not
correctly aligned for the pointed-to type, the behavior is undefined.
Otherwise, when converted back again, the result shall compare equal to
the original pointer. When a pointer to an object is converted to a
pointer to a character type, the result points to the lowest addressed
byte of the object. Successive increments of the result, up to the size
of the object, yield pointers to the remaining bytes of the object."
(6.3.2.3p7).

Pay careful attention to that statement: it only specifies where the the
resulting pointer points when it is converted to a pointer to a
character type. 'int[10]' isn't a character type.
... b[0] is safe but b[1] technically is UB , is it
not ?

The issue you're thinking about would apply only to conversion in the
opposite direction:

int c[10][10];
int *d = c[0];

Given the above code, d+i is undefined behavior for i<0 or i>10, while
d has undefined behavior for i<0 or i>9. People often say that this
is only a problem if an implementation supports run-time array-bounds
checking using fat pointers, or some similar technique. However, it's
perfectly possible for this code to malfunction in a more conventional
context. Because the behavior is undefined, a compiler could produce
code optimized on the assumption that d cannot alias any element c[j]
for j!=0; optimizations that will produce unexpected results if it does
in fact alias any element of any other row of c. For instance, consider
the following code:

for(int i=20; i<30; i++)
d = 2*c[2][3];

Because d has undefined behavior for i>9, an implementation is free
to ignore what the range of values of 'i' actually is, and to optimize
the above loop by moving the evaluation of c[2][3] out of the loop,
producing code equivalent to:

int temp = 2*c[2][3];
for(int i=20; i<30; i++)
d = temp;

Now, the original code might have been written by a developer who didn't
think about the aliasing, in which case the above optimization might
have been exactly what was intended; but that's just an (un)lucky
accident. However, that kind of optimization would not have been
permitted in code where the aliasing was legal.

Converting a pointer at the first element of a one-dimensional array of
int into a pointer to an array of 10 ints doesn't raise any of the same
issues that apply to c and d above.
 
S

SamL

SamL said:
Is it possible to cast a one dimensional array to a two dimensional
array? For example, cast int a[100] to int a[10][10]. If that is not
possible, is there a way that allows me to use a[100] as b[10][10]? I
do not want to use it as a pointer to pointer, I need to use it as a
true two dimensional array. Thanks.

This may suit your needs:

---------------------------------------------------------------------------
#include <stdio.h>

int main(void)
{
    int a[9] = {1, 2, 3, 4, 5, 6, 7, 8, 9};
    int (*b)[3] = (int (*)[]) a;

    printf("Show me 4! %d\n", ((int(*)[3]) a)[1][0]);
    printf("Again! %d\n", b[1][0]);

    return 0;

}

Thanks a lot. It works.
 
D

David Thompson

You cannot cast arrays, you can only cast values of scalar types. A
pointer has a scalar type, so can cast a pointer <snip>
int (*b)[10] = (int(*)[10]) a; // a is int[100]

Then b[row][col] == a[row*10+col], for valid values of row and col.

That looks like the cast is being applied to 'a', but in C an array name
gets automatically converted, in most cases, into a pointer <snip>
Since, in many ways, pointers function a lot like pointers, this may
meet your needs.

Obviously you meant pointers like arrays -- or arrays like pointers.
(Or both!)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,078
Messages
2,570,572
Members
47,204
Latest member
MalorieSte

Latest Threads

Top