If a is an int array, (char*) &a[1] - (char *)&a[0] is equal to 4, while &a[1] - &a[0] is equal to 1. why is that?
Pointer math operates on the size of the data structure its pointing to. This is because if I do this:
int array[10] ;
int * p = array ;
p ++ ;
I want p pointing at the second int, not some memory halfway in between two elements.
So &a[1] is four bytes apart from &a[0] but asking it &a[1] - &a[0] asks how many ints apart it is. When you cast it to char you ask for the math in terms of the size of char.
When you do
&a[1] - &a[0]
since a is an int array, an implicit (int *) pointer is assumed, that is
(int *)&a[1] - (int *)&a[0]
Hence since both are of type pointer to int , their difference gives 1.
But when you do-
(char*) &a[1] - (char *)&a[2]
assuming int is 4 bytes and char is 1 byte on your compiler, the difference will be four since each element of a is int and has four bytes.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With