From: Kevin Bowers Newsgroups: cmu.cs.class.cs213 Subject: Pointer Arithmetic Date: Mon, 27 Feb 2006 17:48:28 -0500 There has been a lot of confusion regarding pointer arithmetic. It was obvious from the last quiz that most students do not understand the material. Below is an explanation of how pointer arithmetic works, specifically with reference to the last quiz. This was tested on one of the fish machines (64 bit). Take the following: int * A[2][3]; (initialized to actual values) typeof(A) = int*** sizeof(A) = 48 = 8 x 6 (8 byte pointers * 2 rows * 3 columns) A = 0xe6b0 A+1 = 0xe6c8 = 0xe6b0 + 0x18 = 0xe6b0 + 24(size of a row) A[1] = 0xe6c8 A+1 = A[1]. However, their types do not match. If you try to compile (A+1 == A[1]) you get a warning to that effect. typeof(A+1) = int*** != typeof(A[1]) = int** typeof A[1] = int** sizeof(A[1]) = 24 = 8 x 3 (8 byte pointers * 3 columns) A[1]+2 = 0xe6d8 &(A[1][2]) = 0xe6d8 A[1]+2 = &(A[1][2]). Both have type int**. Checking that A[1]+2 == &(A[1][2]) compiles without error. *(A[1]+2) == A[1][2] also works. When computing equivalence, it is important to look at both the type and the computation. Things of type int***: A, A+5, &(A[1]) Things of type int**: A[1], A[1]+2, *(A+1), (int**)A+1, &(A[1][2]) Things of type int*: A[1][2], *(A[1] + 2), **(A+1), ((int**)A)[5], *(((int**)A)+5) Because A is a pointer to a matrix, adding to A is equivalent to incrementing the rows. Similarly, A[1] is a pointer to an array, adding to A[1] increments through the columns of the array. However, in both cases you must be careful to match types. (int**)(A+1) == A[1] == *(A+1) == &A[1][0]. Similarly, if you first cast A and then add to it, you can treat the matrix as an array. ((int**)A) + 5 == A[1] + 2 == &(A[1][2]). *(((int**)A) + 5 == *(A[1]+2) == A[1][2]. This is because the matrix actually gets stored sequentially in memory when allocated this way so memory looks like: ------------------------------------------------------------- | A[0][0] | A[0][1] | A[0][2] | A[1][0] | A[1][1] | A[1][2] | ------------------------------------------------------------- Let's step through an example to make sure things are clear. (((int**)(A+1))+2) == &A[1][2] Why is this true? Looking at the left side: First we add 1 to A, which gives us the address of A[1][0], but of the wrong type. We then cast it to the correct type which is (int**). From there we can add 2, which will move us 2 elements forward in the array. This gives us the address of A[1][2], also written &A[1][2]. Similarly, we can dereference the left side to get the value in A[1][2], which is of type int*. *(((int**)(A+1))+2) == A[1][2] We had to cast (A+1) before incrementing by 2 or we would have gotten (A+1)+2 = A + 3, which is not only of the wrong type, but is equivalent to A[3][0], which is out of bounds. Hopefully this provides you with a little insight into pointer arithmetic. Remember to check both the value of the pointer as well as it's type before determining equivalence. As always, feel free to email your TA if there is something about pointer arithmetic, or any other topic, which does not makes sense. Kevin kbowers+cs213@cs.cmu.edu "A bus stops at a bus station, a train stops at a train station, on my desk I have a workstation..."