I have gone over this code and I have noticed this casting that looks weird
oldidt = (unsigned long long *)(*(unsigned long*)(oldidtr+1));
To me it looks like the first cast is affecting the +1 so it will move 4 bytes (a pointer), and the second cast is for the result being an unsigned long long *. And the star in the outer shell on the inner cast is an "access this memory" star.
+1 will still be a 4 byte jump.oldidt = *(oldidt+1);? (Assuming the compiler doesn't complain, and gives out and exec)The declaration of oldidt was:
static unsigned long long *oldidt;
I'm calling the casts "first" and "second" according to invocation (left is the second).
It looks like this code is using several unsigned short ints to store pointers in two halves. This assumes that an unsigned short int is half as wide as a pointer. The code is extracting the stored pointer from two adjacent shorts, namely the second and third member of an array, and interpreting it as a pointer to an unsigned long long int.
So, the final cast is necessary to reinterpret an integer as a pointer, while the first (inner) cast serves to read a differently typed value from an existing variable, namely a long from a short (or rather, from two adjacent shorts).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With