I believe I'm misunderstanding something that should probably be simple.
I'm trying to accept arguments passed to a python script. The type of argument I'm expecting is somewhere along the lines of "\xaa\xbb\xcc\xdd", and it keeps converting it to binary(i think?) instead of actually allowing me to print it out just how it was passed.
How can I get this to do what I'm looking for it to do? I'd ultimately like to take this and be able to convert it to something like 0xddccbbaa, but I guess I'd at least like to get the first step completed by being able to interpret it.
Like, I don't want printing \x75 to print out a u. I want to be able to interpret \x75 as \x75 . Any easy way to do this?
Some demonstrations with struct:
x = b"\xaa\xbb\xcc\xdd"
import struct
struct.unpack('I',x)
Out[3]: (3721182122,)
y = struct.unpack('I',x)
y[0]
Out[5]: 3721182122
hex(y[0])
Out[6]: '0xddccbbaa'
Essentially: treat the bytestring as a little-endian 4-byte unsigned integer ('I'). struct handles turning it into an int, and you can use hex to get a string representation of it in hex (or use something like '{:x}'.format(y[0]), if you prefer)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With