I have made a numpy array out of data from an image. I want to convert the numpy array into a one-dimensional one.
import numpy as np
import matplotlib.image as img
if __name__ == '__main__':
my_image = img.imread("zebra.jpg")[:,:,0]
width, height = my_image.shape
my_image = np.array(my_image)
img_buffer = my_image.copy()
img_buffer = img_buffer.reshape(width * height)
print str(img_buffer.shape)
The 128x128 image is here.

However, this program prints out (128, 128). I want img_buffer to be a one-dimensional array though. How do I reshape this array? Why won't numpy actually reshape the array into a one-dimensional array?
.reshape returns a new array, rather than reshaping in place.
By the way, you appear to be trying to get a bytestring of the image - you probably want to use my_image.tostring() instead.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With