I am loading an .npy file in Python 3.7. The output looks like this:
>>>import numpy as np
>>>dt = np.load('trajectories.npy')
>>>dt
array({'trajectories': array([[[729.78449821, 391.1702509],
[912.41666667, 315.5 ],
[832.0577381 , 325.83452381]],
...,
[[852.92 , 174.16253968],
[923.36053131, 347.92694497],
[878.89942529, 323.26652299]]]), video_path: 'myPath', frames_per_second: 28}, dtype = object)
Given that I am new to numpy ndarrays, the dt object looks like a dictionary to me. However, when I try indexing 'trajectories', I receive an error:
>>>>dt['trajectories']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices
>>>>dt.get('trajectories')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'numpy.ndarray' object has no attribute 'get'
When I treat it as an array:
>>>dt[0]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
IndexError: too many indices for array
When I try converting the array to a tuple, I am told the array is 0-d.
What's going on?
The array that you loaded is actually a scalar, meaning it's an array object with empty shape representing a "non-array" value. In particular, is a scalar with data type object containing a Python dict, which in turn contains a numeric NumPy array under the key 'trajectories'.
In many cases, NumPy scalars can be used indistinctly from the value they contain (e.g. scalar numbers can be used very much like regular Python numbers). However, with objects it is more complicated, because the methods of the object are not exposed through the NumPy scalar. To "unpack" the scalar you can use the item method, which will get the "bare" inner value. Then you will be able to use the object as usual. For example, in your case, you can just do:
dt.item()['trajectories']
And that will give you the internal array in the dictionary.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With