Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python - Best data structure for incredibly large matrix

I need to create about 2 million vectors w/ 1000 slots in each (each slot merely contains an integer).

What would be the best data structure for working with this amount of data? It could be that I'm over-estimating the amount of processing/memory involved.

I need to iterate over a collection of files (about 34.5GB in total) and update the vectors each time one of the the 2-million items (each corresponding to a vector) is encountered on a line.

I could easily write code for this, but I know it wouldn't be optimal enough to handle the volume of the data, which is why I'm asking you experts. :)

Best, Georgina

like image 658
Georgina Avatar asked Nov 17 '25 06:11

Georgina


1 Answers

You might be memory bound on your machine. Without cleaning up running programs:

a = numpy.zeros((1000000,1000),dtype=int)

wouldn't fit into memory. But in general if you could break the problem up such that you don't need the entire array in memory at once, or you can use a sparse representation, I would go with numpy (scipy for the sparse representation).

Also, you could think about storing the data in hdf5 with h5py or pytables or netcdf4 with netcdf4-python on disk and then access the portions you need.

like image 60
JoshAdel Avatar answered Nov 18 '25 19:11

JoshAdel



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!