One of the answers for this question says that the following is a good way to read a large binary file without reading the whole thing into memory first:
with open(image_filename, 'rb') as content:
for line in content:
#do anything you want
I thought the whole point of specifying 'rb'
is that the line endings are ignored, therefore how could for line in content
work?
Is this the most "Pythonic" way to read a large binary file or is there a better way?
I would write a simple helper function to read in the chunks you want:
def read_in_chunks(infile, chunk_size=1024):
while True:
chunk = infile.read(chunk_size)
if chunk:
yield chunk
else:
# The chunk was empty, which means we're at the end
# of the file
return
The use as you would for line in file
like so:
with open(fn. 'rb') as f:
for chunk in read_in_chunks(f):
# do you stuff on that chunk...
BTW: I asked THIS question 5 years ago and this is a variant of an answer at that time...
You can also do:
from collections import partial
with open(fn,'rb') as f:
for chunk in iter(functools.partial(f.read, numBytes),''):
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With