I have the following snippet:
tempfile = Tempfile.new(export_file.filename)
begin
tempfile.write(contents)
file_storage_service.store(export_file.filename, tempfile)
ensure
tempfile.close!
end
And the store method is as follows:
def store(filename, file)
client = Aws::S3::Client.new(options)
object = Aws::S3::Object.new(bucket_name, filename, client: client)
object.upload_file(file)
end
My issue is that I seem to get a Aws::S3::Errors::BadDigest error on one of my cloud machines, but locally this works as expected.
I believe the tempfile is unlinked while the store method is being called, resulting in AWS comparing two different digests, but I'm not very sure about this. I have Ruby 2.1.6 on both machines, the local one running OS X and the cloud one Linux.
What can I do to fix this? And what's the cause of the problem?
P.S.: I've tried both close! and close on the tempfile, with the same results.
It seems that S3's upload_file expects a file with the cursor reset. Calling tempfile.rewind just before calling store solves this issue.
Ok folks, my problem was even deeper. With tempfile.rewind the error disappeared but the changes made to the file directly were not reflected in memory, so instead of using rewind, I ended up using:
tempfile.close
tempfile.open
That fixed my problem completely !
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With