Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Large file upload to amazon s3 failing after 30 second limit set by heroku

I store my uploaded files in amazon s3 services with the following command

AWS::S3::S3Object.store(params[:uploadfile].original_filename, open(params[:uploadfile]), 'mybucket', :access => :private, :content_type => params[:uploadfile].content_type)

I can upload file's up to 30Mb without having a problem. I have read in other posts that this could be due to the fact the file is being loaded into memory(confused). The largest file i am going to upload is 40Mb, how can i achieve this without the upload failing.

My chrome browser returns the following error to me

Error 101 (net::ERR_CONNECTION_RESET): The connection was reset.

When i tried uploading from my development machine(localhost), i could upload large file > 80-100Mb, however its not working from heroku, i don't understand why, because i am uploading files directly to s3.

Strangely my downloads fail after 30 seconds , which is the timeout limit that heroku sets, however i do not recieve any error of timeout or failed upload from heroku logs

Thank you for your help

like image 237
Hishalv Avatar asked Dec 31 '25 11:12

Hishalv


1 Answers

After many months on this issue, i found a gem that works well, by uploading directly to amazon s3, without any complex flash, and javascript suff. I also integrates into carrierwave. The gem is called Carrierwave_direct

Works without a problem, however if you are using rails 3.0.x checkout this page for a solution.

If you are using rails rails 3.1.x, you are all set to go.

like image 119
Hishalv Avatar answered Jan 02 '26 02:01

Hishalv



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!