Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

statement too large while pushing data into redshift database from python

I am pushing pandas dataframe in redshift table and getting following error

cur.execute("INSERT INTO sir_main VALUES " + str(args_str))
psycopg2.ProgrammingError: Statement is too large. Statement Size: 58034743 
bytes. Maximum Allowed: 16777216 bytes`

And it halts the execution. Is there any way to configure the limit while pushing into database?

like image 479
Neil Avatar asked Oct 19 '25 01:10

Neil


1 Answers

If you are loading more than a few hundred rows you should save the dataframe as a flat file to S3 and load it into Redshift using COPY. https://docs.aws.amazon.com/redshift/latest/dg/r_COPY.html

like image 184
Joe Harris Avatar answered Oct 21 '25 14:10

Joe Harris



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!