I am moving data from a Mysql DB to PostgresDB using Python. My Code looks like below -
conn = psycopg2.connect("dbname='aaa' user='aaa' host='localhost' password='aaa' ")
curp = conn.cursor()
db = MySQLdb.connect(host="127.0.0.1", user="root", passwd="root" , unix_socket='/var/mysql/mysql.sock', port=3306 )
cur.execute('select * from aaa_pledge')
list = cur.fetchall()
for l in list:
print l
for i in range(len(l)):
print i, l[i]
qry = "insert into aaa_pledge values ('%s','%s','%s','%s',%s,%s,'%s',%s,%s,%s,%s,'%s')" %( l[0], l[1], l[2], l[3], l[4], l[5], l[6], str(l[7]) , l[8], l[9], l[10], l[11] )
print qry
res = curp.execute( qry)
print res
curp.execute( "commit")
curp.close()
cur.execute( "commit")
cur.close()
The code runs fine for valid dates and executes as below -
('500055', 'NEERAJ SINGAL', datetime.date(2014, 12, 4), 'NEERAJ SINGAL', Decimal('57221712.00'), Decimal('25.2600000'), 'INVOCATION', datetime.date(2014, 12, 4), Decimal('0E-7'), Decimal('0E-7'), Decimal('0E-7'), '-')
0 500055
1 NEERAJ SINGAL
2 2014-12-04
3 NEERAJ SINGAL
4 57221712.00
5 25.2600000
6 INVOCATION
***7 2014-12-04***
8 0E-7
9 0E-7
10 0E-7
11 -
insert into aaa_pledge values ('500055','NEERAJ SINGAL','2014-12-04','NEERAJ SINGAL',57221712.00,25.2600000,'INVOCATION','2014-12-04',0E-7,0E-7,0E-7,'-')
But when the source has nulls, in the tuple we get a None
('500055', 'NEERAJ SINGAL', datetime.date(2014, 11, 26), 'NEERAJ SINGAL', Decimal('57394244.00'), Decimal('25.3300000'), 'INVOCATION', ***None,*** Decimal('0E-7'), Decimal('0E-7'), Decimal('0E-7'), '-')
0 500055
1 NEERAJ SINGAL
2 2014-11-26
3 NEERAJ SINGAL
4 57394244.00
5 25.3300000
6 INVOCATION
***7 None***
8 0E-7
9 0E-7
10 0E-7
11 -
insert into aaa_pledge values ('500055','NEERAJ SINGAL','2014-11-26','NEERAJ SINGAL',57394244.00,25.3300000,'INVOCATION','None',0E-7,0E-7,0E-7,'-')
Traceback (most recent call last):
File "pransfer_data.py", line 27, in <module>
res = curp.execute( qry)
psycopg2.DataError: invalid input syntax for type date: "None"
LINE 1: ...EERAJ SINGAL',57394244.00,25.3300000,'INVOCATION','None',0E-...
I read some notes about using %s rather than '%s' and also about using str(l[7]). However in each case if the code runs fine for valid dates, it fails for invalid ones and vice versa. I am going to face this problem for other integer values also.
Is there a way to
PS - I have lot of dates and numeric cols where this is going to be required. I will not be able to do a if then else for each of the columns.
Thanks, Manish
What you need to do is to invoke the parameterized form of the psycopg2's cursor.execute
method - pass in the query (with %s
placeholders, without interpolating it right away) as a string for the first parameter of execute
, and then pass in a tuple containing your parameter values as the second parameter. Something like:
insertQuery = 'INSERT INTO foo VALUES (%s, %s, %s)'
cursor.execute(insertQuery, (1, 'Alice', datetime.date(2014, 12, 4)))
Consider the following silly test table:
rchang=> \d foo
Table "public.foo"
Column | Type | Modifiers
--------+-----------------------------+-----------
id | integer |
name | text |
ts | timestamp without time zone |
With the following Python code performing insertions:
insertQuery = 'INSERT INTO foo VALUES (%s, %s, %s)'
conn = psycopg2.connect(connString)
cursor = conn.cursor()
cursor.execute(insertQuery, (1, 'Alice', datetime.date(2014, 12, 4)))
cursor.execute(insertQuery, (2, 'Bob', None))
conn.commit()
conn.close()
Then we get the following result in the database:
rchang=> select * from foo;
id | name | ts
----+-------+---------------------
1 | Alice | 2014-12-04 00:00:00
2 | Bob |
(2 rows)
Note that when you use the parameterized version, None
gets properly inserted as a null value into PostgreSQL.
You need a NULLIF and a CAST:
CREATE TABLE foo(bar date);
INSERT INTO foo(bar) VALUES('2015-01-01');
INSERT INTO foo(bar) VALUES('None'); -- fails
INSERT INTO foo(bar) VALUES(CAST(NULLIF('None', 'None') AS DATE)); -- works fine
INSERT INTO foo(bar) VALUES(CAST(NULLIF('2015-01-01', 'None') AS DATE)); -- works fine as well
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With