Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

read_sql() from MySQL is extremely slow

I am using MySQL with pandas and sqlalchemy. However, it is extremely slow. A simple query as this one takes more than 11 minutes to complete on a table with 11 milion rows. What actions could improve this performance? The table mentioned does not have a primary key and was indexed only by one column.

from sqlalchemy import create_engine
import pandas as pd
sql_engine_access = 'mysql+pymysql://root:[password]@localhost')
sql_engine = create_engine(sql_engine_access, echo=False)
script = 'select * from my_database.my_table'
df = pd.read_sql(script, con=self.sql_engine)
like image 502
Roberto Avatar asked Oct 21 '25 22:10

Roberto


1 Answers

You can try out our tool connectorx (pip install -U connectorx). It is implemented in Rust and targeting on improving the performance of pandas.read_sql. The API is basically the same with pandas. For example in your case the code would look like:

import connectorx as cx
conn_url = "mysql://root:[password]@localhost:port/my_database"
query = "select * from my_table"
df = cx.read_sql(conn_url, query)

If there is a numerical column that is evenly distributed like ID in your query result, you can also further speed up the process by leveraging multiple cores like this:

df = cx.read_sql(conn_url, query, partition_on="ID", partition_num=4)

This would split the entire query to four small ones by filtering on the ID column and connectorx will run them in parallel. You can check out here for more usage and examples.

Here is the benchmark result loading 60M rows x 16 columns from MySQL to pandas DataFrame using 4 cores: mysql time mysql memory

like image 124
Xiaoying Wang Avatar answered Oct 23 '25 12:10

Xiaoying Wang