Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert mySql query result to json using python

Tags:

python

json

mysql

I want to monitor mySql Database for getting update of it's performance so I executed a query show global status; This gives me a list of varibale like Uptime, aborted_connections etc.

But I want to export the result in a json format, can I achieve that using python script?

What I have done till now is,

  1. exported result of query to .csv file .
  2. Trying to convert that csv file to json using the following python code
 import csv
import json
csvfile = open ('./tempp/op.csv','r')
jsonfile = open('./tempp/op.json','w')
fieldnames = ("Variable_name","Value")
reader = csv.DictReader(csvfile,fieldnames)
for row in reader:
  json.dump(row,jsonfile)
  jsonfile.write('\n')

Issues:

  1. The above script is not giving the desired result.It produces json like
{"Value": null, "Variable_name": "Variable_name\tValue"}
{"Value": null, "Variable_name": "Aborted_clients\t0"}
{"Value": null, "Variable_name": "Aborted_connects\t7"}
{"Value": null, "Variable_name": "Binlog_cache_disk_use\t0"}
{"Value": null, "Variable_name": "Binlog_cache_use\t0"}
{"Value": null, "Variable_name": "Binlog_stmt_cache_disk_use\t0"}
{"Value": null, "Variable_name": "Binlog_stmt_cache_use\t0"}
  1. This seems to be a bad approach to first write the result to a file and then read data from it. Is there any better approach to directly convert result from mySql query to json object directly?

Edit based on answers: My op.csv file looks like:

Variable_name   Value
Aborted_clients 0
Aborted_connects    7
Binlog_cache_disk_use   0
Binlog_cache_use    0
Binlog_stmt_cache_disk_use  0
Binlog_stmt_cache_use   0
Bytes_received  31075
Bytes_sent  1891186
Com_admin_commands  445
Com_assign_to_keycache  0
Com_alter_db    0
Com_alter_db_upgrade    0
like image 475
Prashant Prabhakar Singh Avatar asked Apr 08 '26 12:04

Prashant Prabhakar Singh


2 Answers

Give this a try buddy-

import csv
import json

rows = []
with open('test.csv', 'r') as f:
    reader = csv.DictReader(f)
    for row in reader:
        rows += [row]

print json.dumps(rows, sort_keys=True, indent=4, separators=(',', ': '))

The arguments in the json.dumps are just so the output is formatted nicely.

So, for an input of the following-

Heading1,Heading2,Heading3
Row1Value1,Row1Value2,Row1Value3
Row2Value1,Row2Value2,Row2Value3
Row3Value1,Row3Value2,Row3Value3

You should get this-

[
    {
        "Heading1": "Row1Value1",
        "Heading2": "Row1Value2",
        "Heading3": "Row1Value3"
    },
    {
        "Heading1": "Row2Value1",
        "Heading2": "Row2Value2",
        "Heading3": "Row2Value3"
    },
    {
        "Heading1": "Row3Value1",
        "Heading2": "Row3Value2",
        "Heading3": "Row3Value3"
    }
]
like image 67
initialed85 Avatar answered Apr 11 '26 00:04

initialed85


Pandas is very flexible library for data. You can download it here: LINK

once you have it:

import pandas as pd
from pandas.io import sql
import sqlite3

conn = sqlite3.connect('example.db')
e = pd.read_sql('select * from my_table;',conn)
b = e.to_json()

print(b)

I am working with sqlite3, but it is universal for all sqls i guess.

there is as well function: pd.read_sql_table to read direct from table, however you would need alchemy.

like image 39
Testing man Avatar answered Apr 11 '26 00:04

Testing man



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!