Non-commercial, non-professional user, seeking advice for hobbyist project.
I'm trying to pull data from one Pi, and scroll it on a second Pi.
However, terminal is showing repeats of the data initially loaded, not fresh data that I can see in the database. (For info...The database is updated automatically every ten minutes. For tests like this, I am running a manual update script to force new readings into the database.)
#!/usr/bin/env python
import signal
import time
import scrollphathd
from scrollphathd.fonts import font5x7
import mysql.connector
from mysql.connector import Error
con = mysql.connector.connect(host='192.168.###############')
str_len = 0
scroll_x = 0
timer = 2 # number of mins for display loop
### Create cursorS for each database element:
### Keep in buffer with =True
while True:
curT = con.cursor(buffered=True)
curY = con.cursor(buffered=True)
curL = con.cursor(buffered=True)
### Use each cursor to read required data from database:
curT.execute('SELECT Temp FROM readings ORDER BY Added DESC LIMIT 1')
curY.execute('SELECT Yaxis FROM readings ORDER BY Added DESC LIMIT 1')
curL.execute('SELECT Lux FROM readings ORDER BY Added DESC LIMIT 1')
### Get rid of trailing comma from each SELECT result:
resultT = [row[0] for row in curT.fetchall()]
resultY = [row[0] for row in curY.fetchall()]
resultL = [row[0] for row in curL.fetchall()]
### Not essential, but let's show the
### result of each SELECT query in the terminal:
print resultT
print resultY
print resultL
# set loop time in seconds
start = time.time()
end = start + ( timer + 60 )
### Set strings for display on Scroll PhatHD from SELECT results:
while time.time() < end:
temperature = resultT[0]
yaxis = resultY[0]
lux = resultL[0]
### Dim down Scroll Phat HD, and clear its buffer:
scrollphathd.set_brightness(0.1)
scrollphathd.clear()
### Uncomment/comment the below line to rotate Scroll PhatHD by 180/upside down
scrollphathd.rotate(degrees=180)
### Uncomment line below to test all data on Scroll PhatHD in one go.
### str_len = scrollphathd.write_string(" :-) %.1fC Y%i L%i "%(temperature, yaxis, lux), brightness=0.5)
### Check light levels and door angle (Yaxis) and report appripriately. Always show the temperature:
if lux <= 100 and yaxis >=3500 :
str_len = scrollphathd.write_string("Garage: light off & door closed. %.1fC Y%i "%(temperature, yaxis), x=0, y=0, font=font5x7)
elif lux <= 100 and yaxis <500:
str_len = scrollphathd.write_string("Garage: Light off & door open. %.1fC "%(temperature), x=0, y=0, font=font5x7)
elif lux > 100 and yaxis <500:
str_len = scrollphathd.write_string("Garage: Light on & door open. %.1fC "%(temperature), x=0, y=0, font=font5x7)
elif lux > 100 and yaxis >=3500:
str_len = scrollphathd.write_string("Garage: Light on & door closed. %.1fC "%(temperature), x=0, y=0, font=font5x7)
elif yaxis >500 and yaxis <3499:
str_len = scrollphathd.write_string("Garage door ajar %.1fC "%(temperature), x=0, y=0, font=font5x7)
scrollphathd.scroll_to(scroll_x, 0)
scrollphathd.show()
time.sleep(0.01)
scroll_x += 1
if scroll_x >= str_len:
scroll_x = 0
What do I need to change to make the display show fresh data from the database, and not repeated show old stale data?
Thank you.
EDIT: I do wonder if it's in the cursor buffer, and if that needs flushing somehow between each SELECT loop?
I ran into this same problem with MySQL from different notebooks running Python. I was seeing obvious cases where my queries from one notebook was not seeing data being posted by the other ones.
This is a very simple solution, but I find it works well.
Just setup a fresh connection, then process the SQL, and at the end close the connection. Code in Python looks like:
import mysql.connector
def do_SQL_Stuff(rds_Dict):
#Get RDS connection
cnx_GDAX = mysql.connector.connect(
host = rds_Dict['host'],
user = rds_Dict['user'],
password = rds_Dict['password'],
database = rds_Dict['database'])
#Do SQL stuff
#Close RDS connection
cnx_GDAX.close()
return r_Stuff
This works well for my coding and totally removed this issue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With