Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Serial data logger, parse on the fly

I am trying to slap together a Python data logger and am looking for some advice. Here is my situation. I don't want any user intervention. Just plug in and record. So far I am using the PySerial module and am able to record all serial data to a text file just fine. I read one line, store in a variable, then write that variable to a file in append mode.

Now I would like to transmit this data to the web for graphing. Here is where my headache comes in. Should I try and send this to a SQL server or just to a text file? I am using a 3g module, but speeds/bandwidth are limited. Data is streaming to the logger at 1 sample every .25sec Here is an example:

1 324 23454 2342 0 233 0 0 12223 66453 443 33 33 20 0 0 0 0

So I don't think I need to send EVERY sample, maybe just one line per second or every other second.

The logger will only run for about 5 days then all data will be recorded. Now my questions are, are there any recommendations on how/where I should send the data? SQL or Text file? And next, if I have this data online, is there an easy to way to plot this data in real-time while it is streaming?

And for the icing on the cake, sometimes the raw data will be space delimited, sometimes tab delimited, and sometimes comma delimited.

Any input would be appreciated!

like image 491
user1876087 Avatar asked Dec 06 '25 06:12

user1876087


1 Answers

Ok so you have a bunch of questions/issues here. I will try and address each of them:

Database

keep each line in a table in a database. It will make your life much easier. I recommend use something that can handle big loads like MySQL or Postgres. I could suggest the following table design. line is your data, uploaded is a flag which indicates whether or not this line has been uploaded to the server yet. line_date can record the exact date time that the reading was taken, that might be useful to capture. If your data lines are variable and you don't want to set a fixed length change the varchar(200) to text. I recommend you keep this table on the client side and use the uploaded field as a flag to indicate whether or not this line has been uploaded to the server. That way you won't have any data loss during any network connectivity issues, and you can keep track of which lines are still pending an upload. You can then have one script that just inserts lines, another script or thread that reads the list of not uploaded rows and uploads them every second or so. You can also have pretty much the same table design on the server for simplicity.

CREATE TABLE data_lines (
    id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
    line VARCHAR(200),
    uploaded INT,
    line_date DATETIME
);

Parsing lines

You mentioned the raw data will be space delimited, sometimes tab delimited, and sometimes comma delimited. This simple line of code can handle all those cases.

>>> line = "1,2 3\t4"
>>> print line.replace(',', ' ').split()
['1', '2', '3', '4']

Real-Time graphs

Flot is the way to go. They do really great web-based real time graphs. you can see an example of one here.

like image 51
Marwan Alsabbagh Avatar answered Dec 07 '25 21:12

Marwan Alsabbagh



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!