Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Node-SQLite's db.each using large amount of memory for broad queries

I am attempting to run the below listed function in Node on a VM with 3.5GB of memory. The query I am running returns ~2.25 million rows. It fails due to what appears to be a memory leak - if someone has encountered this or has suggestions on how to handle the increasing memory problem on this call, I'd appreciate it.

var sqlite3 = require('sqlite3').verbose();

db.each(query, function (error, data) {
    if (error) {
        console.log('Failed.');
    } else {
        console.log('Current perf: ', process.memoryUsage());
    }
    error = null; data = null;
}, function (error, responseLength) {
    console.log('done with all');
});

Each row looks like this:

2015-11-13T01:17:32Z|510|40.632087|-73.946855|315.47|2|20151112|9910715-SCPD5-SC_D5-Weekday-10-SDon|0|307840|73.51|5.53

The goal is to handle each row one at a time and write them to a CSV, line by line, via a stream. The thought was that, by doing this, I would avoid having to hold the entire query response in memory - but this goal seems to be thwarted given the current situation.

like image 290
kuanb Avatar asked Oct 28 '25 12:10

kuanb


1 Answers

I think you need to use a prepared statement and setImmediate(), like this:

var sqlite3 = require('sqlite3').verbose();

var stmt = db.prepare(query);

doQuery();

function doQuery(){

    stmt.get(function(error, data){
        if(data){
            if(error){
                console.log(error);
            } else {
                // do something with data here.
            }
            setImmediate(function(){
                doQuery();
            });
        }
    });

}

Explanation:

1) Prepared statements makes the client retrieve each row separately, rather than trying to grab all the results at once. The data param will be undefined when it hits the last row.

2) setImmediate() makes sure that the callback loop isn't called recursively, which could blow your call stack if the number of rows is too high.

like image 140
Russell Beattie Avatar answered Oct 30 '25 02:10

Russell Beattie