Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Caching gzipped responses from Node/Express/Redis

My requirement is to generate reports on a daily basis which several users access. The data only changes every 24 hours. Currently I use Node to create the report from data stored in Redis and generate the response which is compressed using Express's Compress() method. Creating and zipping these responses takes about 40 seconds (the response is about 4MB of JSON uncompressed) My preference would be to cache/store these responses either on the filesystem or in memory for a certain period of time. I don't want to implement Varnish or other reverse proxy just for this, so can anyone suggest another approach to effectively cache the responses? Thanks in advance.

like image 971
cgarvey Avatar asked Sep 06 '25 03:09

cgarvey


1 Answers

I recently had to do something similar, where I received a large JSON payload from a MongoDB database intended to be sent to a user application.

I only needed to update it periodically, about every 10 minutes, but GZipping the response took serious time, so I periodically grab the data, gzip it, and store it away, sending it in this example as a response to a GET request.

var zlib = require('zlib');
var Buffer = require('buffer').Buffer;
var express = require('express');
var app = express();

var yourData, cachedGzip;

// zlib performs gzip compression, passes payload to callback asynchronously
zlib.gzip(new Buffer(yourData)), function(err, data){
    cachedGzip = data
});

// Somewhere later in your app...
// Can now use the cached data for gzip responses, example route
app.get('/', function(req, res) {
  res.header('Content-Type', 'application/json');
  res.header('Content-Encoding', 'gzip');
  res.send(cachedGzip);
});
like image 85
Daguava Avatar answered Sep 08 '25 00:09

Daguava