What I'm trying to do, is update this answer to not depend on the JSONStream library, which isn't maintained anymore. There we have:
Comment.find()
.cursor()
.pipe(JSONStream.stringify())
.pipe(res.type('json'))
It's using the Mongoose .cursor(), which returns a Node.js-compatible readable stream, but I'm open to using the native mongo driver.
Now, my first question: is anybody still using Node.js streams or are you nowadays supposed to use JavaScript Iterators and Generators?
If so, I think I'll be able to convert the cursor to an iterator and convert each chunk to JSON separately. (Although library recommendations that do error-handling etc. are welcome, even if off-topic here and not the core of this question).
But how do I make the iterator stream into the express.js result?
I couldn't find any documentation on that (although couldn't find documentation on res being a writable stream either, although it works.) Am I even on the right track with my thinking here?
Edit:
Meanwhile, I've done some more research and found the following libs:
map and filter etc. for their streams..?Edit: Added a custom stringifying step.
The res object in ExpressJS is a Writable subclassed from http.ServerResponse, and can be piped data.
I tend to hook up this data flow using NodeJS' built-in support for converting an iterator to a readable, and using stream.pipeline for exception handling.
Note that it's no longer necessary to convert the cursor to a readable in NodeJS v13+, as stream.pipeline now accepts async iterators in place of a stream.
Note that it is redundant to use
stringify()if it is possible to use Mongoose's lean() directly. Lean will emit JSON data.
import stream from "stream";
import util from "util";
function handler(req, res, next){
try {
// init the cursor
const cursor = Comment.find().lean(); // "lean" will emit json data
const readable = stream.Readable.from( cursor );
// promisifying the pipeline will make it throw on errors
await util.promisify(stream.pipeline)( readable, res.type('json') );
next();
}
catch( error ){
next( error );
}
}
With custom stringifying in NodeJS v13+:
import stream from "stream";
import util from "util";
function handler(req, res, next){
try {
// init the cursor
const cursor = Comment.find().lean(); // "lean" will emit json data
const readable = stream.Readable.from( cursor );
// promisifying the pipeline will make it throw on errors
await util.promisify(stream.pipeline)(
readable,
// Custom "stringifying" using an async iterator
async function*( source ){
// Add some output before the result from mongodb. Typically, this could be information about
// the number of results in a REST API.
yield "Appended"
for await (const comment of source ){
// Emit a "projection" of the data retrieved from MongoDB
yield {
text: comment.text,
// Add a new property:
newProperty: true
}
}
// Add some final data to the response. In a REST API, this might be the closing bracket of an array "]".
yield "Prended"
},
// the stringified data is then piped to express' res object
res.type('json')
);
next();
}
catch( error ){
next( error );
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With