Ondrej Tokar Ondrej Tokar - 1 year ago 414
Node.js Question

What is the right approach to update many records in MongoDB using Mongoose

I am pulling some records from MongoDB using Mongoose, importing them into another system and then I would like to set status (document attribute) for all these documents to


I could find this solution: Update multiple documents by id set. Mongoose

I was wondering if that is the right approach, to build up a criterion consisting of all document ids and then perform the update. Please also take into account a fact that it's going to be many documents.

(What is the limit of the update query? Couldn't find it anywhere. Official documentation: http://mongoosejs.com/docs/2.7.x/docs/updating-documents.html)

Answer Source

The approach of building up a criterion consisting of all document ids and then performing the update is bound to cause potential issues. When you iterate a list of documents sending an update operation with each doc, in Mongoose you run the risk of blowing up your server especially when dealing with a large dataset because you are not waiting for an asynchronous call to complete before moving on to the next iteration. You will be essentially building a "stack" of unresolved operations until this causes a problem - Stackoverflow.

Take for example, supposing you had an array of document ids that you wanted to update the matching document on the status field:

var processedIds = [

then for really small datasets you could use the forEach() method on the array to iterate it and update your collection:

    Model.update({"_id": id}, {"$set": {"status": "processed" }}, callback);

The above is okay for small datasets. However, this becomes an issue when you are faced with thousands or millions of documents to update as you will be making repeated server calls of asynchronous code within the loop.

An alternative would be to use something like async's eachLimit and iterate over the array performing a MongoDB update operation for each item while never performing more than x parallel updates the same time.

The best approach would be to use the bulk API for this which is extremely efficient in processing updates in bulk. The difference in performance vs calling the update operation on each and every one of the many documents is that instead of sending the update requests to the server with each iteration, the bulk API sends the requests once in every 1000 requests (batched).

For Mongoose versions >=4.3.0 which support MongoDB Server 3.2.x, you can use bulkWrite() for updates. The following example shows how you can go about this:

var bulkUpdateCallback = function(err, r){
// Initialise the bulk operations array
var bulkOps = [],
    counter = 0;

processedIds.forEach(function(id) {
        "updateOne": {
            "filter": { "_id": id },
            "update": { "$set": { "status": "processed" } }

    if (counter % 500 == 0) {
        // Get the underlying collection via the native node.js driver collection object
        Model.collection.bulkWrite(bulkOps, { "ordered": true, w: 1 }, bulkUpdateCallback);
        bulkUpdateOps = []; // re-initialize

if (counter % 500 != 0) { Model.collection.bulkWrite(bulkOps, { "ordered": true, w: 1 }, bulkUpdateCallback); }

For Mongoose versions ~3.8.8, ~3.8.22, 4.x which support MongoDB Server >=2.6.x, you could use the Bulk API as follows

var bulk = Model.collection.initializeOrderedBulkOp(),
    counter = 0;

processedIds.forEach(function(id) {
    bulk.find({ "_id": id }).updateOne({ 
        "$set": { "status": "processed" }

    if (counter % 500 == 0) {
        bulk.execute(function(err, r) {
           // do something with the result
           bulk = Model.collection.initializeOrderedBulkOp();
           counter = 0;

// Catch any docs in the queue under or over the 500's
if (counter > 0) {
    bulk.execute(function(err,result) {
       // do something with the result here