denislexic denislexic - 1 month ago 16x
Node.js Question

Loopback: How to copy files from one container to another?

I'm using

Strongloop loopback
for my backend API.
I'm using it's
to upload and manage files. More specifically, I'm using the file system for the moment but will switch to AWS storage once on production.

The users on the platform can copy a project and therefor will also copy the linked files that have been uploaded.

My question is, How can I copy files from one container to another?

Here's my code so far to create a new container:, function (err, container) {
// If error, no docs have been added or created, so just skip
}{ name: projectId},function (err,container) {
console.log("Error creating a container when copying project");
// Here is where I need assistance


It is possible to create download stream then pipe it to upload stream from one container to another, but that would not be a good idea since you will create unnecessary traffic. Let your underlying environment do the job for you.

On your local development environment you can use fs module to work with files, but when you migrate your storage to S3 you should use AWS SDK to copy files. You can create remote method on your Storage model that will take source bucket/folder and destination bucket/folder to copy files for you.

Something like:

      accepts: [
        {arg: 'copySource', type: 'string'},
        {arg: 'destinationBucket', type: 'string'},
        {arg: 'destinationFileName', type: 'string'}],
      returns: {arg: 'success', type: 'string'}

  Storage.copyS3File = function(copySource, destinationBucket, destinationFileName, cb) {
    var AWS = require('aws-sdk');
    var s3 = new AWS.S3({params: {Bucket: 'yourbucket',region:'yourregion'}});
    var params = {CopySource: copySource, Bucket: destinationBucket, Key: destinationFileName};
    s3.copyObject(params, function(err, success) {
      if (err) cb(err); // an error occurred
      else     cb(null, success)           // successful response

See Calling the copyObject operation on S3.

Please note that this code is not tested or ready for production. It is here only to give you an idea how to do it.

Also note that S3 is somewhat different than loopbackjs implementation of storage. S3 has flat structure but organizing files in folders is supported using key name prefixes.

Working with S3 folders

To make things more clear, you can retrieve file list from your source container and loop through that list to copy files to new destination using remote method created above like:

Storage.getFiles({container: 'yourContainer'}, function(files){

    Storage.copyS3File(file.container + '/' +, 'newContainer', '', cb);

}, function(err){