Hassan Baig Hassan Baig - 3 months ago 38
Python Question

Uploading contents of Django FileField to a remote server

In a Django project of mine, users upload video files. Initially, I was uploading them directly to Azure Blob Storage (equivalent to storing it on Amazon S3). I.e. in models.py I had:

class Video(models.Model):
video_file = models.FileField(upload_to=upload_path, storage=OverwriteStorage())


Where
OverwriteStorage
overrides
Storage
in
django.core.files.storage
, and essentially uploads the file onto Azure.

Now I need to upload this file to a separate Linux server (not the same one that serves my Django web application). In this separate server, I'll perform some operations on the video file (compression, format change), and then I'll upload it to Azure Storage like before.

My question is: given my goal, how do I change the way I'm uploading the file in models.py? An illustrative example would be nice. I'm thinking I'll need to change
FileField.upload_to
, but all the examples I've seen indicate it's only to define a local filesystem path. Moreover, I don't want to let the user upload the content normally and then run a process to upload the file to another server. Doing it directly is my preference. Any ideas?

Answer

I've solved a similar issue with Amazon's S3, but the concept should be the same.

First, I use django-storages, and by default, upload my media files to S3 (django-storages also supports Azure). Then, my team set up an NFS share mount on our Django web servers from the destination server we occasional need to write user uploads to. Then we simply override django-storages by using "upload_to" to the local path that is a mount from the other server.

This answer has a quick example of how to set up an NFS share from one server on another: http://superuser.com/questions/300662/how-to-mount-a-folder-from-a-linux-machine-on-another-linux-machine

There are a few ways to skin the cat, but this one seemed easiest to our team. Good luck!

Comments