Homunculus Reticulli Homunculus Reticulli - 4 months ago 33
Python Question

Setting up a server to host multiple domains using django, virtualenv, gunicorn and nginx

I am setting up a new server machine, which will host multiple django websites.

I must point out that I own (developed and are in absolute control of) all websites that will be run on the server.

I am pretty certain that ALL of the websites will be using the same version of:

  • django

  • gunicorn

  • nginx

  • postgreSQL and psycopg2 (all though some websites will be using geospatial and other extensions)

The only thing that I know will differ between the django applications are:

  • python modules used (which may have implications for version of python required)

I can understand using virtualenv to manage instances of where a project has specific python modules (or even python version requirements), but it seems pretty wasteful to me (in terms of resources), to have each project (via virtualenv), to have separate installations of django, nginx, gunicorn ... etc.

My question then is this:

Is it 'acceptable' (or considered best practice in scenarios such as that outlined above) to globally install django, gunicorn, nginx, postgreSQL and psycopg2 and simply use virtualenv to manage only the parts (e.g. python modules/versions) that differ between projects?.

Note: In this scenario there'll be one nginx server handling multiple domains.

Last but not the least, is it possible to use virtualenv to manage different postgreSQL extensions in different projects?


No. It would probably work, but it would be a bad idea.

Firstly, it's not clear what kind of "resources" you think would be wasted. The only relevant thing is disk space, and we're talking about a few megabytes only; not even worth thinking about.

Secondly, you'd now make it impossible to upgrade any of them individually; for anything beyond a trivial upgrade, you'd need to test and release them all together, rather than just doing what you need and deploying that one on its own.