My web app requires some initial database scripts to be executed during installation. These are just some CREATE TABLE scripts that need to be run. I'm working on dockerizing it.
I've been trying to use the Docker 1.12's swarm mode to create multiple container replicas of my web app which use a shared database. (An Oracle 12c or MySQL instance).
The problem is that when I start multiple copies of the container, they all try to run the SQL scripts simultaneously and fail. I tried using Flyway to handle the database migrations but to no avail.
What should be the way to handle such a situation?
Generally, DB migration jobs should be versioned with every new script executing only after all the earlier ones are done. With this as a precondition one can either design a tool that runs those jobs one after another if not already run or one lets the jobs run always, but designs them in a way to check that they have not been done before.
Otherwise, all application data are at risk of being wiped off at next deployment. You should be able to deploy any time keeping and reusing the existing data (with docker or without).
In your "simple" case a DB check of the following sort would help:
create table IF NOT EXISTS thing (col1 int not null, col2 int not null, age int not null );
Thanks @Drew for prompting me to put my comment into an answer.