Robert Robert - 1 month ago 11
Bash Question

Is it possible to several commands in the background but wait for all results and fail the script if an command fails

We have a CI script like this

# Make a build
...
# Upload to several different services:
curl "https://a.com"
curl "https://b.com"
curl "https://c.com"


Running
curl
in the background (
curl "https://a.com" &
) will save lots of time, but we want to fail the build (and log something sensible) if any upload fails. Is this possible?

Answer

Yes.

You can use the wait command in bash to wait for completion on one or more sub-processes to terminate in which case we provide the PID to wait on it. Also wait can optionally take no arguments in which case it waits for all background process to terminate.

Example:-

#!/bin/bash

sleep 3 &

wait "$!"     # Feeding the non-zero process-id as argument to wait command.
              # Can also be stored in a variable as pid=$(echo $!)

# Waits until the process 'sleep 3' is completed. Here the wait 
# on a single process is done by capturing its process id

echo "I am waking up"

sleep 4 &
sleep 5 &

wait          # Without specifying the id, just 'wait' waits until all jobs 
              # started on the background is complete.

# (or) simply
# wait < <(jobs -p)     # To wait on all background jobs started with (job &)

echo "I woke up again"
Comments