Grant Grant - 4 months ago 9x
Linux Question

wget in bash script is only downloading a partial file?

I have a php script that triggers a remote bash script which in turn triggers let another remote bash script (all machines are talking to each other with ssh key pairs and working fine)...

PHP (server 1) --> BASH (server 2) --> BASH (server 3)

There is a whole bunch of functions in the first bash script which all work perfectly, variables are passed right the way through the process, and then at the end there is a command to run a further script on server 3 :

ssh "/var/www/ $1";

This also works, as the script is triggered : ($1 contains a filename eg : mymusic.mp3)

wget -o /var/www/vhosts/site1/httpdocs/audio/$1$1

This should be downloading a file at this stage but for some reason it only downloads a partial file. The complete filesize should be around 115mb, but every time it's run it it will only download the first 10 or 20k of it? That number varies each time it's run, it's not constant.

I've tried using wget -b in this script to force it to download in the background but nothing downloads if I do that.

Am I using wget the wrong way in this situation? As it's being triggered from a remote bash script is there some extra security or commands that I need to supply to make this work?

Many thanks!


Per comments, you identified that the output from your wget command contained the text that wget produces, rather than the content of the file in the URL you specified.

According to the wget man page:

   -o logfile
       Log all messages to logfile.  The messages are normally reported to
       standard error.

This isn't what you're looking for, so yes, this is a wget usage error.

Instead, you probably either want to use curl, which uses a -o option the way you'd expect, or switch to wget's -O (capital O) option. Note that wget by default will name the files it saves the same as in the URL; read the man page to see how -O's behaviour is different from this.