Fractale Fractale - 17 days ago 9
Bash Question

How to clean and find good url in command lines?

I have a list of URLs and I want to test them to get only the one witch work.

So I would like to do:

cat URLs.txt | testURLs > GoodURLs.txt


Do you know how to do that?
maybe ping or wget can be useful???

Answer Source

I find "curl" to be usually better for this kind of checks than ping or wget. You could use something like this:

#!/bin/bash
file=$1
while IFS= read -r line
do
    curl $line >/dev/null 2>&1
    if [ "$?" -eq "0" ]
    then
        echo "$line"
    fi
done < $file

The execution changes a bit, but it's pretty similar to what you had in mind. Don't forget to make it executable with chmod.

chmod u+x testURLs
./testURLs URLs.txt > GoodURLs.txt