Fractale Fractale - 3 years ago 224
Bash Question

How to clean and find good url in command lines?

I have a list of URLs and I want to test them to get only the one witch work.

So I would like to do:

cat URLs.txt | testURLs > GoodURLs.txt

Do you know how to do that?
maybe ping or wget can be useful???

Answer Source

I find "curl" to be usually better for this kind of checks than ping or wget. You could use something like this:

while IFS= read -r line
    curl $line >/dev/null 2>&1
    if [ "$?" -eq "0" ]
        echo "$line"
done < $file

The execution changes a bit, but it's pretty similar to what you had in mind. Don't forget to make it executable with chmod.

chmod u+x testURLs
./testURLs URLs.txt > GoodURLs.txt
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download