I have a bash script that uses wget running on a 3g device. I want the script to pause until I get a response so I have setup my code like this:
wget -t 0 -T 15 --retry-connrefused www.example.com
The problem is that when there is no connection at all (3g drops for a few seconds) the DNS can't resolve the host and wget stops retrying.
Is there a way to force retry until the connection comes back? I know I can write a loop but I want to know if there is a setting in wget I can use. If there is not what the best variable to build the loop on?
More details:
With the code I wrote, wget will retry but if the device has no Internet connection at all (for example I pull out the 3g dongle from the device) it will stop retrying telling me it can't resolve the host address. It sends that after the 15 seconds defined by -T 15
wget: unable to resolve host address
However, sometimes, wget receives no data from the server, in which case it'll wait. In the example above, if wget has waited for data longer than five seconds, it will abort and restart the download. The default read timeout is 900 seconds.
The wget tool is essentially a spider that scrapes / leeches web pages but some web hosts may block these spiders with the robots. txt files. Also, wget will not follow links on web pages that use the rel=nofollow attribute. You can however force wget to ignore the robots.
The default is to retry 20 times, with the exception of fatal errors like "connection refused" or "not found" (404), which are not retried.
To resume a wget download it's very straight forward. Open the terminal to the directory where you were downloading your file to and run wget with the -c flag to resume the download.
This loop should do this:
while true;do
wget -T 15 -c http://example.com && break
done
while loop will not break and it will run the wget command continuously and keep printing error message.wget starts resolving the host and getting the files.0 or inf i.e unlimited retry, use limited value) of wget will retry to get the files until timeout of 15 seconds reached. After 15 seconds the wget command will fail and print error output and thus the while loop won't break. So it will again reach in a state where there is no connection or such and keep printing error message.wget starts resolving the host and getting the files. These steps (1-4) continue as long as the files are not downloaded completely.wget command uses -c option, i.e resume option. So every instances of wget will start (downloading) from where it left off.wget command succeeds, the loop will break.Here is a script that you can use to resolve your problem
FILENAME=$1
DOWNURL=$2
wget -O "`echo $FILENAME`" "`echo $DOWNURL`"
FILESIZE=$(stat -c%s "$FILENAME")
while [ $FILESIZE \< 1000 ]; do
sleep 3
wget -O "`echo $FILENAME`" "`echo $DOWNURL`"
FILESIZE=$(stat -c%s "$FILENAME")
done
The script forces the download to continue until it finishes. The 1000 could be changed to fit whatever size file you are downloading.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With