Can I use wget to check for a 404 and not actually download the resource? If so how? Thanks
The wget package is pre-installed on most Linux distributions today. To check whether the Wget package is installed on your system, open up your console, type wget , and press enter. If you have wget installed, the system will print wget: missing URL .
Downloading a file In order to download a file using Wget, type wget followed by the URL of the file that you wish to download. Wget will download the file in the given URL and save it in the current directory.
What is wget? Wget is a free GNU command-line utility tool used to download files from the internet. It retrieves files using HTTP, HTTPS, and FTP protocols. It serves as a tool to sustain unstable and slow network connections.
The best alternative is aria2, which is both free and Open Source. Other great apps like Wget are uGet, cURL, ArchiveBox and HTTPie. Wget alternatives are mainly Download Managers but may also be Website Downloaders or HTTP Clients.
There is the command line parameter --spider
exactly for this. In this mode, wget does not download the files and its return value is zero if the resource was found and non-zero if it was not found. Try this (in your favorite shell):
wget -q --spider address echo $?
Or if you want full output, leave the -q
off, so just wget --spider address
. -nv
shows some output, but not as much as the default.
If you want to check quietly via $? without the hassle of grep'ing wget's output you can use:
wget -q "http://blah.meh.com/my/path" -O /dev/null
Works even on URLs with just a path but has the disadvantage that something's really downloaded so this is not recommended when checking big files for existence.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With