Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to download batch of data with linux command line?

For example I want to download data from: http://nimbus.cos.uidaho.edu/DATA/OBS/

with the link:

http://nimbus.cos.uidaho.edu/DATA/OBS/pr_1979.nc

to

http://nimbus.cos.uidaho.edu/DATA/OBS/pr_2015.nc

How can I write a script to download all of them? with wget?and how to loop the links from 1979 to 2015?

like image 685
breezeintopl Avatar asked Sep 05 '25 03:09

breezeintopl


1 Answers

wget can take file as input which contains URLs per line.

wget -ci url_file

-i : input file
-c : resume functionality

So all you need to do is put the URLs in a file and use that file with wget.

A simple loop like Jeff Puckett II's answer will be sufficient for your particular case, but if you happen to deal with more complex situations (random urls), this method may come in handy.

like image 159
Jahid Avatar answered Sep 07 '25 21:09

Jahid