Don’t you just love these little things, so impossible in the past, but which save so much time?
I remember at times in the past I wanted to download a whole sequence of files, of whatever type, saved in a known location but named in a numerical sequence. I remember what I’d do, I’d resort to any of a number of methods – from the simplest, dumbest technique of simply going to each URL by hand and manually downloading what was there, using cut and paste to load “components” of the URL into download managers, using Microsoft Excel to generate an incrementing URL list, and finally the bluntest weapon of them all – brute force wget.
Isn’t it so much nicer to just be able to do this:
# wget.rb i = 1 100.times do puts 'http://www.example.com/files/path/filename' + i.to_s + '.ext' i = i+1 end
then ruby wget.rb > wget.txt
and finally pipe it into wget ..
FINISHED –03:42:35–
Downloaded: 317,666,256 bytes in 179 files
Start to finish: under 10 minutes.
December 3rd, 2006 at 11:04 am
You can do the same with curl without even having to write a script or use a temporary file. man curl for details.
December 3rd, 2006 at 11:17 am
Lol really?
I’ve always been prejudiced against curl for some reason, so I don’t know how to use for anything beyond downloading wget!
I’ll check it out though, certainly sounds more efficient.
December 3rd, 2006 at 10:17 pm
Yes, it is more efficient. Off the top of my head, the command you would use would be:
curl -O ‘http://www.example.com/files/path/filename[1-100].ext’
So instead of 10 minutes, it’s 10 seconds.