download multiple files from a site using for loop like c in bash
February 7th, 2007 mysurface Posted in Bash, for | Hits: 67657 | 11 Comments »
Batch download using wget seems to be not working anymore, a lots of web site used to protect the particular directory from being browse. But if you know the filename of the files, you still can do a batch download by writing a custom bash script.
Recently I discover a site which allows you to read manga online. The problem is that I have to browse it page by page by clicking next link and the load is extremely slow. I really can’t enjoy reading it, where a page need almost 5 to 10 seconds to completely loaded.
So I decided to download it all at once and read later. I discover that all the images is actually in a particular folder with the folder name as volume of the manga, each manga page is actually an jpg image with name 1.jpg – 96.jpg.
I fire up my vim editor and wrote this down, this allow me to download entire volume at once.
#!/bin/bash
for (( a=1; a<=96; a++ ))
do
wget -c http://www.manga.com/images/qtvim/Vol_01/$a.jpg
done
I like for loop in c style, bare in mind the first line is important, you have to define as bash instead of sh. #!/bin/sh
will actually fails.
[tags]download, manga, download tips, hack[/tags]
May 23rd, 2007 at 3:11 am
seq is even nicer for this – you can say:
for a in `seq 1 96`
do
wget -c http://www.manga.com/images/qtvim/Vol_01/$a.jpg
done
May 23rd, 2007 at 9:54 am
Interesting! Yes, this is neat :)
June 1st, 2007 at 6:36 am
Your tip works like a charm, yet I _accidentally_ voted with two stars, that was meant to be 5 stars, sorry about that ;)
July 22nd, 2007 at 2:49 pm
Cool, found this when googled, but.. this doesn’t do exactly what I need, How would you download a bunch of files where part of the file name does not change, but it doesn’t increase like your examble.
December 11th, 2007 at 5:08 am
Open page in firefox. Click on Tools -> Page info, then links tab. Select the links of the files. Right click and copy, open text editor and paste the links. Save as links.txt
Open terminal and enter:
wget -c -i links.txt
All the files mentioned to links.txt will be downloaded to current folder.
Alhamdulillah.
December 13th, 2007 at 11:09 am
this is actually a handy piece of code for many jobs, thanks
for (( a=1; a
April 9th, 2009 at 11:57 am
I noticed that this is not the first time at all that you write about the topic. Why have you chosen it again?
April 10th, 2009 at 10:12 am
If you aware, the for loop part is different, another post that uses seq -f is more flexible.
July 14th, 2009 at 11:59 am
wget -c -i method is easiest and works best
January 10th, 2011 at 12:18 pm
WOW I LOVE THIS!
You have just saved me hours of toil.
April 22nd, 2011 at 1:09 pm
Thank you for this post , very good information.