download multiple files from a site using for loop like c in bash

February 7th, 2007 mysurface Posted in Bash, for | Hits: 45314 | 11 Comments »

Batch download using wget seems to be not working anymore, a lots of web site used to protect the particular directory from being browse. But if you know the filename of the files, you still can do a batch download by writing a custom bash script.

Recently I discover a site which allows you to read manga online. The problem is that I have to browse it page by page by clicking next link and the load is extremely slow. I really can’t enjoy reading it, where a page need almost 5 to 10 seconds to completely loaded.

So I decided to download it all at once and read later. I discover that all the images is actually in a particular folder with the folder name as volume of the manga, each manga page is actually an jpg image with name 1.jpg – 96.jpg.

I fire up my vim editor and wrote this down, this allow me to download entire volume at once.


#!/bin/bash
for (( a=1; a<=96; a++ ))
do
    wget -c http://www.manga.com/images/qtvim/Vol_01/$a.jpg
done

I like for loop in c style, bare in mind the first line is important, you have to define as bash instead of sh. #!/bin/sh will actually fails.

[tags]download, manga, download tips, hack[/tags]

11 Responses to “download multiple files from a site using for loop like c in bash”

  1. seq is even nicer for this – you can say:
    for a in `seq 1 96`
    do
    wget -c http://www.manga.com/images/qtvim/Vol_01/$a.jpg
    done

  2. Interesting! Yes, this is neat :)

  3. Your tip works like a charm, yet I _accidentally_ voted with two stars, that was meant to be 5 stars, sorry about that ;)

  4. Cool, found this when googled, but.. this doesn’t do exactly what I need, How would you download a bunch of files where part of the file name does not change, but it doesn’t increase like your examble.

  5. Rabiul Hassan Khan Says:

    Open page in firefox. Click on Tools -> Page info, then links tab. Select the links of the files. Right click and copy, open text editor and paste the links. Save as links.txt

    Open terminal and enter:

    wget -c -i links.txt

    All the files mentioned to links.txt will be downloaded to current folder.

    Alhamdulillah.

  6. this is actually a handy piece of code for many jobs, thanks

    for (( a=1; a

  7. I noticed that this is not the first time at all that you write about the topic. Why have you chosen it again?

  8. If you aware, the for loop part is different, another post that uses seq -f is more flexible.

  9. wget -c -i method is easiest and works best

  10. WOW I LOVE THIS!
    You have just saved me hours of toil.

  11. Thank you for this post , very good information.

Leave a Reply