With terminal: download multiple queued links with wget.


Many times we need to download several links from a web page, whatever these links are, we always have some options to do it, some practical and others not so much, for example:

Download from the manager Firefox it is relatively practical because although it is simple, you cannot minimize it to the tray. The good thing is that if you close Firefox never mind, the download manager runs the same.

With Chromium It happens differently, the manager is kind of weird and the truth is that it is of little use because you cannot close the browser and that consumes resources.

There are alternatives like Toucan y Jdownloader, but I don't like any of them. Jdownloader use Java (i hate java with all my being) already Toucan I have never made it work, so it is not viable for me, not to mention how heavy Jdownloader.

Anyway, many times what we want to download is something embedded in a Link and we also need to do it in a light and non-intrusive way, without windows on top or anything like that. Well the solution, as always, is in the all powerful terminal.

Do you remember Wget? Well, with it we can do many things, from downloading something from a page and now, to downloading many links, one after another and restarting the connection if it falls.

The thing is very simple, the only thing we need is a common and wild text editor, a terminal (preferably that runs in the background as guake, jterm or yakuake) and be a bit neat.

The steps.

  1. First we are going to look for the exact links from where we want to download the content.
  2. Once located, we copy each link in the text editor.
  3. We save the file in .txt where we want the content to be downloaded.

Then, we must be ordered so we go from the terminal to the folder in which we want the content to be downloaded:

cd /home/usuario/carpeta-deseada/...

Once inside, we must make sure that the text file is in that same folder, if not, we move it (graphically or via terminal). If we already have it, we do:

wget -c -i archivo.txt

It's simple, didn't you think that with that simple command line you would have something that huge programs like Jdownloader offer you? But I still explain what each thing does:

  1. wget is what links and downloads the content.
  2. -c is for you to continue in case of interrupted download.
  3. -i is what takes, so to speak, the links from the text file.
  4. file.txt Do I have to explain?

And well, that's it, it's really simple but brutally useful, at least for me.

I hope it serves you, greetings.


The content of the article adheres to our principles of editorial ethics. To report an error click here!.

25 comments, leave yours

Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: Miguel Ángel Gatón
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.

  1.   vicky said

    Can you do the same with axel?

    1.    dwarf said

      I have no idea, just a day ago I found out about this ajhahaha

  2.   rogertux said

    Really useful!

  3.   nelson said

    This is super useful, this is how I download things on my server 😀

  4.   martin said

    With JDownloader I download from MediaFire. With this I can?

    1.    AurosZx said

      Yes. You just have to look for the direct link (right click on the download button> Copy link address). That is, you cannot use links like http://www.asdf.com/montondeletrasynumeros unless they end with the filename. For example, http://www.asdf.com/loquesea/descarga/archivo.zip.

      1.    Sys said

        For that particular case I use Plowshare
        Download and upload files from Rapidshare, Fileserve and other file-sharing websites
        https://code.google.com/p/plowshare/

        1.    Sys said

          Note: Plowshare doesn't need any Java virtual machine or graphical interface or anything like that. It is very similar to Wget.

  5.   diego said

    Thanks, very practical, as you well say: beastly useful.

  6.   Pavloco said

    Very useful, excellent for downloading anime and things like that. Especially with how heavy Jdownloader is.

  7.   Jamin samuel said

    aja but nobody asks the most important thing:

    Wget do you have to install it or is that already installed on the system?

    1.    dwarf said

      XD has been installed for years

      1.    Jamin samuel said

        Uff my brother thanks 😉 .. I think Jdowloader was screwed xD I'll do the test to see how I'm doing ..

      2.    Jamin samuel said

        Ok I need a little help ..

        I opened the terminal and placed

        cd / home

        and I get this $ home

        I try to write my name and it does nothing, I try to write "personal folder" and it doesn't do anything either, I try to write "downloads" which is where I save the text file with the links to download and nothing ..

        conclusion I do not know how to navigate with the terminal through the folders ..

        They teach me? pliss xD

        1.    dwarf said

          The command would be cd / home / your-user / Downloads

          Your username is the one you use to log into the account when you turn on the PC. Do you know what it is?

          Another important piece of information is that the names in the terminal must be identical to how you have them in the directory, "downloads" is not equal to "Downloads" and "Videos" is not equal to "Videos"

  8.   sieg84 said

    At Kget I love filters.

  9.   mortadelo_666 said

    +1 to this: "I hate Java with all my being"

    It works for my toucan without problems, but now I use more torrent than direct download.

    Thanks for the article

  10.   Hugo said

    I use this variant with the combination of -bci parameters to make the download as a background task and resume if there are failures, and I also combine it with the trickle command to do a little traffic-shaping and thus not consume all the capacity of link and be able to navigate simultaneously that I download, for example:

    sudo trickled -d 10 -u 8 -t 2 -N 6 && cd /var/tmp && trickle wget -bci pendiente && tail -f wget-log

    That's one of the things I like the most about Linux, that it allows a lot of customization.

    1.    taregon said

      Strange command, I'll start to pry it. oO

  11.   Ivan! said

    I don't quite understand this .. Whenever I have used wget, it downloads files with an Html extension .. They don't weigh anything, and of course, they are useless ..

    1.    sieg84 said

      It is just copy the link and paste it in the terminal to download with wget, it does not have much science.

  12.   sieg84 said

    In a simple way.
    midnight commander
    In openSUSE it is already installed, just check your distro in the package manager ...

  13.   Libertcharrua said

    Hello, I am finding it very useful thanks

  14.   Moses said

    mmm haha ​​I know it's a bit old but ... I follow the instructions but the only thing that downloads me is an html ...

  15.   maharba_1809 said

    Plowshare is awesome. In Ubuntu it is very easy to install. Works with Depositfiles, zshare, mediafire. I don't think he pulls with Mega but it will be a matter of time