How to keep multiple machines updated at the same time

In situations where you manage multiple machines, Apt can be very helpful as it makes the process of updating and applying security patches for each machine very simple. The problem is that, with the traditional method, once an update comes out, you have to download a copy of all the new packages for each of the machines, which implies a phenomenal consumption of our bandwidth and bandwidth. from the official server Fortunately, there is a method that allows us to update one of the machines and, from there, update the rest of the machines that make up our network. This method, in addition to reducing costs and optimizing the use of our bandwidth, avoids the duplication of packages on the different machines: they all install the packages using our "cache server".


When you are running the same distribution on different machines (whether at work, in computer labs, on server "farms," ​​in clusters, or even on your small home network) it can be helpful to create a cache repository on your network so that once a package has been downloaded from an official repository, all the other machines download it from that cache repository that is stored on a machine on your network that we will call "server". In this way, updates downloaded from one machine can be installed on the others without having to download them again from the official repositories.

Let's first look at some "non-traditional" solutions that I DO NOT recommend, but will surely come to mind when solving this question.

Share / etc / apt

When you install a package on a Debian distro (or its derivatives), it is stored locally in the '/ etc / apt' directory. When a package is required, Apt first looks in this directory to see if there is a local copy (that is, a cache), thus avoiding unnecessary downloading. As a result, several of you will surely have thought that a good way to solve the problem in question could be to choose a computer, which we would designate as a kind of server, which would be updated using the official repositories, and which would share your directory '/ etc / apt' with the rest of the machines on the network. However, this method could lead to problems related to the blocking of the file 'sources.list'. In general, it is not the most widely used or the most convenient solution.

Move packages

Instead of sharing a common '/ etc / apt' directory, another alternative could be that each computer uses its own local cache directory but generate a script that takes care of copying the packages from one machine to the other so that they all remain updated. A tool to carry out this task can be 'apt-move', but I honestly do not recommend it because it is not transparent enough for the end user. In addition, it can mean a completely unnecessary use of disk space since all packages must be copied to each of the machines.

Dedicated cache systems

The best solution for this problem is to use a dedicated cache system. In short, what it is about is to create a copy of the official servers on one of the machines on your network and then configure the rest of the machines so that, instead of looking for updates from the official servers, they do it using this local cache (or copy).

There are a variety of systems designed to work with Apt, including apt-cacher, apt-proxy, and apt-cached.

Here we are going to deal with apt-cacher, which is the easiest to use.

apt-cacher

Apt-cacher is very different from other repository caching systems since it is not a stand-alone program but runs as a CGI script under Apache. This has several advantages, such as making it a small and simple tool but at the same time very powerful and, consequently, more robust because it does not require its own code to handle protocols, and it is very flexible because you can use the Apache's access control mechanism in case you want to limit the number of machines that are enabled to access the cache.

Apt-cacher only needs to be installed on one machine, the one that you decide should function as your local repository cache. Then, the rest of the computers on your network must be configured to request updates from the cache and not from the official servers.

Server configuration

To install just

sudo apt-get install apt-cacher

This package has dependencies with apache, perl and wget, so it will install them if you don't have it previously installed.

Once installed, it is recommended to restart Apache:

/etc/init.d/apache restart

Finally, all you have to do is adjust the default parameters of the script. I wrote in a terminal:

sudo gedit /etc/apt-cacher/apt-cacher.conf

In general, all the default values ​​are fine, but it is recommended to adjust the following three:

admin_email = mimail @ myserver generate_reports = 1 expire_hours = 24

The second element is a Boolean type variable that determines the generation of reports (0 does not generate reports, 1 does generate them). The first element, instead, is the email address to which the created reports will be sent. The third and final item determines the number of hours Apt should wait to check for updates available on the official servers.

If you are using a proxy, don't forget to add the following items:

http_proxy = proxy.example.com: 8080 use_proxy = 1

To verify that it works, you can access your local cache through the url http: // server_name / apt-cacher / and a page will appear showing the apt-cacher configuration. Remember that 'server_name' must be replaced by the IP of the machine that you have designated as 'server', that is, as the depository of the local packet cache.

Client configuration

Now all you have to do is modify the sources.list of the clients so that they pass through the server. If the server's IP is 123.123.123.123, you have to add it to each line of the sources.list, and also be careful that they all refer to the same server, otherwise the cache will have no effect.

sudo gedit /etc/apt/sources.list
Note: Be careful! In Debian and its derivatives the 'sources.list' is stored in '/ etc / apt'. However, in other distributions it might be stored in another path. In case you can't find the file you can always find it by entering 'locate sources.list' in a terminal.

Once the file is opened, the IP of our server being 123.123.123.123, all the lines should be modified using the following criteria:

# Original #deb http://ftp.us.debian.org/debian/ sid main contrib non-free # deb-src http://ftp.us.debian.org/debian/ sid main contrib non-free # Modified deb http://123.123.123.123/apt-cacher/ftp.us.debian.org/debian/ sid main contrib non-free deb-src http://123.123.123.123/apt-cacher/ftp.us.debian. org / debian / sid main contrib non-free

As you can see, you have to add the server IP + '/ apt-cacher /' at the beginning of the URL. Then goes the rest of the original line.

Traffic statistics

If you added the element 'generate_reports = 1' in the file 'apt-cacher.conf', apt-cacher will generate access statistics, which you can access with the url '/ apt-cacher / report'.

If, for any reason, you need to generate the statistics before the number of hours set in 'apt-cacher.conf', run the following command:

/usr/share/apt-cacher/apt-cacher-report.pl

Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: Miguel Ángel Gatón
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.

  1.   eduardo said

    Greetings, excellent contribution, my question is if you know of any application that allows you to have a centralized repository to apply patches but to different distributions, that is, keep several machines updated at the same time but have different distributions

  2.   Let's use Linux said

    Hi Eduardo! The truth is that I see it very difficult. If you discover any way, do not stop letting me know.
    A big hug! Cheers! Paul.

  3.   stingy said

    I use puppet to remotely manage the servers I manage.

  4.   Let's use Linux said

    Yes. Thank you very much for the information. I already corrected it.
    Hug! Paul.

  5.   Geniutrixone said

    Hello,

    the tutorial is super good ... I have a scope .. in Debian Lenny the source.lst is in the path / etc / apt /

    regards

  6.   sepulvedamarcos said

    question….

    if I have a couple of machines with the same distro… but not with the same programs…. How do you know what to download from the official repos…. lowers everything ??? ...

  7.   Let's use Linux said

    Your question is excellent. I calculate that the system should work the same as in a common Apt: if it does not find it in the cache, it downloads it from the official repositories. In this case, one of the "client" machines informs the "server" that it needs an update according to the list of updates on the "server" of your network. To install that update I calculate that it will first look for the package in the server cache. If it cannot find it, it downloads it from the official repositories, saves it on the server and, from there, it is installed on the machine that required it. This package will be available in the "server" cache so that other machines on your network can install it from there as well.

    Please feel free to write if I was not clear enough.

    A hug! Paul.

  8.   Mishudark said

    I think there is an ERROR… packages are not stored in / etc / apt…. they actually stay in / var / cache / apt / archives

  9.   Let's use Linux said

    The truth is I do not know.
    Sure there is a way to do it. 🙁
    If you find out, let me know and I'll add it.
    Cheers! Paul.

  10.   Alvaro said

    Nothing with dynamic ip's, right?

  11.   Marce said

    Apparently in more current distros you have to add the port (3142 by default) to the URL of the local network. It would look like this: http://mi_servidor:3142/apt-cacher

  12.   alfredo torrealba said

    I have lubuntu 16.04 who has done it under this system and if it has worked for him? And what I wanted to ask is the following if I install this server and my other machines do not have the same programs when making a request on my client machines for the installation of a program that I have on the server, do you think I install it directly from the local server or does the request to the official repository server ¿?