Why is it better to compile than to install from the repositories

In this little guide I am going to explain (and teach you) why it is better that you compile a program (say Firefox, Vlc, etc) from its source code, than download it (from The Software Center, Yumex, Pacman, etc) and install.

First we go with the theory:

What is "compile"?

Compiling is transforming the source code (code written in a certain programming language, say C, C ++, etc.) into an executable program for its operation by using the processor to convert the language used to generate the code to binary and assembler . It is also often called packaging.

Why is it better to "compile"?

First you have to know the following to understand why. Put in a "crude" way (simple, not very professional, etc), each race (Pentium, Core, Atom, etc) and its species (Intel, AMD, ARM, etc.) of processor have instructions (software written in assembler that processes the code) of their model (Core i7, Core i5, Atom x2, Phantom x8, Arm, etc.) and also have general instructions that all of their kind have.

When you download from the repositories through the Software Center / apt-get / Yumex / Yum / Pacman / etc, a program that installs automatically is said to be this precompiled for operation on all possible processors (Intel and Amd). As it is a precompiled program, those instructions typical of that specific processor model are lost (think that if a program like Firefox or Chrome, which have more than 7 or 8 million lines of code, they had to put all the specific instructions for each processor on the market, the amount of code would be so large that that program would no longer be efficient) leaving nothing more than the general ones of its creator brand (Intel, Amd, Arm).

When you download, unzip and compile the source code of a program yourself, it compiles with the specific instructions of TU processor, (which does not mean that it will not work on a machine with a different one, only that it will be optimized specifically and purely for your processor), thus unleashing and releasing all the power that your processor is capable of giving thanks to its specific instructions.

In more technical details, these specific instructions are closely linked to what is known as the chipset of your motherboard, which is the great headache for those of us who have Intel when we want to upgrade the processor and the motherboard.

You would be surprised at the power that your amd atom x2 o tu Intel Core Inside, 2 Core Duo, i3, etc from your old PC. Now do you understand why there is much talk in the Linux world about compiling the famous Kernel (heart of every operating system)? Imagine if you compile a whole system (graphical environment (Gnome, Kde, etc), Kernel, commonly used programs (Firefox, Vlc, Chrome, Wine, etc.) especially for your pc) all the speed and optimization level you would have.

This compilation principle to obtain a code optimized especially for your machine is the one used by distros such as Gentoo and derivatives (which I am not going to talk about now, I use Fedora minimal with compilation of Gnome 3, the kernel and other programs) where the system , your updates and your programs are always compiled.

Cons of compilation:

I already explained all the advantages, but like everything in the universe it has one against.

In the compilation case they are;

  • Time needed for this (Firefox with an i7 4790K (without overclock since I'm very bad with voltages) takes 3 minutes, Gnome Shell (the bar nothing else) with Gnome-Control-Center took me about 2 minutes, both being compiled at the same time in Fedora. But on a machine with a less powerful processor this time can be disproportionately long).
  • The processor uses 100% of its power with all its cores at maximum, so consumption and heat skyrocket (take this into account if you have overclocking or if it is especially a notebook), so it is convenient that you prepare a mate or a coffee for the occasion.
  • Perhaps you are missing a library (tool) that uses a program so that it does not error in the compilation. In general, all distros have packages or sets of them to avoid this (they are packed with various libraries and other things that allow the kernel to communicate as it should with the processor during the process).

How can I compile?

For Debian (Ubuntu, Mint, Elementary, etc they are all derivatives of this so follow this

Here I am talking about compiling a program for normal use, not a kernel.

aptitude install build-essential dh-make devscripts fakeroot debhelper debian-policy ccache dh-autoreconf autotools-dev build-dep ardor

I put debian-policy, but if your distro is not Debian and it gives you an error that no such package exists, just ignore it. I have to clarify that I have not used these systems for a long time, so if a package is no longer in the repositories, do not worry.

For Fedora:

sudo yum -y install kernel-headers
kernel-devel
sudo yum groupinstall "Development Tools"
sudo yum groupinstall "Development Libraries"

Here I have to apologize for those who use Arch (I do not know the distro well) and OpenSuse, since I do not know these distros or the respective packages to perform a correct compilation (and I have not corroborated what is on the network, so that for those two I don't know if they work).

Now that you have all the necessary requirements, you only need to download the source code of the program you want to compile, depending on the extension you unzip it using the terminal (don't worry, I'll leave you the commands) and when you go to the folder (always with the terminal) you do the same next:

If you have the possibility to configure yourself to choose the components and others:

./configure

Then you type:

make

And finally to install the program on your linux:

make install

All this always with root (su in Fedora, sudo su in Ubuntu and its derivatives (Mint, Elementary Os, etc)

Commands to unzip using the terminal (the file is unzipped in a folder where the file is located):

.Tar files (tar) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Pack | tar cvf file.tar / file / * Unpack | tar xvf file.tar View content | tar tvf file.tar
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - .tar.gz - .tar.z - .tgz (tar with gzip) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Pack and zip | tar czvf file.tar.gz / file / Unpack and unzip | tar xzvf file.tar.gz View content (not extracted) | tar tzvf file.tar.gz
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - .gz (gzip) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compress | gzip -q file (The file compresses and renames it "file.gz") Unzip | gzip -d file.gz (The file unzips it and leaves it as "file" Note: gzip only compresses files, not directories
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - .bz2 (bzip2) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compress | bzip2 file | bunzip2 file (The file compresses and renames it "file.bz2") Unzip | bzip2 -d file.bz2 | bunzip2 file.bz2 (The file unzips it and leaves it as "file") Note: bzip2 only compresses files, not directories
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - .tar.bz2 (tar with bzip2) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compress | tar -c files | bzip2> file.tar.bz2 Unzip | bzip2 -dc file.tar.bz2 | tar -xv | tar jvxf file.tar.bz2 (recent versions of tar) View content | bzip2 -dc file.tar.bz2 | tar -tv
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - .zip (zip) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compress | zip file.zip / mayo / archives Unzip | unzip file.zip View content | unzip -v file.zip
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - .rar (rar) - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compress | rar -a file.rar / may / archives Unzip | rar -x file.rar View content | rar -v file.rar | rar -l file.rar

And that's all. Greetings from Buenos Aires, Argentina. Happy Holidays and New Years! :).


Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: Miguel Ángel Gatón
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.

  1.   Gonzalo said

    The problem with compiling is that it does not always work the first time and is more tedious

    1.    Cristian said

      The problem with compiling is that unless you have an old and limited PC, the improvements will not be noticeable ... well maybe on a computer with intensive use it is an option, but for most users it is just a tedious process.

      1.    Daniel said

        I think that's the heart of the matter. Is the performance improvement that will be noticeable when compiling the packages so important that it would take the time and hassle of this task into the background?

      2.    joaco said

        The same if you have an i7 compile it is convenient, because it is faster and I calculate that it works something better. Now with a pc with intel atom, it is not convenient, unless you really need that extra power that compiling gives, but it can take hours to compile a program with a less powerful processor.

    2.    Avrah said

      I totally agree, it has happened to me to compile and find out after a while that you lack a library, track it and have to face the process again ... It is rare that everything works on the first try ... xD

  2.   FerGe said

    Very interesting!

    If you compile a program, how do the updates work afterwards? Are they automatic or do we have to be aware of whether a new version has come out?

    1.    Antonio Campos said

      You have to update it manually, that is to say, compiling the most recent version that is another, let's say "disadvantage", for which it also does something tedious

    2.    jlbaena said

      As updates do not exist, in fact, Linux distributions and their different ways of packaging the software and the corresponding package managers eliminate the inconvenience of recompiling for each new update (and solving dependencies).

      Greetings.

    3.    joaco said

      If you compile it by downloading the source code from any page, then you do have to do it manually and learn how to install it because not all are installed the same.
      Now, if you have Gentoo or some distro with ports, then you do it from the repositories almost automatically.

    4.    Fermin said

      In Gentoo your package manager, Portage, takes care of updates and dependencies; I don't know on other distros. Of course, each update involves recompiling, obviously.

  3.   tanrax said

    There was a time when I compiled everything I could. Then I got tired, especially because of the time I had to dedicate to the machine working (45 min for the kernel, 10 min for chromium…) and because of the time I spent fixing problems that arose on the fly. In addition, I personally did not find an increase in performance, I had the feeling that everything was the same. For these reasons now I use everything precompiled, everything is instantaneous and without conflicts. Although I learned a lot at that time, I wanted to use gentoo 🙂

  4.   Emmanuel said

    Even, and it is something that I have seen little, it can be compiled from systems like apt. Add a build flag to apt-source and voila. Of course, prior to that, installing the necessary packages to carry out the compilations, if not it doesn't work ... although it is a more direct form of compilation and that involves fewer steps, since, only the first time it takes up the installation of packages and the following, the met dependencies and the package as such.

    Greetings.

    1.    joaco said

      It has apt-build functionality, although I think it doesn't compile the dependencies, but installs pre-compiled binaries.

  5.   xikufrancesc said

    From the first moment I saw the headline, I couldn't help but think the same thing, and after reading the entire excellent article, I have the idea in mind going around a thousand times, Gentoo ... Gentoo, where are you?
    compiling is wonderful, being able to enjoy certain features and use them is priceless, but time and "current needs" are inexcusable, as they do not apply.
    Maybe we need something in the middle, where neither libraries nor details in version change will waste so much time. We will see what will happen then, or if we really apply ourselves to compile on the aptitude itself, uprmi and zypper that we already have installed.

  6.   anonymous said

    3 minutes firefox!… .Did you mean 30?

    This took a long time on my pc with a 8350G fx4.5, I use gentoo.
    $ genlop -t firefox | tail -n3
    Sat Dec 6 20:00:00 2014 >>> www-client / firefox-34.0.5-r1
    merge time: 16 minutes and 35 seconds

    Those instructions specific to each processor are called mnemonics and are physically implemented within the microprocessor, they are what make up the machine language, therefore compiling so that a program can run on many types of microprocessors, if or if they have to be limited to the least amount of common mnemonics supported by all those microprocessors ... wasting real capacity of the most current and powerful microprocessors.
    This is how companies and gnu / linux binary distros do it.

    1.    Shyancore said

      To me with an Intel i7 4790K with 18gb of RAM it has taken me what I said before

      1.    anonymous said

        I understand that the micro you have is superior, but the difference is abysmal, the truth must be a savanna at that speed. Maybe it's something related to dependencies or USEs that are the same as the configure options when compiling by hand.

      2.    Jhonny said

        Small detail that you obviated saying 18GB of Ram apart from the i7, not everyone has that machine, but you could do a benchmarking so the difference is noticeable, because the theory is nice but let's see if it compensates.

      3.    Cristian said

        Another great detail, the processor is Intel, therefore it has the best floating point independent of the model, a very relevant feature to carry out this type of process

    2.    ezequiel said

      True, compiling is tedious. But you learn a lot by reneging on Makefiles, libraries, etc. It is something that is good to do even a couple of times. I use everything precompiled for the same reason Tanrax cited.

      Greetings from Argentina!

  7.   Erick carvajal said

    The problem that I generally have when trying to compile totally new version programs is always due to dependencies, sometimes it is necessary to compile all of them (to get to the latest versions) and then think about being able to compile what you want.

    PATH problems and FLAGS are the things that still keep me from wanting to compile everything (although I usually do it the way I can). One of the tools that I usually consult to be able to compile the dependencies is the following web - http://www.linuxfromscratch.org/ -

    #LinuxFromScratch is a project that provides "step-by-step" instructions to compile the source code that you need to use on the system .. (98% of what I have needed to compile I have achieved by guiding me from here and gradually learning).

    As a plus point I think that compiling a system from 0 would be interesting especially for development environments or servers among other things that we say "are not usually as changeable" as a personal computer in which we are constantly installing and changing everything (it is my point of view) in addition to the fact that the minimum of performance that is gained in this type of use applications is very important.

    These are points of which very little is said nowadays and only the "scholars" manage, but it is interesting to give these kinds of things the tutorials they need so that every day we find more people who contribute help to the different communities where they participate and not only Gnu / Linux remains in time due to the poor performance of the collaborators, that although until now "it has worked this way" it is not very healthy to have only End Users.

  8.   Rabuda Eagle said

    Allow me a little addition. In order to obtain the advantages that you present here, you must properly configure the well-known make.conf. The processor family and compilation flags are indicated there. Similarly, there you can specify the number of cores to use during compilation. When you use all the cores of your mic, the compilation time is drastically reduced.

    All the best

  9.   Sebastian said

    Very good article. I would have liked an example as well or I would like directly, a post on how to compile in archlinux or how to use AUR. Happy New Year from Mendoza.

  10.   TheGuillox said

    A long time ago ... I always compiled the kernel, but it is very tedious to have to wait 40min: / anyway ... I haven't compiled anything except for the video drivers for a long time (only for special configurations).

  11.   Alejandro said

    The article is very interesting, but no sir, packing and compiling is not the same;) ..

  12.   c4explosive said

    Very good post. I agree about compiling certain programs, but sometimes it is somewhat tedious so it takes the machine to do the process. But apart from that one learns a lot, especially when libraries or packages are needed.
    I think for Archlinux, to compile you need the following package: base-devel
    pacman -S base-devel

  13.   rat kill said

    The info is very good, but the truth is, it is not necessary to compile, if you are a standard user and you just want something to work like this, don't even touch that. It is tedious to be compiling, always, I always say you are missing a library, you find one or another problem, tell me to compile the minecraft server so that everything is as good as possible and I take your time…. apart that every time an update or patch or whatever comes out, start compiling xd again

    1.    kik1n said

      Exactly, compiling is for very specific programs that you need to use optimally, because compiling everything, and as you say, there are always updates, mostly rolling release distros, is annoying. I would only recommend lts kernels.

  14.   FedoraUser said

    Today almost all the processors that people use support the same instructions, therefore compiling is only favorable when it comes to the kernel and in a system such as a server, that, and obviously when there are no precompiled packages, everything else it's a waste of time.

  15.   John Mere said

    Good contribution, I'm going to try to see how it goes, so far most of the time (almost always) I install from the repositories ...
    Small observation: The rar command options are unscripted and bunzip2 only unzips.

  16.   santiago said

    The most I compiled was a kernel for debian wheezy and it took me about 2 hours (I have an amd e450 1.6 ghz dual-core cpu) and that is precisely why I do not install gentoo, the time to compile and download the whole system would take me about 18 hours , and that if I have no problem, it is true that it is better to compile but most of the time the time taken is too much and I think it is not worth it. You have a speed boost but it's not much and I don't think it justifies all the time invested. Although if one day I have a pc with a processor as good as yours, I will try to install gentoo o

  17.   vampire said

    People:

    Without flame intentions or anything, slackers see it as natural to compile, generate the binary, install it with the relevant package manager (that resolves dependencies obviously, slapt-get, swaret, slackyd and / or several others), with everything optimized for our team and as if nothing, which is nothing to write home about or quantum physics.

    Watching a DVD without fidgeting on a P3 750MHz with 192MB RAM is neither impossible nor difficult to achieve over Slackware. I attest, and it's faster than compiling a Gentoo. But it's not the same, I also use Gentoo.

    The difference between hacker and consumer is that the consumer says "I wish it would work that way" and the hacker "I have a screwdriver and a few minutes" - Rael Dornfest

  18.   pepenrike said

    Is there really a noticeable performance improvement?
    With a last generation i7 and 18 Gb of ram, how do you notice the difference between compiled packages and binaries?

    I have always hated about the suitability of self-compiling packages, but I think that in today's desktop environment it is very complex to sustain it, especially due to the complexity of the dependencies, the continuous updates, and the enormous dependency on non-free sources. , as in the case of proprietary drivers, which undoubtedly influence performance much more than any aspect that can be compiled ...

    regards

    1.    Shyancore said

      Considering that Gnome 3 only compiles it (I will say the names crudely since the names of the packages I do not remember): the shell (the bar), gnome-control-center (complete, with its dependencies, etc), the applet for the time and about 2 or 3 dependencies for the shell to work. Obviously the shell required more dependencies for all its functions to work but it led me to compile GDM among others, I fixed this by modifying it with GConf once the shell was compiled.
      Now when I log in (via terminal) the environment takes much less time to load than when it was installed precompiled. Throwing a time to the air, in a precompiled way I think it took about 3 or 4 seconds to load the shell (with about 5 in which the wallpaper is shown, I never understood why it took so long, it seems to me that it is because of the driver with the GT 630) and compiled as soon as I entered the password X org starts and the environment is loaded (with preload and prelink I made them much faster, it seems to me that it is because they were passed to the cache; https://www.google.com.ar/search?q=preload+y+prelink+fedora&ie=utf-8&oe=utf-8&gws_rd=cr&ei=iXaqVPykO4qYNpbTgdAP )

    2.    mario said

      The fact that the i7 has ss4 and ss3 instructions, which are ignored by generic builds from various distros (debian builds for 486, ubuntu for 686) can give you an idea of ​​when hardware is wasted trying to span a 20 year old processor - maybe thanks for supporting my old pentium mmx-. If you need "proprietary drivers" as you mentioned, the kernel provides the ability to load specific firmware at compilation time. No more weird issues with xorg.

  19.   Fabian Alexis said

    Thanks for the info, it's always good to learn (or re-learn) (:

  20.   Javier said

    Debian gladly to Gentoo 🙂
    http://crysol.org/es/node/699

  21.   yuan six said

    Another disadvantage is that the compilation by terminal is for users who know or already have some knowledge about Linux. Is there a graphical tool that does not manage the compilation, installation and update of programs but graphically?

    1.    mario said

      Calculate linux does that, it is a gentoo with graphical tools ready to compile. In Phoronix they usually recommend it.

  22.   José said

    I am a linux user, sometimes when I want to install a program from the repository the old versions of the program are installed, simply because the new ones are not compiled for the distro in question, I think that knowing how to compile is essential, even more so when they are used rare distros.

  23.   joan said

    Everything it says in the post is fine and I don't doubt it's true, but the performance difference between installing a binary package and compiling yourself is imperceptible to a user.

    And the disadvantages of compiling are many and if they are clearly perceptible to the user. Therefore I personally step to compile.

  24.   NauTiluS said

    Where I have noticed the most performance when compiling the kernel, it was on a laptop with an AMD 64 processor. The change between the factory kernel and the compiled one was brutal.

    Right now, I have a factory kernel on my system, because as they say a lot around here, there was a time when I compiled almost everything and I got tired.

    Right now, I only compile some vitally important programs, such as to use a small server, or to play with emulators. Not long ago I made a post on how to compile the mame version. These programs generally if noticeable when you have it optimized for your system.

    I just need to try that gentoo distro, and see how the performance goes.

  25.   NauTiluS said

    I forgot to add, for people who take a long time compiling the kernel, more than 30 minutes, there are several tricks to do it in less time.

    One of those tricks is that, only compile the modules of your equipment, maximum, maybe 70 modules at most is what appears to you and if we add the support of iptables with all its requirements, I think it would increase to 300 modules. Come on, it's much better than compiling 3000-odd modules, a figure that is currently running if the kernel modules are compiled as they come from the factory or as they say, vanilla.

    The program that will help you to know which modules the kernel currently recognizes on your system is "localmodconfig" or by making use of this script "streamline_config.pl" found inside the kernel source directory, in the path "/ scripts / kconfig / »

    Of course, make sure you have all your USB devices connected, since once the kernel recognizes all your modules, it is only a matter of compiling.

    The kernel will be very light and you will feel a certain air of freshness in the system, as well as speed up the startup and shutdown of the system more.

    Greetings.

  26.   tabris said

    Life is not that easy! there are programs that use cmake or other things, and keeping everything updated and compiled takes time. And having such a CPU, what difference will it make to you?

  27.   Yoyo said

    The problem with compiling is that some of the programs that we install with that method are later not uninstalled or give errors when doing so, so we cannot uninstall them.

    1.    anonymous said

      You must save the folder with the compiled sources, when you want to uninstall, all you have to do is go to the sources folder and from a terminal as root execute:

      # make uninstall

      Of course, the packages compiled by hand by default in every serious distro are installed separately, that is, in / usr / local / bin not in / usr / bin where the package manager of the distro puts them by default, like that. nothing is intertwined.

  28.   FreeBSDDICK. said

    The article raises several interesting things but lacks a terrible quality in its terms and logical structure.

    «In an executable program for its operation through the use of the PROCESSOR for the conversion of the language used to generate the code to the binary and assembler. It is also often called packaging. "

    False . a Compiler is actually used, it is in charge of passing the instructions of a certain programming language to its corresponding assembly language and then translating this into machine language.

    Assembly language is a mnemonic that reflects a group of instructions resident in the registers of the chip.

    "When you download, unzip and compile the source code of a program yourself, it is compiled with the specific instructions of YOUR processor"

    When compiling a program, it will simply be done with the instructions common to the architecture. It is up to each user to activate the corresponding compiler flags in order to optimize a program for a specific processor.

    Regarding what you comment on compiling the kernel:
    When you compile the kernel, you are looking to activate or deactivate features that may or may not be useful at a certain time, which will not necessarily be reflected in the size and speed relationship in the execution load.

    When you refer to the following section:

    dh-make devscripts fakeroot debhelper debian-policy ccache dh-autoreconf autotools-dev build-dep

    These programs are not essential for compiling a program. As you were trying to say at the beginning, the number of programming languages ​​prevents you from knowing for sure what tools you must have installed to be able to compile programs in gnu / linux ... you can know that only by consulting the documentation of the program you want to carry out. The programs you mention are used to DEBIANIZE and package in this format a program that may or may not be compiled.

    There are other issues in the article that turn out to be somewhat ambiguous in the way they are posed. It would be difficult to address them all.

    I suggest the review of the article as far as possible by its creator and urges a better control of the quality of the publications.

    1.    pepenrike said

      Man, it's not that either.

      The article is not for Science magazine, it is simply an introductory article, and I think that in the terms in which it is written, it is in-depth enough for a novice user to understand the key concepts.

      If we get academic, three-quarters of what is published on the internet would be worth absolutely nothing.

      Let's not be so purist… it is impossible to agree 100% with an article, but we cannot continually assess "technical" quality, as if we were to evaluate a doctorate.

      My full support to the author of this article

  29.   Nonamed said

    interesting article

    it is always good for freedom lovers to use unar, instead of rar, to unzip rars freely. ( https://packages.debian.org/jessie/unar )

  30.   Jumi said

    I hit the bug with this issue ... I started searching in google but I can't find a tutorial to compile firefox under ubunto 14.04 amd64 bits ... otherwise, tonight I get the kernel with the following tutorial: http://www.redeszone.net/2014/11/28/como-instalar-el-ultimo-kernel-de-linux-en-ubuntu-14-04-lts/

  31.   carlos ferra said

    good article, I'm learning a lot. but I would use this only for some specific program that consumes a lot of resources, like video editors for example. Greetings.

  32.   babel said

    Between this article and the one from Gentoo they published a few days ago they tempt me to install Gentoo on my PC. Many years ago I used Sabayon, which facilitated the entire installation process but kept the base that is to compile from source. I honestly do not remember having noticed any difference in the performance of my Laptop (at that time I had a lap) with Sabayon or with Ubuntu, so I do not know whether to throw myself all the work of deleting my Arch that works very well to install it. I'm not sure a few milliseconds per program is worth it.

    1.    anonymous said

      Of the 4 pcs with gentoo that I have installed and updated, the notebook that had archlinux is added ... Systemd tired me, I already had to use it with startx because in the last update, both cores shot to 85% of use, without doing nothing, I was researching and it seems that something has changed in systemd to make slim go crazy and eat the microprocessor.
      Enough, it was enough with arch .... too long it lasted, more than two years, now I'm installing gentoo, I'm going for the stage3 testing update, for tonight an openbox with fries will go.

  33.   Leo said

    Good article, it makes me want to compile Qupzilla, but with a sempron it will take days, well, I know not so much, but it still gives a bad feeling.

  34.   Manuel Aponte said

    Another disadvantage of the compilation is that when there is an update it is necessary to compile and install the update again, which is a problem considering that some programs have short development cycles and for them updates are issued frequently, 2 to 3 months, with all this the casual user gets bored and the constant user consumes a lot of time in keeping his system up to date.

  35.   Manuel Aponte said

    I would like to know which applications is more recommended to compile. according to its usefulness, update frequency and performance improvement.

  36.   Alex Pol said

    This is absurd, if you need to compile yourself, you are using the wrong distribution. The only reason to compile is to add debug options to slow you down, in exchange for better fixing the bugs of others.
    Your system is not slow because it needs -O3, it is slow because there is a program reading too much to disk or painting too much on the screen.

    My recommendation: instead of micro-optimizing our system, let's work as a community to improve the software that we all have.

  37.   Javier Fernández said

    You have not explained how to optimize the compilation, for example in Gentoo USE options are used to optimize the generated code, you also have to indicate the processor, etc. How is that done in UBUNTU / Debian or Arch ?, Interesting Article.

  38.   Jose Manuel said

    Good!

    In the absence of reading the comments below, I have a newbie in linux:

    I use Fedora 20, I already have quite a few things installed, for example, the Firefox browser, to compile it for my machine, can I just do it? That is, under the code and compile it, or do I first have to eliminate the program that already I have downloaded to compile the new one ...

    The same with the Linux kernel and such….

    Maybe I am asking something absurd but I already say that I am quite a newbie to the serious Linux stuff lol

    A greeting!

    1.    Koprotk said

      I think the kernel is not necessary, but you must create an entry for each kernel in GRUB, with firefox I do not know if it is recommended to have 2 firefox, personally I prefer to have 1 only kernel and 1 only firefox

  39.   st-avapxia said

    The only thing I have compiled in my life has been a version in development of Musique, I really like that app, it was worth all the time it took for the process. For an end user like me, when I finished, I felt fulfilled.

    Greetings, excellent blog.

  40.   eco-slacker said

    Hello, I use Slackware and compiling the applications is the most normal thing in the world.
    You install the system from an ISO already precompiled, and the precompiled applications that you can use from the official repository are few, although if you want you can download the source code of the system (and the original scripts with which the entire distro is compiled) and compile it yourself which is how I imagine Gentoo works.
    However, the SlackBuilds project provides scripts (similar to the official distro) for many third-party applications, in which you download the source code of what you want to install and convert it to a tgz or txz package that is later installed with it. distro's official package manager. Therefore, the advantage is that you avoid using the configure, make, make install commands and you can update, reinstall or remove the package like any other and very easily.
    The downside is that dependencies are not resolved automagically in Slackware like in other distros, so you have to compile the necessary dependencies first and the application you want to install last. The compiled programs that I use are from LibreOffice, Texmaker, Spyder, Qt5, QtCreator, VLC, Wine, GRASS, QGis, among others. Depending on the application and its requirements, compilation and installation can take anywhere from 5 minutes to several hours. But if you want you can find and use a pre-compiled package to save yourself time.
    I haven't had time to check if there is much difference between compiled and precompiled packages, but my system is very stable. But I think that at least in my laptop there is not much difference because it is not that powerful, it has an i3 processor and 4 GB of RAM.
    Greetings and good luck compiling.

  41.   Koprotk said

    I am currently using Funtoo, to be honest I don't see any performance difference between compiling a program or installing the precompiled one, I do it purely for an educational purpose, but if there are differences between compiling the kernel and not doing it, yes. When I was using debian and wanted to compile something I used the following sequence:

    ./configure
    Make -j3 (number of cores + 1)
    Alien

    It used alíen because it creates a binary of the compiled program and so you can install it on your system as any binary, and so, if you want to uninstall, you can simply use synaptic or another package manager, that is the advantage of creating the package and installing the package as such, instead of doing "make install"

    1.    yukiteru said

      I do see an improvement, at least with large and heavy packages, for example Libreoffice in Funtoo takes much less time to load than in Debian, the same has happened to me with VLC or with mpv and MKV FullHD and multi-audio files, the load is much faster.

      Another that has also undergone the change is Firefox, in Debian having 10 or 15 tabs with my PC becomes torture, but with Funtoo I have managed to have up to 30 open and it continues as if nothing and the ram consumption is much lower and less Tending to freeze for JS files, I think it depends more on the context on how certain tasks and programs are executed.

  42.   Marco Sarmiento said

    The problem is that when we download it precompiled we are converting any linux distro into a crude copy of windows

  43.   Fermin said

    More than a spectacular increase in performance, I see the advantage in the possibility of compiling the packages with the components that one wants: for example, if you do not have a printer you can indicate that the packages with support for CUPS are not compiled -the packages that they use CUPS, obviously, if you compile Hunspell with or without CUPS it will not matter- just -at least in Gentoo- indicating in the make.conf file, where all the options for building packages are centralized "-cups"; if you use KDE5, or Plasma 5, as they call it now, you can specify the tags "-kde", "-qt4", which were valid tags for KDE 4 but unnecessary in KDE 5 and applications ported to the new desktop, "-gnome" , "-Gtk", and so on with any component you know you don't need. If for some reason a specific program needs, let's say GTK, then you can, in a file called package.use, indicate that it does use GTK, for example for Pidgin with the same label but without the minus sign, that is "gtk »:« Net-im / pidgin gtk ».
    In this way, a system is achieved several hundred megabytes lighter and smaller and more efficient binaries, by not having unnecessary code. I have gone from Ubuntu to Gentoo through Opensuse, Kubuntu, Debian, Arch, Chakra or KaOS, and Gentoo is the fastest system I have had, and I still have the same Core 2 Duo that I had 7 years ago. Of course, I leave the compilations for the night, because compiling QT5 for example takes several hours. If you set the "niceness" parameter for Portage in make.conf you can install packages or update while you continue working with the machine and you hardly notice much slowdown, although obviously the compilation time increases; but come on, with putting it to install or update when I go to dinner, and if necessary leaving it working overnight, my old computer works better than my girlfriend's I3 with Kubuntu.

    Another increasingly important aspect is that when compiling from source files, the security that the package we are installing is the original one, that it has not been manipulated by third parties, is almost total. I think Debian is implementing a build verification system that will guarantee a bit more that the precompile we install actually comes from the original source, but there will never be so much certainty when that package has been compiled on our machine with our setup.
    In my opinion, with a modern processor, not a ratchet like mine, hehe, and, if we want to speed up the process, with 8 GB of RAM to be able to mount / var / tmp -the temporary folder that Portage uses for compilation- in RAM, which will always be faster than a hard disk or an SSD, today I do not see much sense to use precompiled packages. If my Firefox computer takes about 40 minutes to compile, how long can it take for an I5 or an I7 that is currently on the market, 5 minutes, even less? I'm talking about source firefox, not firefox-bin, which is a precompiled binary package that can be installed on Gentoo if you have a very slow machine - there are several large packages that are already offered precompiled for this reason, it is not mandatory to compile everything -. I can't speak because my girlfriend won't let me fiddle with her computer, hehe, and mine is going so well that I don't feel the need to renew it, but if I'm right, I think it's worth wasting a few minutes compiling to have a system made to measure. More adjusted and adapted to our machine, I don't think that anything can be achieved without already entering those Linux methods from scratch, Linux from scratch, which I think is already reserved for computer scientists or very advanced Linux connoisseurs.

    Greetings.

  44.   Duck said

    Very good!
    a single thing does not exist the «Amd Atom x2»
    ni existira is a trademark of intel
    regards