Squid 3.5.15 and squidGuard CentOS 7 (https and ACL)

Good good. Here I bring you Squid 3.5 (stable) on CentOS, oh wow !!!, if they told me that I had to talk about CentOS and I was told by my readers that squid 3.5 was no longer leaking https y someone wrote me an email asking to filter by groups and content. So I bring you a review so you can see how I did it and you can do it.

ok first things first, how will squid know in CentOS this version is what? 3.3.8, well a bit outdated, but it works. However, for those who like to live in the current, the first thing is to add the squid repository (yes, you could download the tar.gz and compile it, but hey we are not going to reinvent the wheel here, someone already compiled it in a rpm package, hahaha). In Debian it has a series of bugs, among them filtering and must use Stretch repositories

As always, I am not telling you to install malware, this is from the official squid wiki, check HERE

vi /etc/yum.repo.d/squid.repo

[squid] name = Squid repo for CentOS Linux - $ basearch
#IL mirror
baseurl = http: //www1.ngtech.co.il/repo/centos/$releasever/$basearch/
# baseurl = http: //www1.ngtech.co.il/repo/centos/7/$basearch/
failovermethod = priority
enabled = 1
gpgcheck = 0

yum update

yum install squid3

Now, if you have read my other posts configuring squid will not be a problem. So summary HERE and to cache HERE. Do some things change? Well, like all linux, some files are here and not there, but the settings are the same. But lest you say that I am a villain, this is the minimum you should put

acl localnet src 172.16.0.0/21 #RFC1918 possible internal network

http_access allow localnet

http_port 172.16.5.110:3128

run the following command to create your cache space

squid -z

Then the next to verify that the configuration file is correct

squid -k parse

finally we restart the service

systemctl squid restart

Now since we filter http and https, and create simple acls, the answer is —-> SquidGuard, this man is a content filter and traffic redirector, at the moment there are a lot of people who prefer dansguardian, it is not my case. Although squidguard is no longer developed by anyone in specific, it is maintained by the same distributions and I have not the slightest idea if any specific organization is dedicated to this package, however, the fact is that it works and is still maintained constantly updated.

yum install squidGuard

As I said, with squidguard you can filter traffic through black lists (blacklist) or allow through white lists (whitelist)

You must add the following lines to squid.conf:

This indicates which program will be in charge of filtering the traffic, where the binary and the configuration file are located.

url_rewrite_program / usr / bin / squidGuard -c /etc/squid/squidGuard.conf

This indicates how many maximum redirectors will exist (150) to attend the requests, how many minimums start with the squid (120), how many will be kept in reserve (1), if they can attend more than 1 request simultaneously (0)

url_rewrite_children 150 startup = 120 idle = 1 concurrency = 0

If there is not even a redirector available, the person will not be able to navigate, which is ideal, we do not want anyone to navigate freely. In the log it will say an error when this happens and you should evaluate increasing the number of redirectors.

url_rewrite_bypass off

How to do Acl and filtering?

The first thing is to download a complete blacklist to start from HERE, you can also create and I will explain how, unzip in / var / squidGuard / this is by default in centos, but in others it is / var / lib / squidguard /

tar -xvzf bigblacklist.tar.gz /var/squidGuard/

chown -R  squid. /var/squidGuard/

You must enter squidguard.conf:

Declare where the lists are and where the logs will be saved.

dbhome / var / squidGuard / blacklists

logdir / var / log / squidGuard /

Now squid guard is handled with 3 tags src, dest, acl.

Suppose I want to create a "limited" group in src, I declare the group itself and all the ip's that belong to that group

src limited {
ip 172.168.128.10 # pepito perez informatica
ip 172.168.128.13 # andrea perez informatica
ip 172.168.128.20 # carolina perez informatica
}

Now for create and declare lists, is with the dest tag. Important !, you have to understand how you can block a page

  • -For example domain: facebook.com, the entire entire domain will be blocked
  • -For example urllist: facebook.com/juegos this means that only that url is blocked from the rest I can navigate all facebook
  • -Finally, expressionlist example facebook, then any page that has facebook written even if it is a news page that refers to an article and mentions facebook in its body will be blocked.

dest porn {
domainlist porn / domains
urllist porn / urls
expressionlist porn / expressions
}

Now we declare that you are going to block or not and what action you will take when it happens. It all starts with the label ACL, inside there are 'n' number of groups. Continuing with the "limited" example, the pass tag to refer to the lists and traffic, if the keyword has a exclamation mark (!) at the beginning refers to the fact that it is not allowed, otherwise yes, even if it is on a previously denied list.

In this example, we have the limited group, in which there is a whitelist or white list to allow certain traffic that may be blocked in the other black lists (!), Then finish the sentence with «Any»To indicate that if it did not match with any list then it allows that traffic. If it ended with the label «none»Would not allow that traffic. Finally the label redirect, to indicate what action to take when blocking a page in this case we send it to google.

Here is an example I put many more blocked lists, but remember that they must be declared in dest, also not all lists have urllist, expressionlist or domainlist so you should check well.

acl {
limited {
pass whitelist! porn! adult! sexuality! proxy! spyware! malware! hacking! mixed_adult! naturism! sect! marketingware! virusinfected! warez! weapons! hunting! updatesites! gambling! filehosting! filesharing! humor! ads! shopping! games! clothing ! desktopsillies! sexualityeducation! violence! remote-control! jobsearch! cellphones! kidstimewasting! ecommerce! beerliquorsale! radio! socialnetworking! social_networks! instantmessaging! chat! audio-video! verisign! sports! sportnews! news! press! entertainment! mobile-phone ! lingerie! magazines! manga! arjel! tobacco! frencheducation! celebrity! bitcoin! blog any
redirect http://google.com?
} #fin limited
} #end acl

We reload all the lists

squidGuard -b -C ALL

If squidguard ready for request appears in the log, then we are ready to start with

systemctl squid restart

Thanks for everything, I hope you keep writing in comments, and pay attention to all my posts.


Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: Miguel Ángel Gatón
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.

  1.   yraul said

    you have tried denying the porn images and all the expressions in https and especially when searching with google

  2.   edgar said

    when a sarg manual to complement that of squid

  3.   edgar said

    when a sarg manual to complement the squid,

  4.   snklb said

    Did it work for you with https?

  5.   freedarwin said

    Hand I need your help, please when you can write to me is Pedroza

  6.   Anonymous said

    IS THIS CONFIG FOR PROXY MANUAL OR TRANSPARENT?

  7.   JULY said

    IS THIS CONFIG FOR PROXY MANUAL OR TRANSPARENT?

  8.   linux frames said

    When did you install squidguard !!!!