Reposted from Using Squid to Proxy SSL Sites (by Karim Elatov on Jan 5, 2019), with slight editing.
Squid
Squid is really flexible and allows many different approaches to proxying. From versions 3.5 and up, there is better support for SSL-Bumping, which is now called Peek and Slice. This allows Squid to look into the TLS handshake and generate Dynamic Certificates on the fly, so the browser doesn't throw any warnings (as long as the CA Cert is trusted by the Browser).
Generate a CA Certificate to be used by Squid
The process is descibed in detail in Dynamic SSL Certificate Generation and a nice configuration example is available at SSL-Bump using an intermediate CA. So first let's generate the certificate files:
<> openssl req -new -newkey rsa:2048 -sha256 -days 365 -nodes -x509 -extensions v3_ca -keyout squid-ca-key.pem -out squid-ca-cert.pem
Then combine the files:
<> cat squid-ca-cert.pem squid-ca-key.pem >> squid-ca-cert-key.pem
Then move the file to a location squid can read:
<> sudo mkdir /etc/squid/certs
<> sudo mv squid-ca-cert-key.pem /etc/squid/certs/.
<> sudo chown squid:squid -R /etc/squid/certs
And you should be set with the install.
Configure Squid to Peek-N-Slice SSL Connections
. . . Details omitted. Check the original if you will. . .
To confirm the config is okay:
<> sudo squid -k parse
Now let's create the SSL database and make sure the squid user can access it:
<> sudo /usr/lib64/squid/ssl_crtd -c -s /var/lib/ssl_db
<> sudo chown squid:squid -R /var/lib/ssl_db
Then enable the service, start it, and confirm it's running:
<> sudo systemctl enable squid
<> sudo systemctl start squid
<> sudo systemctl status squid.service
Import Certificate CA into the Browser for Squid
Now as a quick test we can use curl
to confirm it's working. Without the CA, you will see the following warning:
<> curl --proxy http://192.168.1.100:3128 https://google.com
curl: (60) SSL certificate problem: self signed certificate in certificate chain
More details here: https://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
HTTPS-proxy has similar options --proxy-cacert and --proxy-insecure.
So I copied the CA to the client machine, and then tried again:
<> curl --proxy http://192.168.1.100:3128 --cacert squid-ca-cert.pem https://google.com
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
<A HREF="https://www.google.com/">here</A>.
</BODY></HTML>
Now if we try with a browser, first specify the proxy server, in Linux we can start chrome
with the following parameter:
<> google-chrome-stable --proxy-server=192.168.1.100:3128
and by default you will see an SSL warning:
So in the address bat enter chrome://settings/certificates
, and then import the CA certificate under the Authorities section, and make sure you choose this certificate to validate websites:
Lastly confirm the certificate is imported:
Now if you go to any site, your browser will trust the Squid CA as Squid will generate a dynamic cert for that hostname:
Check out Squid Logs
After your browser is configured to use Squid as it's proxy you can check out the access logs to confirm it's proxying the connections:
<> tail -f /var/log/squid/access.log
1523141358.587 51 192.168.1.107 TAG_NONE/200 0 CONNECT clientservices.googleapis.com:443 - HIER_DIRECT/172.217.11.227 -
1523141358.587 47 192.168.1.107 TAG_NONE/200 0 CONNECT translate.googleapis.com:443 - HIER_DIRECT/172.217.11.234 -
1523141358.631 15 192.168.1.107 TCP_MISS/200 1563 GET https://translate.googleapis.com/translate_a/l? - HIER_DIRECT/172.217.11.234 application/json
Using a proxy auto-config (PAC) file to Specify Proxy Settings
Most of the browsers support specifying a URL for a pac file. A PAC file is a fancy java script file which allows you to make additional choices as to when you would like to use a Proxy. For example you can check the IP of the client, and which URL the client is heading to and then either go through the proxy or forward the client directly to the destination URL. There are a bunch of good examples:
Borrowing most of the content from one of the above sites, here is what I ended up with:
. . .Details omitted. Check the original if you will. . .
Now you just need to host that file on a webserver and when starting the chrome browser you can just pass the location of the file, here is an example:
<> google-chrome-stable --proxy-pac-url=http://10.0.0.2/proxy.pac
Or manually enter it in the chrome setting section.
Secure Proxy Connections
(This section talks about) most browsers don't support an HTTPS connection to a proxy server, so with chrome you actually need to specify it with a pac file or use a socks proxy. From Encrypted browser-Squid connection:
While HTTPS design efforts were focused on end-to-end communication, it would also be nice to be able to encrypt the browser-to-proxy connection (without creating a CONNECT tunnel that blocks Squid from accessing and caching content). This would allow, for example, a secure use of remote proxies located across a possibly hostile network.
Squid can accept regular proxy traffic using https_port in the same way Squid does it using an http_port directive. Unfortunately, popular modern browsers do not permit configuration of TLS/SSL encrypted proxy connections. There are open bug reports against most of those browsers now, waiting for support to appear. If you have any interest, please assist browser teams with getting that to happen.
Meanwhile, tricks using stunnel or SSH tunnels are required to encrypt the browser-to-proxy connection before it leaves the client machine. These are somewhat heavy on the network and can be slow as a result.
The Chrome browser is able to connect to proxies over SSL connections if configured to use one in a PAC file or command line switch. GUI configuration appears not to be possible (yet).
The Firefox 33.0 browser is able to connect to proxies over TLS connections if configured to use one in a PAC file. GUI configuration appears not to be possible (yet)
From Secure Web Proxy, here is a simple pac file which specifies a secure proxy server:
function FindProxyForURL(url, host) { return "HTTPS secure-proxy.example.com:443"; }
It's good to know it's possible.
Trying out WebSafety
There is a pretty cool product called WebSafety. It integrates really well with Squid to provide advanced web filter options and also a nice UI to configure most of the squid settings.
There is a community version but it doesn't support web filter. You can check out the differences at Community Version. It can show you current traffic and if it blocked anything (if you configured it to do so):
Top comments (0)