Introduction#
Analytics are broadly used on most, if not all, websites we are using every day. The most used service is Google Analytics. I don’t want to show how you can replace it with an open source alternative and what data I use to analyze this blog.
I will give you full transparency about what information I see.
Maybe you have noticed that there is no cookie banner on this blog. This is because Hugo generates a static website and all information I need is collected without cookies.
This means I can not (and don’t want to) track you over multiple devices or from different networks (e.g. mobile and desktop).
What do I use?#
I looked at different alternatives like Matomo and Umami but decided to go with Plausible.
Plausible is based on the eClickhouse database and can be hosted as a docker container.
Plausible#
Prerequisites#
- Your public DNS zone is set up to point to your server.
Installation#
I followed the installation guide, cloned the repo and configured Plausible to be usable with my existing nginx reverse proxy, following this.
The Docker compose file should not be edited directly, instead create a compose.override.yml file to override the default settings.
My override yaml looks like this:
services:
plausible:
ports:
- 127.0.0.1:8000:${HTTP_PORT}
And it uses the HTTP_PORT environment variable, so I also need to set environment variables by creating the .env file:
touch .env
echo "HTTP_PORT=8000" >> .env
echo "SECRET_KEY_BASE=$(openssl rand -base64 48)" >> .env
echo "BASE_URL=https://plausible.kohnkenet.de" >> .env
HTTP_PORTis the port used where nginx neeeds to forward traffic to.SECRET_KEY_BASEis a random string used for encryption.BASE_URLis the URL where Plausible will be available under, which points to the nginx.
I use Let’s Encrypts certbot to generate my SSL certificates and let it partly manage my nginx configuration:
Now we can start up the container:
sudo docker-compose up -d

Nginx configuration#
Next is the nginx configuration. I first only created the HTTP configuration and then created the certificates with certbot.
Finally I added forwarding configuration.
The complete config:
server {
server_name plausible.kohnkenet.de;
listen [::]:443 ssl; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/plausible.kohnkenet.de/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/plausible.kohnkenet.de/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
location / {
proxy_pass http://127.0.0.1:8000;
# proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
location /live/websocket {
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "Upgrade";
}
}
server {
if ($host = plausible.kohnkenet.de) {
return 301 https://$host$request_uri;
} # managed by Certbot
listen 80;
listen [::]:80;
server_name plausible.kohnkenet.de;
return 404; # managed by Certbot
}
Import the collection script#
Now the server is ready and reachable from your browser. Connect to it and create your user.
After you created your user you can select what kind of data you want to collect. I have chosen the following:
- Outbound links
- 404 errors
- Hashed page paths
This means I could track sponsored links, if I ever decide to implement them, can find broken links and can see what posts are the most interesting.

This script needs to be placed in the <head> of your website.
My Hugo theme Blowfish has a so called partial for this use case.
I just need to create the file /layouts/partials/extend-head.html and add the script from the setup.
After I push it to the main branch, the data is collected.
See the data#
Now when I visit the site, I can see the data in the dashboard:


Related posts#
If you are interested in how I automated my blog setup using git, see this post
