Sergio Talens-Oliag: New Blog Config
hugo
, asciidoctor
and the papermod
theme, how I publish it using
nginx
, how I ve integrated the remark42
comment system and how I ve
automated its publication using gitea
and json2file-go
.
It is a long post, but I hope that at least parts of it can be interesting for
some, feel free to ignore it if that is not your case
Hugo ConfigurationTheme settingsThe site is using the PaperMod theme and as I m
using asciidoctor to publish my content I ve adjusted
the settings to improve how things are shown with it.
The current config.yml
file is the one shown below (probably some of the
settings are not required nor being used right now, but I m including the
current file, so this post will have always the latest version of it):
Some notes about the settings:
disableHLJS
and assets.disableHLJS
are set to true
; we plan to use
rouge
on adoc
and the inclusion of the hljs
assets adds styles that
collide with the ones used by rouge
.ShowToc
is set to true
and the TocOpen
setting is set to false
to
make the ToC appear collapsed initially. My plan was to use the asciidoctor
ToC, but after trying I believe that the theme one looks nice and I don t
need to adjust styles, although it has some issues with the html5s
processor (the admonition titles use <h6>
and they are shown on the ToC,
which is weird), to fix it I ve copied the layouts/partial/toc.html
to my
site repository and replaced the range of headings to end at 5
instead of
6
(in fact 5
still seems a lot, but as I don t think I ll use that heading
level on the posts it doesn t really matter).params.profileMode
values are adjusted, but for now I ve left it disabled
setting params.profileMode.enabled
to false
and I ve set the
homeInfoParams
to show more or less the same content with the latest posts
under it (I ve added some styles to my custom.css
style sheet to center the
text and image of the first post to match the look and feel of the profile).- On the
asciidocExt
section I ve adjusted the backend
to use html5s
,
I ve added the asciidoctor-html5s
and asciidoctor-diagram
extensions to
asciidoctor
and adjusted the workingFolderCurrent
to true
to make
asciidoctor-diagram
work right (haven t tested it yet).
Theme customisationsTo write in asciidoctor
using the html5s
processor I ve added some files to
the assets/css/extended
directory:
- As said before, I ve added the file
assets/css/extended/custom.css
to
make the homeInfoParams
look like the profile page and I ve also changed a
little bit some theme styles to make things look better with the html5s
output: - I ve also added the file
assets/css/extended/adoc.css
with some styles
taken from the asciidoctor-default.css
, see this
blog
post about the original file; mine is the same after formatting it with
css-beautify and editing it to use variables for
the colors to support light and dark themes: - The previous file uses variables from a partial copy of the
theme-vars.css
file that changes the highlighted code background color and adds the color
definitions used by the admonitions: - The previous styles use
font-awesome
, so I ve downloaded its resources for
version 4.7.0
(the one used by asciidoctor
) storing the
font-awesome.css
into on the assets/css/extended
dir (that way it is
merged with the rest of .css
files) and copying the fonts to the
static/assets/fonts/
dir (will be served directly):FA_BASE_URL="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0"
curl "$FA_BASE_URL/css/font-awesome.css" \
> assets/css/extended/font-awesome.css
for f in FontAwesome.otf fontawesome-webfont.eot \
fontawesome-webfont.svg fontawesome-webfont.ttf \
fontawesome-webfont.woff fontawesome-webfont.woff2; do
curl "$FA_BASE_URL/fonts/$f" > "static/assets/fonts/$f"
done
- As already said the default highlighter is disabled (it provided a
css
compatible with rouge
) so we need a css
to do the highlight styling; as
rouge
provides a way to export them, I ve created the
assets/css/extended/rouge.css
file with the thankful_eyes
theme:rougify style thankful_eyes > assets/css/extended/rouge.css
- To support the use of the
html5s
backend with admonitions I ve added a
variation of the example found on this
blog
post to assets/js/adoc-admonitions.js
:
and enabled its minified use on the layouts/partials/extend_footer.html
file
adding the following lines to it:
- $admonitions := slice (resources.Get "js/adoc-admonitions.js")
resources.Concat "assets/js/adoc-admonitions.js" minify fingerprint
<script defer crossorigin="anonymous" src=" $admonitions.RelPermalink "
integrity=" $admonitions.Data.Integrity "></script>
Theme settingsThe site is using the PaperMod theme and as I m
using asciidoctor to publish my content I ve adjusted
the settings to improve how things are shown with it.
The current config.yml
file is the one shown below (probably some of the
settings are not required nor being used right now, but I m including the
current file, so this post will have always the latest version of it):
Some notes about the settings:
disableHLJS
and assets.disableHLJS
are set to true
; we plan to use
rouge
on adoc
and the inclusion of the hljs
assets adds styles that
collide with the ones used by rouge
.ShowToc
is set to true
and the TocOpen
setting is set to false
to
make the ToC appear collapsed initially. My plan was to use the asciidoctor
ToC, but after trying I believe that the theme one looks nice and I don t
need to adjust styles, although it has some issues with the html5s
processor (the admonition titles use <h6>
and they are shown on the ToC,
which is weird), to fix it I ve copied the layouts/partial/toc.html
to my
site repository and replaced the range of headings to end at 5
instead of
6
(in fact 5
still seems a lot, but as I don t think I ll use that heading
level on the posts it doesn t really matter).params.profileMode
values are adjusted, but for now I ve left it disabled
setting params.profileMode.enabled
to false
and I ve set the
homeInfoParams
to show more or less the same content with the latest posts
under it (I ve added some styles to my custom.css
style sheet to center the
text and image of the first post to match the look and feel of the profile).- On the
asciidocExt
section I ve adjusted the backend
to use html5s
,
I ve added the asciidoctor-html5s
and asciidoctor-diagram
extensions to
asciidoctor
and adjusted the workingFolderCurrent
to true
to make
asciidoctor-diagram
work right (haven t tested it yet).
disableHLJS
andassets.disableHLJS
are set totrue
; we plan to userouge
onadoc
and the inclusion of thehljs
assets adds styles that collide with the ones used byrouge
.ShowToc
is set totrue
and theTocOpen
setting is set tofalse
to make the ToC appear collapsed initially. My plan was to use theasciidoctor
ToC, but after trying I believe that the theme one looks nice and I don t need to adjust styles, although it has some issues with thehtml5s
processor (the admonition titles use<h6>
and they are shown on the ToC, which is weird), to fix it I ve copied thelayouts/partial/toc.html
to my site repository and replaced the range of headings to end at5
instead of6
(in fact5
still seems a lot, but as I don t think I ll use that heading level on the posts it doesn t really matter).params.profileMode
values are adjusted, but for now I ve left it disabled settingparams.profileMode.enabled
tofalse
and I ve set thehomeInfoParams
to show more or less the same content with the latest posts under it (I ve added some styles to mycustom.css
style sheet to center the text and image of the first post to match the look and feel of the profile).- On the
asciidocExt
section I ve adjusted thebackend
to usehtml5s
, I ve added theasciidoctor-html5s
andasciidoctor-diagram
extensions toasciidoctor
and adjusted theworkingFolderCurrent
totrue
to makeasciidoctor-diagram
work right (haven t tested it yet).
Theme customisationsTo write in asciidoctor
using the html5s
processor I ve added some files to
the assets/css/extended
directory:
- As said before, I ve added the file
assets/css/extended/custom.css
to
make the homeInfoParams
look like the profile page and I ve also changed a
little bit some theme styles to make things look better with the html5s
output: - I ve also added the file
assets/css/extended/adoc.css
with some styles
taken from the asciidoctor-default.css
, see this
blog
post about the original file; mine is the same after formatting it with
css-beautify and editing it to use variables for
the colors to support light and dark themes: - The previous file uses variables from a partial copy of the
theme-vars.css
file that changes the highlighted code background color and adds the color
definitions used by the admonitions: - The previous styles use
font-awesome
, so I ve downloaded its resources for
version 4.7.0
(the one used by asciidoctor
) storing the
font-awesome.css
into on the assets/css/extended
dir (that way it is
merged with the rest of .css
files) and copying the fonts to the
static/assets/fonts/
dir (will be served directly):FA_BASE_URL="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0"
curl "$FA_BASE_URL/css/font-awesome.css" \
> assets/css/extended/font-awesome.css
for f in FontAwesome.otf fontawesome-webfont.eot \
fontawesome-webfont.svg fontawesome-webfont.ttf \
fontawesome-webfont.woff fontawesome-webfont.woff2; do
curl "$FA_BASE_URL/fonts/$f" > "static/assets/fonts/$f"
done
- As already said the default highlighter is disabled (it provided a
css
compatible with rouge
) so we need a css
to do the highlight styling; as
rouge
provides a way to export them, I ve created the
assets/css/extended/rouge.css
file with the thankful_eyes
theme:rougify style thankful_eyes > assets/css/extended/rouge.css
- To support the use of the
html5s
backend with admonitions I ve added a
variation of the example found on this
blog
post to assets/js/adoc-admonitions.js
:
and enabled its minified use on the layouts/partials/extend_footer.html
file
adding the following lines to it:
- $admonitions := slice (resources.Get "js/adoc-admonitions.js")
resources.Concat "assets/js/adoc-admonitions.js" minify fingerprint
<script defer crossorigin="anonymous" src=" $admonitions.RelPermalink "
integrity=" $admonitions.Data.Integrity "></script>
- As said before, I ve added the file
assets/css/extended/custom.css
to make thehomeInfoParams
look like the profile page and I ve also changed a little bit some theme styles to make things look better with thehtml5s
output: - I ve also added the file
assets/css/extended/adoc.css
with some styles taken from theasciidoctor-default.css
, see this blog post about the original file; mine is the same after formatting it with css-beautify and editing it to use variables for the colors to support light and dark themes: - The previous file uses variables from a partial copy of the
theme-vars.css
file that changes the highlighted code background color and adds the color definitions used by the admonitions: - The previous styles use
font-awesome
, so I ve downloaded its resources for version4.7.0
(the one used byasciidoctor
) storing thefont-awesome.css
into on theassets/css/extended
dir (that way it is merged with the rest of.css
files) and copying the fonts to thestatic/assets/fonts/
dir (will be served directly):FA_BASE_URL="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0" curl "$FA_BASE_URL/css/font-awesome.css" \ > assets/css/extended/font-awesome.css for f in FontAwesome.otf fontawesome-webfont.eot \ fontawesome-webfont.svg fontawesome-webfont.ttf \ fontawesome-webfont.woff fontawesome-webfont.woff2; do curl "$FA_BASE_URL/fonts/$f" > "static/assets/fonts/$f" done
- As already said the default highlighter is disabled (it provided a
css
compatible withrouge
) so we need acss
to do the highlight styling; asrouge
provides a way to export them, I ve created theassets/css/extended/rouge.css
file with thethankful_eyes
theme:rougify style thankful_eyes > assets/css/extended/rouge.css
- To support the use of the
html5s
backend with admonitions I ve added a variation of the example found on this blog post toassets/js/adoc-admonitions.js
: and enabled its minified use on thelayouts/partials/extend_footer.html
file adding the following lines to it:- $admonitions := slice (resources.Get "js/adoc-admonitions.js") resources.Concat "assets/js/adoc-admonitions.js" minify fingerprint <script defer crossorigin="anonymous" src=" $admonitions.RelPermalink " integrity=" $admonitions.Data.Integrity "></script>
Remark42 configurationTo integrate Remark42 with the PaperMod theme I ve
created the file layouts/partials/comments.html
with the following content
based on the remark42
documentation, including extra code to sync the dark/light setting with the
one set on the site:
In development I use it with anonymous comments enabled, but to avoid SPAM
the production site uses social logins (for now I ve only enabled Github
& Google, if someone requests additional services I ll check them, but those
were the easy ones for me initially).
To support theme switching with remark42
I ve also added the following inside
the layouts/partials/extend_footer.html
file:
- if (not site.Params.disableThemeToggle)
<script>
/* Function to change theme when the toggle button is pressed */
document.getElementById("theme-toggle").addEventListener("click", () =>
if (typeof window.REMARK42 != "undefined")
if (document.body.className.includes('dark'))
window.REMARK42.changeTheme('light');
else
window.REMARK42.changeTheme('dark');
);
</script>
- end
With this code if the theme-toggle
button is pressed we change the remark42
theme before the PaperMod
one (that s needed here only, on page loads the
remark42
theme is synced with the main one using the code from the
layouts/partials/comments.html
shown earlier).
- if (not site.Params.disableThemeToggle)
<script>
/* Function to change theme when the toggle button is pressed */
document.getElementById("theme-toggle").addEventListener("click", () =>
if (typeof window.REMARK42 != "undefined")
if (document.body.className.includes('dark'))
window.REMARK42.changeTheme('light');
else
window.REMARK42.changeTheme('dark');
);
</script>
- end
Development setupTo preview the site on my laptop I m using docker-compose
with the following
configuration:
To run it properly we have to create the .env
file with the current user ID
and GID on the variables APP_UID
and APP_GID
(if we don t do it the files
can end up being owned by a user that is not the same as the one running the
services):
$ echo "APP_UID=$(id -u)\nAPP_GID=$(id -g)" > .env
The Dockerfile
used to generate the sto/hugo-adoc
is:
If you review it you will see that I m using the
docker-asciidoctor image as
the base; the idea is that this image has all I need to work with asciidoctor
and to use hugo
I only need to download the binary from their latest release
at github (as we are using an
image based on alpine we also need to install the
libc6-compat
package, but once that is done things are working fine for me so
far).
The image does not launch the server by default because I don t want it to; in
fact I use the same docker-compose.yml
file to publish the site in production
simply calling the container without the arguments passed on the
docker-compose.yml
file (see later).
When running the containers with docker-compose up
(or docker compose up
if
you have the docker-compose-plugin
package installed) we also launch a nginx
container and the remark42
service so we can test everything together.
The Dockerfile
for the remark42
image is the original one with an updated
version of the init.sh
script:
The updated init.sh
is similar to the original, but allows us to use an
APP_GID
variable and updates the /etc/group
file of the container so the
files get the right user and group (with the original script the group is
always 1001
):
The environment file used with remark42
for development is quite minimal:
And the nginx/default.conf
file used to publish the service locally is simple
too:
$ echo "APP_UID=$(id -u)\nAPP_GID=$(id -g)" > .env
Production setupThe VM where I m publishing the blog runs Debian GNU/Linux
and uses binaries from local packages and applications packaged inside
containers.
To run the containers I m using
docker-ce (I could have used
podman instead, but I already had it installed on the
machine, so I stayed with it).
The binaries used on this project are included on the following packages from
the main
Debian repository:
git
to clone & pull the repository,jq
to parse json
files from shell scripts,json2file-go
to save the webhook messages to files,inotify-tools
to detect when new files are stored by json2file-go
and
launch scripts to process them,nginx
to publish the site using HTTPS and work as proxy for
json2file-go
and remark42
(I run it using a container),task-spool
to queue the scripts that update the deployment.
And I m using docker
and docker compose
from the debian packages on the
docker
repository:
docker-ce
to run the containers,docker-compose-plugin
to run docker compose
(it is a plugin, so no -
in
the name).
Repository checkoutTo manage the git
repository I ve created a deploy key, added it to gitea
and cloned the project on the /srv/blogops
PATH (that route is owned by a
regular user that has permissions to run docker
, as I said before).
Compiling the site with hugo
To compile the site we are using the docker-compose.yml
file seen before, to
be able to run it first we build the container images and once we have them we
launch hugo
using docker compose run
:
$ cd /srv/blogops
$ git pull
$ docker compose build
$ if [ -d "./public" ]; then rm -rf ./public; fi
$ docker compose run hugo --
The compilation leaves the static HTML on /srv/blogops/public
(we remove the
directory first because hugo
does not clean the destination folder as
jekyll
does).
The deploy script re-generates the site as described and moves the public
directory to its final place for publishing.
Running remark42
with dockerOn the /srv/blogops/remark42
folder I have the following docker-compose.yml
:
The ../.env
file is loaded to get the APP_UID
and APP_GID
variables that
are used by my version of the init.sh
script to adjust file permissions and
the env.prod
file contains the rest of the settings for remark42
, including
the social network tokens (see the
remark42 documentation for
the available parameters, I don t include my configuration here because some of
them are secrets).
Nginx configurationThe nginx
configuration for the blogops.mixinet.net
site is as simple as:
server
listen 443 ssl http2;
server_name blogops.mixinet.net;
ssl_certificate /etc/letsencrypt/live/blogops.mixinet.net/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/blogops.mixinet.net/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
access_log /var/log/nginx/blogops.mixinet.net-443.access.log;
error_log /var/log/nginx/blogops.mixinet.net-443.error.log;
root /srv/blogops/nginx/public_html;
location /
try_files $uri $uri/ =404;
include /srv/blogops/nginx/remark42.conf;
server
listen 80 ;
listen [::]:80 ;
server_name blogops.mixinet.net;
access_log /var/log/nginx/blogops.mixinet.net-80.access.log;
error_log /var/log/nginx/blogops.mixinet.net-80.error.log;
if ($host = blogops.mixinet.net)
return 301 https://$host$request_uri;
return 404;
On this configuration the certificates are managed by
certbot and the server root directory is on
/srv/blogops/nginx/public_html
and not on /srv/blogops/public
; the reason
for that is that I want to be able to compile without affecting the running
site, the deployment script generates the site on /srv/blogops/public
and if
all works well we rename folders to do the switch, making the change feel almost
atomic.
json2file-go configurationAs I have a working WireGuard VPN between the
machine running gitea
at my home and the VM where the blog is served, I m
going to configure the json2file-go
to listen for connections on a high port
using a self signed certificate and listening on IP addresses only reachable
through the VPN.
To do it we create a systemd socket
to run json2file-go
and adjust its
configuration to listen on a private IP (we use the FreeBind
option on its
definition to be able to launch the service even when the IP is not available,
that is, when the VPN is down).
The following script can be used to set up the json2file-go
configuration:
Warning: The script uses mkcert
to create the temporary certificates, to install the
package on bullseye
the backports
repository must be available.
Gitea configurationTo make gitea use our json2file-go
server we go to the project and enter into
the hooks/gitea/new
page, once there we create a new webhook of type gitea
and set the target URL to https://172.31.31.1:4443/blogops
and on the secret
field we put the token generated with uuid
by the setup script:
sed -n -e 's/blogops://p' /etc/json2file-go/dirlist
The rest of the settings can be left as they are:
- Trigger on: Push events
- Branch filter:
*
Warning: We are using an internal IP and a self signed certificate, that means that we
have to review that the webhook
section of the app.ini
of our gitea
server allows us to call the IP and skips the TLS verification (you can see the
available options on the
gitea
documentation).
The [webhook]
section of my server looks like this:
[webhook]
ALLOWED_HOST_LIST=private
SKIP_TLS_VERIFY=true
Once we have the webhook
configured we can try it and if it works our
json2file
server will store the file on the
/srv/blogops/webhook/json2file/blogops/
folder.
The json2file spooler scriptWith the previous configuration our system is ready to receive webhook calls
from gitea
and store the messages on files, but we have to do something to
process those files once they are saved in our machine.
An option could be to use a cronjob
to look for new files, but we can do
better on Linux using inotify
we will use the inotifywait
command from
inotify-tools
to watch the json2file
output directory and execute a script
each time a new file is moved inside it or closed after writing
(IN_CLOSE_WRITE
and IN_MOVED_TO
events).
To avoid concurrency problems we are going to use task-spooler
to launch the
scripts that process the webhooks using a queue of length 1, so they are
executed one by one in a FIFO queue.
The spooler script is this:
To run it as a daemon we install it as a systemd service
using the following
script:
The gitea webhook processorFinally, the script that processes the JSON files does the following:
- First, it checks if the repository and branch are right,
- Then, it fetches and checks out the commit referenced on the JSON file,
- Once the files are updated, compiles the site using
hugo
with docker
compose
, - If the compilation succeeds the script renames directories to swap the old
version of the site by the new one.
If there is a failure the script aborts but before doing it or if the swap
succeeded the system sends an email to the configured address and/or the user
that pushed updates to the repository with a log of what happened.
The current script is this one:
git
to clone & pull the repository,jq
to parsejson
files from shell scripts,json2file-go
to save the webhook messages to files,inotify-tools
to detect when new files are stored byjson2file-go
and launch scripts to process them,nginx
to publish the site using HTTPS and work as proxy forjson2file-go
andremark42
(I run it using a container),task-spool
to queue the scripts that update the deployment.
docker-ce
to run the containers,docker-compose-plugin
to rundocker compose
(it is a plugin, so no-
in the name).
Repository checkoutTo manage the git
repository I ve created a deploy key, added it to gitea
and cloned the project on the /srv/blogops
PATH (that route is owned by a
regular user that has permissions to run docker
, as I said before).
Compiling the site with hugo
To compile the site we are using the docker-compose.yml
file seen before, to
be able to run it first we build the container images and once we have them we
launch hugo
using docker compose run
:
$ cd /srv/blogops
$ git pull
$ docker compose build
$ if [ -d "./public" ]; then rm -rf ./public; fi
$ docker compose run hugo --
The compilation leaves the static HTML on /srv/blogops/public
(we remove the
directory first because hugo
does not clean the destination folder as
jekyll
does).
The deploy script re-generates the site as described and moves the public
directory to its final place for publishing.
$ cd /srv/blogops
$ git pull
$ docker compose build
$ if [ -d "./public" ]; then rm -rf ./public; fi
$ docker compose run hugo --
Running remark42
with dockerOn the /srv/blogops/remark42
folder I have the following docker-compose.yml
:
The ../.env
file is loaded to get the APP_UID
and APP_GID
variables that
are used by my version of the init.sh
script to adjust file permissions and
the env.prod
file contains the rest of the settings for remark42
, including
the social network tokens (see the
remark42 documentation for
the available parameters, I don t include my configuration here because some of
them are secrets).
Nginx configurationThe nginx
configuration for the blogops.mixinet.net
site is as simple as:
server
listen 443 ssl http2;
server_name blogops.mixinet.net;
ssl_certificate /etc/letsencrypt/live/blogops.mixinet.net/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/blogops.mixinet.net/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
access_log /var/log/nginx/blogops.mixinet.net-443.access.log;
error_log /var/log/nginx/blogops.mixinet.net-443.error.log;
root /srv/blogops/nginx/public_html;
location /
try_files $uri $uri/ =404;
include /srv/blogops/nginx/remark42.conf;
server
listen 80 ;
listen [::]:80 ;
server_name blogops.mixinet.net;
access_log /var/log/nginx/blogops.mixinet.net-80.access.log;
error_log /var/log/nginx/blogops.mixinet.net-80.error.log;
if ($host = blogops.mixinet.net)
return 301 https://$host$request_uri;
return 404;
On this configuration the certificates are managed by
certbot and the server root directory is on
/srv/blogops/nginx/public_html
and not on /srv/blogops/public
; the reason
for that is that I want to be able to compile without affecting the running
site, the deployment script generates the site on /srv/blogops/public
and if
all works well we rename folders to do the switch, making the change feel almost
atomic.
server
listen 443 ssl http2;
server_name blogops.mixinet.net;
ssl_certificate /etc/letsencrypt/live/blogops.mixinet.net/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/blogops.mixinet.net/privkey.pem;
include /etc/letsencrypt/options-ssl-nginx.conf;
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;
access_log /var/log/nginx/blogops.mixinet.net-443.access.log;
error_log /var/log/nginx/blogops.mixinet.net-443.error.log;
root /srv/blogops/nginx/public_html;
location /
try_files $uri $uri/ =404;
include /srv/blogops/nginx/remark42.conf;
server
listen 80 ;
listen [::]:80 ;
server_name blogops.mixinet.net;
access_log /var/log/nginx/blogops.mixinet.net-80.access.log;
error_log /var/log/nginx/blogops.mixinet.net-80.error.log;
if ($host = blogops.mixinet.net)
return 301 https://$host$request_uri;
return 404;
json2file-go configurationAs I have a working WireGuard VPN between the
machine running gitea
at my home and the VM where the blog is served, I m
going to configure the json2file-go
to listen for connections on a high port
using a self signed certificate and listening on IP addresses only reachable
through the VPN.
To do it we create a systemd socket
to run json2file-go
and adjust its
configuration to listen on a private IP (we use the FreeBind
option on its
definition to be able to launch the service even when the IP is not available,
that is, when the VPN is down).
The following script can be used to set up the json2file-go
configuration:
Warning: The script uses mkcert
to create the temporary certificates, to install the
package on bullseye
the backports
repository must be available.
Warning: The script uses mkcert
to create the temporary certificates, to install the
package on bullseye
the backports
repository must be available.
Gitea configurationTo make gitea use our json2file-go
server we go to the project and enter into
the hooks/gitea/new
page, once there we create a new webhook of type gitea
and set the target URL to https://172.31.31.1:4443/blogops
and on the secret
field we put the token generated with uuid
by the setup script:
sed -n -e 's/blogops://p' /etc/json2file-go/dirlist
The rest of the settings can be left as they are:
- Trigger on: Push events
- Branch filter:
*
Warning: We are using an internal IP and a self signed certificate, that means that we
have to review that the webhook
section of the app.ini
of our gitea
server allows us to call the IP and skips the TLS verification (you can see the
available options on the
gitea
documentation).
The [webhook]
section of my server looks like this:
[webhook]
ALLOWED_HOST_LIST=private
SKIP_TLS_VERIFY=true
Once we have the webhook
configured we can try it and if it works our
json2file
server will store the file on the
/srv/blogops/webhook/json2file/blogops/
folder.
sed -n -e 's/blogops://p' /etc/json2file-go/dirlist
- Trigger on: Push events
- Branch filter:
*
Warning: We are using an internal IP and a self signed certificate, that means that we
have to review that the webhook
section of the app.ini
of our gitea
server allows us to call the IP and skips the TLS verification (you can see the
available options on the
gitea
documentation).
The [webhook]
section of my server looks like this:
[webhook]
ALLOWED_HOST_LIST=private
SKIP_TLS_VERIFY=true
[webhook]
ALLOWED_HOST_LIST=private
SKIP_TLS_VERIFY=true
The json2file spooler scriptWith the previous configuration our system is ready to receive webhook calls
from gitea
and store the messages on files, but we have to do something to
process those files once they are saved in our machine.
An option could be to use a cronjob
to look for new files, but we can do
better on Linux using inotify
we will use the inotifywait
command from
inotify-tools
to watch the json2file
output directory and execute a script
each time a new file is moved inside it or closed after writing
(IN_CLOSE_WRITE
and IN_MOVED_TO
events).
To avoid concurrency problems we are going to use task-spooler
to launch the
scripts that process the webhooks using a queue of length 1, so they are
executed one by one in a FIFO queue.
The spooler script is this:
To run it as a daemon we install it as a systemd service
using the following
script:
The gitea webhook processorFinally, the script that processes the JSON files does the following:
- First, it checks if the repository and branch are right,
- Then, it fetches and checks out the commit referenced on the JSON file,
- Once the files are updated, compiles the site using
hugo
with docker
compose
, - If the compilation succeeds the script renames directories to swap the old
version of the site by the new one.
If there is a failure the script aborts but before doing it or if the swap
succeeded the system sends an email to the configured address and/or the user
that pushed updates to the repository with a log of what happened.
The current script is this one:
- First, it checks if the repository and branch are right,
- Then, it fetches and checks out the commit referenced on the JSON file,
- Once the files are updated, compiles the site using
hugo
withdocker compose
, - If the compilation succeeds the script renames directories to swap the old version of the site by the new one.