english/index.wml -> /index.en.html (with a symlink from index.html to index.en.html)
and french/index.wml -> /index.fr.html
. In contrast, debianhugo uses en/_index.md -> /index.html
and fr/_index.md -> /fr/index.html
.
Apache's multilingual content negotiation checks for index.<user preferred lang code>.html
in the current directory, which works well with webwml since all related translations are generated in the same directory. However, with debianhugo using subdirectories for languages other than English, we had to set up aliases for every other language page to be generated in the frontmatter. For example, in fr/_index.md
, we added this to the front matter:...
aliases:
- /index.fr.html
...
/index.en.html
. If it doesn t find it, it defaults to any other language-suffixed file, which can lead to unexpected behavior. For example, if English is set as the preferred language, accessing the site may serve /index.fr.html
, which then redirects to /fr/index.html
. This was a significant challenge, and you can see a demo of this hosted here.
If I were to start the project over, I would document every decision as I make them in the wiki, no matter how rough the documentation turns out. Waiting until the midpoint of the project to document was not a good idea.
As I move into the second half of my internship, the goals we ve set include improving our project wiki documentation and continuing the migration process while enhancing the user experience of complicated sections. I m looking forward to making even more progress and sharing my journey with you all. Happy coding!
Courtesy of my CRANberries, there is also a diffstat report for this release. For questions, suggestions, or issues please use the [issue tracker][issue tickets] at the GitHub repo.Changes in version 0.2.4 (2025-01-19)
- Use
Rcpp::RawVector
instead ofstd::vector<unsigned char>
saving extra copy (Travers in #16)- Several updates to README.md with R Journal paper, add badges, add Authors@R, add CITATION file, add repo info to DESCRIPTION
- Update continuous integration via r-ci
- Update to no longer require compilation standard
This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can now sponsor me at GitHub.
whois
:
$ whois -r 2a0b:7140:1:1:5054:ff:fe66:85c5
% This is the RIPE Database query service.
% The objects are in RPSL format.
%
% The RIPE Database is subject to Terms and Conditions.
% See https://docs.db.ripe.net/terms-conditions.html
% Note: this output has been filtered.
% To receive output for a database update, use the "-B" flag.
% Information related to '2a0b:7140:1::/48'
% Abuse contact for '2a0b:7140:1::/48' is 'abuse@servinga.com'
inet6num: 2a0b:7140:1::/48
netname: EE-SERVINGA-2022083002
descr: servinga.com - Estonia
geoloc: 59.4424455 24.7442221
country: EE
org: ORG-SG262-RIPE
mnt-domains: HANNASKE-MNT
admin-c: CL8090-RIPE
tech-c: CL8090-RIPE
status: ASSIGNED
mnt-by: MNT-SERVINGA
created: 2020-02-18T11:12:49Z
last-modified: 2024-12-04T12:07:26Z
source: RIPE
% Information related to '2a0b:7140:1::/48AS207408'
route6: 2a0b:7140:1::/48
descr: servinga.com - Estonia
origin: AS207408
mnt-by: MNT-SERVINGA
created: 2020-02-18T11:18:11Z
last-modified: 2024-12-11T23:09:19Z
source: RIPE
% This query was served by the RIPE Database Query Service version 1.114 (SHETLAND)
The important bit here is this line:
origin: AS207408
which referts to Autonomous System 207408,
owned by a hosting company in Germany called
Servinga.
$ curl -sL https://ip.guide/as207408 jq .routes.v4 >> servinga
$ curl -sL https://ip.guide/as207408 jq .routes.v6 >> servinga
or a local database downloaded from IPtoASN.
This is what I ended up with in the case of Servinga:
[
"45.11.183.0/24",
"80.77.25.0/24",
"194.76.227.0/24"
]
[
"2a0b:7140:1::/48"
]
<Location /blog.cgi>
Include /etc/apache2/spammers.include
Options +ExecCGI
AddHandler cgi-script .cgi
</Location>
and then put the following in /etc/apache2/spammers.include
:
<RequireAll>
Require all granted
# https://ipinfo.io/AS207408
Require not ip 46.11.183.0/24
Require not ip 80.77.25.0/24
Require not ip 194.76.227.0/24
Require not ip 2a0b:7140:1::/48
</RequireAll>
Finally, I can restart the website and commit my changes:
$ apache2ctl configtest && systemctl restart apache2.service
$ git commit -a -m "Ban all IP blocks from Servinga"
Go/go1.23.1 (amd64-linux) go-autorest/v14.2.1 Azure-SDK-For-Go/v68.0.0 storage/2021-09-01microsoft.com/aks-operat azsdk-go-armcompute/v1.0.0 (go1.22.3; linux)The main information is
Azure-SDK-For-Go
, which means the program making all these calls to storage API is written in Go. All our services are written in Typescript or Rust, so they are not suspect.
That leaves controllers running in kube-systems
namespace. I could not find anything suspects in the logs of these services.
At that point I was convinced that a component in Kubernetes control plane was making all those calls. Unfortunately, AKS is managed by Microsoft and I don t have access to the control plane logs.
However, we re realized that we had quite a lot of volumesnapshots
that are created in our clusters using k8s-scheduled-volume-snapshotter:
preprod
and prod
were the number of snapshots were quite different.
We tried to get more information using Azure console on our snapshot account, but it was also broken by the throttling issue.
We were so puzzled that we decided to try L odagan s advice (tout cr mer pour repartir sur des bases saines, loosely translated as burn everything down to start from scratch ) and we destroyed piece by piece our dev cluster while checking if the throttling stopped.
First, we removed all our applications, no change.
Then, all ancillary components like rabbitmq
, cert-manager
were removed, no change.
Then, we tried remove the namespace containing our applications. But, we faced another issue: Kubernetes was unable to remove the namespace because it could not destroy some PVC
and volumesnapshots
. That was actually good news, because it meant that we were close to the actual issue.
We managed to destroy the PVC
and volumesnapshots
by removing their finalizers. Finalizers are some kind of markers that tell kubernetes that something needs to be done before actually deleting a resource.
The finalizers were removed with a command like:
kubectl patch volumesnapshots $ volumesnapshot \ -p ' \"metadata\": \"finalizers\":null ' --type mergeThen, we got the first progress : the throttling and high call rate stopped on our dev cluster. To make sure that the snapshots were the issue, we re-installed the ancillary components and our applications. Everything was copacetic. So, the problem was indeed with
PVC
and snapshots.
Even though we have backups outside of Azure, we weren t really thrilled at trying L odagan s method on our prod cluster
So we looked for a better fix to try on our preprod cluster.
Poking around in PVC
and volumesnapshots
, I finally found this error message in the description on a volumesnapshotcontents
:
Code="ShareSnapshotCountExceeded" Message="The total number of snapshots for the share is over the limit."The number of snapshots found in our cluster was not that high. So I wanted to check the snapshots present in our storage account using Azure console, which was still broken. Fortunately, Azure CLI is able to retry
HTTP
calls when getting 429
errors. I managed to get a list of snapshots with
az storage share list --account-name [redacted] --include-snapshots \
tee preprod-list.json
k8s-scheduled-volume-snapshotter
creates new snapshots when it cannot list the old ones. So we had 4 new snapshots per day instead of one.
Since we had the chain of events, fixing the issue was not too difficult (but quite long ):
k8s-scheduled-volume-snapshotter
by disabling its cron jobaz
command and a Perl script (this step took several hours)k8s-scheduled-volume-snapshotter
k8s-scheduled-volume-snapshotter
.
Anyway, to avoid this problem is the future, we will:
k8s-scheduled-volume-snapshotter
author to better cope with throttlinggitlab-ci
) to generate configuration files and code snippets because it uses the same syntax used by
helm (easier to use by other DevOps already familiar with the format) and the binary is small and
can be easily included into the docker images used by the pipeline jobs.
One interesting feature of the tmpl
tool is that it can read values from command line arguments and from multiple
files in different formats (YAML, JSON, TOML, etc) and merge them into a single object that can be used to render the
templates.
There are alternatives to the tmpl
tool and I ve looked at them (i.e. simple ones like
go-template-cli or complex ones like
gomplate), but I haven t found one that fits my needs.
For my next project I plan to evaluate a move to a different tool or template format, as tmpl
is not being actively
maintained (as I said, I m using my own fork) and it is not included on existing GNU/Linux distributions (I packaged it
for Debian
and Alpine
, but I don t want to maintain something like that without an active community and I m not
interested in being the upstream myself, as I m trying to move to Rust instead of
Go as the compiled programming language for my projects).
fast_float
library
version to the current version 7.0.0, and updates a few packaging
aspects.
Courtesy of my CRANberries, there is also a diffstat report for this release. For questions, suggestions, or issues please use the [issue tracker][issue tickets] at the GitHub repo.Changes in version 0.0.5 (2025-01-15)
- No longer set a compilation standard
- Updates to continuous integration, badges, URLs, DESCRIPTION
- Update to fast_float 7.0.0
- Per CRAN Policy comment-out compiler 'diagnostic ignore' instances
This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can now sponsor me at GitHub.
checkbashism
script complained about.
The following section from the NEWS.Rd file has full details.
Thanks to my CRANberries, there is a diff to the previous release. The RProtoBuf page has copies of the (older) package vignette, the quick overview vignette, and the pre-print of our JSS paper. Questions, comments etc should go to the GitHub issue tracker off the GitHub repo.Changes in RProtoBuf version 0.4.23 (2022-12-13)
- More robust tests using
toTextFormat()
(Xufei Tan in #99 addressing #98)- Various standard packaging updates to CI and badges (Dirk)
- Improvements to string construction in error messages (Michael Chirico in #102 and #103)
- Accommodate ProtoBuf 26.x and later (Matteo Gianella in #104)
- Accommodate ProtoBuf 6.30.9 and later (Lev Kandel in #106)
- Correct
bashism
issues inconfigure.ac
(Dirk)
This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can sponsor me at GitHub.
.Call(symbol)
as well
as for the url to the Rcpp book (which
has remained unchanged for years) failing . My email reply was promptly
dealt with under European morning hours and by the time I got up the
submission was in state waiting over a single reverse-dependency
failure which is also spurious, appears on some systems and not
others, and also not new. Imagine that: nearly 3000 reverse dependencies
and only one (spurious) change to worse. Solid testing seems to help. My
thanks as always to the CRAN
for responding promptly.
This release continues with the six-months January-July cycle started
with release
1.0.5 in July 2020. This time we also need a one-off hotfix
release 1.0.13-1: we had (accidentally) conditioned an upcoming R
change on 4.5.0, but it already came with 4.4.2 so we needed to adjust
our code. As a reminder, we do of course make interim snapshot dev or
rc releases available via the Rcpp drat repo as well as
the r-universe page and
repo and strongly encourage their use and testing I run my systems
with these versions which tend to work just as well, and are also fully
tested against all reverse-dependencies.
Rcpp has long established itself
as the most popular way of enhancing R with C or C++ code. Right now,
2977 packages on CRAN depend on
Rcpp for making analytical code go
faster and further. On CRAN, 13.6% of all packages depend (directly) on
Rcpp, and 60.8% of all compiled
packages do. From the cloud mirror of CRAN (which is but a subset of all
CRAN downloads), Rcpp has been
downloaded 93.7 million times. The two published papers (also included
in the package as preprint vignettes) have, respectively, 1947 (JSS, 2011) and 354 (TAS, 2018)
citations, while the the book (Springer useR!,
2013) has another 676.
This release is primarily incremental as usual, generally preserving
existing capabilities faithfully while smoothing our corners and / or
extending slightly, sometimes in response to changing and tightened
demands from CRAN or R standards. The move towards a
more standardized approach for the C API of R once again to a few
changes; Kevin did once again did most of these PRs. Other contributed
PRs include G bor permitting builds on yet another BSD variant, Simon
Guest correcting sourceCpp()
to work on read-only files,
Marco Colombo correcting a (surprisingly large) number of vignette
typos, I aki rebuilding some documentation files that tickled (false)
alerts, and I took care of a number of other maintenance items along the
way.
The full list below details all changes, their respective PRs and, if
applicable, issue tickets. Big thanks from all of us to all
contributors!
Thanks to my CRANberries, you can also look at a diff to the previous release Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page. Bugs reports are welcome at the GitHub issue tracker as well (where one can also search among open or closed issues).Changes in Rcpp release version 1.0.14 (2025-01-11)
- Changes in Rcpp API:
- Support for user-defined databases has been removed (Kevin in #1314 fixing #1313)
- The
SET_TYPEOF
function and macro is no longer used (Kevin in #1315 fixing #1312)- An errorneous cast to
int
affecting large return object has been removed (Dirk in #1335 fixing #1334)- Compilation on DragonFlyBSD is now supported (G bor Cs rdi in #1338)
- Use read-only
VECTOR_PTR
andSTRING_PTR
only with with R 4.5.0 or later (Kevin in #1342 fixing #1341)- Changes in Rcpp Attributes:
- Changes in Rcpp Deployment:
- One unit tests for arm64 macOS has been adjusted; a macOS continuous integration runner was added (Dirk in #1324)
- Authors@R is now used in DESCRIPTION as mandated by CRAN, the
Rcpp.package.skeleton()
function also creates it (Dirk in #1325 and #1327)- A single datetime format test has been adjusted to match a change in R-devel (Dirk in #1348 fixing #1347)
- Changes in Rcpp Documentation:
This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can sponsor me at GitHub.
~/.config/sway/config
I tuned some things:
sway
is started / reloaded (I adjusted my configuration with wdisplays
and used shikanectl
to save it).xdg-desktop-portal-wlr
service.swayidle
command to lock the screen after some time of inactivity.es
key mapgrimm
and swappy
to take screenshotswaybar
as the environment bar.sway
is started (it uses swaymsg
to execute background commands
and the i3toolwait
script to wait for the#!/bin/sh
# VARIABLES
CHROMIUM_LOCAL_STATE="$HOME/.config/google-chrome/Local State"
I3_TOOLWAIT="$HOME/.config/sway/scripts/i3-toolwait"
# Functions
chromium_profile_dir()
jq -r ".profile.info_cache to_entries map( (.value.name): .key ) add .\"$1\" // \"\"" "$CHROMIUM_LOCAL_STATE"
# MAIN
IGZ_PROFILE_DIR="$(chromium_profile_dir "sergio.talens@intelygenz.com")"
OURO_PROFILE_DIR="$(chromium_profile_dir "sergio.talens@nxr.global")"
PERSONAL_PROFILE_DIR="$(chromium_profile_dir "stalens@gmail.com")"
# Common programs
swaymsg "exec nextcloud --background"
swaymsg "exec nm-applet"
# Run spotify on the first workspace (it is mapped to the laptop screen)
swaymsg -q "workspace 1"
$ I3_TOOLWAIT "spotify"
# Run tmux on the
swaymsg -q "workspace 2"
$ I3_TOOLWAIT -- foot tmux a -dt sto
wp_num="3"
if [ "$OURO_PROFILE_DIR" ]; then
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT -m ouro-browser -- google-chrome --profile-directory="$OURO_PROFILE_DIR"
wp_num="$((wp_num+1))"
fi
if [ "$IGZ_PROFILE_DIR" ]; then
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT -m igz-browser -- google-chrome --profile-directory="$IGZ_PROFILE_DIR"
wp_num="$((wp_num+1))"
fi
if [ "$PERSONAL_PROFILE_DIR" ]; then
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT -m personal-browser -- google-chrome --profile-directory="$PERSONAL_PROFILE_DIR"
wp_num="$((wp_num+1))"
fi
# Open the browser without setting the profile directory if none was found
if [ "$wp_num" = "3" ]; then
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT google-chrome
wp_num="$((wp_num+1))"
fi
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT evolution
wp_num="$((wp_num+1))"
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT slack
wp_num="$((wp_num+1))"
# Open a private browser and a console in the last workspace
swaymsg -q "workspace $wp_num"
$ I3_TOOLWAIT -- google-chrome --incognito
$ I3_TOOLWAIT foot
# Go back to the second workspace for keepassxc
swaymsg "workspace 2"
$ I3_TOOLWAIT keepassxc
vi
/vim
and emacs
as my text editors (vi
for plain text and
emacs
for programming and editing HTML/XML), but eventually I moved to vim
as my main text editor and I ve been
using it since (well, I moved to neovim
some time ago, although I kept my old vim
configuration).
To be fair I m not as expert as I could be with vim
, but I m productive with it and it has many plugins that make my
life easier on my machines, while keeping my ability to edit text and configurations on any system that has a vi
compatible editor installed.
For work reasons I tried to use Visual Studio Code last year, but I ve never really
liked it and almost everything I do with it I can do with neovim
(i. e. I even use copilot
with it). Besides, I m a
heavy terminal user (I use tmux
locally and via ssh
) and I like to be able to use my text editor on my shell
sessions, and code
does not work like that.
The only annoying thing about vim
/neovim
is its configuration (well, the problem is that I have a very old one and
probably should spend some time fixing and updating it), but, as I said, it s been working well for me for a long time,
so I never really had the motivation to do it.
Anyway, after finishing my desktop tests I saw that I had the Helix editor installed for
some time but I never tried it, so I decided to give it a try and see if it could be a good replacement for neovim
on
my environments (the only drawback is that as it is not vi
compatible, I would need to switch back to vi
mode when
working on remote systems, but I guess I could live with that).
I ran the helix
tutorial and I liked it, so I decided to configure and install the
Language Servers I can probably take
advantage of on my daily work on my personal and work machines and see how it works.
# AWK
sudo npm i -g 'awk-language-server@>=0.5.2'
# BASH
sudo apt-get install shellcheck shfmt
sudo npm i -g bash-language-server
# C/C++
sudo apt-get install clangd
# CSS, HTML, ESLint, JSON, SCS
sudo npm i -g vscode-langservers-extracted
# Docker
sudo npm install -g dockerfile-language-server-nodejs
# Docker compose
sudo npm install -g @microsoft/compose-language-service
# Helm
app="helm_ls_linux_amd64"
url="$(
curl -s https://api.github.com/repos/mrjosh/helm-ls/releases/latest
jq -r ".assets[] select(.name == \"$app\") .browser_download_url"
)"
curl -L "$url" --output /tmp/helm_ls
sudo install /tmp/helm_ls /usr/local/bin
rm /tmp/helm_ls
# Markdown
app="marksman-linux-x64"
url="$(
curl -s https://api.github.com/repos/artempyanykh/marksman/releases/latest
jq -r ".assets[] select(.name == \"$app\") .browser_download_url"
)"
curl -L "$url" --output /tmp/marksman
sudo install /tmp/marksman /usr/local/bin
rm /tmp/marksman
# Python
sudo npm i -g pyright
# Rust
rustup component add rust-analyzer
# SQL
sudo npm i -g sql-language-server
# Terraform
sudo apt-get install terraform-ls
# TOML
cargo install taplo-cli --locked --features lsp
# YAML
sudo npm install --global yaml-language-server
# JavaScript, TypeScript
sudo npm install -g typescript-language-server typescript
sudo npm install -g --save-dev --save-exact @biomejs/biome
helix
configuration is done on a couple of toml
files that are placed on the ~/.config/helix
directory,
the config.toml
file I used is this one:
theme = "solarized_light"
[editor]
line-number = "relative"
mouse = false
[editor.statusline]
left = ["mode", "spinner"]
center = ["file-name"]
right = ["diagnostics", "selections", "position", "file-encoding", "file-line-ending", "file-type"]
separator = " "
mode.normal = "NORMAL"
mode.insert = "INSERT"
mode.select = "SELECT"
[editor.cursor-shape]
insert = "bar"
normal = "block"
select = "underline"
[editor.file-picker]
hidden = false
[editor.whitespace]
render = "all"
[editor.indent-guides]
render = true
character = " " # Some characters that work well: " ", " ", " ", " "
skip-levels = 1
language-servers.toml
file:
[[language]]
name = "go"
auto-format = true
formatter = command = "goimports"
[[language]]
name = "javascript"
language-servers = [
"typescript-language-server", # optional
"vscode-eslint-language-server",
]
[language-server.rust-analyzer.config.check]
command = "clippy"
[language-server.sql-language-server]
command = "sql-language-server"
args = ["up", "--method", "stdio"]
[[language]]
name = "sql"
language-servers = [ "sql-language-server" ]
[[language]]
name = "hcl"
language-servers = [ "terraform-ls" ]
language-id = "terraform"
[[language]]
name = "tfvars"
language-servers = [ "terraform-ls" ]
language-id = "terraform-vars"
[language-server.terraform-ls]
command = "terraform-ls"
args = ["serve"]
[[language]]
name = "toml"
formatter = command = "taplo", args = ["fmt", "-"]
[[language]]
name = "typescript"
language-servers = [
"typescript-language-server",
"vscode-eslint-language-server",
]
helix
and the most interesting thing
for me was the easy configuration and the language server integrations, but as I am already comfortable with neovim
and just had installed the language server support tools on my machines I just need to
configure them for neovim and I can keep using it for a while.
As I said my configuration is old, to configure neovim
I have the following init.vim
file on my ~/.config/nvim
folder:
set runtimepath^=~/.vim runtimepath+=~/.vim/after
let &packpath=&runtimepath
source ~/.vim/vimrc
" load lua configuration
lua require('config')
vimrc
(it is a little bit messy, but it works) and I use a lua
configuration
file for the language servers and some additional neovim
plugins on the ~/.config/nvim/lua/config.lua
file:
-- -----------------------
-- BEG: LSP Configurations
-- -----------------------
-- AWS (awk_ls)
require'lspconfig'.awk_ls.setup
-- Bash (bashls)
require'lspconfig'.bashls.setup
-- C/C++ (clangd)
require'lspconfig'.clangd.setup
-- CSS (cssls)
require'lspconfig'.cssls.setup
-- Docker (dockerls)
require'lspconfig'.dockerls.setup
-- Docker Compose
require'lspconfig'.docker_compose_language_service.setup
-- Golang (gopls)
require'lspconfig'.gopls.setup
-- Helm (helm_ls)
require'lspconfig'.helm_ls.setup
-- Markdown
require'lspconfig'.marksman.setup
-- Python (pyright)
require'lspconfig'.pyright.setup
-- Rust (rust-analyzer)
require'lspconfig'.rust_analyzer.setup
-- SQL (sqlls)
require'lspconfig'.sqlls.setup
-- Terraform (terraformls)
require'lspconfig'.terraformls.setup
-- TOML (taplo)
require'lspconfig'.taplo.setup
-- Typescript (ts_ls)
require'lspconfig'.ts_ls.setup
-- YAML (yamlls)
require'lspconfig'.yamlls.setup
settings =
yaml =
customTags = "!reference sequence"
-- -----------------------
-- END: LSP Configurations
-- -----------------------
-- ---------------------------------
-- BEG: Autocompletion configuration
-- ---------------------------------
-- Ref: https://github.com/neovim/nvim-lspconfig/wiki/Autocompletion
--
-- Pre requisites:
--
-- # Packer
-- git clone --depth 1 https://github.com/wbthomason/packer.nvim \
-- ~/.local/share/nvim/site/pack/packer/start/packer.nvim
--
-- # Start nvim and run :PackerSync or :PackerUpdate
-- ---------------------------------
local use = require('packer').use
require('packer').startup(function()
use 'wbthomason/packer.nvim' -- Packer, useful to avoid removing it with PackerSync / PackerUpdate
use 'neovim/nvim-lspconfig' -- Collection of configurations for built-in LSP client
use 'hrsh7th/nvim-cmp' -- Autocompletion plugin
use 'hrsh7th/cmp-nvim-lsp' -- LSP source for nvim-cmp
use 'saadparwaiz1/cmp_luasnip' -- Snippets source for nvim-cmp
use 'L3MON4D3/LuaSnip' -- Snippets plugin
end)
-- Add additional capabilities supported by nvim-cmp
local capabilities = require("cmp_nvim_lsp").default_capabilities()
local lspconfig = require('lspconfig')
-- Enable some language servers with the additional completion capabilities offered by nvim-cmp
local servers = 'clangd', 'rust_analyzer', 'pyright', 'ts_ls'
for _, lsp in ipairs(servers) do
lspconfig[lsp].setup
-- on_attach = my_custom_on_attach,
capabilities = capabilities,
end
-- luasnip setup
local luasnip = require 'luasnip'
-- nvim-cmp setup
local cmp = require 'cmp'
cmp.setup
snippet =
expand = function(args)
luasnip.lsp_expand(args.body)
end,
,
mapping = cmp.mapping.preset.insert(
['<C-u>'] = cmp.mapping.scroll_docs(-4), -- Up
['<C-d>'] = cmp.mapping.scroll_docs(4), -- Down
-- C-b (back) C-f (forward) for snippet placeholder navigation.
['<C-Space>'] = cmp.mapping.complete(),
['<CR>'] = cmp.mapping.confirm
behavior = cmp.ConfirmBehavior.Replace,
select = true,
,
['<Tab>'] = cmp.mapping(function(fallback)
if cmp.visible() then
cmp.select_next_item()
elseif luasnip.expand_or_jumpable() then
luasnip.expand_or_jump()
else
fallback()
end
end, 'i', 's' ),
['<S-Tab>'] = cmp.mapping(function(fallback)
if cmp.visible() then
cmp.select_prev_item()
elseif luasnip.jumpable(-1) then
luasnip.jump(-1)
else
fallback()
end
end, 'i', 's' ),
),
sources =
name = 'nvim_lsp' ,
name = 'luasnip' ,
,
-- ---------------------------------
-- END: Autocompletion configuration
-- ---------------------------------
helix
installed and try it again on some of my personal projects to see if I can get used to it,
but for now I ll stay with neovim
as my main text editor and learn the shortcuts to use it with the language servers.R_NO_REMAP
by conditioning on the release version. It turns
that this does not get when the #define
independently so
this needed a small refinement which this version brings. No other
changes were made.
The NEWS
extract follows and details the changes some
more.
Thanks to my CRANberries, you can also look at a diff to the previous release. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page. Bugs reports are welcome at the GitHub issue tracker as well (where one can also search among open or closed issues).Changes in inline version 0.3.21 (2025-01-08)
- Refine use of
Rf_warning
incfunction
setting-DR_NO_REMAP
ourselves to get R version independent state
This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. If you like this or other open-source work I do, you can now sponsor me at GitHub.
amd64
architecture, naturally, but it also is building Debian packages that are marked with the no architecture label, all
. The second builder is, however, only rebuilding the i386
architecture.
Both of these services were also switched to reproduce the Debian trixie distribution instead of unstable, which started with 43% of the archive rebuild with 79.3% reproduced successfully. This is very much a work in progress, and we ll start reproducing Debian unstable soon.
Our i386
hosts are very kindly sponsored by Infomaniak whilst the amd64
node is sponsored by OSUOSL thank you! Indeed, we are looking for more workers for more Debian architectures; please contact us if you are able to help.
apt install debian-repro-status
.
glibc
within openSUSE.
kpcyrd
followed-up to a post from September 2024 which mentioned their desire for someone to implement a hashset of allowed module hashes that is generated during the kernel build and then embedded in the kernel image , thus enabling a deterministic and reproducible build. However, they are now reporting that somebody implemented the hash-based allow list feature and submitted it to the Linux kernel mailing list . Like kpcyrd
, we hope it gets merged.
First, we propose an automated approach for library reproducibility to enhance library security during the deployment phase. We then develop a scalable call graph generation technique to support various use cases, such as method-level vulnerability analysis and change impact analysis, which help mitigate security challenges within the ecosystem. Utilizing the generated call graphs, we explore the impact of libraries on their users. Finally, through empirical research and mining techniques, we investigate the current state of the Maven ecosystem, identify harmful practices, and propose recommendations to address them.A PDF of Mehdi s entire thesis is available to download.
283
and 284
to Debian:
tests_quines.py::test_ differences,differences_deb
to simply use assert_diff and not mangle the test fixture. [ ]0.11+nmu4
of the dh-buildinfo
package. In this release, the dh_buildinfo
becomes a no-op ie. it no longer does anything beyond warning the developer that the dh-buildinfo
package is now obsolete. In his upload, Santiago wrote that We still want packages to drop their [dependency] on dh-buildinfo
, but now they will immediately benefit from this change after a simple rebuild.
dpkg
.
debian-devel
development mailing list on the topic of Supporting alternative zlib implementations . In particular, Fay wrote about her results experimenting whether zlib-ng
produces identical results or not.
rust-rebuilderd-worker
, rust-derp
, rust-in-toto
and debian-repro-status
to Debian, which passed successfully through the so-called NEW queue.
debrebuild
component/script of the devscripts
package, including:
dh-r
package to report that the Recommends
and Suggests
fields are missing from rebuilt R packages. At the time of writing, this bug has no patch and needs some help to make over 350 binary packages reproducible.
-fobject-determinism
, which enables deterministic object code generation .
tests.reproducible-builds.org/archlinux
now redirect to reproducible.archlinux.org instead. In fact, everything Arch-related has now been removed from the jenkins.debian.net.git
repository, as those continuous integration tests have been disabled for some time.
0.7.29
was uploaded to Debian unstable by Vagrant Cascadian. It included contributions already covered in previous months as well as new ones from Rebecca N. Palmer, such as:
cargo-packaging/rusty_v8
, cockpit
, collectd
, deepin-daemon
, deepin-file-manager
, esbuild
, grpc
, hyperkitty
, icedtea-web
, java-atk-wrapper
, kdenetwork-filesharing
, kicad
, kompare
, librespeed-cli
, lincity-ng
, mraa
, ollama
, opa-fmgui
, opencryptoki
, opencryptoki
, openmpi4:gnu-hpc
, openwsman
, patterns-microos
, portmidi
, presage
, procps
, sad
, scons/nst
, sendmail
, static-initrd
, suse-hpc
, swtpm
, tiny
, vtk
, xdg-desktop-portal
and yast
.
pyorbital
.python-pbcore
.dictionaries-common
.i386.reproduce.debian.net
rebuilder. [ ][ ][ ][ ][ ][ ]i386.reproduce.debian.net
run on a public port to allow external workers. [ ]/api/v0/pkgs/list
endpoint. [ ]arch:any
and arch:all
on the amd64
architecture, but only arch:any
on i386
. [ ]dstat
on Jenkins nodes anymore as its been removed from Debian trixie. [ ]infom08-i386
node to become another rebuilder. [ ]reproducible_pool_buildinfos.sh
script. [ ]installation-birthday
everywhere. [ ]Recommends
by default on Jenkins nodes. [ ]rebuilder_stats.py
to rebuilderd_stats.py
. [ ]/etc/cron.d/
with the correct permissions. [ ].buildinfo
on buildinfos.debian.net files. [ ][ ][ ]rebuilder_stats.py
scripts. [ ]dpkg.selections
file [ ], Roland Clobus updated the Jenkins log parser to parse warnings from diffoscope [ ] and Mattia Rizzolo banned a number of bots and crawlers from the service [ ][ ].
#reproducible-builds
on irc.oftc.net
.
rb-general@lists.reproducible-builds.org
python3-django-jsonfield
in the code (it was
superseded by a Django-native field). Thanks to Philipp Kern from the Debian
System Administrators team, the upgrade happened on December 23rd.
Rapha l also improved distro-tracker to better deal with invalid Maintainer
fields which recently caused multiples issues in the regular data updates
(#1089985,
MR 105).
While working on this, he filed
#1089648 asking
dpkg tools to error out early when maintainers make such mistakes.
Finally he provided feedback to multiple issues and merge requests
(MR 106,
issues #21,
#76,
#77), there seems to
be a surge of interest in distro-tracker lately. It would be nice if those new
contributors could stick around and help out with the significant backlog of
issues (in the Debian BTS, in
Salsa).
build-essential
, by Helmut Grohne
Building on the gcc-for-host
work of last December,
a notable patch turning build-essential
Multi-Arch: same
became feasible. Whilst the change is small, its implications
and foundations are not. We still install crossbuild-essential-$ARCH
for cross
building and due to a britney2
limitation, we cannot have it depend on the
host s C library. As a result, there are workarounds in place for
sbuild
and pbuilder.
In turning build-essential
Multi-Arch: same
, we may actually express these
dependencies directly as we install build-essential:$ARCH
instead.
The crossbuild-essential-$ARCH
packages will continue to be available as
transitional dummy packages.
bubblewrap
,
e2fsprogs
, libvpd-2.2-3
, and pam-tmpdir
and corresponding on related
issues such as kexec-tools
and live-build
. The removal of the usrmerge
package unfortunately broke debootstrap
and was quickly reverted. Continued
fallout is expected and will continue until trixie
is released.gnu-efi
in the process.nproc
is not a good place for this functionality.Publisher: | Erewhon |
Copyright: | November 2024 |
ISBN: | 1-64566-099-0 |
Format: | Kindle |
Pages: | 443 |
Know I adore you. Look out over the glow. The cities sundered, their machines inverted, mountains split and prairies blazing, that long foreseen Hereafter crowning fast. This calamity is a promise made to you. A prayer to you, and to your shadow which has become my second self, tucked behind my eye and growing in tandem with me, pressing outwards through the pupil, the smarter, truer, almost bursting reason for our wrath. Do not doubt me. Just look. Watch us rise as the sun comes up over the beauty. The future stains the bleakness so pink. When my violence subsides, we will have nothing, and be champions.Marney Honeycutt is twelve years old, a factory worker, and lustertouched. She works in the Yann I. Chauncey Ichorite Foundry in Ignavia City, alongside her family and her best friend, shaping the magical metal ichorite into the valuable industrial products of a new age of commerce and industry. She is the oldest of the lustertouched, the children born to factory workers and poisoned by the metal. It has made her allergic, prone to fits at any contact with ichorite, but also able to exert a strange control over the metal if she's willing to pay the price of spasms and hallucinations for hours afterwards. As Metal from Heaven opens, the workers have declared a strike. Her older sister is the spokesperson, demanding shorter hours, safer working conditions, and an investigation into the health of the lustertouched children. Chauncey's response is to send enforcer snipers to kill the workers, including the entirety of her family.
The girl sang, "Unalone toward dawn we go, toward the glory of the new morning." An enforcer shot her in the belly, and when she did not fall, her head.Marney survives, fleeing into the city, swearing an impossible personal revenge against Yann Chauncey. An act of charity gets her a ticket on a train into the countryside. The woman who bought her ticket is a bandit who is on the train to rob it. Marney's ability to control ichorite allows her to help the bandits in return, winning her a place with the Highwayman's Choir who have been preying on the shipments of the rich and powerful and then disappearing into the hills. The Choir's secret is that the agoraphobic and paranoid Baron of the Fingerbluffs is dead and has been for years. He was killed by his staff, Hereafterist idealists, who have turned his remote territory into an anarchist commune and haven for pirates and bandits. This becomes Marney's home and the Choir becomes her family, but she never forgets her oath of revenge or the childhood friend she left behind in the piles of bodies and to whom this story is narrated. First, Clarke's writing is absolutely gorgeous.
We scaled the viny mountain jags at Montrose Barony's legal edge, the place where land was and wasn't Ignavia, Royston, and Drustland alike. There was a border but it was diffuse and hallucinatory, even more so than most. On legal papers and state maps there were harsh lines that squashed topography and sanded down the mountains into even hills in planter's rows, but here among the jutting rocks and craggy heather, the ground was lineless.The rhythm of it, the grasp of contrast and metaphor, the word choice! That climactic word "lineless," with its echo of limitless. So good. Second, this is the rarest of books: a political fantasy that takes class and religion seriously and uses them for more than plot drivers. This is not at all our world, and the technology level is somewhat ambiguous, but the parallels to the Gilded Age and Progressive Era are unmistakable. The Hereafterists that Marney joins are political anarchists, not in the sense of alternative governance structures and political theory sanitized for middle-class liberals, but in the sense of Emma Goldman and Peter Kropotkin. The society they have built in the Fingerbluffs is temporary, threatened, and contingent, but it is sincere and wildly popular among the people who already lived there. Even beyond politics, class is a tangible force in this book. Marney is a factory worker and the child of factory workers. She barely knows how to read and doesn't magically learn over the course of the book. She has friends who are clever in the sense rewarded by politics and nobility, who navigate bureaucracies and political nuance, but that is not Marney's world. When, towards the end of the book, she has to deal with a gathering of high-class women, the contrast is stark, and she navigates that gathering only by being entirely unexpected. Perhaps the best illustration of the subtlety of this is the terminology in the book for lesbian. Marney is a crawly, which is a slur thrown at people like her (and one of the rare fictional slurs that work exactly as the author intended) but is also simply what she calls herself. Whether or not it functions as a slur depends on context, and the context is never hard to understand. The high-class lesbians she meets later are Lunarists, and react to crawly as a vile and insulting word. They use language to separate themselves from both the insult and from the social class that uses it. Language is an indication of culture and manners and therefore of morality, unlike deeds, which admit endless justifications.
Conversation was fleeting. Perdita managed with whomever stood near her, chipper about every prettiness she saw, the flitting butterflies, the dappled light between the leaves, the lushness and the fragrance of untamed land, and her walking companions took turns sharing in her delight. It was infectious, how happy she was. She was going to slaughter millions. She was going to skip like this all the while.The handling of religion is perhaps even better. Marney was raised a Tullian, which sits alongside two other fleshed-out fictional religions and sketches of several more. Tullians tend to be conservative and patriarchal, and Marney has a realistically complicated relationship with faith: sticking with some Tullian worship practices and gestures because they're part of who she is, feeling a kinship to other Tullians, discarding beliefs that don't fit her, and revising others. Every major religion has a Hereafterist spin or reinterpretation that upends or reverses the parts of the religion that were used to prop up the existing social order and brings it more in line with Hereafterist ideals. We see the Tullian Hereafterist variation in detail, and as someone who has studied a lot of methods of reinterpreting Christianity, I was impressed by how well Clarke invents both a belief system and its revisionist rewrite. This is exactly how religions work in human history, but one almost never sees this subtlety in fantasy novels. Marney's allergy to ichorite causes her internal dialogue to dissolve into hallucinatory synesthesia when she's manipulating or exposed to it. Since that's most of the book, substantial portions read like drug trips with growing body horror. I normally hate this type of narration, so it's a sign of just how good Clarke's writing is that I tolerated it and even enjoyed parts. It helps that the descriptions are irreverent and often surprising, full of unexpected metaphors and sudden turns. It's very hard not to quote paragraph after paragraph of this book. Clarke is also doing a lot with gender that I don't feel qualified to comment in detail on, but it would not surprise me to see this book in the Otherwise Award recommendation list. I can think of three significant male characters, all of whom are well-done, but every other major character is female by at least some gender definition. Within that group, though, is huge gender diversity of the complicated and personal type that doesn't force people into defined boxes. Marney's sexuality is similarly unclassified and sometimes surprising. My one complaint is that I thought the sex scenes (which, to warn, are often graphic) fell into the literary fiction trap of being described so closely and physically that it didn't feel like anyone involved was actually enjoying themselves. (This is almost certainly a matter of personal taste.) I had absolutely no idea how Clarke was going to end this book, and the last couple of chapters caught me by surprise. I'm still not sure what I think about the climax. It's not the ending that I wanted, but one of the merits of this book is that it never did what I thought I wanted and yet made me enjoy the journey anyway. It is, at least, a genre ending, not a literary ending: The reader gets a full explanation of what is going on, and the setting is not static the way that it so often is in literary fiction. The characters can change the world, for good or for ill. The story felt frustrating and incomplete when I first finished it, but I haven't stopped thinking about this book and I think I like the shape of it a bit more now. It was certainly unexpected, at least by me. Clarke names Dhalgren as one of their influences in the acknowledgments, and yes, Metal from Heaven is that kind of book. This is the first 2024 novel I've read that felt like the kind of book that should be on award shortlists. I'm not sure it was entirely successful, and there are parts of it that I didn't like or that weren't for me, but it's trying to do something different and challenging and uncomfortable, and I think it mostly worked. And the writing is so good.
She looked like a mythic princess from the old woodcuts, who ruled nature by force of goodness and faith and had no legal power.Metal from Heaven is not going to be everyone's taste. If you do not like literary fantasy, there is a real chance that you will hate this. I am very glad that I read it, and also am going to take a significant break from difficult books before I tackle another one. But then I'm probably going to try the Scapegracers series, because Clarke is an author I want to follow. Content notes: Explicit sex, including sadomasochistic sex. Political violence, mostly by authorities. Murdered children, some body horror, and a lot of serious injuries and death. Rating: 8 out of 10
package Foo;
use v5.40;
use Moose;
has 'attribute' => (
is => 'ro',
isa => 'Str',
required => 1
);
sub say_something
my $self = shift;
say "Hello there, our attribute is " . $self->attribute;
The above is a class that has a single attribute called attribute
.
To create an object, you use the Moose constructor on the class, and
pass it the attributes you want:
use v5.40;
use Foo;
my $foo = Foo->new(attribute => "foo");
$foo->say_something;
(output: Hello there, our attribute is foo
)
This creates a new object with the attribute attribute
set to bar
.
The attribute
accessor is a method generated by Moose, which functions
both as a getter and a setter (though in this particular case we made
the attribute "ro", meaning read-only, so while it can be set at object
creation time it cannot be changed by the setter anymore). So yay, an
object.
And it has methods, things that we set ourselves. Basic OO, all that.
One of the peculiarities of perl is its concept of "lists". Not to be
confused with the lists of python -- a concept that is called "arrays"
in perl and is somewhat different -- in perl, lists are enumerations of
values. They can be used as initializers for arrays or hashes, and they
are used as arguments to subroutines. Lists cannot be nested; whenever a
hash or array is passed in a list, the list is "flattened", that is, it
becomes one big list.
This means that the below script is functionally equivalent to the above
script that uses our "Foo" object:
use v5.40;
use Foo;
my %args;
$args attribute = "foo";
my $foo = Foo->new(%args);
$foo->say_something;
(output: Hello there, our attribute is foo
)
This creates a hash %args
wherein we set the attributes that we want
to pass to our constructor. We set one attribute in %args
, the one
called attribute
, and then use %args
and rely on list flattening to
create the object with the same attribute set (list flattening turns a
hash into a list of key-value pairs).
Perl also has a concept of "references". These are scalar values that
point to other values; the other value can be a hash, a list, or another
scalar. There is syntax to create a non-scalar value at assignment time,
called anonymous references, which is useful when one wants to remember
non-scoped values. By default, references are not flattened, and this
is what allows you to create multidimensional values in perl; however,
it is possible to request list flattening by dereferencing the
reference. The below example, again functionally equivalent to the
previous two examples, demonstrates this:
use v5.40;
use Foo;
my $args = ;
$args-> attribute = "foo";
my $foo = Foo->new(%$args);
$foo->say_something;
(output: Hello there, our attribute is foo
)
This creates a scalar $args
, which is a reference to an anonymous
hash. Then, we set the key attribute
of that anonymous hash to bar
(note the use arrow operator here, which is used to indicate that we
want to dereference a reference to a hash), and create the object using
that reference, requesting hash dereferencing and flattening by using a
double sigil, %$
.
As a side note, objects in perl are references too, hence the fact that
we have to use the dereferencing arrow to access the attributes and
methods of Moose objects.
Moose attributes don't have to be strings or even simple scalars. They
can also be references to hashes or arrays, or even other objects:
package Bar;
use v5.40;
use Moose;
extends 'Foo';
has 'hash_attribute' => (
is => 'ro',
isa => 'HashRef[Str]',
predicate => 'has_hash_attribute',
);
has 'object_attribute' => (
is => 'ro',
isa => 'Foo',
predicate => 'has_object_attribute',
);
sub say_something
my $self = shift;
if($self->has_object_attribute)
$self->object_attribute->say_something;
$self->SUPER::say_something unless $self->has_hash_attribute;
say "We have a hash attribute!"
This creates a subclass of Foo
called Bar
that has a hash
attribute called hash_attribute
, and an object attribute called
object_attribute
. Both of them are references; one to a hash, the
other to an object. The hash ref is further limited in that it requires
that each value in the hash must be a string (this is optional but can
occasionally be useful), and the object ref in that it must refer to an
object of the class Foo
, or any of its subclasses.
The predicates
used here are extra subroutines that Moose provides if
you ask for them, and which allow you to see if an object's attribute
has a value or not.
The example script would use an object like this:
use v5.40;
use Bar;
my $foo = Foo->new(attribute => "foo");
my $bar = Bar->new(object_attribute => $foo, attribute => "bar");
$bar->say_something;
(output: Hello there, our attribute is foo
)
This example also shows object inheritance, and methods implemented in
child classes.
Okay, that's it for perl and Moose basics. On to...
Bar
package, we could use coercion to
eliminate one object creation step from the creation of a Bar
object:
package "Bar";
use v5.40;
use Moose;
use Moose::Util::TypeConstraints;
extends "Foo";
coerce "Foo",
from "HashRef",
via Foo->new(%$_) ;
has 'hash_attribute' => (
is => 'ro',
isa => 'HashRef',
predicate => 'has_hash_attribute',
);
has 'object_attribute' => (
is => 'ro',
isa => 'Foo',
coerce => 1,
predicate => 'has_object_attribute',
);
sub say_something
my $self = shift;
if($self->has_object_attribute)
$self->object_attribute->say_something;
$self->SUPER::say_something unless $self->has_hash_attribute;
say "We have a hash attribute!"
Okay, let's unpack that a bit.
First, we add the Moose::Util::TypeConstraints
module to our package.
This is required to declare coercions.
Then, we declare a coercion to tell Moose how to convert a HashRef
to
a Foo
object: by using the Foo
constructor on a flattened list
created from the hashref that it is given.
Then, we update the definition of the object_attribute
to say that it
should use coercions. This is not the default, because going through the
list of coercions to find the right one has a performance penalty, so if
the coercion is not requested then we do not do it.
This allows us to simplify declarations. With the updated Bar
class,
we can simplify our example script to this:
use v5.40;
use Bar;
my $bar = Bar->new(attribute => "bar", object_attribute => attribute => "foo" );
$bar->say_something
(output: Hello there, our attribute is foo
)
Here, the coercion kicks in because the value object_attribute
, which
is supposed to be an object of class Foo
, is instead a hash ref.
Without the coercion, this would produce an error message saying that
the type of the object_attribute
attribute is not a Foo
object. With
the coercion, however, the value that we pass to object_attribute
is
passed to a Foo constructor using list flattening, and then the
resulting Foo
object is assigned to the object_attribute
attribute.
Coercion works for more complicated things, too; for instance, you can
use coercion to coerce an array of hashes into an array of objects, by
creating a subtype first:
package MyCoercions;
use v5.40;
use Moose;
use Moose::Util::TypeConstraints;
use Foo;
subtype "ArrayOfFoo", as "ArrayRef[Foo]";
subtype "ArrayOfHashes", as "ArrayRef[HashRef]";
coerce "ArrayOfFoo", from "ArrayOfHashes", via [ map Foo->create(%$_) @ $_ ] ;
Ick. That's a bit more complex.
What happens here is that we use the map
function to iterate over a
list of values.
The given list of values is @ $_
, which is perl for "dereference the
default value as an array reference, and flatten the list of values in
that array reference".
So the ArrayRef
of HashRef
s is dereferenced and flattened, and each
HashRef
in the ArrayRef is passed to the map
function.
The map function then takes each hash ref in turn and passes it to the
block of code that it is also given. In this case, that block is
Foo->create(%$_)
. In other words, we invoke the create
factory
method with the flattened hashref as an argument. This returns an object
of the correct implementation (assuming our hash ref has a type
attribute set), and with all attributes of their object set to the
correct value. That value is then returned from the block (this could be
made more explicit with a return
call, but that is optional, perl
defaults a return value to the rvalue of the last expression in a
block).
The map
function then returns a list of all the created objects, which
we capture in an anonymous array ref (the []
square brackets), i.e.,
an ArrayRef of Foo object, passing the Moose requirement of
ArrayRef[Foo]
.
Usually, I tend to put my coercions in a special-purpose package.
Although it is not strictly required by Moose, I find that it is
useful to do this, because Moose does not allow a coercion to be defined
if a coercion for the same type had already been done in a different
package. And while it is theoretically possible to make sure you only
ever declare a coercion once in your entire codebase, I find that doing
so is easier to remember if you put all your coercions in a specific
package.
Okay, now you understand Moose object coercion! On to...
my $module = "Foo";
eval "require $module";
This loads "Foo" at runtime. Obviously, the $module string could be a
computed value, it does not have to be hardcoded.
There are some obvious downsides to doing things this way, mostly in the
fact that a computed value can basically be anything and so without
proper checks this can quickly become an arbitrary code vulnerability.
As such, there are a number of distributions on
CPAN to help you with the low-level stuff of
figuring out what the possible modules are, and how to load them.
For the purposes of my script, I used
Module::Pluggable. Its API
is fairly simple and straightforward:
package Foo;
use v5.40;
use Moose;
use Module::Pluggable require => 1;
has 'attribute' => (
is => 'ro',
isa => 'Str',
);
has 'type' => (
is => 'ro',
isa => 'Str',
required => 1,
);
sub handles_type
return 0;
sub create
my $class = shift;
my %data = @_;
foreach my $impl($class->plugins)
if($impl->can("handles_type") && $impl->handles_type($data type ))
return $impl->new(%data);
die "could not find a plugin for type " . $data type ;
sub say_something
my $self = shift;
say "Hello there, I am a " . $self->type;
The new concept here is the plugins
class method, which is added by
Module::Pluggable
, and which searches perl's library paths for all
modules that are in our namespace. The namespace is configurable, but by
default it is the name of our module; so in the above example, if there
were a package "Foo::Bar" which
handles_type
type
key in
a hash that is passed to the create
subroutine,create
subroutine creates a new object with the passed
key/value pairs used as attribute initializers.Foo::Bar
package:
package Foo::Bar;
use v5.40;
use Moose;
extends 'Foo';
has 'type' => (
is => 'ro',
isa => 'Str',
required => 1,
);
has 'serves_drinks' => (
is => 'ro',
isa => 'Bool',
default => 0,
);
sub handles_type
my $class = shift;
my $type = shift;
return $type eq "bar";
sub say_something
my $self = shift;
$self->SUPER::say_something;
say "I serve drinks!" if $self->serves_drinks;
We can now indirectly use the Foo::Bar
package in our script:
use v5.40;
use Foo;
my $obj = Foo->create(type => bar, serves_drinks => 1);
$obj->say_something;
output:
Hello there, I am a bar.
I serve drinks!
Okay, now you understand all the bits and pieces that are needed to
understand how I created the DSL engine. On to...
create
factory method in the
last version of our Foo
package allows us to decide at run time which
module to instantiate an object of, and to load that module at run time.
We can use coercion and list flattening to turn a reference to a hash
into an object of the correct type.
We haven't looked yet at how to turn a JSON data structure into a hash,
but that bit is actually ridiculously trivial:
use JSON::MaybeXS;
my $data = decode_json($json_string);
Tada, now $data is a reference to a deserialized version of the JSON
string: if the JSON string contained an object, $data is a hashref; if
the JSON string contained an array, $data is an arrayref, etc.
So, in other words, to create an extensible JSON-based DSL that is
implemented by Moose objects, all we need to do is create a system that
Module::Pluggable
to find the available object classes, andtype
attribute to figure out which object class to use
to create the object
"description": "do stuff",
"actions": [
"type": "bar",
"serves_drinks": true,
,
"type": "bar",
"serves_drinks": false,
]
... and then we could have a Moose object definition like this:
package MyDSL;
use v5.40;
use Moose;
use MyCoercions;
has "description" => (
is => 'ro',
isa => 'Str',
);
has 'actions' => (
is => 'ro',
isa => 'ArrayOfFoo'
coerce => 1,
required => 1,
);
sub say_something
say "Hello there, I am described as " . $self->description . " and I am performing my actions: ";
foreach my $action(@ $self->actions )
$action->say_something;
Now, we can write a script that loads this JSON file and create a new
object using the flattened arguments:
use v5.40;
use MyDSL;
use JSON::MaybeXS;
my $input_file_name = shift;
my $args = do
local $/ = undef;
open my $input_fh, "<", $input_file_name or die "could not open file";
<$input_fh>;
;
$args = decode_json($args);
my $dsl = MyDSL->new(%$args);
$dsl->say_something
Output:
Hello there, I am described as do stuff and I am performing my actions:
Hello there, I am a bar
I am serving drinks!
Hello there, I am a bar
In some more detail, this will:
MyDSL
class;MyDSL
class then uses those arguments to set its attributes,
using Moose coercion to convert the "actions" array of hashes into an
array of Foo::Bar
objects.say_something
method on the MyDSL
objectFoo::Quux
class, making sure it has a method
handles_type
that returns a truthy value when called with quux
as
the argument, and installing it into the perl library path. This is
rather easy to do.
It can even be extended deeper, too; if the quux
type requires a list
of arguments rather than just a single argument, it could itself also
have an array attribute with relevant coercions. These coercions could
then be used to convert the list of arguments into an array of objects
of the correct type, using the same schema as above.
The actual DSL is of course somewhat more complex, and also actually
does something useful, in contrast to the DSL that we define here which just
says things.
Creating an object that actually performs some action when required is
left as an exercise to the reader.
/es/distrib to /distrib/index.es.html
/es/social_contract to /social_contract.es.html
/es/intro/about to /intro/about.es.html
/da to /index.da.html
RewriteCond % REQUEST_URI ^/([a-z] 2 (?:-[a-z] 2 )?)/(.*)$
RewriteCond % DOCUMENT_ROOT /$2/index.%1.html -f
RewriteCond % DOCUMENT_ROOT /$1/$2 !-d
RewriteRule ^/([a-z] 2 (?:-[a-z] 2 )?)/(.*)$ /$2/index.%1.html [last,redirect]
RewriteCond % REQUEST_URI ^/([a-z] 2 (?:-[a-z] 2 )?)/(.*)$
RewriteCond % DOCUMENT_ROOT /$2.%1.html -f
RewriteCond % DOCUMENT_ROOT /$1/$2 !-d
RewriteRule ^/([a-z] 2 (?:-[a-z] 2 )?)/(.*)$ /$2.%1.html [last,redirect]
Next.