Search Results: "alan"

4 March 2024

Paulo Henrique de Lima Santana: Bits from FOSDEM 2023 and 2024

Link para vers o em portugu s

Intro Since 2019, I have traveled to Brussels at the beginning of the year to join FOSDEM, considered the largest and most important Free Software event in Europe. The 2024 edition was the fourth in-person edition in a row that I joined (2021 and 2022 did not happen due to COVID-19) and always with the financial help of Debian, which kindly paid my flight tickets after receiving my request asking for help to travel and approved by the Debian leader. In 2020 I wrote several posts with a very complete report of the days I spent in Brussels. But in 2023 I didn t write anything, and becayse last year and this year I coordinated a room dedicated to translations of Free Software and Open Source projects, I m going to take the opportunity to write about these two years and how it was my experience. After my first trip to FOSDEM, I started to think that I could join in a more active way than just a regular attendee, so I had the desire to propose a talk to one of the rooms. But then I thought that instead of proposing a tal, I could organize a room for talks :-) and with the topic translations which is something that I m very interested in, because it s been a few years since I ve been helping to translate the Debian for Portuguese.

Joining FOSDEM 2023 In the second half of 2022 I did some research and saw that there had never been a room dedicated to translations, so when the FOSDEM organization opened the call to receive room proposals (called DevRoom) for the 2023 edition, I sent a proposal to a translation room and it was accepted! After the room was confirmed, the next step was for me, as room coordinator, to publicize the call for talk proposals. I spent a few weeks hoping to find out if I would receive a good number of proposals or if it would be a failure. But to my happiness, I received eight proposals and I had to select six to schedule the room programming schedule due to time constraints . FOSDEM 2023 took place from February 4th to 5th and the translation devroom was scheduled on the second day in the afternoon. Fosdem 2023 The talks held in the room were these below, and in each of them you can watch the recording video. And on the first day of FOSDEM I was at the Debian stand selling the t-shirts that I had taken from Brazil. People from France were also there selling other products and it was cool to interact with people who visited the booth to buy and/or talk about Debian.
Fosdem 2023

Fosdem 2023
Photos

Joining FOSDEM 2024 The 2023 result motivated me to propose the translation devroom again when the FOSDEM 2024 organization opened the call for rooms . I was waiting to find out if the FOSDEM organization would accept a room on this topic for the second year in a row and to my delight, my proposal was accepted again :-) This time I received 11 proposals! And again due to time constraints, I had to select six to schedule the room schedule grid. FOSDEM 2024 took place from February 3rd to 4th and the translation devroom was scheduled for the second day again, but this time in the morning. The talks held in the room were these below, and in each of them you can watch the recording video. This time I didn t help at the Debian stand because I couldn t bring t-shirts to sell from Brazil. So I just stopped by and talked to some people who were there like some DDs. But I volunteered for a few hours to operate the streaming camera in one of the main rooms.
Fosdem 2024

Fosdem 2024
Photos

Conclusion The topics of the talks in these two years were quite diverse, and all the lectures were really very good. In the 12 talks we can see how translations happen in some projects such as KDE, PostgreSQL, Debian and Mattermost. We had the presentation of tools such as LibreTranslate, Weblate, scripts, AI, data model. And also reports on the work carried out by communities in Africa, China and Indonesia. The rooms were full for some talks, a little more empty for others, but I was very satisfied with the final result of these two years. I leave my special thanks to Jonathan Carter, Debian Leader who approved my flight tickets requests so that I could join FOSDEM 2023 and 2024. This help was essential to make my trip to Brussels because flight tickets are not cheap at all. I would also like to thank my wife Jandira, who has been my travel partner :-) Bruxelas As there has been an increase in the number of proposals received, I believe that interest in the translations devroom is growing. So I intend to send the devroom proposal to FOSDEM 2025, and if it is accepted, wait for the future Debian Leader to approve helping me with the flight tickets again. We ll see.

3 March 2024

Dirk Eddelbuettel: RcppArmadillo 0.12.8.1.0 on CRAN: Upstream Fix, Interface Polish

armadillo image Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language and is widely used by (currently) 1130 other packages on CRAN, downloaded 32.8 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 578 times according to Google Scholar. This release brings a new upstream bugfix release Armadillo 12.8.1 prepared by Conrad yesterday. It was delayed for a few hours as CRAN noticed an error in one package which we all concluded was spurious as it could be reproduced outside of the one run there. Following from the previous release, we also use the slighty faster Lighter header in the examples. And once it got to CRAN I also updated the Debian package. The set of changes since the last CRAN release follows.

Changes in RcppArmadillo version 0.12.8.1.0 (2024-03-02)
  • Upgraded to Armadillo release 12.8.1 (Cortisol Injector)
    • Workaround in norm() for yet another bug in macOS accelerate framework
  • Update README for RcppArmadillo usage counts
  • Update examples to use '#include <RcppArmadillo/Lighter>' for faster compilation excluding unused Rcpp features

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the Rcpp R-Forge page. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

8 February 2024

Dirk Eddelbuettel: RcppArmadillo 0.12.8.0.0 on CRAN: New Upstream, Interface Polish

armadillo image Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language and is widely used by (currently) 1119 other packages on CRAN, downloaded 32.5 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 575 times according to Google Scholar. This release brings a new (stable) upstream (minor) release Armadillo 12.8.0 prepared by Conrad two days ago. We, as usual, prepared a release candidate which we tested against the over 1100 CRAN packages using RcppArmadillo. This found no issues, which was confirmed by CRAN once we uploaded and so it arrived as a new release today in a fully automated fashion. We also made a small change that had been prepared by GitHub issue #400: a few internal header files that were cluttering the top-level of the include directory have been moved to internal directories. The standard header is of course unaffected, and the set of full / light / lighter / lightest headers (matching we did a while back in Rcpp) also continue to work as one expects. This change was also tested in a full reverse-dependency check in January but had not been released to CRAN yet. The set of changes since the last CRAN release follows.

Changes in RcppArmadillo version 0.12.8.0.0 (2024-02-06)
  • Upgraded to Armadillo release 12.8.0 (Cortisol Injector)
    • Faster detection of symmetric expressions by pinv() and rank()
    • Expanded shift() to handle sparse matrices
    • Expanded conv_to for more flexible conversions between sparse and dense matrices
    • Added cbrt()
    • More compact representation of integers when saving matrices in CSV format
  • Five non-user facing top-level include files have been removed (#432 closing #400 and building on #395 and #396)

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the Rcpp R-Forge page. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

16 January 2024

Jonathan Dowland: Two reissued Coil LPs

Happy 2024! DAIS have continued their programme of posthumous Coil remasters and re-issues. Constant Shallowness Leads To Evil was remastered by Josh Bonati in 2021 and re-released in 2022 in a dizzying array of different packaging variants. The original releases in 2000 had barely any artwork, and given that void I think Nathaniel Young has done a great job of creating something compelling.
Constant Shallowness leads to Evil and Queens of te Circulating Library
A limited number of the original re-issue have special lenticular covers, although these were not sold by any distributors outside the US. I tried to find a copy on my trip to Portland in 2022, to no avail. Last year DAIS followed Constant with Queens Of The Circulating Library, same deal: limited lenticular covers, US only. Both are also available digital-only, e.g. on Bandcamp: Constant , Queens . The original, pre-remastered releases have been freely available on archive.org for a long time: Constant , Queens Both of these releases feel to me that they were made available by the group somewhat as an afterthought, having been produced primarily as part of their live efforts. (I'm speculating freely here, it might not be true). Live takes of some of this material exist in the form of Coil Presents Time Machines, which has not (yet) been reissued. In my opinion this is a really compelling recording. I vividly remember listening to this whilst trying to get an hour's rest in a hotel somewhere on a work trip. It took me to some strange places! I'll leave you from one of my favourite moments from "Colour Sound Oblivion", Coil's video collection of live backdrops. When this was performed live it was also called "Constant Shallowness Leads To Evil", although it's distinct from the material on the LP: also available on archive.org. A version of this Constant made it onto a Russian live bootleg, which is available on Spotify and Bandcamp complete with some John Balance banter: we only do this on religious holidays Constant Shallowness Leads to Evil by Coil

15 January 2024

Colin Watson: OpenUK New Year s Honours

Apparently I got an honour from OpenUK. There are a bunch of people I know on that list. Chris Lamb and Mark Brown are familiar names from Debian. Colin King and Jonathan Riddell are people I know from past work in Ubuntu. I ve admired David MacIver s work on Hypothesis and Richard Hughes work on firmware updates from afar. And there are a bunch of other excellent projects represented there: OpenStreetMap, Textualize, and my alma mater of Cambridge to name but a few. My friend Stuart Langridge wrote about being on a similar list a few years ago, and I can t do much better than to echo it: in particular he wrote about the way the open source development community is often at best unwelcoming to people who don t look like Stuart and I do. I can t tell a whole lot about demographic distribution just by looking at a list of names, but while these honours still seem to be skewed somewhat male, I m fairly sure they re doing a lot better in terms of gender balance than my home project of Debian is, for one. I hope this is a sign of improvement for the future, and I ll do what I can to pay it forward.

10 January 2024

Dirk Eddelbuettel: BH 1.84.0-1 on CRAN: New Upstream

Boost Boost is a very large and comprehensive set of (peer-reviewed) libraries for the C++ programming language, containing well over one hundred individual libraries. The BH package provides a sizeable subset of header-only libraries for (easier, no linking required) use by R. It is fairly widely used: the (partial) CRAN mirror logs (aggregated from the cloud mirrors) show over 35.7 million package downloads. Version 1.84.0 of Boost was released in December following the regular Boost release schedule of April, August and December releases. As the commits and changelog show, we packaged it almost immediately and started testing following our annual update cycle which strives to balance being close enough to upstream and not stressing CRAN and the user base too much. The reverse depends check revealed five packages requiring changes or adjustments which is a pretty good outcome given the over three hundred direct reverse dependencies. So we opened issue #100 to coordinate the issue over the winter break during which CRAN also closes (just as we did in previous years). Our sincere thanks to the two packages that already updated before, and to the one that updated today within hours (!!) of the BH uploaded it needed. There are very few actual changes. We honoured one request (in issue #97) to add Boost QVM bringing quarternion support to R. No other new changes needed to be made. A number of changes I have to make each time in BH, and it is worth mentioning them. Because CRAN cares about backwards compatibility and the ability to be used on minimal or older systems, we still adjust the filenames of a few files to fit a jurassic constraints of just over a 100 characters per filepath present in some long-outdated versions of tar. Not a big deal. We also, and that is more controversial, silence a number of #pragma diagnostic messages for g++ and clang++ because CRAN insists on it. I have no choice in that matter. One warning we suppressed last year, but no longer do, concerns the C++14 standard that some Boost libraries now default to. Packages setting C++11 explicitly will likely get a note from CRAN changing this; in most cases that should be trivial to remove as we only had to opt into (then) newer standards under old compilers. These days newer defaults help; R itself now defaults to C++17.

Changes in version 1.84.0-0 (2024-01-09)

Via my CRANberries, there is a diffstat report relative to the previous release. Comments and suggestions about BH are welcome via the issue tracker at the GitHub repo. If you like this or other open-source work I do, you can now sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

29 December 2023

Russ Allbery: Review: The Afterward

Review: The Afterward, by E.K. Johnston
Publisher: Dutton Books
Copyright: February 2019
Printing: 2020
ISBN: 0-7352-3190-7
Format: Kindle
Pages: 339
The Afterward is a standalone young adult high fantasy with a substantial romance component. The title is not misspelled. Sir Erris and her six companions, matching the number of the new gods, were successful in their quest for the godsgem. They defeated the Old God and destroyed Him forever, freeing King Dorrenta from his ensorcellment, and returned in triumph to Cadrium to live happily ever after. Or so the story goes. Sir Erris and three of the companions are knights. Another companion is the best mage in the kingdom. Kalanthe Ironheart, who distracted the Old God at a critical moment and allowed Sir Erris to strike, is only an apprentice due to her age, but surely will become a great knight. And then there is Olsa Rhetsdaughter, the lowborn thief, now somewhat mockingly called Thief of the Realm for all the good that does her. The reward was enough for her to buy her freedom from the Thief's Court. It was not enough to pay for food after that, or enough for her to change her profession, and the Thief's Court no longer has any incentive to give her easy (or survivable) assignments. Kalanthe is in a considerably better position, but she still needs a good marriage. Her reward paid off half of her debt, which broadens her options, but she's still a debt-knight, liable for the full cost of her training once she reaches the age of nineteen. She's mostly made her peace with the decisions she made given her family's modest means, but marriages of that type are usually for heirs, and Kalanthe is not looking forward to bearing a child. Or, for that matter, sleeping with a man. Olsa and Kalanthe fell in love during the Quest. Given Kalanthe's debt and the way it must be paid, and her iron-willed determination to keep vows, neither of them expected their relationship to survive the end of the Quest. Both of them wish that it had. The hook is that this novel picks up after the epic fantasy quest is over and everyone went home. This is not an entirely correct synopsis; chapters of The Afterward alternate between "After" and "Before" (and one chapter delightfully titled "More or less the exact moment of"), and by the end of the book we get much of the story of the Quest. It's not told from the perspective of the lead heroes, though; it's told by following Kalanthe and Olsa, who would be firmly relegated to supporting characters in a typical high fantasy. And it's largely told through the lens of their romance. This is not the best fantasy novel I've read, but I had a fun time with it. I am now curious about the intended audience and marketing, though. It was published by a YA imprint, and both the ages of the main characters and the general theme of late teenagers trying to chart a course in an adult world match that niche. But it's also clearly intended for readers who have read enough epic fantasy quests that they will both be amused by the homage and not care that the story elides a lot of the typical details. Anyone who read David Eddings at an impressionable age will enjoy the way Johnston pokes gentle fun at The Belgariad (this book is dedicated to David and Leigh Eddings), but surely the typical reader of YA fantasy these days isn't also reading Eddings. I'm therefore not quite sure who this book was for, but apparently that group included me. Johnston thankfully is not on board with the less savory parts of Eddings's writing, as you might have guessed from the sapphic romance. There is no obnoxious gender essentialism here, although there do appear to be gender roles that I never quite figured out. Knights are referred to as sir, but all of the knights in this story are women. Men still seem to run a lot of things (kingdoms, estates, mage colleges), but apart from the mage, everyone on the Quest was female, and there seems to be an expectation that women go out into the world and have adventures while men stay home. I'm not sure if there was an underlying system that escaped me, or if Johnston just mixed things up for the hell of it. (If the latter, I approve.) This book does suffer a bit from addressing some current-day representation issues without managing to fold them naturally into the story or setting. One of the Quest knights is transgender, something that's revealed in a awkward couple of paragraphs and then never mentioned again. Two of the characters have a painfully earnest conversation about the word "bisexual," complete with a strained attempt at in-universe etymology. Racial diversity (Olsa is black, and Kalanthe is also not white) seemed to be handled a bit better, although I am not the reader to notice if the discussions of hair maintenance were similarly awkward. This is way better than no representation and default-white characters, to be clear, but it felt a bit shoehorned in at times and could have used some more polish. These are quibbles, though. Olsa was the heart of the book for me, and is exactly the sort of character I like to read about. Kalanthe is pure stubborn paladin, but I liked her more and more as the story continued. She provides a good counterbalance to Olsa's natural chaos. I do wish Olsa had more opportunities to show her own competence (she's not a very good thief, she's just the thief that Sir Erris happened to know), but the climax of the story was satisfying. My main grumble is that I badly wanted to dwell on the happily-ever-after for at least another chapter, ideally two. Johnston was done with the story before I was. The writing was serviceable but not great and there are some bits that I don't think would stand up to a strong poke, but the characters carried the story for me. Recommended if you'd like some sapphic romance and lightweight class analysis complicating your Eddings-style quest fantasy. Rating: 7 out of 10

22 December 2023

Russ Allbery: Review: Wintersmith

Review: Wintersmith, by Terry Pratchett
Series: Discworld #35
Publisher: Clarion Books
Copyright: 2006
Printing: 2007
ISBN: 0-06-089033-9
Format: Mass market
Pages: 450
Wintersmith is the 35th Discworld novel and the 3rd Tiffany Aching novel. You could probably start here, since understanding the backstory isn't vital for following the plot, but I'm not sure why you would. Tiffany is now training with Miss Treason, a 113-year-old witch who is quite different in her approach from Miss Level, Tiffany's mentor in A Hat Full of Sky. Miss Level was the unassuming and constantly helpful glue that held the neighborhood together. Miss Treason is the judge; her neighbors are scared of her and proud of being scared of her, since that means they have a proper witch who can see into their heads and sort out their problems. On the surface, they're quite different; part of the story of this book is Tiffany learning to see the similarities. First, though, Miss Treason rushes Tiffany to a strange midnight Morris Dance, without any explanation. The Morris Dance usually celebrates the coming of spring and is at the center of a village party, so Tiffany is quite confused by seeing it danced on a dark and windy night in late autumn. But there is a hole in the dance where the Fool normally is, and Tiffany can't keep herself from joining it. This proves to be a mistake. That space was left for someone very different from Tiffany, and now she's entangled herself in deep magic that she doesn't understand. This is another Pratchett novel where the main storyline didn't do much for me. All the trouble stems from Miss Treason being maddeningly opaque, and while she did warn Tiffany, she did so in that way that guarantees a protagonist of a middle-grade novel will ignore. The Wintersmith is a boring, one-note quasi-villain, and the plot mainly revolves around elemental powers being dumber than a sack of hammers. The one thing I will say about the main plot is that the magic Tiffany danced into is entangled with courtship and romance, Tiffany turns thirteen over the course of this book, and yet this is not weird and uncomfortable reading the way it would be in the hands of many other authors. Pratchett has a keen eye for the age range that he's targeting. The first awareness that there is such a thing as romance that might be relevant to oneself pairs nicely with the Wintersmith's utter confusion at how Tiffany's intrusion unbalanced his dance. This is a very specific age and experience that I think a lot of authors would shy away from, particularly with a female protagonist, and I thought Pratchett handled it adroitly. I personally found the Wintersmith's awkward courting tedious and annoying, but that's more about me than about the book. As with A Hat Full of Sky, though, everything other than the main plot was great. It is becoming obvious how much Tiffany and Granny Weatherwax have in common, and that Granny Weatherwax recognizes this and is training Tiffany herself. This is high-quality coming-of-age material, not in the traditional fantasy sense of chosen ones and map explorations, but in the sense of slowly-developing empathy and understanding of people who think differently than you do. Tiffany, like Granny Weatherwax, has very little patience with nonsense, and her irritation with stupidity is one of her best characteristics. But she's learning how to blunt it long enough to pay attention, and to understand how people she doesn't like can still be the right people for specific situations. I particularly loved how Granny carries on with a feud at the same time that Tiffany is learning to let go of one. It's not a contradiction or hypocrisy; it's a sign that Tiffany is entitled to her judgments and feelings, but has to learn how to keep them in their place and not let them take over. One of the great things about the Tiffany Aching books is that the villages are also characters. We don't see that much of the individual people, but one of the things Tiffany is learning is how to see the interpersonal dynamics and patterns of village life. Somehow the feelings of irritation and exasperation fade once you understand people's motives and see more sides to their character. There is a lot more Nanny Ogg in this book than there has been in the last few, and that reminded me of how much I love her character. She has a completely different approach than Granny Weatherwax, but it's just as effective in different ways. She's also the perfect witch to have around when you've stumbled into a stylized love story that you don't want to be a part of, and yet find oddly fascinating. It says something about the skill of Pratchett's characterization that I could enjoy a book this much while having no interest in the main plot. The Witches have always been great characters, but somehow they're even better when seen through Tiffany's perspective. Good stuff; if you liked any of the other Tiffany Aching books, you will like this as well. Followed by Making Money in publication order. The next Tiffany Aching novel is I Shall Wear Midnight. Rating: 8 out of 10

20 December 2023

Melissa Wen: The Rainbow Treasure Map Talk: Advanced color management on Linux with AMD/Steam Deck.

Last week marked a major milestone for me: the AMD driver-specific color management properties reached the upstream linux-next! And to celebrate, I m happy to share the slides notes from my 2023 XDC talk, The Rainbow Treasure Map along with the individual recording that just dropped last week on youtube talk about happy coincidences!

Steam Deck Rainbow: Treasure Map & Magic Frogs While I may be bubbly and chatty in everyday life, the stage isn t exactly my comfort zone (hallway talks are more my speed). But the journey of developing the AMD color management properties was so full of discoveries that I simply had to share the experience. Witnessing the fantastic work of Jeremy and Joshua bring it all to life on the Steam Deck OLED was like uncovering magical ingredients and whipping up something truly enchanting. For XDC 2023, we split our Rainbow journey into two talks. My focus, The Rainbow Treasure Map, explored the new color features we added to the Linux kernel driver, diving deep into the hardware capabilities of AMD/Steam Deck. Joshua then followed with The Rainbow Frogs and showed the breathtaking color magic released on Gamescope thanks to the power unlocked by the kernel driver s Steam Deck color properties.

Packing a Rainbow into 15 Minutes I had so much to tell, but a half-slot talk meant crafting a concise presentation. To squeeze everything into 15 minutes (and calm my pre-talk jitters a bit!), I drafted and practiced those slides and notes countless times. So grab your map, and let s embark on the Rainbow journey together! Slide 1: The Rainbow Treasure Map - Advanced Color Management on Linux with AMD/SteamDeck Intro: Hi, I m Melissa from Igalia and welcome to the Rainbow Treasure Map, a talk about advanced color management on Linux with AMD/SteamDeck. Slide 2: List useful links for this technical talk Useful links: First of all, if you are not used to the topic, you may find these links useful.
  1. XDC 2022 - I m not an AMD expert, but - Melissa Wen
  2. XDC 2022 - Is HDR Harder? - Harry Wentland
  3. XDC 2022 Lightning - HDR Workshop Summary - Harry Wentland
  4. Color management and HDR documentation for FOSS graphics - Pekka Paalanen et al.
  5. Cinematic Color - 2012 SIGGRAPH course notes - Jeremy Selan
  6. AMD Driver-specific Properties for Color Management on Linux (Part 1) - Melissa Wen
Slide 3: Why do we need advanced color management on Linux? Context: When we talk about colors in the graphics chain, we should keep in mind that we have a wide variety of source content colorimetry, a variety of output display devices and also the internal processing. Users expect consistent color reproduction across all these devices. The userspace can use GPU-accelerated color management to get it. But this also requires an interface with display kernel drivers that is currently missing from the DRM/KMS framework. Slide 4: Describe our work on AMD driver-specific color properties Since April, I ve been bothering the DRM community by sending patchsets from the work of me and Joshua to add driver-specific color properties to the AMD display driver. In parallel, discussions on defining a generic color management interface are still ongoing in the community. Moreover, we are still not clear about the diversity of color capabilities among hardware vendors. To bridge this gap, we defined a color pipeline for Gamescope that fits the latest versions of AMD hardware. It delivers advanced color management features for gamut mapping, HDR rendering, SDR on HDR, and HDR on SDR. Slide 5: Describe the AMD/SteamDeck - our hardware AMD/Steam Deck hardware: AMD frequently releases new GPU and APU generations. Each generation comes with a DCN version with display hardware improvements. Therefore, keep in mind that this work uses the AMD Steam Deck hardware and its kernel driver. The Steam Deck is an APU with a DCN3.01 display driver, a DCN3 family. It s important to have this information since newer AMD DCN drivers inherit implementations from previous families but aldo each generation of AMD hardware may introduce new color capabilities. Therefore I recommend you to familiarize yourself with the hardware you are working on. Slide 6: Diagram with the three layers of the AMD display driver on Linux The AMD display driver in the kernel space: It consists of three layers, (1) the DRM/KMS framework, (2) the AMD Display Manager, and (3) the AMD Display Core. We extended the color interface exposed to userspace by leveraging existing DRM resources and connecting them using driver-specific functions for color property management. Slide 7: Three-layers diagram highlighting AMD Display Manager, DM - the layer that connects DC and DRM Bridging DC color capabilities and the DRM API required significant changes in the color management of AMD Display Manager - the Linux-dependent part that connects the AMD DC interface to the DRM/KMS framework. Slide 8: Three-layers diagram highlighting AMD Display Core, DC - the shared code The AMD DC is the OS-agnostic layer. Its code is shared between platforms and DCN versions. Examining this part helps us understand the AMD color pipeline and hardware capabilities, since the machinery for hardware settings and resource management are already there. Slide 9: Diagram of the AMD Display Core Next architecture with main elements and data flow The newest architecture for AMD display hardware is the AMD Display Core Next. Slide 10: Diagram of the AMD Display Core Next where only DPP and MPC blocks are highlighted In this architecture, two blocks have the capability to manage colors:
  • Display Pipe and Plane (DPP) - for pre-blending adjustments;
  • Multiple Pipe/Plane Combined (MPC) - for post-blending color transformations.
Let s see what we have in the DRM API for pre-blending color management. Slide 11: Blank slide with no content only a title 'Pre-blending: DRM plane' DRM plane color properties: This is the DRM color management API before blending. Nothing! Except two basic DRM plane properties: color_encoding and color_range for the input colorspace conversion, that is not covered by this work. Slide 12: Diagram with color capabilities and structures in AMD DC layer without any DRM plane color interface (before blending), only the DRM CRTC color interface for post blending In case you re not familiar with AMD shared code, what we need to do is basically draw a map and navigate there! We have some DRM color properties after blending, but nothing before blending yet. But much of the hardware programming was already implemented in the AMD DC layer, thanks to the shared code. Slide 13: Previous Diagram with a rectangle to highlight the empty space in the DRM plane interface that will be filled by AMD plane properties Still both the DRM interface and its connection to the shared code were missing. That s when the search begins! Slide 14: Color Pipeline Diagram with the plane color interface filled by AMD plane properties but without connections to AMD DC resources AMD driver-specific color pipeline: Looking at the color capabilities of the hardware, we arrive at this initial set of properties. The path wasn t exactly like that. We had many iterations and discoveries until reached to this pipeline. Slide 15: Color Pipeline Diagram connecting AMD plane degamma properties, LUT and TF, to AMD DC resources The Plane Degamma is our first driver-specific property before blending. It s used to linearize the color space from encoded values to light linear values. Slide 16: Describe plane degamma properties and hardware capabilities We can use a pre-defined transfer function or a user lookup table (in short, LUT) to linearize the color space. Pre-defined transfer functions for plane degamma are hardcoded curves that go to a specific hardware block called DPP Degamma ROM. It supports the following transfer functions: sRGB EOTF, BT.709 inverse OETF, PQ EOTF, and pure power curves Gamma 2.2, Gamma 2.4 and Gamma 2.6. We also have a one-dimensional LUT. This 1D LUT has four thousand ninety six (4096) entries, the usual 1D LUT size in the DRM/KMS. It s an array of drm_color_lut that goes to the DPP Gamma Correction block. Slide 17: Color Pipeline Diagram connecting AMD plane CTM property to AMD DC resources We also have now a color transformation matrix (CTM) for color space conversion. Slide 18: Describe plane CTM property and hardware capabilities It s a 3x4 matrix of fixed points that goes to the DPP Gamut Remap Block. Both pre- and post-blending matrices were previously gone to the same color block. We worked on detaching them to clear both paths. Now each CTM goes on its own way. Slide 19: Color Pipeline Diagram connecting AMD plane HDR multiplier property to AMD DC resources Next, the HDR Multiplier. HDR Multiplier is a factor applied to the color values of an image to increase their overall brightness. Slide 20: Describe plane HDR mult property and hardware capabilities This is useful for converting images from a standard dynamic range (SDR) to a high dynamic range (HDR). As it can range beyond [0.0, 1.0] subsequent transforms need to use the PQ(HDR) transfer functions. Slide 21: Color Pipeline Diagram connecting AMD plane shaper properties, LUT and TF, to AMD DC resources And we need a 3D LUT. But 3D LUT has a limited number of entries in each dimension, so we want to use it in a colorspace that is optimized for human vision. It means in a non-linear space. To deliver it, userspace may need one 1D LUT before 3D LUT to delinearize content and another one after to linearize content again for blending. Slide 22: Describe plane shaper properties and hardware capabilities The pre-3D-LUT curve is called Shaper curve. Unlike Degamma TF, there are no hardcoded curves for shaper TF, but we can use the AMD color module in the driver to build the following shaper curves from pre-defined coefficients. The color module combines the TF and the user LUT values into the LUT that goes to the DPP Shaper RAM block. Slide 23: Color Pipeline Diagram connecting AMD plane 3D LUT property to AMD DC resources Finally, our rockstar, the 3D LUT. 3D LUT is perfect for complex color transformations and adjustments between color channels. Slide 24: Describe plane 3D LUT property and hardware capabilities 3D LUT is also more complex to manage and requires more computational resources, as a consequence, its number of entries is usually limited. To overcome this restriction, the array contains samples from the approximated function and values between samples are estimated by tetrahedral interpolation. AMD supports 17 and 9 as the size of a single-dimension. Blue is the outermost dimension, red the innermost. Slide 25: Color Pipeline Diagram connecting AMD plane blend properties, LUT and TF, to AMD DC resources As mentioned, we need a post-3D-LUT curve to linearize the color space before blending. This is done by Blend TF and LUT. Slide 26: Describe plane blend properties and hardware capabilities Similar to shaper TF, there are no hardcoded curves for Blend TF. The pre-defined curves are the same as the Degamma block, but calculated by the color module. The resulting LUT goes to the DPP Blend RAM block. Slide 27: Color Pipeline Diagram  with all AMD plane color properties connect to AMD DC resources and links showing the conflict between plane and CRTC degamma Now we have everything connected before blending. As a conflict between plane and CRTC Degamma was inevitable, our approach doesn t accept that both are set at the same time. Slide 28: Color Pipeline Diagram connecting AMD CRTC gamma TF property to AMD DC resources We also optimized the conversion of the framebuffer to wire encoding by adding support to pre-defined CRTC Gamma TF. Slide 29: Describe CRTC gamma TF property and hardware capabilities Again, there are no hardcoded curves and TF and LUT are combined by the AMD color module. The same types of shaper curves are supported. The resulting LUT goes to the MPC Gamma RAM block. Slide 30: Color Pipeline Diagram with all AMD driver-specific color properties connect to AMD DC resources Finally, we arrived in the final version of DRM/AMD driver-specific color management pipeline. With this knowledge, you re ready to better enjoy the rainbow treasure of AMD display hardware and the world of graphics computing. Slide 31: SteamDeck/Gamescope Color Pipeline Diagram with rectangles labeling each block of the pipeline with the related AMD color property With this work, Gamescope/Steam Deck embraces the color capabilities of the AMD GPU. We highlight here how we map the Gamescope color pipeline to each AMD color block. Slide 32: Final slide. Thank you! Future works: The search for the rainbow treasure is not over! The Linux DRM subsystem contains many hidden treasures from different vendors. We want more complex color transformations and adjustments available on Linux. We also want to expose all GPU color capabilities from all hardware vendors to the Linux userspace. Thanks Joshua and Harry for this joint work and the Linux DRI community for all feedback and reviews. The amazing part of this work comes in the next talk with Joshua and The Rainbow Frogs! Any questions?
References:
  1. Slides of the talk The Rainbow Treasure Map.
  2. Youtube video of the talk The Rainbow Treasure Map.
  3. Patch series for AMD driver-specific color management properties (upstream Linux 6.8v).
  4. SteamDeck/Gamescope color management pipeline
  5. XDC 2023 website.
  6. Igalia website.

13 December 2023

Melissa Wen: 15 Tips for Debugging Issues in the AMD Display Kernel Driver

A self-help guide for examining and debugging the AMD display driver within the Linux kernel/DRM subsystem. It s based on my experience as an external developer working on the driver, and are shared with the goal of helping others navigate the driver code. Acknowledgments: These tips were gathered thanks to the countless help received from AMD developers during the driver development process. The list below was obtained by examining open source code, reviewing public documentation, playing with tools, asking in public forums and also with the help of my former GSoC mentor, Rodrigo Siqueira.

Pre-Debugging Steps: Before diving into an issue, it s crucial to perform two essential steps: 1) Check the latest changes: Ensure you re working with the latest AMD driver modifications located in the amd-staging-drm-next branch maintained by Alex Deucher. You may also find bug fixes for newer kernel versions on branches that have the name pattern drm-fixes-<date>. 2) Examine the issue tracker: Confirm that your issue isn t already documented and addressed in the AMD display driver issue tracker. If you find a similar issue, you can team up with others and speed up the debugging process.

Understanding the issue: Do you really need to change this? Where should you start looking for changes? 3) Is the issue in the AMD kernel driver or in the userspace?: Identifying the source of the issue is essential regardless of the GPU vendor. Sometimes this can be challenging so here are some helpful tips:
  • Record the screen: Capture the screen using a recording app while experiencing the issue. If the bug appears in the capture, it s likely a userspace issue, not the kernel display driver.
  • Analyze the dmesg log: Look for error messages related to the display driver in the dmesg log. If the error message appears before the message [drm] Display Core v... , it s not likely a display driver issue. If this message doesn t appear in your log, the display driver wasn t fully loaded and you will see a notification that something went wrong here.
4) AMD Display Manager vs. AMD Display Core: The AMD display driver consists of two components:
  • Display Manager (DM): This component interacts directly with the Linux DRM infrastructure. Occasionally, issues can arise from misinterpretations of DRM properties or features. If the issue doesn t occur on other platforms with the same AMD hardware - for example, only happens on Linux but not on Windows - it s more likely related to the AMD DM code.
  • Display Core (DC): This is the platform-agnostic part responsible for setting and programming hardware features. Modifications to the DC usually require validation on other platforms, like Windows, to avoid regressions.
5) Identify the DC HW family: Each AMD GPU has variations in its hardware architecture. Features and helpers differ between families, so determining the relevant code for your specific hardware is crucial.
  • Find GPU product information in Linux/AMD GPU documentation
  • Check the dmesg log for the Display Core version (since this commit in Linux kernel 6.3v). For example:
    • [drm] Display Core v3.2.241 initialized on DCN 2.1
    • [drm] Display Core v3.2.237 initialized on DCN 3.0.1

Investigating the relevant driver code: Keep from letting unrelated driver code to affect your investigation. 6) Narrow the code inspection down to one DC HW family: the relevant code resides in a directory named after the DC number. For example, the DCN 3.0.1 driver code is located at drivers/gpu/drm/amd/display/dc/dcn301. We all know that the AMD s shared code is huge and you can use these boundaries to rule out codes unrelated to your issue. 7) Newer families may inherit code from older ones: you can find dcn301 using code from dcn30, dcn20, dcn10 files. It s crucial to verify which hooks and helpers your driver utilizes to investigate the right portion. You can leverage ftrace for supplemental validation. To give an example, it was useful when I was updating DCN3 color mapping to correctly use their new post-blending color capabilities, such as: Additionally, you can use two different HW families to compare behaviours. If you see the issue in one but not in the other, you can compare the code and understand what has changed and if the implementation from a previous family doesn t fit well the new HW resources or design. You can also count on the help of the community on the Linux AMD issue tracker to validate your code on other hardware and/or systems. This approach helped me debug a 2-year-old issue where the cursor gamma adjustment was incorrect in DCN3 hardware, but working correctly for DCN2 family. I solved the issue in two steps, thanks for community feedback and validation: 8) Check the hardware capability screening in the driver: You can currently find a list of display hardware capabilities in the drivers/gpu/drm/amd/display/dc/dcn*/dcn*_resource.c file. More precisely in the dcn*_resource_construct() function. Using DCN301 for illustration, here is the list of its hardware caps:
	/*************************************************
	 *  Resource + asic cap harcoding                *
	 *************************************************/
	pool->base.underlay_pipe_index = NO_UNDERLAY_PIPE;
	pool->base.pipe_count = pool->base.res_cap->num_timing_generator;
	pool->base.mpcc_count = pool->base.res_cap->num_timing_generator;
	dc->caps.max_downscale_ratio = 600;
	dc->caps.i2c_speed_in_khz = 100;
	dc->caps.i2c_speed_in_khz_hdcp = 5; /*1.4 w/a enabled by default*/
	dc->caps.max_cursor_size = 256;
	dc->caps.min_horizontal_blanking_period = 80;
	dc->caps.dmdata_alloc_size = 2048;
	dc->caps.max_slave_planes = 2;
	dc->caps.max_slave_yuv_planes = 2;
	dc->caps.max_slave_rgb_planes = 2;
	dc->caps.is_apu = true;
	dc->caps.post_blend_color_processing = true;
	dc->caps.force_dp_tps4_for_cp2520 = true;
	dc->caps.extended_aux_timeout_support = true;
	dc->caps.dmcub_support = true;
	/* Color pipeline capabilities */
	dc->caps.color.dpp.dcn_arch = 1;
	dc->caps.color.dpp.input_lut_shared = 0;
	dc->caps.color.dpp.icsc = 1;
	dc->caps.color.dpp.dgam_ram = 0; // must use gamma_corr
	dc->caps.color.dpp.dgam_rom_caps.srgb = 1;
	dc->caps.color.dpp.dgam_rom_caps.bt2020 = 1;
	dc->caps.color.dpp.dgam_rom_caps.gamma2_2 = 1;
	dc->caps.color.dpp.dgam_rom_caps.pq = 1;
	dc->caps.color.dpp.dgam_rom_caps.hlg = 1;
	dc->caps.color.dpp.post_csc = 1;
	dc->caps.color.dpp.gamma_corr = 1;
	dc->caps.color.dpp.dgam_rom_for_yuv = 0;
	dc->caps.color.dpp.hw_3d_lut = 1;
	dc->caps.color.dpp.ogam_ram = 1;
	// no OGAM ROM on DCN301
	dc->caps.color.dpp.ogam_rom_caps.srgb = 0;
	dc->caps.color.dpp.ogam_rom_caps.bt2020 = 0;
	dc->caps.color.dpp.ogam_rom_caps.gamma2_2 = 0;
	dc->caps.color.dpp.ogam_rom_caps.pq = 0;
	dc->caps.color.dpp.ogam_rom_caps.hlg = 0;
	dc->caps.color.dpp.ocsc = 0;
	dc->caps.color.mpc.gamut_remap = 1;
	dc->caps.color.mpc.num_3dluts = pool->base.res_cap->num_mpc_3dlut; //2
	dc->caps.color.mpc.ogam_ram = 1;
	dc->caps.color.mpc.ogam_rom_caps.srgb = 0;
	dc->caps.color.mpc.ogam_rom_caps.bt2020 = 0;
	dc->caps.color.mpc.ogam_rom_caps.gamma2_2 = 0;
	dc->caps.color.mpc.ogam_rom_caps.pq = 0;
	dc->caps.color.mpc.ogam_rom_caps.hlg = 0;
	dc->caps.color.mpc.ocsc = 1;
	dc->caps.dp_hdmi21_pcon_support = true;
	/* read VBIOS LTTPR caps */
	if (ctx->dc_bios->funcs->get_lttpr_caps)  
		enum bp_result bp_query_result;
		uint8_t is_vbios_lttpr_enable = 0;
		bp_query_result = ctx->dc_bios->funcs->get_lttpr_caps(ctx->dc_bios, &is_vbios_lttpr_enable);
		dc->caps.vbios_lttpr_enable = (bp_query_result == BP_RESULT_OK) && !!is_vbios_lttpr_enable;
	 
	if (ctx->dc_bios->funcs->get_lttpr_interop)  
		enum bp_result bp_query_result;
		uint8_t is_vbios_interop_enabled = 0;
		bp_query_result = ctx->dc_bios->funcs->get_lttpr_interop(ctx->dc_bios, &is_vbios_interop_enabled);
		dc->caps.vbios_lttpr_aware = (bp_query_result == BP_RESULT_OK) && !!is_vbios_interop_enabled;
	 
Keep in mind that the documentation of color capabilities are available at the Linux kernel Documentation.

Understanding the development history: What has brought us to the current state? 9) Pinpoint relevant commits: Use git log and git blame to identify commits targeting the code section you re interested in. 10) Track regressions: If you re examining the amd-staging-drm-next branch, check for regressions between DC release versions. These are defined by DC_VER in the drivers/gpu/drm/amd/display/dc/dc.h file. Alternatively, find a commit with this format drm/amd/display: 3.2.221 that determines a display release. It s useful for bisecting. This information helps you understand how outdated your branch is and identify potential regressions. You can consider each DC_VER takes around one week to be bumped. Finally, check testing log of each release in the report provided on the amd-gfx mailing list, such as this one Tested-by: Daniel Wheeler:

Reducing the inspection area: Focus on what really matters. 11) Identify involved HW blocks: This helps isolate the issue. You can find more information about DCN HW blocks in the DCN Overview documentation. In summary:
  • Plane issues are closer to HUBP and DPP.
  • Blending/Stream issues are closer to MPC, OPP and OPTC. They are related to DRM CRTC subjects.
This information was useful when debugging a hardware rotation issue where the cursor plane got clipped off in the middle of the screen. Finally, the issue was addressed by two patches: 12) Issues around bandwidth (glitches) and clocks: May be affected by calculations done in these HW blocks and HW specific values. The recalculation equations are found in the DML folder. DML stands for Display Mode Library. It s in charge of all required configuration parameters supported by the hardware for multiple scenarios. See more in the AMD DC Overview kernel docs. It s a math library that optimally configures hardware to find the best balance between power efficiency and performance in a given scenario. Finding some clk variables that affect device behavior may be a sign of it. It s hard for a external developer to debug this part, since it involves information from HW specs and firmware programming that we don t have access. The best option is to provide all relevant debugging information you have and ask AMD developers to check the values from your suspicions.
  • Do a trick: If you suspect the power setup is degrading performance, try setting the amount of power supplied to the GPU to the maximum and see if it affects the system behavior with this command: sudo bash -c "echo high > /sys/class/drm/card0/device/power_dpm_force_performance_level"
I learned it when debugging glitches with hardware cursor rotation on Steam Deck. My first attempt was changing the clock calculation. In the end, Rodrigo Siqueira proposed the right solution targeting bandwidth in two steps:

Checking implicit programming and hardware limitations: Bring implicit programming to the level of consciousness and recognize hardware limitations. 13) Implicit update types: Check if the selected type for atomic update may affect your issue. The update type depends on the mode settings, since programming some modes demands more time for hardware processing. More details in the source code:
/* Surface update type is used by dc_update_surfaces_and_stream
 * The update type is determined at the very beginning of the function based
 * on parameters passed in and decides how much programming (or updating) is
 * going to be done during the call.
 *
 * UPDATE_TYPE_FAST is used for really fast updates that do not require much
 * logical calculations or hardware register programming. This update MUST be
 * ISR safe on windows. Currently fast update will only be used to flip surface
 * address.
 *
 * UPDATE_TYPE_MED is used for slower updates which require significant hw
 * re-programming however do not affect bandwidth consumption or clock
 * requirements. At present, this is the level at which front end updates
 * that do not require us to run bw_calcs happen. These are in/out transfer func
 * updates, viewport offset changes, recout size changes and pixel
depth changes.
 * This update can be done at ISR, but we want to minimize how often
this happens.
 *
 * UPDATE_TYPE_FULL is slow. Really slow. This requires us to recalculate our
 * bandwidth and clocks, possibly rearrange some pipes and reprogram
anything front
 * end related. Any time viewport dimensions, recout dimensions,
scaling ratios or
 * gamma need to be adjusted or pipe needs to be turned on (or
disconnected) we do
 * a full update. This cannot be done at ISR level and should be a rare event.
 * Unless someone is stress testing mpo enter/exit, playing with
colour or adjusting
 * underscan we don't expect to see this call at all.
 */
enum surface_update_type  
UPDATE_TYPE_FAST, /* super fast, safe to execute in isr */
UPDATE_TYPE_MED,  /* ISR safe, most of programming needed, no bw/clk change*/
UPDATE_TYPE_FULL, /* may need to shuffle resources */
 ;

Using tools: Observe the current state, validate your findings, continue improvements. 14) Use AMD tools to check hardware state and driver programming: help on understanding your driver settings and checking the behavior when changing those settings.
  • DC Visual confirmation: Check multiple planes and pipe split policy.
  • DTN logs: Check display hardware state, including rotation, size, format, underflow, blocks in use, color block values, etc.
  • UMR: Check ASIC info, register values, KMS state - links and elements (framebuffers, planes, CRTCs, connectors). Source: UMR project documentation
15) Use generic DRM/KMS tools:
  • IGT test tools: Use generic KMS tests or develop your own to isolate the issue in the kernel space. Compare results across different GPU vendors to understand their implementations and find potential solutions. Here AMD also has specific IGT tests for its GPUs that is expect to work without failures on any AMD GPU. You can check results of HW-specific tests using different display hardware families or you can compare expected differences between the generic workflow and AMD workflow.
  • drm_info: This tool summarizes the current state of a display driver (capabilities, properties and formats) per element of the DRM/KMS workflow. Output can be helpful when reporting bugs.

Don t give up! Debugging issues in the AMD display driver can be challenging, but by following these tips and leveraging available resources, you can significantly improve your chances of success. Worth mentioning: This blog post builds upon my talk, I m not an AMD expert, but presented at the 2022 XDC. It shares guidelines that helped me debug AMD display issues as an external developer of the driver. Open Source Display Driver: The Linux kernel/AMD display driver is open source, allowing you to actively contribute by addressing issues listed in the official tracker. Tackling existing issues or resolving your own can be a rewarding learning experience and a valuable contribution to the community. Additionally, the tracker serves as a valuable resource for finding similar bugs, troubleshooting tips, and suggestions from AMD developers. Finally, it s a platform for seeking help when needed. Remember, contributing to the open source community through issue resolution and collaboration is mutually beneficial for everyone involved.

5 December 2023

Dirk Eddelbuettel: RcppArmadillo 0.12.6.6.1 on CRAN: No More Deprecation

armadillo image Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language and is widely used by (currently) 1126 other packages on CRAN, downloaded 31.7 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 569 times according to Google Scholar. This release ends the practice on asking Armadillo to suppress deprecation warnings. RcppArmadillo, as noted, has a large user base. Sometimes Conrad sometimes made changes without too much of a heads-up so at times it was opportune to not bring those warnings to dozens (or maybe hundreds) of packages at CRAN. Yet we need to balance this with the demonstrable need to call out older deprecated code use. So sixteen months ago, with GitHub issue #391, we started to alert author of 30+ affected packages and supplied either pull requests or emailed patches to all. Eleven months ago GitHub issues #402 was added for a second deprecation. And the time of making the switch has come. Release 0.12.6.6.1 no longer defines ARMA_IGNORE_DEPRECATED_MARKER. So among the over 1100 packages using RcppArmadillo at CRAN, around a good dozen or so were flagged in the upload but CRAN concurred and let the package migrate to CRAN. If you maintain an affected package, consider applying the patch or pull request now. A simple stop-gap measure also exists by adding -DARMA_IGNORE_DEPRECATED_MARKER to src/Makevars as either PKG_CPPFLAGS or PKG_CXXFLAGS to reactivate it. But a proper code update, which is generally simple, may be better. If you are unsure, do not hesitate to get in touch. The set of changes since the last CRAN release follows.

Changes in RcppArmadillo version 0.12.6.6.1 (2023-12-03)
  • Following the extendeded transition in #391 and #402, this release no longer sets ARMA_IGNORE_DEPRECATED_MARKER. Maintainers of affected packages have received pull requests or patches and can set -DARMA_IGNORE_DEPRECATED_MARKER as PKG_CPPFLAGS.

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the Rcpp R-Forge page. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

17 November 2023

Jonathan Dowland: HLedger, regex matches and field assignments

I've finally landed a patch/feature for HLedger I've been working on-and-off (mostly off) since around March. HLedger has a powerful CSV importer which you configure with a set of rules. Rules consist of conditional matchers (does field X in this CSV row match this regular expression?) and field assignments (set the resulting transaction's account to Y). motivating problem 1 Here's an example of one of my rules for handling credit card repayments. This rule is applied when I import a CSV for my current account, which pays the credit card:
if AMERICAN EXPRESS
    account2 liabilities:amex
This results in a ledger entry like the following
2023-10-31 AMERICAN EXPRESS
    assets:current           - 6.66
    liabilities:amex           6.66
My current account statements cover calendar months. My credit card period spans mid-month to mid-month. I pay it off by direct debit, which comes out after the credit card period, towards the very end of the calendar month. That transaction falls roughly halfway through the next credit card period. On my credit card statements, that repayment is "warped" to the start of the list of transactions, clearing the outstanding balance from the previous period. When I import my credit card data to HLedger, I want to compare the result against a PDF statement to make sure my ledger matches reality. The repayment "warping" makes this awkward, because it means the balance for roughly half the new transactions (those that fall before the real-date of the repayment) don't match up. motivating problem 2 I start new ledger files each year. I need to import the closing balances from the previous year to the next, which I do by exporting the final balance from the previous year in CSV and importing that into the new ledgers in the usual way. Between 2022 and 2023 I changed the scheme I use for account names so I need to translate between the old and the new in the opening balances. I couldn't think of a way of achieving this in the import rules (besides writing a bespoke rule for every possible old account name) so I abused another HLedger feature instead, HLedger aliases. For example I added this alias in my family ledger file for 2023
alias /^family:(.*)/ = \1
These are ugly and I'd prefer to get rid of them. regex match groups A common feature of regular expressions is defining match groups which can be referenced elsewhere, such as on the far-side of a substitution. I added match group support to HLedger's field assignments. addressing date warping Here's an updated version rule from the first motivating problem:
if AMERICAN EXPRESS
& %date (..)/(..)/(....)
    account2 liabilities:amex
    comment2 date:\3-\2-16
We now match on on extra date field, and surround the day/month/year components with parentheses to define match groups. We add a second field assignment too, setting the second posting's "comment" field to a string which, once the match groups are interpolated, instructs HLedger to do date warping (I wrote about this in date warping in HLedger) The new transaction looks like this:
2023-10-31 AMERICAN EXPRESS
    assets:current           - 6.66
    liabilities:amex           6.66 ; date:2023-10-16
getting rid of aliases In the second problem, I can strip off the unwanted account name prefixes at CSV import time, with rules like this
if %account2 ^family:(.*)$
    account2 \1
When! This stuff landed a week ago in early November, and is not yet in a Hledger release.

1 November 2023

Dirk Eddelbuettel: RcppArmadillo 0.12.6.6.0 on CRAN: Bugfix, Thread Throttling

armadillo image Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language and is widely used by (currently) 1110 other packages on CRAN, downloaded 31.2 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 563 times according to Google Scholar. This release brings upstream bugfix releases 12.6.5 (sparse matrix corner case) and 12.6.6 with an ARPACK correction. Conrad released it this this morning, I had been running reverse dependency checks anyway and knew we were in good shape so for once I did not await a full run against the now over 1100 (!!) packages using RcppArmadillo. This release also contains a change I prepared on Sunday and which helps with much-criticized (and rightly I may add) insistence by CRAN concerning throttling . The motivation is understandable: CRAN tests many packages at once on beefy servers and can ill afford tests going off and requesting numerous cores. But rather than providing a global setting at their end, CRAN insists that each package (!!) deals with this. The recent traffic on the helpful-as-ever r-pkg-devel mailing clearly shows that this confuses quite a few package developers. Some have admitted to simply turning examples and tests off: a net loss for all of us. Now, Armadillo defaults to using up to eight cores (which is enough to upset CRAN) when running with OpenMP (which is generally only on Linux for reasons I rather not get into ). With this release I expose a helper functions (from OpenMP) to limit this. I also set up an example package and repo RcppArmadilloOpenMPEx detailing this, and added a demonstration of how to use the new throttlers to the fastLm example. I hope this proves useful to users of the package. The set of changes since the last CRAN release follows.

Changes in RcppArmadillo version 0.12.6.6.0 (2023-10-31)
  • Upgraded to Armadillo release 12.6.6 (Cortisol Retox)
    • Fix eigs_sym(), eigs_gen() and svds() to generate deterministic results in ARPACK mode
  • Add helper functions to set and get the number of OpenMP threads
  • Store initial thread count at package load and use in thread-throttling helper (and resetter) suitable for CRAN constraints

Changes in RcppArmadillo version 0.12.6.5.0 (2023-10-14)
  • Upgraded to Armadillo release 12.6.5 (Cortisol Retox)
    • Fix for corner-case bug in handling sparse matrices with no non-zero elements

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the Rcpp R-Forge page. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

29 October 2023

Aigars Mahinovs: Figuring out finances part 4

At the end of the last part of this, we got a Home Assistant OS installation that contains in itself a Firefly III instance and that contains all the current financial information. Now I will try to connect the two. While it could be nice to create a fully-featured integration for Firefly III to Home Assistant to communicate all interesting values and events, I have an interest on programming a more advanced data point calculation for my budget needs, so a less generic, but more flexible approch is a better one for me. So I was quite interested when among the addons in the Home Assistant Addon Store I saw AppDaemon - a way to simply integrate arbitrary Python processing with Home Assistant. Let's see if that can do what I want. For start, after reading the tutorial , I wanted to create a simple script that would use Firefly III REST API to read the current balance of my main account and then send that to Home Assistant as a sensor value, which then can be displayed on a dashboard. As a quick try I modified the provided hello_world.py that is included in the default AppDaemon installation:
import requests
from datetime import datetime
import appdaemon.plugins.hass.hassapi as hass
app_token = "<FIREFLY_PERSONAL_ACCESS_TOKEN>"
firefly_url = "<FIREFLY_URL>"
class HelloWorld(hass.Hass):
    def initialize(self):
        self.run_every(self.set_asset, "now", 60 * 60)
    def set_asset(self, kwargs):
        ent = self.get_entity("sensor.firefly3_asset_sparkasse_main")
        if not ent.exists():
            ent.add(
                state=0.0,
                attributes= 
                    "native_value": 0.0,
                    "native_unit_of_measurement": "EUR",
                    "state_class": "measurement",
                    "device_class": "monetary",
                    "current_balance_date": datetime.now(),
                 )
        r = requests.get(
            firefly_url + "/api/v1/accounts?type=asset",
            headers= 
                "Authorization": "Bearer " + app_token,
                "Accept": "application/vnd.api+json",
                "Content-Type": "application/json",
         )
        data = r.json()
        for account in data["data"]:
            if not "attributes" in account or "name" not in account["attributes"]:
                continue
            if account["attributes"]["name"] != "Sparkasse giro":
                continue
            self.log("Account :" + str(account["attributes"]))
            ent.set_state(
                state=account["attributes"]["current_balance"],
                attributes= 
                    "native_value": account["attributes"]["current_balance"],
                    "current_balance_date": datetime.fromisoformat(account["attributes"]["current_balance_date"]),
                 )
            self.log("Entity updated")
It uses a URL and personal access token to access Firefly III API, gets the asset accounts information, then extracts info about current balance and balance date of my main account and then creates and/or updates a "sensor" value into Home Assistant. This sensor is with metadata marked as a monetary value and as a measurement. This makes Home Assistant track this value in the database as a graphable changing value. I modified the file using the File Editor addon to edit the /config/appdaemon/apps/hello.py file. Each time the file is saved it is reloaded and logs can be seen in the AppDaemon Logs section - main_log for logging messages or error_log if there is a crash. Useful to know that requests library is included, but it hard to see in the docks what else is included or if there is an easy way to install extra Python packages. This is already a very nice basis for custom value insertion into Home Assistant - whatever you can with a Python script extract or calculate, you can also inject into Home Assistant. With even this simple approach you can monitor balances, budgets, piggy-banks, bill payment status and even sum of transactions in particular catories in a particular time window. Especially interesting data can be found in the insight section of the Firefly III API. The script above uses a trigger like self.run_every(self.set_asset, "now", 60 * 60) to simply run once per hour. The data in Firefly will not be updated too often anyway, at least not until we figure out how to make bank connection run automatically without user interaction and not screw up already existing transactions along the way. In theory a webhook API of the Firefly III could be used to trigger the data update instantly when any transaction is created or updated. Possibly even using Home Assistant webhook integration. Hmmm. Maybe. Who am I kiddind? I am going to make that work, for sure! :D But first - how about figuring out the future? So what I want to do? In short, I want to predict what will be the balance on my main account just before the next months salary comes in. To do this I would:
  • take the current balance of the main account
  • if this months salary is not paid out yet, then add that into the balance
  • deduct all still unpaid bills that are due between now and the target date
  • if the credit card account has not yet been reset to the main account, deduct current amount on the cards
  • if credit card account has been reset, but not from main account deducted yet, deduct the reset amount
To do that I need to use the Firefly API to read: current account info, status of all bills including next due date and amount, transfer transactions between credit cards and main account and something that would store the expected salary date and amount. Ideally I'd use a recurring transaction or a income bill for this, but Firefly is not really cooperating with that. The easiest would be just to hardcode that in the script itself. And this is what I have come up with so far. To make the development process easier, I separated put the params for the API key and salary info and app params for the month to predict for, and predict both this and next months balances at the same time. I edited the script locally with Neovim and also ran it locally with a few mocks, uploading to Home Assistant via the SSH addon when the local executions looked good. So what's next? Well, need to somewhat automate the sync with the bank (if at all possible). And for sure take a regular database and config backup :D

19 October 2023

Russ Allbery: Review: The Cassini Division

Review: The Cassini Division, by Ken MacLeod
Series: Fall Revolution #3
Publisher: Tor
Copyright: 1998
Printing: August 2000
ISBN: 0-8125-6858-3
Format: Mass market
Pages: 305
The Cassini Division is the third book in the Fall Revolution series and a fairly direct sequel (albeit with different protagonists) to The Stone Canal. This is not a good place to start the series. It's impossible to talk about the plot of this book without discussing the future history of this series, which arguably includes some spoilers for The Star Fraction and The Stone Canal. I don't think the direction of history matters that much in enjoying the previous books, but read the first two books of the series before this review if you want to avoid all spoilers. When the Outwarders uploaded themselves and went fast, they did a lot of strange things: an interstellar probe contrary to all known laws of physics, the disassembly of Ganymede, and the Malley Mile, which plays a significant role in The Stone Canal. They also crashed the Earth. This was not entirely their fault. There were a lot of politics, religious fundamentalism, and plagues in play as well. But the storm of viruses broadcast from their transformed Jupiter shut down essentially all computing equipment on Earth, which set off much of the chaos. The results were catastrophic, and also politically transformative. Now, the Solar Union is a nearly unified anarchosocialist society, with only scattered enclaves of non-cooperators left outside that structure. Ellen May Ngewthu is a leader of the Cassini Division, the bulwark that stands between humans and the Outwarders. The Division ruthlessly destroys any remnant or probe that dares rise out of Jupiter's atmosphere, ensuring that the Outwarders, whatever they have become after untold generations of fast evolution, stay isolated to the one planet they have absorbed. The Division is very good at what they do. But there is a potential gap in that line of defense: there are fast folk in storage at the other end of the Malley Mile, on New Mars, and who knows what the deranged capitalists there will do or what forces they might unleash. The one person who knows a path through the Malley Mile isn't talking, so Ellen goes in search of the next best thing: the non-cooperator scientist Isambard Kingdom Malley. I am now thoroughly annoyed at how politics are handled in this series, and much less confused by the frequency with which MacLeod won Prometheus Awards from the Libertarian Futurist Society. Some of this is my own fault for having too high of hopes for political SF, but nothing in this series so far has convinced me that MacLeod is seriously engaging with political systems. Instead, the world-building to date makes the classic libertarian mistake of thinking societies will happily abandon stability and predictability in favor of their strange definition of freedom. The Solar Union is based on what Ellen calls the true knowledge, which is worth quoting in full so that you know what kind of politics we're talking about:
Life is a process of breaking down and using other matter, and if need be, other life. Therefore, life is aggression, and successful life is successful aggression. Life is the scum of matter, and people are the scum of life. There is nothing but matter, forces, space and time, which together make power. Nothing matters, except what matters to you. Might makes right, and power makes freedom. You are free to do whatever is in your power, and if you want to survive and thrive you had better do whatever is in your interests. If your interests conflict with those of others, let the others pit their power against yours, everyone for theirselves. If your interests coincide with those of others, let them work together with you, and against the rest. We are what we eat, and we eat everything. All that you really value, and the goodness and truth and beauty of life, have their roots in this apparently barren soil. This is the true knowledge. We had founded our idealism on the most nihilistic implications of science, our socialism on crass self-interest, our peace on our capacity for mutual destruction, and our liberty on determinism. We had replaced morality with convention, bravery with safety, frugality with plenty, philosophy with science, stoicism with anaesthetics and piety with immortality. The universal acid of the true knowledge had burned away a world of words, and exposed a universe of things. Things we could use.
This is certainly something that some people will believe, particularly cynical college students who love political theory, feeling smarter than other people, and calling their pet theories things like "the true knowledge." It is not even remotely believable as the governing philosophy of a solar confederation. The point of government for the average person in human society is to create and enforce predictable mutual rules that one can use as a basis for planning and habits, allowing you to not think about politics all the time. People who adore thinking about politics have great difficulty understanding how important it is to everyone else to have ignorable government. Constantly testing your power against other coalitions is a sport, not a governing philosophy. Given the implication that this testing is through violence or the threat of violence, it beggars belief that any large number of people would tolerate that type of instability for an extended period of time. Ellen is fully committed to the true knowledge. MacLeod likely is not; I don't think this represents the philosophy of the author. But the primary political conflict in this novel famous for being political science fiction is between the above variation of anarchy and an anarchocapitalist society, neither of which are believable as stable political systems for large numbers of people. This is a bit like seeking out a series because you were told it was about a great clash of European monarchies and discovering it was about a fight between Liberland and Sealand. It becomes hard to take the rest of the book seriously. I do realize that one point of political science fiction is to play with strange political ideas, similar to how science fiction plays with often-implausible science ideas. But those ideas need some contact with human nature. If you're going to tell me that the key to clawing society back from a world-wide catastrophic descent into chaos is to discard literally every social system used to create predictability and order, you had better be describing aliens, because that's not how humans work. The rest of the book is better. I am untangling a lot of backstory for the above synopsis, which in the book comes in dribs and drabs, but piecing that together is good fun. The plot is far more straightforward than the previous two books in the series: there is a clear enemy, a clear goal, and Ellen goes from point A to point B in a comprehensible way with enough twists to keep it interesting. The core moral conflict of the book is that Ellen is an anti-AI fanatic to the point that she considers anyone other than non-uploaded humans to be an existential threat. MacLeod gives the reader both reasons to believe Ellen is right and reasons to believe she's wrong, which maintains an interesting moral tension. One thing that MacLeod is very good at is what Bob Shaw called "wee thinky bits." I think my favorite in this book is the computer technology used by the Cassini Division, who have spent a century in close combat with inimical AI capable of infecting any digital computer system with tailored viruses. As a result, their computers are mechanical non-Von-Neumann machines, but mechanical with all the technology of a highly-advanced 24th century civilization with nanometer-scale manufacturing technology. It's a great mental image and a lot of fun to think about. This is the only science fiction novel that I can think of that has a hard-takeoff singularity that nonetheless is successfully resisted and fought to a stand-still by unmodified humanity. Most writers who were interested in the singularity idea treated it as either a near-total transformation leaving only remnants or as something that had to be stopped before it started. MacLeod realizes that there's no reason to believe a post-singularity form of life would be either uniform in intent or free from its own baffling sudden collapses and reversals, which can be exploited by humans. It makes for a much better story. The sociology of this book is difficult to swallow, but the characterization is significantly better than the previous books of the series and the plot is much tighter. I was too annoyed by the political science to fully enjoy it, but that may be partly the fault of my expectations coming in. If you like chewy, idea-filled science fiction with a lot of unexplained world-building that you have to puzzle out as you go, you may enjoy this, although unfortunately I think you need to read at least The Stone Canal first. The ending was a bit unsatisfying, but even that includes some neat science fiction ideas. Followed by The Sky Road, although I understand it is not a straightforward sequel. Rating: 6 out of 10

17 October 2023

Russ Allbery: Review: A Hat Full of Sky

Review: A Hat Full of Sky, by Terry Pratchett
Series: Discworld #32
Publisher: HarperTrophy
Copyright: 2004
Printing: 2005
ISBN: 0-06-058662-1
Format: Mass market
Pages: 407
A Hat Full of Sky is the 32nd Discworld novel and the second Tiffany Aching young adult novel. You should not start here, but you could start with The Wee Free Men. As with that book, some parts of the story carry more weight if you are already familiar with Granny Weatherwax. Tiffany is a witch, but she needs to be trained. This is normally done by apprenticeship, and in Tiffany's case it seemed wise to give her exposure to more types of witching. Thus, Tiffany, complete with new boots and a going-away present from the still-somewhat-annoying Roland, is off on an apprenticeship to the sensible Miss Level. (The new boots feel wrong and get swapped out for her concealed old boots at the first opportunity.) Unbeknownst to Tiffany, her precocious experiments with leaving her body as a convenient substitute for a mirror have attracted something very bad, something none of the witches are expecting. The Nac Mac Feegle know a hiver as soon as they feel it, but they have a new kelda now, and she's not sure she wants them racing off after their old kelda. Terry Pratchett is very good at a lot of things, but I don't think villains are one of his strengths. He manages an occasional memorable one (the Auditors, for example, at least before the whole chocolate thing), but I find most of them a bit boring. The hiver is one of the boring ones. It serves mostly as a concretized metaphor about the temptations of magical power, but those temptations felt so unlike the tendencies of Tiffany's personality that I didn't think the metaphor worked in the story. The interesting heart of this book to me is the conflict between Tiffany's impatience with nonsense and Miss Level's arguably excessive willingness to help everyone regardless of how demanding they get. There's something deeper in here about female socialization and how that interacts with Pratchett's conception of witches that got me thinking, although I don't think Pratchett landed the point with full force. Miss Level is clearly a good witch to her village and seems comfortable with how she lives her life, so perhaps they're not taking advantage of her, but she thoroughly slots herself into the helper role. If Tiffany attempted the same role, people would be taking advantage of her, because the role doesn't fit her. And yet, there's a lesson here she needs to learn about seeing other people as people, even if it wouldn't be healthy for her to move all the way to Miss Level's mindset. Tiffany is a precocious kid who is used to being underestimated, and who has reacted by becoming independent and somewhat judgmental. She's also had a taste of real magical power, which creates a risk of her getting too far into her own head. Miss Level is a fount of empathy and understanding for the normal people around her, which Tiffany resists and needed to learn. I think Granny Weatherwax is too much like Tiffany to teach her that. She also has no patience for fools, but she's older and wiser and knows Tiffany needs a push in that direction. Miss Level isn't a destination, but more of a counterbalance. That emotional journey, a conclusion that again focuses on the role of witches in questions of life and death, and Tiffany's fascinatingly spiky mutual respect with Granny Weatherwax were the best parts of this book for me. The middle section with the hiver was rather tedious and forgettable, and the Nac Mac Feegle were entertaining but not more than that. It felt like the story went in a few different directions and only some of them worked, in part because the villain intended to tie those pieces together was more of a force of nature than a piece of Tiffany's emotional puzzle. If the hiver had resonated with the darker parts of Tiffany's natural personality, the plot would have worked better. Pratchett was gesturing in that direction, but he never convinced me it was consistent with what we'd already seen of her. Like a lot of the Discworld novels, the good moments in A Hat Full of Sky are astonishing, but the plot is somewhat forgettable. It's still solidly entertaining, though, and if you enjoyed The Wee Free Men, I think this is slightly better. Followed by Going Postal in publication order. The next Tiffany Aching novel is Wintersmith. Rating: 8 out of 10

15 October 2023

Aigars Mahinovs: Figuring out finances part 2

A week ago I started to migrate my financial planning from a closed source system to a new system based on an open source, self-hosted solution. Main candidate is Firefly III - a relatively simple financial planner with a rather rich feature set and a solid user base and developer support. Starting it up with a Docker-Compose file was quite easy, following the official documentation. The same Compose file also managed the MySQL database, the importer app and a cron container for regular imports. The separate importer app allows both imports from CSV files and also from external bank account connection services. For whatever weird reason both of those services support exporting data from my regular bank accounts, but not from my credit cards for exactly the same bank. So for those cards I would need to periodically download CVS transaction exports and feed them into the importer. The combination of the cron container and the importer app allows for both of these functions. To do this you first do the import via the Web UI first and configure all the mappings - configure file and date formats, map account names in the bank output to account names that you added in Firefly, configure what otehr columns mean and what is the mapping of other values, like expense and income accounts. At the end of the process you can download a json file representing all the settings of the import you just did. Putting such a json config file in the input folder of the cron container and telling it to do an import (weekly, for example) would do such import periodically. Putting a credit card CSV file along with a config file for credit card import would also auto-import that. So far so good. However, when I tried importing exports from MoneyWiz or even when I tried re-creating account history directly from the bank data, I hit a very annoying problem. Transfers. Thing is detecting transfers between the real (asset) accounts that you are managing is really essential. For one those are not real expenses and incomes, so failing to mark them as transfers would show weird income/expense numbers. But if you do detect them as transfers and correctly map the destination account, but fail to match the transations between another you get a double-transaction. This becomes really hard when transactions happen on different dates and have different descriptions. So you get both a +1000 transaction "Credit card reset" on 14.09.2023 in the credit card account and a -1000$ transaction "Repayment of own credit card" on 24.09.2023 in the main account of the bank. Matching them to recognise that it is just one transfer and not two is ... non-trivial. The best solution I could come up with so far is to always map the "Opposing account name" for such transfers to a virtual "Transfers" asset account. That has the benefit of being actually able to correctly represent transfers that take several days to move between accounts and showing you how much money is still in transit at any point in time. So after I figured this out, finally the account balances started matching up with reality. Setting up spending categories required another change - the author of the Firefly III does not like complexity so it does not support nesting categories. Categories can only be in a single, flat list. It is suggested that if you do need to track multiple categories and also their combination, then category groups might help you. I will survive for now by simplifying the categories that I do actually use. Might actually make the reports more usable. A new feature for me will be the ability to use automation rules to assign transactions to categories based on their contents, like expense accounts of keywords in descriptions. Setting up regular bills (with matching rules to assign incoming transactions to those bills) is another feature that is very important to me. The feature itself works just fine in Firefly III, but it has two restrictions that the author of the software does not actually want to be changed. For one it can not be used to track prodictable incomes (like salary), presumably because it is only there for bills (subscriptions in v3 UI). And for other, there is nothing in the base software that actually uses the data from bills to look forward. The author does not like to try to predict the future. Which for me is basically the one of two reasons to use this kind of software at all. I want to know - given all manually entered regular and 100% predictable expenses and incomes, what will be the state of my accounts at particular date? It seems that to get at that I will need to write my own oracle script using the rich Firefly III API. The last question I had for this week was - how will I enter a new cash puchace of potatoes on a farmers market into the system that sits in a private server in my home network? Turns out there are actually multiple Android apps for Firefly III that can be easily used for this using a robust OAuth or shared token authentification. Except the web UI needs to be exposed online for this to work (and ideally protected by HTTPS). Other users have also created Telegram bots that allow using chat messages to create transaction entries. This is a bit harder to use and more narrow scope, but should be easier to setup. Will have to try both apporaches. But before I really get going on this, I need to fix another thing that I have been postponing for a while: I will need to migrate my Home Assistant installation from a Docker container installation to a Home Assistant OS installation so that I can install Addons, including Firefly III, MySQL and Portainer to have a bit more organised and less hand-knitted home server setup. Let's see how that goes next week :D

9 October 2023

Aigars Mahinovs: Figuring out finances part 1

I have been managing my finances and getting an overview of where I am financially and where I am going month-to-month for a few years already. That means that I already have a method of doing my finances and a method of thinking about them. So far this has been supported by a commerical tool called MoneyWiz. I was happy to pay them for the ability to have a solid product and be able to easily access my finances from my phone (to enter cash transactions directly in the field) and sync data across multiple devices with their cloud offering. However, MoneyWiz development stalled. So much so that they stopped updating their Android app and cancelled web version plans to just focus on a iPhone and MacOS clients. And even there the development did not seem very successful over the last year. So I am cancelling their subscription, extracting all my data (as CSV) and looking for alternatives. So let's start with the basics - what I want from finace software?
  • Open source and self-hosted
  • Accounts to represent real accounts in my bank, credit cards, cash account
  • Transactions to represent money moving in/out/between accounts
  • Tree of categories for transactions to categorise spending
  • Category reports on arbitrary time periods
  • Ability to see the expected balance on each account at any historic point in time from given transactions
  • Ability to do a corrective transaction - hmm, I don't know what happened, but right now I have X in cash
  • Adding past transactions before a corrective transaction should not change balance after correction
  • Ability to automatically import transactions from my bank
  • Ability to plan future, recurring transactions
  • Incoming transactions from bank import should be auto-matched with planned recuring transactions
  • Ability to see missed planned recurring transactions
  • Ability to see value deviations in planned recuring transactions
  • Ability to manually match a transaction with a planned recurring transaction for that month or skip a month
  • Ability to see a projected balance on my accounts at any future date given planned transactions
  • Credit card fully refilling its current (negative) balance to zero (from a given account) should be a plannable transaction
  • Accessible from multiple devices, like Linux, Android, iPhone, MacOS. Most likely that means web-based
  • Achiving accounts without loosing their past transactions with active accounts
  • API to access data, so that I could wire that up to a Home Assistant dashboard
I don't care about currencies (thanks Euro!), localization, taxes and double-entry bookkeeping. I don't have much use for tags, budgets or savings targets. It could be useful to be able to track stock investment values (but I do have a pretty good from the bank that does that anyway). And tracking loans (incoming and outgoing) could be a worthwhile thing as well, but that can also be simulated with separate accounts for each loan. Tracking refundable expenses (like medical expenses that insurances should refund later on) would also be nice. Could also be simulated by having a separate expense category (Medical - refundable) and putting the refund as a negative expense into that category. One weird feature that I really like is having transfer transactions that have arbitrary incoming and outgoing dates. For some transactions it is a simple transfer delay - money leaves the account A on Friday and only appears in the destination account B on Monday morning. However, this can also happen the other way around - a few credit cards I have seen work in a way that on 14th of each month the balance of the credit card gets reset to zero and then a week later the negative balance that was on that credit card gets taken from the base account that the card is bound to. So the money leaves the base account a week later than it arrives in the credit card account. Financial systems really are magic sometimes :D Ideally that should be represented by a single transaction with two dates - one date when money leaves one account and another date when it arrives in a different account, so that balances on both accounts in the days between would be correct (as in matching what the bank says they were on those dates). And it should automatch such transaction from bank data imports. And predict its value from the negative balance of the credit card. And a pony! But in worst case this could be simulated using a separate "In transfer" account. When I am thinking about my finances, the key metric for me is - how much will I spend this month - both in fixed spending (rent, Internet and phone bills, health insurance, subscriptions, ...) and in variable spending (groceries, eating out, clothing, ...). In theory that should be trivial, but in practice a simple calendar month view will fail because a bunch of fixed expenses occur at the month end/start boundary and can happen either at the end of previous month or at start on next month, depending on how the weekends happen to fall and how the systems of various providers decided to do direct debits this month. So 1st of month is not a good snapshoting time. Any date between 6th (last possible days for monthly bills) and 20th (earliest possible salary day) would be a better cut off point. Looking at the balance of my main account at that time point lets me know what was the difference between income and expenses for the past month. And that is the number I want the financial app to be able to predict as soon as possible. And additional complication is the way credit cards work here. The expenses booked to a credit card after 14th of September will only appear in the main account on 20th of October and thus will actually only count towards the expenses of month of October. That makes it very confusing to see a larger overspend in the main account on 6th of October and then use the categories to try to figure out where the money went, because all the spending that happened on the credit card after the 14th of September should not really be looked at in that context, but spending on the main account or cash account should be counted all the way to 6th of October. :D I am not optimistic that any finance software would able to doea with this mess correctly. So far I am tending to consider Firefly III which has most of what I need and looks great and well maintained, but has the tiny drawback of being written in PHP, so that basically excludes me from ability to contribute anything beyond ideas and feedback. :D I already did a quick test of Firefly via local Docker containers, so the next step would be to try to set it up as a production instance (using the same always-on mini PC that is running my Home Assistant instance), get the database up and working, get the Firefly III itself up and running, get account importer working for bank connection, import historical transactions from there, setup my categories and recurring transactions, import past data dumped out of MoneyWiz and see if this works well and gives me the insights I need. If you know a better solution, please let me know on any social or email channel!

6 October 2023

Russ Allbery: Review: The Far Reaches

Review: The Far Reaches, edited by John Joseph Adams
Publisher: Amazon Original Stories
Copyright: June 2023
ISBN: 1-6625-1572-3
ISBN: 1-6625-1622-3
ISBN: 1-6625-1503-0
ISBN: 1-6625-1567-7
ISBN: 1-6625-1678-9
ISBN: 1-6625-1533-2
Format: Kindle
Pages: 219
Amazon has been releasing anthologies of original short SFF with various guest editors, free for Amazon Prime members. I previously tried Black Stars (edited by Nisi Shawl and Latoya Peterson) and Forward (edited by Blake Crouch). Neither were that good, but the second was much worse than the first. Amazon recently released a new collection, this time edited by long-standing SFF anthology editor John Joseph Adams and featuring a new story by Ann Leckie, which sounded promising enough to give them another chance. The definition of insanity is doing the same thing over and over again and expecting different results. As with the previous anthologies, each story is available separately for purchase or Amazon Prime "borrowing" with separate ISBNs. The sidebar cover is for the first in the sequence. Unlike the previous collections, which were longer novelettes or novellas, my guess is all of these are in the novelette range. (I did not do a word count.) If you're considering this anthology, read the Okorafor story ("Just Out of Jupiter's Reach"), consider "How It Unfolds" by James S.A. Corey, and avoid the rest. "How It Unfolds" by James S.A. Corey: Humans have invented a new form of physics called "slow light," which can duplicate any object that is scanned. The energy expense is extremely high, so the result is not a post-scarcity paradise. What the technology does offer, however, is a possible route to interstellar colonization: duplicate a team of volunteers and a ship full of bootstrapping equipment, and send copies to a bunch of promising-looking exoplanets. One of them might succeed. The premise is interesting. The twists Corey adds on top are even better. What can be duplicated once can be duplicated again, perhaps with more information. This is a lovely science fiction idea story that unfortunately bogs down because the authors couldn't think of anywhere better to go with it than relationship drama. I found the focus annoying, but the ideas are still very neat. (7) "Void" by Veronica Roth: A maintenance worker on a slower-than-light passenger ship making the run between Sol and Centauri unexpectedly is called to handle a dead body. A passenger has been murdered, two days outside the Sol system. Ace is in no way qualified to investigate the murder, nor is it her job, but she's watched a lot of crime dramas and she has met the victim before. The temptation to start poking around is impossible to resist. It's been a long time since I've read a story built around the differing experiences of time for people who stay on planets and people who spend most of their time traveling at relativistic speeds. It's a bit of a retro idea from an earlier era of science fiction, but it's still a good story hook for a murder mystery. None of the characters are that memorable and Roth never got me fully invested in the story, but this was still a pleasant way to pass the time. (6) "Falling Bodies" by Rebecca Roanhorse: Ira is the adopted son of a Genteel senator. He was a social experiment in civilizing the humans: rescue a human orphan and give him the best of Genteel society to see if he could behave himself appropriately. The answer was no, which is how Ira finds himself on Long Reach Station with a parole officer and a schooling opportunity, hopefully far enough from his previous mistakes for a second chance. Everyone else seems to like Rebecca Roanhorse's writing better than I do, and this is no exception. Beneath the veneer of a coming-of-age story with a twist of political intrigue, this is brutal, depressing, and awful, with an ending that needs a lot of content warnings. I'm sorry that I read it. (3) "The Long Game" by Ann Leckie: The Imperial Radch trilogy are some of my favorite science fiction novels of all time, but I am finding Leckie's other work a bit hit and miss. I have yet to read a novel of hers that I didn't like, but the short fiction I've read leans more heavily into exploring weird and alien perspectives, which is not my favorite part of her work. This story is firmly in that category: the first-person protagonist is a small tentacled alien creature, a bit like a swamp-dwelling octopus. I think I see what Leckie is doing here: balancing cynicism and optimism, exploring how lifespans influence thinking and planning, and making some subtle points about colonialism. But as a reading experience, I didn't enjoy it. I never liked any of the characters, and the conclusion of the story is the unsettling sort of main-character optimism that seems rather less optimistic to the reader. (4) "Just Out of Jupiter's Reach" by Nnedi Okorafor: K rm n scientists have found a way to grow living ships that can achieve a symbiosis with a human pilot, but the requirements for that symbiosis are very strict and hard to predict. The result was a planet-wide search using genetic testing to find the rare and possibly nonexistent matches. They found seven people. The deal was simple: spend ten years in space, alone, in her ship. No contact with any other human except at the midpoint, when the seven ships were allowed to meet up for a week. Two million euros a year, for as long as she followed the rules, and the opportunity to be part of a great experiment, providing data that will hopefully lead to humans becoming a spacefaring species. The core of this story is told during the seven days in the middle of the mission, and thus centers on people unfamiliar with human contact trying to navigate social relationships after five years in symbiotic ships that reshape themselves to their whims and personalities. The ships themselves link so that the others can tour, which offers both a good opportunity for interesting description and a concretized metaphor about meeting other people. I adore symbiotic spaceships, so this story had me at the premise. The surface plot is very psychological, and I didn't entirely click with it, but the sense of wonder vibes beneath that surface were wonderful. It also feels fresh and new: I've seen most of the ideas before, but not presented or written this way, or approached from quite this angle. Definitely the best story of the anthology. (8) "Slow Time Between the Stars" by John Scalzi: This, on the other hand, was a complete waste of time, redeemed only by being the shortest "story" in the collection. "Story" is generous, since there's only one character and a very dry, linear plot that exists only to make a philosophical point. "Speculative essay" may be closer. The protagonist is the artificial intelligence responsible for Earth's greatest interstellar probe. It is packed with a repository of all of human knowledge and the raw material to create life. Its mission is to find an exoplanet capable of sustaining that life, and then recreate it and support it. The plot, such as it is, follows the AI's decision to abandon that mission and cut off contact with Earth, for reasons that it eventually explains. Every possible beat of this story hit me wrong. The sense of wonder attaches to the most prosaic things and skips over the moments that could have provoked real wonder. The AI is both unbelievable and irritating, with all of the smug self-confidence of an Internet reply guy. The prose is overwrought in all the wrong places ("the finger of God, offering the spark to animate the dirt of another world" would totally be this AI's profile quote under their forum avatar). The only thing I liked about the story is the ethical point that it slowly meanders into, which I think I might agree with and at least find plausible. But it's delivered by the sort of character I would actively leave rooms to avoid, in a style that's about as engrossing as a tax form. Avoid. (2) Rating: 5 out of 10

4 October 2023

Russ Allbery: Review: The Last Watch

Review: The Last Watch, by J.S. Dewes
Series: Divide #1
Publisher: Tor
Copyright: 2021
ISBN: 1-250-23634-7
Format: Kindle
Pages: 476
The Last Watch is the first book of a far-future science fiction duology. It was J.S. Dewes's first novel. The station of the SCS Argus is the literal edge of the universe: the Divide, beyond which there is nothing. Not simply an absence of stars, but a nothing from a deeper level of physics. The Argus is there to guard against a return of the Viators, the technologically superior alien race that nearly conquered humanity hundreds of years prior and has already returned once, apparently traveling along the Divide. Humanity believes the Viators have been wiped out, but they're not taking chances. It is not a sought-after assignment. The Sentinels are the dregs of the military: convicts, troublemakers, and misfits, banished to the literal edge of nowhere. Joining them at the start of this book is the merchant prince, cocky asshole, and exiled sabateur Cavalon Mercer. He doesn't know what to expect from either military service or service on the edge of the universe. He certainly did not expect the Argus to be commanded by Adequin Rake, a literal war hero and a far more effective leader than this post would seem to warrant. There are reasons why Rake is out on the edge of the universe, ones that she's not eager to talk about. They quickly become an afterthought when the Argus discovers that the Divide is approaching their position. The universe is collapsing, and the only people who know about it are people the System Collective would prefer to forget exist. Yes, the edge of the universe, not the edge of the galaxy. Yes, despite having two FTL mechanisms, this book has a scale problem that it never reconciles. And yes, the physics do not really make sense, although this is not the sort of book that tries to explain the science. The characters are too busy trying to survive to develop new foundational theories of physics. I was looking for more good military SF after enjoying Artifact Space so much (and still eagerly awaiting the sequel), so I picked this up. It has some of the same elements: the military as a place where you can make a fresh start with found family elements, the equalizing effects of military assignments, and the merits of good leadership. They're a bit disguised here, since this is a crew of often-hostile misfits under a lot of stress with a partly checked-out captain, but they do surface towards the end of the book. The strength of this book is the mystery of the contracting universe, which poses both an immediate threat to the ship and a longer-term potential threat to, well, everything. The first part of the book builds tension with the immediate threat, but the story comes into its own when the crew starts piecing together the connections between the Viators and the Divide while jury-rigging technology and making risky choices between a lot of bad options. This is the first half of a duology, so the mysteries are not resolved here, but they do reach a satisfying and tantalizing intermediate conclusion. The writing is servicable and adequate, but it's a bit clunky in places. Dewes doesn't quite have the balance right between setting the emotional stakes and not letting the characters indulge in rumination. Rake is a good captain who is worn down and partly checked out, Mercer is scared and hiding it with arrogance and will do well when given the right sort of attention, and all of this is reasonably obvious early on and didn't need as many of the book's pages as it gets. I could have done without the romantic subplot, which I thought was an unnecessary distraction from the plot and turned into a lot of tedious angst, but I suspect I was not the target audience. (Writers, please remember that people can still care about each other and be highly motivated by fear for each other without being romantic partners.) I would not call this a great book. The characters are not going to surprise you that much, and it's a bit long for the amount of plot that it delivers. If you are the sort of person who nit-picks the physics of SF novels and gets annoyed at writers who don't understand how big the universe is, you will have to take a deep breath and hold on to your suspension of disbelief. But Dewes does a good job with ratcheting up the tension and conveying an atmosphere of mysterious things happening at the edge of nowhere, while still keeping it in the genre of mysterious technology and mind-boggingly huge physical phenomena rather than space horror. If you've been looking for that sort of book, this will do. I was hooked and will definitely read the sequel. Followed by The Exiled Fleet. Rating: 7 out of 10

Next.