Search Results: "rudi"

2 April 2024

Dirk Eddelbuettel: ulid 0.3.1 on CRAN: New Maintainer, Some Polish

Happy to share that ulid is now (back) on CRAN. It provides universally unique identifiers that are lexicographically sortable, which improves over the more well-known uuid generators. ulid is a neat little package put together by Bob Rudis a few years ago. It had recently drifted off CRAN so I offered to brush it up and re-submit it. And as tooted earlier today, it took just over an hour to finish that (after the lead up work I had done, including prior email with CRAN in the loop, the repo transfer from Bob s to my ulid repo plus of course a wee bit of actual maintenance; see below for more). The NEWS entry follows.

Changes in version 0.3.1 (2024-04-02)
  • New Maintainer
  • Deleted several repository files no longer used or needed
  • Added .editorconfig, ChangeLog and cleanup
  • Converted NEWS.md to NEWS.Rd
  • Simplified R/ directory to one source file
  • Simplified src/ removing redundant Makevars
  • Added ulid() alias
  • Updated / edited roxygen and README.md documention
  • Removed vignette which was identical to README.md
  • Switched continuous integration to GitHub Actions
  • Placed upstream (header-only) library into src/ulid/
  • Renamed single interface file to src/wrapper

If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

4 March 2024

Dirk Eddelbuettel: tinythemes 0.0.2 at CRAN: Maintenance

A first maintenance of the still fairly new package tinythemes arrived on CRAN today. tinythemes provides the theme_ipsum_rc() function from hrbrthemes by Bob Rudis in a zero (added) dependency way. A simple example is (also available as a demo inside the package) contrasts the default style (on left) with the one added by this package (on the right): This version mostly just updates to the newest releases of ggplot2 as one must, and takes advantage of Bob s update to hrbrthemes yesterday. The full set of changes since the initial CRAN release follows.

Changes in spdl version 0.0.2 (2024-03-04)
  • Added continuous integrations action based on r2u
  • Added demo/ directory and a READNE.md
  • Minor edits to help page content
  • Synchronised with ggplot2 3.5.0 via hrbrthemes

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the repo where comments and suggestions are welcome. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

30 December 2023

Russ Allbery: Review: The Hound of Justice

Review: The Hound of Justice, by Claire O'Dell
Series: Janet Watson Chronicles #2
Publisher: Harper Voyager
Copyright: July 2019
ISBN: 0-06-269938-5
Format: Kindle
Pages: 325
The Hound of Justice is a near-future thriller novel with Sherlock Holmes references. It is a direct sequel to A Study in Honor. This series is best read in order. Janet Watson is in a much better place than she was in the first book. She has proper physical therapy, a new arm, and a surgeon's job waiting for her as soon as she can master its features. A chance meeting due to an Inauguration Day terrorist attack may even develop into something more. She just needs to get back into the operating room and then she'll feel like her life is back on track. Sara Holmes, on the other hand, is restless, bored, and manic, rudely intruding on Watson's date. Then she disappears, upending Watson's living arrangements. She's on the trail of something. When mysterious destructible notes start appearing in Watson's books, it's clear that she wants help. The structure of this book didn't really work for me. The first third or so is a slice-of-life account of Watson's attempt to resume her career as a surgeon against a backdrop of ongoing depressing politics. This part sounds like the least interesting, but I was thoroughly engrossed. Watson is easy to care about, hospital politics are strangely interesting, and while the romance never quite clicked for me, it had potential. I was hoping for another book like A Study in Honor, where Watson's life and Holmes's investigations entwine and run in parallel. That was not to be. The middle third of the book pulls Watson away to Georgia and a complicated mix of family obligations and spy-novel machinations. If this had involved Sara's fae strangeness, verbal sparring, and odd tokens of appreciation, maybe it would have worked, but Sara Holmes is entirely off-camera. Watson is instead dealing with a minor supporting character from the first book, who drags her through disguises, vehicle changes, and border stops in a way that felt excessive and weirdly out of place. (Other reviews say that this character is the Mycroft Holmes equivalent; the first initial of Micha's name fits, but nothing else does so far as I can tell.) Then the last third of the novel turns into a heist. I like a heist novel as much as the next person, but a good heist story needs a team with chemistry and interplay, and I didn't know any of these people. There was way too little Sara Holmes, too much of Watson being out of her element in a rather generic way, and too many steps that Watson is led through without giving the reader a chance to enjoy the competence of the team. It felt jarring and disconnected, like Watson got pulled out of one story and dropped into an entirely different story without a proper groundwork. The Hound of Justice still has its moments. Watson is a great character and I'm still fully invested in her life. She was pulled into this mission because she's the person Holmes knows with the appropriate skills, and when she finally gets a chance to put those skills to use, it's quite satisfying. But, alas, the magic of A Study in Honor simply isn't here, in part because Sara Holmes is missing for most of the book and her replacements and stand-ins are nowhere near as intriguing. The villain's plan seems wildly impractical and highly likely to be detected, and although I can come up with some explanations to salvage it, those don't appear in the book. And, as in the first book, the villain seems very one-dimensional and simplistic. This is certainly not a villain worthy of Holmes. Fittingly, given the political movements O'Dell is commenting on, a lot of this book is about racial politics. O'Dell contrasts the microaggressions and more subtle dangers for Watson as a black woman in Washington, D.C., with the more explicit and active racism of the other places to which she travels over the course of the story. She's trying very hard to give the reader a feeling for what it's like to be black in the United States. I don't have any specific complaints about this, and I'm glad she's attempting it, but I came away from this book with a nagging feeling that Watson's reactions were a tiny bit off. It felt like a white person writing about racism rather than a black person writing about racism: nothing is entirely incorrect, but the emotional beats aren't quite where black authors would put them. I could be completely wrong about this, and am certainly much less qualified to comment than O'Dell is, but there were enough places that landed slightly wrong that I wanted to note it. I would still recommend A Study in Honor, but I'm not sure I can recommend this book. This is one of those series where the things that I enjoyed the most about the first book weren't what the author wanted to focus on in subsequent books. I would read more about the day-to-day of Watson's life, and I would certainly read more of Holmes and Watson sparring and circling and trying to understand each other. I'm less interested in somewhat generic thrillers with implausible plots and Sherlock Holmes references. At the moment, this is academic, since The Hound of Justice is the last book of the series so far. Rating: 6 out of 10

19 December 2023

Dirk Eddelbuettel: tinythemes 0.0.1 at CRAN: New Package

Delighted to announce a new package that arrived on CRAN today: tinythemes. It repackages the theme_ipsum_rc() function by Bob Rudis from his hrbrthemes package in a zero (added) dependency way. A simple example is (also available as a demo inside the packages in the next update) contrasts the default style (on left) with the one added by this package (on the right): The GitHub repo also shows this little example: total dependencies of hrbrthemes over what ggplot2 installs:
> db <- tools::CRAN_package_db()
> deps <- tools::package_dependencies(c("ggplot2", "hrbrthemes"), recursive=TRUE, db=db
> Filter(\(x) x != "ggplot2", setdiff(deps[[2]], deps[[1]]))
 [1] "extrafont"         "knitr"             "rmarkdown"         "htmltools"        
 [5] "tools"             "gdtools"           "extrafontdb"       "Rttf2pt1"         
 [9] "Rcpp"              "systemfonts"       "gfonts"            "curl"             
[13] "fontquiver"        "base64enc"         "digest"            "ellipsis"         
[17] "fastmap"           "evaluate"          "highr"             "xfun"             
[21] "yaml"              "bslib"             "fontawesome"       "jquerylib"        
[25] "jsonlite"          "stringr"           "tinytex"           "cachem"           
[29] "memoise"           "mime"              "sass"              "fontBitstreamVera"
[33] "fontLiberation"    "shiny"             "crul"              "crayon"           
[37] "stringi"           "cpp11"             "urltools"          "httpcode"         
[41] "fs"                "rappdirs"          "httpuv"            "xtable"           
[45] "sourcetools"       "later"             "promises"          "commonmark"       
[49] "triebeard"        
>
Comments and suggestions are welcome at the GitHub repo. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

3 October 2023

Bastian Blank: Introducing uploads to Debian by git tag

Several years ago, several people proposed a mechanism to upload packages to Debian just by doing "git tag" and "git push". Two not too long discussions on debian-devel (first and second) did not end with an agreement how this mechanism could work.1 The central problem was the ability to properly trace uploads back to the ones who authorised it. Now, several years later and after re-reading those e-mail threads, I again was stopped at the question: can we do this? Yes, it would not be just "git tag", but could we do with close enough? I have some rudimentary code ready to actually do uploads from the CI. However the list of caveats are currently pretty long. Yes, it works in principle. It is still disabled here, because in practice it does not yet work. Problems with this setup So what are the problems? It requires the git tags to include the both signed files for a successful source upload. This is solved by a new tool that could be a git sub-command. It just creates the source package, signs it and adds the signed file describing the upload (the .dsc and .changes file) to the tag to be pushed. The CI then extracts the signed files from the tag message and does it's work as normal. It requires a sufficiently reproducible build for source packages. Right now it is only known to work with the special 3.0 (gitarchive) source format, but even that requires the latest version of this format. No idea if it is possible to use others, like 3.0 (quilt) for this purpose. The shared GitLab runner provides by Salsa do not allow ftp access to the outside. But Debian still uses ftp to do uploads. At least if you don't want to share your ssh key, which can't be restricted to uploads only, but ssh would not work either. And as the current host for those builds, the Google Cloud Platform, does not provide connection tracking support for ftp, there is no easy way to allow that without just allowing everything. So we have no way to currently actually perform uploads from this platform. Further work As this is code running in a CI under the control of the developer, we can easily do other workflows. Some teams do workflows that do tags after acceptance into the Debian archive. Or they don't use tags at all. With some other markers, like variables or branch names, this support can be expanded easily. Unrelated to this task here, we might want to think about tying the .changes files for uploads to the target archive. As this code makes all of them readily available in form of tag message, replaying them into possible other archives might be more of a concern now. Conclusion So to come back to the question, yes we can. We can prepare uploads using our CI in a way that they would be accepted into the Debian archive. It just needs some more work on infrastructure.

  1. Here I have to admit that, after reading it again, I'm not really proud of my own behaviour and have to apologise.

27 June 2023

Matt Brown: Designing a PCBA friendly CO2 monitor

co2mon.nz currently uses monitors based on Oliver Seiler s open source design which I am personally building. This post describes my exploration of how to achieve production of a CO2 monitor that could enable the growth of co2mon.nz.

Goals Primarily I want to design a CO2 monitor which allows the majority of the production process to be outsourced. In particular, the PCB should be able to be assembled in an automated fashion (PCBA). As a secondary goal, I d like to improve the aesthetics of the monitor while retaining the unique feature of displaying clear visual indication of the current ventilation level through coloured lights. Overall, I ll consider the project successfull if I can achieve a visually attractive CO2 monitor which takes me less than 10 minutes per monitor to assemble/box/ship and whose production cost has the potential to be lower than the current model.

PCB

Schematic The existing CO2 monitor design provides a solid foundation but relies upon the ESP32 Devkit board, which is intended for evaluation purposes and is not well suited to automated assembly. Replacing this devkit board with the underlying ESP32 module is the major change needed to enable PCBA production, which then also requires moving the supporting electronics from the devkit board directly onto the primary PCB. The basic ESP32 chipset used in the devkit boards is no longer available as a discrete module suitable for placement directly onto a PCB which means the board will also have to be updated to use a more modern variant of the ESP32 chipset which is in active production such as the ESP32-S3. The ESP32-S3-WROOM1-N4 module is a very close match to the original devkit and will be suitable for this project. In addition to the change of ESP module, I made the following other changes to the components in use:
  • Added an additional temperature/humidity sensor (SHT30). The current monitors take temperature/humidity measurements from the SCD40 chipset. These are primarily intended to help in the calculation of CO2 levels and rely on an offset being subtracted to account for the heat generated by the electronic components themselves. I ve found their accuracy to OK, but not perfect. SHT30 is a cheap part, so its addition to hopefully provide improved temperature/humidity measurement is an easy choice.
  • Swapped to USB-C instead of USB-B for the power connector. USB-C is much more common than USB-B and is also smaller and not as tall off the board which provides more flexibility in the case design.
With major components selected the key task is to draw the schematic diagram describing how they electrically connect to each other, which includes all the supporting electronics (e.g. resistors, capacitors, etc) needed. Schematic I started out trying to use the EasyEDA/OSHWLab ecosystem thinking the tight integration with JLCPCB s assembly services would be a benefit, but the web interface was too clunky and limiting and I quickly got frustrated. KiCad proved to be a much more pleasant and capable tool for the job. The reference design in the ESP32 datasheet (p28) and USB-C power supply examples from blnlabs were particularly helpful alongside the KiCad documentation and the example of the existing monitor in completing this step (click the image to enlarge).

Layout The next step is to physically lay out where each component from the schematic will sit on the PCB itself. Obviously this requires first determining the overall size, shape and outline of the board and needs to occur in iteration with the intended design of the overall monitor, including the case, to ensure components like switches and USB sockets line up correctly. In addition to the requirements around the look and function of the case, the components themselves also have considerations that must be taken into account, including:
  • For best WiFi reception, the ESP32 antenna should be at the top of the monitor and should not have PCB underneath it, or for a specified distance either side of it.
  • The SHT30 temperature sensor should be as far from any heat generating components (e.g. the ESP32, BME680 and SCD40 modules) as possible and also considering that any generated heat will rise, as low on the monitor as possible.
  • The sensors measuring the air (SCD40, BME680 and SHT30) must have good exposure to the air outside the case.
PCB Taking all of these factors into account I ended up with a square PCB containing a cutout in the top right so that the ESP32 antenna can sit within the overall square outline while still meeting its design requirements. The SCD40 and BME680 sit in the top left corner, near the edges for good airflow and far away from the SHT30 temperature sensor in the bottom left corner. The LEDs I placed in a horizontal row across the center of the board, the LCD in the bottom right, a push button on the right-hand side and the USB-C socket in the center at the bottom. Once the components are placed, the next big task is to route the traces (aka wires) between the components on the board such that all the required electrical connections are made without any unintended connections (aka shorts) being created. This is a fun constraint solving/optimisation challenge and takes on an almost artistic aspect with other PCB designers often having strong opinions on which layout is best. The majority of the traces and routing for this board were able to be placed on the top layer of the PCB, but I also made use of the back layer for a few traces to help avoid conflicts and deal with places where different traces needed to cross each other. It s easy to see how this step would be much more challenging and time consuming on a larger and more complex PCB design. The final touches were to add some debugging breakouts for the serial and JTAG ports on the ESP32-S3 and a logo and various other helpful text on the silkscreen layer that will be printed on the PCB so it looks nice.

Production For assembly of the PCB, I went with JLCPCB based out of China. The trickiest part of the process was component selection and ensuring that the parts I had planned in the schematic were available. JLCPCB in conjunction with lcsc.com provides a basic and extended part library. If you use only basic parts you get quicker and cheaper assembly, while using extended parts bumps your order into a longer process with a small fee charged for each component on the board. Initially I spent a lot of time selecting components (particularly LEDs and switches) that were in the basic library before realising that the ESP32 modules are only available in the extended library! I think the lesson is that unless you re building the most trivial PCB with only passive components you will almost certainly end up in the advanced assembly process anyway, so trying to stay within the basic parts library is not worth the time. Unfortunately the SCD40 sensor, the most crucial part of the monitor, is not stocked at all by JLCPCB/LCSC! To work around this JLCPCB will maintain a personal component library for you when you ship components to them to for use in future orders. Given the extra logistical time and hassle of having to do this, combined with having a number of SCD40 components already on hand I decided to have the boards assembled without this component populated for the initial prototype run. This also had the benefit of lowering the risk if something went wrong as the cost of the SCD40 is greater than the cost of the PCB and all the other components combined! I found the kicad-jlcpcb-tools plugin for KiCad invaluable for keeping track of what part from lcsc.com I was planning to use for each component and generating the necessary output files for JLCPCB. The plugin allows you to store these mappings in your actual schematic which is very handy. The search interface it provides is fairly clunky and I found it was often easier to search for the part I needed on lcsc.com and then just copy the part number across into the plugin s search box rather than trying to search by name or component type. The LCD screen is the remaining component which is not easily assembled onto the PCB directly, but as you ll see next, this actually turned out to be OK as integrating the screen directly into the case makes the final assembly process smoother. fabricated PCBs The final surprise in the assembly process was the concept of edge rails, additional PCB material that is needed on either side of the board to help with feeding it through the assembly machine in the correct position. These can be added automatically by JLCPCB and have to be snapped off after the completed boards are received. I hadn t heard about these before and I was a little worried that they d interfere or get in the way of either the antenna cut-out at the top of the board, or the switch on the right hand side as it overhangs the edge so it can sit flush with the case. In the end there was no issue with the edge rails. The switch was placed hanging over them without issue and snapping them off once the boards arrived was a trivial 30s job using a vice to hold the edge rail and then gently tipping the board over until it snapped off - the interface between the board and the rails while solid looking has obviously been scored or perforated in some way during the production process so the edge breaks cleanly and smoothly. Magic! The process was amazingly quick with the completed PCBs (picture above) arriving within 7 days of the order being placed and looking amazing.

Case

Design I mocked up a very simple prototype of the case in FreeCAD during the PCB design process to help position and align the placement of the screen, switch and USB socket on the PCB as all three of these components interface directly with the edges of the case. Initially this design was similar to the current monitor design where the PCB (with lights and screen attached) sits in the bottom of the case, which has walls containing grilles for airflow and then a separate transparent perspex is screwed onto the top to complete the enclosure. As part of the aesthetic improvements for the new monitor I wanted to move away from a transparent front panel to something opaque but still translucent enough to allow the colour of the lights to show through. Without a transparent front panel the LCD also needs to be mounted directly into the case itself. The first few prototype iterations followed the design of the original CO2 monitor with a flat front panel that attaches to the rest of the case containing the PCB, but the new requirement to also attach the LCD to the front panel proved to make this unworkable. To stay in place the LCD has to be pushed onto mounting poles containing a catch mechanism which requires a moderate amount of force and applying that force to the LCD board when it is already connected to the PCB is essentially impossible. case with lcd attached As a result I ended up completely flipping the design such that the front panel is a single piece of plastic that also encompasses the walls of the case and contains appropriate mounting stakes for both the screen and the main PCB. Getting to this design hugely simplified the assembly process. Starting with an empty case lying face down on a bench, the LCD screen is pushed onto the mounting poles and sits flush with the cover of the case - easily achieved without the main PCB yet in place. case with pcb in place Next, the main PCB is gently lowered into the case facing downwards and sits on the mounting pole in each corner with the pins for the LCD just protruding through the appropriate holes in the PCB ready to be quickly soldered into place (this took significant iteration and tuning of dimensions/positioning to achieve!). Finally, a back panel can be attached which holds the PCB in place and uses cantilever snap joints to click on to the rest of the case. Overall the design is a huge improvement over the previous case which required screws and spacers to position the PCB and cover relative to the rest of the case, with the spacers and screws being particularly fiddly to work with. The major concern I had with the new design was that the mount to attach the monitor to the wall has moved from being attached to the main case and components directly to needing to be on the removable back panel - if the clips holding this panel to the case fail the core part of the monitor will fall off the wall which would not be good. To guard against this I ve doubled the size and number of clips at the top of the case (which bears the weight) and the result seems very robust in my testing. To completely assemble a monitor, including the soldering step takes me about 2-3 minutes individually, and would be even quicker if working in batches.

Production Given the number of design/testing iterations required to fine tune the case I chose not to outsource case production for now and used my 3D printer to produce them. I ve successfully used JLCPCB s 3D printing service for the previous case design, so I m confident that getting sufficient cases printed from JLCPCB or another supplier will not be an issue now that the design is finalised. completed monitor I tried a variety of filament colours, but settled on a transparent filament which once combined in the necessary layers to form the case is not actually transparent like perspex is, but provides a nice translucent medium which achieves the goal of having the light colour visible without exposing all of the circuit board detail. There s room for future improvement in the positioning of the LEDs on the circuit board to provide a more even distribution of light across the case but overall I really like the way the completed monitor ends up looking.

Evaluation Building this monitor has been a really fun project, both in seeing something progress from an idea, to plans on a screen to a nice physical thing on my wall, but also in learning and developing a bunch of new skills in PCB design, assembly and 3D design. completed monitor The goal of having a CO2 monitor which I can outsource the vast majority of production of is as close to being met as I think is possible without undertaking the final proof of placing a large order. I ve satisfied myself that each step is feasible and that the final assembly process is quick, easy and well below the level of effort and time it was taking me to produce the original monitors. Cost wise it s also a huge win, primarily in terms of the time taken, but also in the raw components - currently the five prototypes I ordered and built are on par with the component cost of the original CO2 monitor, but this will drop further with larger orders due to price breaks and amortisation of the setup and shipping expenses across more monitors. This project has also given me a much better appreciation for how much I m only just scratching the surface of the potential complexities and challenges in producing a hardware product of this type. I m reasonably confident I could successfully produce a few hundred and maybe even a few thousand monitors using this approach, but it s also clear that getting beyond that point is and would be a whole further level of effort and learning. Hardware is hard work. That s not news to anyone, including me, but there is something to be said for experiencing the process first hand to make the reality of what s required real. The PCB and case designs are both shared and can be found at https://github.com/co2monnz/co2monitor-pcb and https://github.com/co2monnz/cad, feedback and suggestions welcome!

13 February 2023

Vincent Bernat: Building a SQL-like language to filter flows

Akvorado collects network flows using IPFIX or sFlow. It stores them in a ClickHouse database. A web console allows a user to query the data and plot some graphs. A nice aspect of this console is how we can filter flows with a SQL-like language:
Filter editor in Akvorado console
Often, web interfaces expose a query builder to build such filters. I think combining a SQL-like language with an editor supporting completion, syntax highlighting, and linting is a better approach.1 The language parser is built with pigeon (Go) from a parsing expression grammar or PEG. The editor component is CodeMirror (TypeScript).

Language parser PEG grammars are relatively recent2 and are an alternative to context-free grammars. They are easier to write and they can generate better error messages. Python switched from an LL(1)-based parser to a PEG-based parser in Python 3.9. pigeon generates a parser for Go. A grammar is a set of rules. Each rule is an identifier, with an optional user-friendly label for error messages, an expression, and an action in Go to be executed on match. You can find the complete grammar in parser.peg. Here is a simplified rule:
ConditionIPExpr "condition on IP"  
  column:("ExporterAddress"i   return "ExporterAddress", nil  
        / "SrcAddr"i   return "SrcAddr", nil  
        / "DstAddr"i   return "DstAddr", nil  ) _ 
  operator:("=" / "!=") _ 
  ip:IP  
    return fmt.Sprintf("%s %s IPv6StringToNum(%s)",
      toString(column), toString(operator), quote(ip)), nil
   
The rule identifier is ConditionIPExpr. It case-insensitively matches ExporterAddress, SrcAddr, or DstAddr. The action for each case returns the proper case for the column name. That s what is stored in the column variable. Then, it matches one of the possible operators. As there is no code block, it stores the matched string directly in the operator variable. Then, it tries to match the IP rule, which is defined elsewhere in the grammar. If it succeeds, it stores the result of the match in the ip variable and executes the final action. The action turns the column, operator, and IP into a proper expression for ClickHouse. For example, if we have ExporterAddress = 203.0.113.15, we get ExporterAddress = IPv6StringToNum('203.0.113.15'). The IP rule uses a rudimentary regular expression but checks if the matched address is correct in the action block, thanks to netip.ParseAddr():
IP "IP address"   [0-9A-Fa-f:.]+  
  ip, err := netip.ParseAddr(string(c.text))
  if err != nil  
    return "", errors.New("expecting an IP address")
   
  return ip.String(), nil
 
Our parser safely turns the filter into a WHERE clause accepted by ClickHouse:3
WHERE InIfBoundary = 'external' 
AND ExporterRegion = 'france' 
AND InIfConnectivity = 'transit' 
AND SrcAS = 15169 
AND DstAddr BETWEEN toIPv6('2a01:e0f:ffff::') 
                AND toIPv6('2a01:e0f:ffff:ffff:ffff:ffff:ffff:ffff')

Integration in CodeMirror CodeMirror is a versatile code editor that can be easily integrated into JavaScript projects. In Akvorado, the Vue.js component, InputFilter, uses CodeMirror as its foundation and leverages features such as syntax highlighting, linting, and completion. The source code for these capabilities can be found in the codemirror/lang-filter/ directory.

Syntax highlighting The PEG grammar for Go cannot be utilized directly4 and the requirements for parsers for editors are distinct: they should be error-tolerant and operate incrementally, as code is typically updated character by character. CodeMirror offers a solution through its own parser generator, Lezer. We don t need this additional parser to fully understand the filter language. Only the basic structure is needed: column names, comparison and logic operators, quoted and unquoted values. The grammar is therefore quite short and does not need to be updated often:
@top Filter  
  expression
 
expression  
 Not expression  
 "(" expression ")"  
 "(" expression ")" And expression  
 "(" expression ")" Or expression  
 comparisonExpression And expression  
 comparisonExpression Or expression  
 comparisonExpression
 
comparisonExpression  
 Column Operator Value
 
Value  
  String   Literal   ValueLParen ListOfValues ValueRParen
 
ListOfValues  
  ListOfValues ValueComma (String   Literal)  
  String   Literal
 
// [ ]
@tokens  
  // [ ]
  Column   std.asciiLetter (std.asciiLetter std.digit)*  
  Operator   $[a-zA-Z!=><]+  
  String  
    '"' (![\\\n"]   "\\" _)* '"'?  
    "'" (![\\\n']   "\\" _)* "'"?
   
  Literal   (std.digit   std.asciiLetter   $[.:/])+  
  // [ ]
 
The expression SrcAS = 12322 AND (DstAS = 1299 OR SrcAS = 29447) is parsed to:
Filter(Column, Operator, Value(Literal),
  And, Column, Operator, Value(Literal),
  Or, Column, Operator, Value(Literal))
The last step is to teach CodeMirror how to map each token to a highlighting tag:
export const FilterLanguage = LRLanguage.define( 
  parser: parser.configure( 
    props: [
      styleTags( 
        Column: t.propertyName,
        String: t.string,
        Literal: t.literal,
        LineComment: t.lineComment,
        BlockComment: t.blockComment,
        Or: t.logicOperator,
        And: t.logicOperator,
        Not: t.logicOperator,
        Operator: t.compareOperator,
        "( )": t.paren,
       ),
    ],
   ),
 );

Linting We offload linting to the original parser in Go. The /api/v0/console/filter/validate endpoint accepts a filter and returns a JSON structure with the errors that were found:
 
  "message": "at line 1, position 12: string literal not terminated",
  "errors": [ 
    "line":    1,
    "column":  12,
    "offset":  11,
    "message": "string literal not terminated",
   ]
 
The linter source for CodeMirror queries the API and turns each error into a diagnostic.

Completion The completion system takes a hybrid approach. It splits the work between the frontend and the backend to offer useful suggestions for completing filters. The frontend uses the parser built with Lezer to determine the context of the completion: do we complete a column name, an operator, or a value? It also extracts the column name if we are completing something else. It forwards the result to the backend through the /api/v0/console/filter/complete endpoint. Walking the syntax tree was not as easy as I thought, but unit tests helped a lot. The backend uses the parser generated by pigeon to complete a column name or a comparison operator. For values, the completions are either static or extracted from the ClickHouse database. A user can complete an AS number from an organization name thanks to the following snippet:
results := []struct  
  Label  string  ch:"label" 
  Detail string  ch:"detail" 
 
columnName := "DstAS"
sqlQuery := fmt.Sprintf( 
 SELECT concat('AS', toString(%s)) AS label, dictGet('asns', 'name', %s) AS detail
 FROM flows
 WHERE TimeReceived > date_sub(minute, 1, now())
 AND detail != ''
 AND positionCaseInsensitive(detail, $1) >= 1
 GROUP BY label, detail
 ORDER BY COUNT(*) DESC
 LIMIT 20
 , columnName, columnName)
if err := conn.Select(ctx, &results, sqlQuery, input.Prefix); err != nil  
  c.r.Err(err).Msg("unable to query database")
  break
 
for _, result := range results  
  completions = append(completions, filterCompletion 
    Label:  result.Label,
    Detail: result.Detail,
    Quoted: false,
   )
 
In my opinion, the completion system is a major factor in making the field editor an efficient way to select flows. While a query builder may have been more beginner-friendly, the completion system s ease of use and functionality make it more enjoyable to use once you become familiar.

  1. Moreover, building a query builder did not seem like a fun task for me.
  2. They were introduced in 2004 in Parsing Expression Grammars: A Recognition-Based Syntactic Foundation. LR parsers were introduced in 1965, LALR parsers in 1969, and LL parsers in the 1970s. Yacc, a popular parser generator, was written in 1975.
  3. The parser returns a string. It does not generate an intermediate AST. This makes it simpler and it currently fits our needs.
  4. It could be manually translated to JavaScript with PEG.js.

2 February 2023

Matt Brown: 2023 Writing Plan

To achieve my goal of publishing one high-quality piece of writing per week this year, I ve put together a draft writing plan and a few organisational notes. Please let me know what you think - what s missing? what would you like to read more/less of from me? I aim for each piece of writing to generate discussion, inspire further writing, and raise my visibility and profile with potential customers and peers. Some of the writing will be opinion, but I expect a majority of it will take a learning by teaching approach - aiming to explain and present useful information to the reader while helping me learn more!

Topic Backlog The majority of my writing is going to fit into 4 series, allowing me to plan out a set of posts and narrative rather than having to come up with something novel to write about every week. To start with for Feb, my aim is to get an initial post in each series out the door. Long-term, it s likely that the order of posts will reflect my work focus (e.g. if I m spending a few weeks deep-diving into a particular product idea then expect more writing on that), but I will try and maintain some variety across the different series as well. This backlog will be maintained as a living page at https://www.mattb.nz/w/queue. Thoughts on SRE This series of posts will be pitched primarily at potential consulting customers who want to understand how I approach the development and operations of distributed software systems. Initial topics to cover include:
  • What is SRE? My philosophy on how it relates to DevOps, Platform Engineering and various other hot terms.
  • How SRE scales up and down in size.
  • My approach to managing oncall responsibilities, toil and operational work.
  • How to grow an SRE team, including the common futility of SRE transformations .
  • Learning from incidents, postmortems, incident response, etc.
Business plan drafts I have an ever-growing list of potential software opportunities and products which I think would be fun to build, but which generally don t ever leave my head due to lack of time to develop the idea, or being unable to convince myself that there s a viable business case or market for it. I d like to start sharing some very rudimentary business plan sketches for some of these ideas as a way of getting some feedback on my assessment of their potential. Whether that s confirmation that it s not worth pursuing, an expression of interest in the product, or potential partnership/collaboration opportunities - anything is better than the idea just sitting in my head. Initial ideas include:
  • Business oriented Mastodon hosting.
  • PDF E-signing - e.g. A Docusign competitor, but with a local twist through RealMe or drivers license validation.
  • A framework to enable simple, performant per-tenant at-rest encryption for SaaS products - stop the data leaks.
Product development updates For any product ideas that show merit and develop into a project, and particularly for the existing product ideas I ve already committed to exploring, I plan to document my product investigation and market research findings as a way of structuring and driving my learning in the space. To start with this will involve:
  • A series of explanatory posts diving into how NZ s electricity system works with a particular focus on how operational data that will be critical to managing a more dynamic grid flows (or doesn t flow!) today, and what opportunities or needs exist for generating, managing or distributing data that might be solvable with a software system I could build.
  • A series of product reviews and deep dives into existing farm management software and platforms in use by NZ farmers today, looking at the functionality they provide, how they integrate and generally testing the anecdotal feedback I have to date that they re clunky, hard to use and not well integrated.
  • For co2mon.nz the focus will be less on market research and more on exploring potential distribution channels (e.g. direct advertising vs partnership with air conditioning suppliers) and pricing models (e.g. buy vs rent).
Debugging walk-throughs Being able to debug and fix a system that you re not intimately familiar with is valuable skill and something that I ve always enjoyed, but it s also a skill that I observe many engineers are uncomfortable with. There s a set of techniques and processes that I ve honed and developed over the years for doing this which I think make the task of debugging an unfamiliar system more approachable. The idea, is that each post will take a problem or situation I ve encountered, from the initial symptom or problem report and walk through the process of how to narrow down and identify the trigger or root cause of the behaviour. Along the way, discussing techniques used, their pros and cons. In addition to learning about the process of debugging itself, the aim is to illustrate lessons that can be applied when designing and building software systems that facilitate and improve our experiences in the operational stage of a systems lifecycle where debugging takes place. Miscellaneous topics In addition the regular series above, stand-alone posts on the other topics may include:
  • The pros/cons I see of bootstrapping a business vs taking VC or other funding.
  • Thoughts on remote work and hiring staff.
  • AI - a confessional on how I didn t think it would progress in my lifetime, but maybe I was wrong.
  • Reflections on 15 years at Google and thoughts on subsequent events since my departure.
  • AWS vs GCP. Fight! Or with less click-bait, a level-headed comparison of the pros/cons I see in each platform.

Logistics

Discussion and comments A large part of my motivation for writing regularly is to seek feedback and generate discussion on these topics. Typically this is done by including comment functionality within the website itself. I ve decided not to do this - on-site commenting creates extra infrastructure to maintain, and limits the visibility and breadth of discussion to existing readers and followers. To provide opportunities for comment and feedback I plan to share and post notification and summarised snippets of selected posts to various social media platforms. Links to these social media posts will be added to each piece of writing to provide a path for readers to engage and discuss further while enabling the discussion and visibility of the post to grow and extend beyond my direct followers and subscribers. My current thinking is that I ll distribute via the following platforms:
  • Mastodon @matt@mastodon.nz - every post.
  • Twitter @xleem - selected posts. I m trying to reduce Twitter usage in favour of Mastodon, but there s no denying that it s still where a significant number of people and discussions are happening.
  • LinkedIn - probably primarily for posts in the business plan series, and notable milestones in the product development process.
In all cases, my aim will be to post a short teaser or summary paragraph that poses an question or relays an interesting fact to give some immediate value and signal to readers as to whether they want to click through rather than simply spamming links into the feed.

Feedback In addition to social media discussion I also plan to add a direct feedback path, particularly for readers who don t have time or inclination to participate in written discussion, by providing a simple thumbs up/thumbs down feedback widget to the bottom of each post, including those delivered via RSS and email.

Organisation To enable subscription to subsets of my writing (particularly for places like Planet Debian, etc where the more business focused content is likely to be off-topic), I plan to place each post into a set of categories:
  • Business
  • Technology
  • General
In addition to the categories, I ll also use more free-form tags to group writing with linked themes or that falls within one of the series described above.

30 December 2022

Chris Lamb: Favourite books of 2022: Non-fiction

In my three most recent posts, I went over the memoirs and biographies, classics and fiction books that I enjoyed the most in 2022. But in the last of my book-related posts for 2022, I'll be going over my favourite works of non-fiction. Books that just missed the cut here include Adam Hochschild's King Leopold's Ghost (1998) on the role of Leopold II of Belgium in the Congo Free State, Johann Hari's Stolen Focus (2022) (a personal memoir on relating to how technology is increasingly fragmenting our attention), Amia Srinivasan's The Right to Sex (2021) (a misleadingly named set of philosophic essays on feminism), Dana Heller et al.'s The Selling of 9/11: How a National Tragedy Became a Commodity (2005), John Berger's mindbending Ways of Seeing (1972) and Louise Richardson's What Terrorists Want (2006).

The Great War and Modern Memory (1975)
Wartime: Understanding and Behavior in the Second World War (1989) Paul Fussell Rather than describe the battles, weapons, geopolitics or big personalities of the two World Wars, Paul Fussell's The Great War and Modern Memory & Wartime are focused instead on how the two wars have been remembered by their everyday participants. Drawing on the memoirs and memories of soldiers and civilians along with a brief comparison with the actual events that shaped them, Fussell's two books are a compassionate, insightful and moving piece of analysis. Fussell primarily sets himself against the admixture of nostalgia and trauma that obscures the origins and unimaginable experience of participating in these wars; two wars that were, in his view, a "perceptual and rhetorical scandal from which total recovery is unlikely." He takes particular aim at the dishonesty of hindsight:
For the past fifty years, the Allied war has been sanitised and romanticised almost beyond recognition by the sentimental, the loony patriotic, the ignorant and the bloodthirsty. I have tried to balance the scales. [And] in unbombed America especially, the meaning of the war [seems] inaccessible.
The author does not engage in any of the customary rose-tinted view of war, yet he remains understanding and compassionate towards those who try to locate a reason within what was quite often senseless barbarism. If anything, his despondency and pessimism about the Second World War (the war that Fussell himself fought in) shines through quite acutely, and this is especially the case in what he chooses to quote from others:
"It was common [ ] throughout the [Okinawa] campaign for replacements to get hit before we even knew their names. They came up confused, frightened, and hopeful, got wounded or killed, and went right back to the rear on the route by which they had come, shocked, bleeding, or stiff. They were forlorn figures coming up to the meat grinder and going right back out of it like homeless waifs, unknown and faceless to us, like unread books on a shelf."
It would take a rather heartless reader to fail to be sobered by this final simile, and an even colder one to view Fussell's citation of such an emotive anecdote to be manipulative. Still, stories and cruel ironies like this one infuse this often-angry book, but it is not without astute and shrewd analysis as well, especially on the many qualitative differences between the two conflicts that simply cannot be captured by facts and figures alone. For example:
A measure of the psychological distance of the Second [World] War from the First is the rarity, in 1914 1918, of drinking and drunkenness poems.
Indeed so. In fact, what makes Fussell's project so compelling and perhaps even unique is that he uses these non-quantitive measures to try and take stock of what happened. After all, this was a war conducted by humans, not the abstract school of statistics. And what is the value of a list of armaments destroyed by such-and-such a regiment when compared with truly consequential insights into both how the war affected, say, the psychology of postwar literature ("Prolonged trench warfare, whether enacted or remembered, fosters paranoid melodrama, which I take to be a primary mode in modern writing."), the specific words adopted by combatants ("It is a truism of military propaganda that monosyllabic enemies are easier to despise than others") as well as the very grammar of interaction:
The Field Service Post Card [in WW1] has the honour of being the first widespread exemplary of that kind of document which uniquely characterises the modern world: the "Form". [And] as the first widely known example of dehumanised, automated communication, the post card popularised a mode of rhetoric indispensable to the conduct of later wars fought by great faceless conscripted armies.
And this wouldn't be a book review without argument-ending observations that:
Indicative of the German wartime conception [of victory] would be Hitler and Speer's elaborate plans for the ultimate reconstruction of Berlin, which made no provision for a library.
Our myths about the two world wars possess an undisputed power, in part because they contain an essential truth the atrocities committed by Germany and its allies were not merely extreme or revolting, but their full dimensions (embodied in the Holocaust and the Holodomor) remain essentially inaccessible within our current ideological framework. Yet the two wars are better understood as an abyss in which we were all dragged into the depths of moral depravity, rather than a battle pitched by the forces of light against the forces of darkness. Fussell is one of the few observers that can truly accept and understand this truth and is still able to speak to us cogently on the topic from the vantage point of experience. The Second World War which looms so large in our contemporary understanding of the modern world (see below) may have been necessary and unavoidable, but Fussell convinces his reader that it was morally complicated "beyond the power of any literary or philosophic analysis to suggest," and that the only way to maintain a na ve belief in the myth that these wars were a Manichaean fight between good and evil is to overlook reality. There are many texts on the two World Wars that can either stir the intellect or move the emotions, but Fussell's two books do both. A uniquely perceptive and intelligent commentary; outstanding.

Longitude (1995) Dava Sobel Since Man first decided to sail the oceans, knowing one's location has always been critical. Yet doing so reliably used to be a serious problem if you didn't know where you were, you are far more likely to die and/or lose your valuable cargo. But whilst finding one's latitude (ie. your north south position) had effectively been solved by the beginning of the 17th century, finding one's (east west) longitude was far from trustworthy in comparison. This book first published in 1995 is therefore something of an anachronism. As in, we readily use the GPS facilities of our phones today without hesitation, so we find it difficult to imagine a reality in which knowing something fundamental like your own location is essentially unthinkable. It became clear in the 18th century, though, that in order to accurately determine one's longitude, what you actually needed was an accurate clock. In Longitude, therefore, we read of the remarkable story of John Harrison and his quest to create a timepiece that would not only keep time during a long sea voyage but would survive the rough ocean conditions as well. Self-educated and a carpenter by trade, Harrison made a number of important breakthroughs in keeping accurate time at sea, and Longitude describes his novel breakthroughs in a way that is both engaging and without talking down to the reader. Still, this book covers much more than that, including the development of accurate longitude going hand-in-hand with advancements in cartography as well as in scientific experiments to determine the speed of light: experiments that led to the formulation of quantum mechanics. It also outlines the work being done by Harrison's competitors. 'Competitors' is indeed the correct word here, as Parliament offered a huge prize to whoever could create such a device, and the ramifications of this tremendous financial incentive are an essential part of this story. For the most part, though, Longitude sticks to the story of Harrison and his evolving obsession with his creating the perfect timepiece. Indeed, one reason that Longitude is so resonant with readers is that many of the tropes of the archetypical 'English inventor' are embedded within Harrison himself. That is to say, here is a self-made man pushing against the establishment of the time, with his groundbreaking ideas being underappreciated in his life, or dishonestly purloined by his intellectual inferiors. At the level of allegory, then, I am minded to interpret this portrait of Harrison as a symbolic distillation of postwar Britain a nation acutely embarrassed by the loss of the Empire that is now repositioning itself as a resourceful but plucky underdog; a country that, with a combination of the brains of boffins and a healthy dose of charisma and PR, can still keep up with the big boys. (It is this same search for postimperial meaning I find in the fiction of John le Carr , and, far more famously, in the James Bond franchise.) All of this is left to the reader, of course, as what makes Longitute singularly compelling is its gentle manner and tone. Indeed, at times it was as if the doyenne of sci-fi Ursula K. LeGuin had a sideline in popular non-fiction. I realise it's a mark of critical distinction to downgrade the importance of popular science in favour of erudite academic texts, but Latitude is ample evidence that so-called 'pop' science need not be patronising or reductive at all.

Closed Chambers: The Rise, Fall, and Future of the Modern Supreme Court (1998) Edward Lazarus After the landmark decision by the U.S. Supreme Court in *Dobbs v. Jackson Women's Health Organization that ended the Constitutional right to abortion conferred by Roe v Wade, I prioritised a few books in the queue about the judicial branch of the United States. One of these books was Closed Chambers, which attempts to assay, according to its subtitle, "The Rise, Fall and Future of the Modern Supreme Court". This book is not merely simply a learned guide to the history and functioning of the Court (although it is completely creditable in this respect); it's actually an 'insider' view of the workings of the institution as Lazurus was a clerk for Justice Harry Blackmun during the October term of 1988. Lazarus has therefore combined his experience as a clerk and his personal reflections (along with a substantial body of subsequent research) in order to communicate the collapse in comity between the Justices. Part of this book is therefore a pure history of the Court, detailing its important nineteenth-century judgements (such as Dred Scott which ruled that the Constitution did not consider Blacks to be citizens; and Plessy v. Ferguson which failed to find protection in the Constitution against racial segregation laws), as well as many twentieth-century cases that touch on the rather technical principle of substantive due process. Other layers of Lazurus' book are explicitly opinionated, however, and they capture the author's assessment of the Court's actions in the past and present [1998] day. Given the role in which he served at the Court, particular attention is given by Lazarus to the function of its clerks. These are revealed as being far more than the mere amanuenses they were hitherto believed to be. Indeed, the book is potentially unique in its the claim that the clerks have played a pivotal role in the deliberations, machinations and eventual rulings of the Court. By implication, then, the clerks have plaedy a crucial role in the internal controversies that surround many of the high-profile Supreme Court decisions decisions that, to the outsider at least, are presented as disinterested interpretations of Constitution of the United States. This is of especial importance given that, to Lazarus, "for all the attention we now pay to it, the Court remains shrouded in confusion and misunderstanding." Throughout his book, Lazarus complicates the commonplace view that the Court is divided into two simple right vs. left political factions, and instead documents an ever-evolving series of loosely held but strongly felt series of cabals, quid pro quo exchanges, outright equivocation and pure personal prejudices. (The age and concomitant illnesses of the Justices also appears to have a not insignificant effect on the Court's rulings as well.) In other words, Closed Chambers is not a book that will be read in a typical civics class in America, and the only time the book resorts to the customary breathless rhetoric about the US federal government is in its opening chapter:
The Court itself, a Greek-style temple commanding the crest of Capitol Hill, loomed above them in the dim light of the storm. Set atop a broad marble plaza and thirty-six steps, the Court stands in splendid isolation appropriate to its place at the pinnacle of the national judiciary, one of the three independent and "coequal" branches of American government. Once dubbed the Ivory Tower by architecture critics, the Court has a Corinthian colonnade and massive twenty-foot-high bronze doors that guard the single most powerful judicial institution in the Western world. Lights still shone in several offices to the right of the Court's entrance, and [ ]
Et cetera, et cetera. But, of course, this encomium to the inherent 'nobility' of the Supreme Court is quickly revealed to be a narrative foil, as Lazarus soon razes this dangerously na ve conception to the ground:
[The] institution is [now] broken into unyielding factions that have largely given up on a meaningful exchange of their respective views or, for that matter, a meaningful explication or defense of their own views. It is of Justices who in many important cases resort to transparently deceitful and hypocritical arguments and factual distortions as they discard judicial philosophy and consistent interpretation in favor of bottom-line results. This is a Court so badly splintered, yet so intent on lawmaking, that shifting 5-4 majorities, or even mere pluralities, rewrite whole swaths of constitutional law on the authority of a single, often idiosyncratic vote. It is also a Court where Justices yield great and excessive power to immature, ideologically driven clerks, who in turn use that power to manipulate their bosses and the institution they ostensibly serve.
Lazurus does not put forward a single, overarching thesis, but in the final chapters, he does suggest a potential future for the Court:
In the short run, the cure for what ails the Court lies solely with the Justices. It is their duty, under the shield of life tenure, to recognize the pathologies affecting their work and to restore the vitality of American constitutionalism. Ultimately, though, the long-term health of the Court depends on our own resolve on whom [we] select to join that institution.
Back in 1998, Lazurus might have had room for this qualified optimism. But from the vantage point of 2022, it appears that the "resolve" of the United States citizenry was not muscular enough to meet his challenge. After all, Lazurus was writing before Bush v. Gore in 2000, which arrogated to the judicial branch the ability to decide a presidential election; the disillusionment of Barack Obama's failure to nominate a replacement for Scalia; and many other missteps in the Court as well. All of which have now been compounded by the Trump administration's appointment of three Republican-friendly justices to the Court, including hypocritically appointing Justice Barrett a mere 38 days before the 2020 election. And, of course, the leaking and ruling in Dobbs v. Jackson, the true extent of which has not been yet. Not of a bit of this is Lazarus' fault, of course, but the Court's recent decisions (as well as the liberal hagiographies of 'RBG') most perforce affect one's reading of the concluding chapters. The other slight defect of Closed Chambers is that, whilst it often implies the importance of the federal and state courts within the judiciary, it only briefly positions the Supreme Court's decisions in relation to what was happening in the House, Senate and White House at the time. This seems to be increasingly relevant as time goes on: after all, it seems fairly clear even to this Brit that relying on an activist Supreme Court to enact progressive laws must be interpreted as a failure of the legislative branch to overcome the perennial problems of the filibuster, culture wars and partisan bickering. Nevertheless, Lazarus' book is in equal parts ambitious, opinionated, scholarly and dare I admit it? wonderfully gossipy. By juxtaposing history, memoir, and analysis, Closed Chambers combines an exacting evaluation of the Court's decisions with a lively portrait of the intellectual and emotional intensity that has grown within the Supreme Court's pseudo-monastic environment all while it struggles with the most impactful legal issues of the day. This book is an excellent and well-written achievement that will likely never be repeated, and a must-read for anyone interested in this ever-increasingly important branch of the US government.

Crashed: How a Decade of Financial Crises Changed the World (2018)
Shutdown: How Covid Shook the World's Economy (2021) Adam Tooze The economic historian Adam Tooze has often been labelled as an unlikely celebrity, but in the fourteen years since the global financial crisis of 2008, a growing audience has been looking for answers about the various failures of the modern economy. Tooze, a professor of history at New York's Columbia University, has written much that is penetrative and thought-provoking on this topic, and as a result, he has generated something of a cult following amongst economists, historians and the online left. I actually read two Tooze books this year. The first, Crashed (2018), catalogues the scale of government intervention required to prop up global finance after the 2008 financial crisis, and it characterises the different ways that countries around the world failed to live up to the situation, such as doing far too little, or taking action far too late. The connections between the high-risk subprime loans, credit default swaps and the resulting liquidity crisis in the US in late 2008 is fairly well known today in part thanks to films such as Adam McKay's 2015 The Big Short and much improved economic literacy in media reportage. But Crashed makes the implicit claim that, whilst the specific and structural origins of the 2008 crisis are worth scrutinising in exacting detail, it is the reaction of states in the months and years after the crash that has been overlooked as a result. After all, this is a reaction that has not only shaped a new economic order, it has created one that does not fit any conventional idea about the way the world 'ought' to be run. Tooze connects the original American banking crisis to the (multiple) European debt crises with a larger crisis of liberalism. Indeed, Tooze somehow manages to cover all these topics and more, weaving in Trump, Brexit and Russia's 2014 annexation of Crimea, as well as the evolving role of China in the post-2008 economic order. Where Crashed focused on the constellation of consequences that followed the events of 2008, Shutdown is a clear and comprehensive account of the way the world responded to the economic impact of Covid-19. The figures are often jaw-dropping: soon after the disease spread around the world, 95% of the world's economies contracted simultaneously, and at one point, the global economy shrunk by approximately 20%. Tooze's keen and sobering analysis of what happened is made all the more remarkable by the fact that it came out whilst the pandemic was still unfolding. In fact, this leads quickly to one of the book's few flaws: by being published so quickly, Shutdown prematurely over-praises China's 'zero Covid' policy, and these remarks will make a reader today squirm in their chair. Still, despite the regularity of these references (after all, mentioning China is very useful when one is directly comparing economic figures in early 2021, for examples), these are actually minor blemishes on the book's overall thesis. That is to say, Crashed is not merely a retelling of what happened in such-and-such a country during the pandemic; it offers in effect a prediction about what might be coming next. Whilst the economic responses to Covid averted what could easily have been another Great Depression (and thus showed it had learned some lessons from 2008), it had only done so by truly discarding the economic rule book. The by-product of inverting this set of written and unwritten conventions that have governed the world for the past 50 years, this 'Washington consensus' if you well, has yet to be fully felt. Of course, there are many parallels between these two books by Tooze. Both the liquidity crisis outlined in Crashed and the economic response to Covid in Shutdown exposed the fact that one of the central tenets of the modern economy ie. that financial markets can be trusted to regulate themselves was entirely untrue, and likely was false from the very beginning. And whilst Adam Tooze does not offer a singular piercing insight (conveying a sense of rigorous mastery instead), he may as well be asking whether we're simply going to lurch along from one crisis to the next, relying on the technocrats in power to fix problems when everything blows up again. The answer may very well be yes.

Looking for the Good War: American Amnesia and the Violent Pursuit of Happiness (2021) Elizabeth D. Samet Elizabeth D. Samet's Looking for the Good War answers the following question what would be the result if you asked a professor of English to disentangle the complex mythology we have about WW2 in the context of the recent US exit of Afghanistan? Samet's book acts as a twenty-first-century update of a kind to Paul Fussell's two books (reviewed above), as well as a deeper meditation on the idea that each new war is seen through the lens of the previous one. Indeed, like The Great War and Modern Memory (1975) and Wartime (1989), Samet's book is a perceptive work of demystification, but whilst Fussell seems to have been inspired by his own traumatic war experience, Samet is not only informed by her teaching West Point military cadets but by the physical and ontological wars that have occurred during her own life as well. A more scholarly and dispassionate text is the result of Samet's relative distance from armed combat, but it doesn't mean Looking for the Good War lacks energy or inspiration. Samet shares John Adams' belief that no political project can entirely shed the innate corruptions of power and ambition and so it is crucial to analyse and re-analyse the role of WW2 in contemporary American life. She is surely correct that the Second World War has been universally elevated as a special, 'good' war. Even those with exceptionally giddy minds seem to treat WW2 as hallowed:
It is nevertheless telling that one of the few occasions to which Trump responded with any kind of restraint while he was in office was the 75th anniversary of D-Day in 2019.
What is the source of this restraint, and what has nurtured its growth in the eight decades since WW2 began? Samet posits several reasons for this, including the fact that almost all of the media about the Second World War is not only suffused with symbolism and nostalgia but, less obviously, it has been made by people who have no experience of the events that they depict. Take Stephen Ambrose, author of Steven Spielberg's Band of Brothers miniseries: "I was 10 years old when the war ended," Samet quotes of Ambrose. "I thought the returning veterans were giants who had saved the world from barbarism. I still think so. I remain a hero worshiper." If Looking for the Good War has a primary thesis, then, it is that childhood hero worship is no basis for a system of government, let alone a crusading foreign policy. There is a straight line (to quote this book's subtitle) from the "American Amnesia" that obscures the reality of war to the "Violent Pursuit of Happiness." Samet's book doesn't merely just provide a modern appendix to Fussell's two works, however, as it adds further layers and dimensions he overlooked. For example, Samet provides some excellent insight on the role of Western, gangster and superhero movies, and she is especially good when looking at noir films as a kind of kaleidoscopic response to the Second World War:
Noir is a world ruled by bad decisions but also by bad timing. Chance, which plays such a pivotal role in war, bleeds into this world, too.
Samet rightfully weaves the role of women into the narrative as well. Women in film noir are often celebrated as 'independent' and sassy, correctly reflecting their newly-found independence gained during WW2. But these 'liberated' roles are not exactly a ringing endorsement of this independence: the 'femme fatale' and the 'tart', etc., reflect a kind of conditional freedom permitted to women by a post-War culture which is still wedded to an outmoded honour culture. In effect, far from being novel and subversive, these roles for women actually underwrote the ambient cultural disapproval of women's presence in the workforce. Samet later connects this highly-conditional independence with the liberation of Afghan women, which:
is inarguably one of the more palatable outcomes of our invasion, and the protection of women's rights has been invoked on the right and the left as an argument for staying the course in Afghanistan. How easily consequence is becoming justification. How flattering it will be one day to reimagine it as original objective.
Samet has ensured her book has a predominantly US angle as well, for she ends her book with a chapter on the pseudohistorical Lost Cause of the Civil War. The legacy of the Civil War is still visible in the physical phenomena of Confederate statues, but it also exists in deep-rooted racial injustice that has been shrouded in euphemism and other psychological devices for over 150 years. Samet believes that a key part of what drives the American mythology about the Second World War is the way in which it subconsciously cleanses the horrors of brother-on-brother murder that were seen in the Civil War. This is a book that is not only of interest to historians of the Second World War; it is a work for anyone who wishes to understand almost any American historical event, social issue, politician or movie that has appeared since the end of WW2. That is for better or worse everyone on earth.

30 August 2022

John Goerzen: The PC & Internet Revolution in Rural America

Inspired by several others (such as Alex Schroeder s post and Szcze uja s prompt), as well as a desire to get this down for my kids, I figure it s time to write a bit about living through the PC and Internet revolution where I did: outside a tiny town in rural Kansas. And, as I ve been back in that same area for the past 15 years, I reflect some on the challenges that continue to play out. Although the stories from the others were primarily about getting online, I want to start by setting some background. Those of you that didn t grow up in the same era as I did probably never realized that a typical business PC setup might cost $10,000 in today s dollars, for instance. So let me start with the background.

Nothing was easy This story begins in the 1980s. Somewhere around my Kindergarten year of school, around 1985, my parents bought a TRS-80 Color Computer 2 (aka CoCo II). It had 64K of RAM and used a TV for display and sound. This got you the computer. It didn t get you any disk drive or anything, no joysticks (required by a number of games). So whenever the system powered down, or it hung and you had to power cycle it a frequent event you d lose whatever you were doing and would have to re-enter the program, literally by typing it in. The floppy drive for the CoCo II cost more than the computer, and it was quite common for people to buy the computer first and then the floppy drive later when they d saved up the money for that. I particularly want to mention that computers then didn t come with a modem. What would be like buying a laptop or a tablet without wifi today. A modem, which I ll talk about in a bit, was another expensive accessory. To cobble together a system in the 80s that was capable of talking to others with persistent storage (floppy, or hard drive), screen, keyboard, and modem would be quite expensive. Adjusted for inflation, if you re talking a PC-style device (a clone of the IBM PC that ran DOS), this would easily be more expensive than the Macbook Pros of today. Few people back in the 80s had a computer at home. And the portion of those that had even the capability to get online in a meaningful way was even smaller. Eventually my parents bought a PC clone with 640K RAM and dual floppy drives. This was primarily used for my mom s work, but I did my best to take it over whenever possible. It ran DOS and, despite its monochrome screen, was generally a more capable machine than the CoCo II. For instance, it supported lowercase. (I m not even kidding; the CoCo II pretty much didn t.) A while later, they purchased a 32MB hard drive for it what luxury! Just getting a machine to work wasn t easy. Say you d bought a PC, and then bought a hard drive, and a modem. You didn t just plug in the hard drive and it would work. You would have to fight it every step of the way. The BIOS and DOS partition tables of the day used a cylinder/head/sector method of addressing the drive, and various parts of that those addresses had too few bits to work with the big drives of the day above 20MB. So you would have to lie to the BIOS and fdisk in various ways, and sort of work out how to do it for each drive. For each peripheral serial port, sound card (in later years), etc., you d have to set jumpers for DMA and IRQs, hoping not to conflict with anything already in the system. Perhaps you can now start to see why USB and PCI were so welcomed.

Sharing and finding resources Despite the two computers in our home, it wasn t as if software written on one machine just ran on another. A lot of software for PC clones assumed a CGA color display. The monochrome HGC in our PC wasn t particularly compatible. You could find a TSR program to emulate the CGA on the HGC, but it wasn t particularly stable, and there s only so much you can do when a program that assumes color displays on a monitor that can only show black, dark amber, or light amber. So I d periodically get to use other computers most commonly at an office in the evening when it wasn t being used. There were some local computer clubs that my dad took me to periodically. Software was swapped back then; disks copied, shareware exchanged, and so forth. For me, at least, there was no online to download software from, and selling software over the Internet wasn t a thing at all.

Three Different Worlds There were sort of three different worlds of computing experience in the 80s:
  1. Home users. Initially using a wide variety of software from Apple, Commodore, Tandy/RadioShack, etc., but eventually coming to be mostly dominated by IBM PC clones
  2. Small and mid-sized business users. Some of them had larger minicomputers or small mainframes, but most that I had contact with by the early 90s were standardized on DOS-based PCs. More advanced ones had a network running Netware, most commonly. Networking hardware and software was generally too expensive for home users to use in the early days.
  3. Universities and large institutions. These are the places that had the mainframes, the earliest implementations of TCP/IP, the earliest users of UUCP, and so forth.
The difference between the home computing experience and the large institution experience were vast. Not only in terms of dollars the large institution hardware could easily cost anywhere from tens of thousands to millions of dollars but also in terms of sheer resources required (large rooms, enormous power circuits, support staff, etc). Nothing was in common between them; not operating systems, not software, not experience. I was never much aware of the third category until the differences started to collapse in the mid-90s, and even then I only was exposed to it once the collapse was well underway. You might say to me, Well, Google certainly isn t running what I m running at home! And, yes of course, it s different. But fundamentally, most large datacenters are running on x86_64 hardware, with Linux as the operating system, and a TCP/IP network. It s a different scale, obviously, but at a fundamental level, the hardware and operating system stack are pretty similar to what you can readily run at home. Back in the 80s and 90s, this wasn t the case. TCP/IP wasn t even available for DOS or Windows until much later, and when it was, it was a clunky beast that was difficult. One of the things Kevin Driscoll highlights in his book called Modem World see my short post about it is that the history of the Internet we usually receive is focused on case 3: the large institutions. In reality, the Internet was and is literally a network of networks. Gateways to and from Internet existed from all three kinds of users for years, and while TCP/IP ultimately won the battle of the internetworking protocol, the other two streams of users also shaped the Internet as we now know it. Like many, I had no access to the large institution networks, but as I ve been reflecting on my experiences, I ve found a new appreciation for the way that those of us that grew up with primarily home PCs shaped the evolution of today s online world also.

An Era of Scarcity I should take a moment to comment about the cost of software back then. A newspaper article from 1985 comments that WordPerfect, then the most powerful word processing program, sold for $495 (or $219 if you could score a mail order discount). That s $1360/$600 in 2022 money. Other popular software, such as Lotus 1-2-3, was up there as well. If you were to buy a new PC clone in the mid to late 80s, it would often cost $2000 in 1980s dollars. Now add a printer a low-end dot matrix for $300 or a laser for $1500 or even more. A modem: another $300. So the basic system would be $3600, or $9900 in 2022 dollars. If you wanted a nice printer, you re now pushing well over $10,000 in 2022 dollars. You start to see one barrier here, and also why things like shareware and piracy if it was indeed even recognized as such were common in those days. So you can see, from a home computer setup (TRS-80, Commodore C64, Apple ][, etc) to a business-class PC setup was an order of magnitude increase in cost. From there to the high-end minis/mainframes was another order of magnitude (at least!) increase. Eventually there was price pressure on the higher end and things all got better, which is probably why the non-DOS PCs lasted until the early 90s.

Increasing Capabilities My first exposure to computers in school was in the 4th grade, when I would have been about 9. There was a single Apple ][ machine in that room. I primarily remember playing Oregon Trail on it. The next year, the school added a computer lab. Remember, this is a small rural area, so each graduating class might have about 25 people in it; this lab was shared by everyone in the K-8 building. It was full of some flavor of IBM PS/2 machines running DOS and Netware. There was a dedicated computer teacher too, though I think she was a regular teacher that was given somewhat minimal training on computers. We were going to learn typing that year, but I did so well on the very first typing program that we soon worked out that I could do programming instead. I started going to school early these machines were far more powerful than the XT at home and worked on programming projects there. Eventually my parents bought me a Gateway 486SX/25 with a VGA monitor and hard drive. Wow! This was a whole different world. It may have come with Windows 3.0 or 3.1 on it, but I mainly remember running OS/2 on that machine. More on that below.

Programming That CoCo II came with a BASIC interpreter in ROM. It came with a large manual, which served as a BASIC tutorial as well. The BASIC interpreter was also the shell, so literally you could not use the computer without at least a bit of BASIC. Once I had access to a DOS machine, it also had a basic interpreter: GW-BASIC. There was a fair bit of software written in BASIC at the time, but most of the more advanced software wasn t. I wondered how these .EXE and .COM programs were written. I could find vague references to DEBUG.EXE, assemblers, and such. But it wasn t until I got a copy of Turbo Pascal that I was able to do that sort of thing myself. Eventually I got Borland C++ and taught myself C as well. A few years later, I wanted to try writing GUI programs for Windows, and bought Watcom C++ much cheaper than the competition, and it could target Windows, DOS (and I think even OS/2). Notice that, aside from BASIC, none of this was free, and none of it was bundled. You couldn t just download a C compiler, or Python interpreter, or whatnot back then. You had to pay for the ability to write any kind of serious code on the computer you already owned.

The Microsoft Domination Microsoft came to dominate the PC landscape, and then even the computing landscape as a whole. IBM very quickly lost control over the hardware side of PCs as Compaq and others made clones, but Microsoft has managed in varying degrees even to this day to keep a stranglehold on the software, and especially the operating system, side. Yes, there was occasional talk of things like DR-DOS, but by and large the dominant platform came to be the PC, and if you had a PC, you ran DOS (and later Windows) from Microsoft. For awhile, it looked like IBM was going to challenge Microsoft on the operating system front; they had OS/2, and when I switched to it sometime around the version 2.1 era in 1993, it was unquestionably more advanced technically than the consumer-grade Windows from Microsoft at the time. It had Internet support baked in, could run most DOS and Windows programs, and had introduced a replacement for the by-then terrible FAT filesystem: HPFS, in 1988. Microsoft wouldn t introduce a better filesystem for its consumer operating systems until Windows XP in 2001, 13 years later. But more on that story later.

Free Software, Shareware, and Commercial Software I ve covered the high cost of software already. Obviously $500 software wasn t going to sell in the home market. So what did we have? Mainly, these things:
  1. Public domain software. It was free to use, and if implemented in BASIC, probably had source code with it too.
  2. Shareware
  3. Commercial software (some of it from small publishers was a lot cheaper than $500)
Let s talk about shareware. The idea with shareware was that a company would release a useful program, sometimes limited. You were encouraged to register , or pay for, it if you liked it and used it. And, regardless of whether you registered it or not, were told please copy! Sometimes shareware was fully functional, and registering it got you nothing more than printed manuals and an easy conscience (guilt trips for not registering weren t necessarily very subtle). Sometimes unregistered shareware would have a nag screen a delay of a few seconds while they told you to register. Sometimes they d be limited in some way; you d get more features if you registered. With games, it was popular to have a trilogy, and release the first episode inevitably ending with a cliffhanger as shareware, and the subsequent episodes would require registration. In any event, a lot of software people used in the 80s and 90s was shareware. Also pirated commercial software, though in the earlier days of computing, I think some people didn t even know the difference. Notice what s missing: Free Software / FLOSS in the Richard Stallman sense of the word. Stallman lived in the big institution world after all, he worked at MIT and what he was doing with the Free Software Foundation and GNU project beginning in 1983 never really filtered into the DOS/Windows world at the time. I had no awareness of it even existing until into the 90s, when I first started getting some hints of it as a port of gcc became available for OS/2. The Internet was what really brought this home, but I m getting ahead of myself. I want to say again: FLOSS never really entered the DOS and Windows 3.x ecosystems. You d see it make a few inroads here and there in later versions of Windows, and moreso now that Microsoft has been sort of forced to accept it, but still, reflect on its legacy. What is the software market like in Windows compared to Linux, even today? Now it is, finally, time to talk about connectivity!

Getting On-Line What does it even mean to get on line? Certainly not connecting to a wifi access point. The answer is, unsurprisingly, complex. But for everyone except the large institutional users, it begins with a telephone.

The telephone system By the 80s, there was one communication network that already reached into nearly every home in America: the phone system. Virtually every household (note I don t say every person) was uniquely identified by a 10-digit phone number. You could, at least in theory, call up virtually any other phone in the country and be connected in less than a minute. But I ve got to talk about cost. The way things worked in the USA, you paid a monthly fee for a phone line. Included in that monthly fee was unlimited local calling. What is a local call? That was an extremely complex question. Generally it meant, roughly, calling within your city. But of course, as you deal with things like suburbs and cities growing into each other (eg, the Dallas-Ft. Worth metroplex), things got complicated fast. But let s just say for simplicity you could call others in your city. What about calling people not in your city? That was long distance , and you paid often hugely by the minute for it. Long distance rates were difficult to figure out, but were generally most expensive during business hours and cheapest at night or on weekends. Prices eventually started to come down when competition was introduced for long distance carriers, but even then you often were stuck with a single carrier for long distance calls outside your city but within your state. Anyhow, let s just leave it at this: local calls were virtually free, and long distance calls were extremely expensive.

Getting a modem I remember getting a modem that ran at either 1200bps or 2400bps. Either way, quite slow; you could often read even plain text faster than the modem could display it. But what was a modem? A modem hooked up to a computer with a serial cable, and to the phone system. By the time I got one, modems could automatically dial and answer. You would send a command like ATDT5551212 and it would dial 555-1212. Modems had speakers, because often things wouldn t work right, and the telephone system was oriented around speech, so you could hear what was happening. You d hear it wait for dial tone, then dial, then hopefully the remote end would ring, a modem there would answer, you d hear the screeching of a handshake, and eventually your terminal would say CONNECT 2400. Now your computer was bridged to the other; anything going out your serial port was encoded as sound by your modem and decoded at the other end, and vice-versa. But what, exactly, was the other end? It might have been another person at their computer. Turn on local echo, and you can see what they did. Maybe you d send files to each other. But in my case, the answer was different: PC Magazine.

PC Magazine and CompuServe Starting around 1986 (so I would have been about 6 years old), I got to read PC Magazine. My dad would bring copies that were being discarded at his office home for me to read, and I think eventually bought me a subscription directly. This was not just a standard magazine; it ran something like 350-400 pages an issue, and came out every other week. This thing was a monster. It had reviews of hardware and software, descriptions of upcoming technologies, pages and pages of ads (that often had some degree of being informative to them). And they had sections on programming. Many issues would talk about BASIC or Pascal programming, and there d be a utility in most issues. What do I mean by a utility in most issues ? Did they include a floppy disk with software? No, of course not. There was a literal program listing printed in the magazine. If you wanted the utility, you had to type it in. And a lot of them were written in assembler, so you had to have an assembler. An assembler, of course, was not free and I didn t have one. Or maybe they wrote it in Microsoft C, and I had Borland C, and (of course) they weren t compatible. Sometimes they would list the program sort of in binary: line after line of a BASIC program, with lines like 64, 193, 253, 0, 53, 0, 87 that you would type in for hours, hopefully correctly. Running the BASIC program would, if you got it correct, emit a .COM file that you could then run. They did have a rudimentary checksum system built in, but it wasn t even a CRC, so something like swapping two numbers you d never notice except when the program would mysteriously hang. Eventually they teamed up with CompuServe to offer a limited slice of CompuServe for the purpose of downloading PC Magazine utilities. This was called PC MagNet. I am foggy on the details, but I believe that for a time you could connect to the limited PC MagNet part of CompuServe for free (after the cost of the long-distance call, that is) rather than paying for CompuServe itself (because, OF COURSE, that also charged you per the minute.) So in the early days, I would get special permission from my parents to place a long distance call, and after some nerve-wracking minutes in which we were aware every minute was racking up charges, I could navigate the menus, download what I wanted, and log off immediately. I still, incidentally, mourn what PC Magazine became. As with computing generally, it followed the mass market. It lost its deep technical chops, cut its programming columns, stopped talking about things like how SCSI worked, and so forth. By the time it stopped printing in 2009, it was no longer a square-bound 400-page beheamoth, but rather looked more like a copy of Newsweek, but with less depth.

Continuing with CompuServe CompuServe was a much larger service than just PC MagNet. Eventually, our family got a subscription. It was still an expensive and scarce resource; I d call it only after hours when the long-distance rates were cheapest. Everyone had a numerical username separated by commas; mine was 71510,1421. CompuServe had forums, and files. Eventually I would use TapCIS to queue up things I wanted to do offline, to minimize phone usage online. CompuServe eventually added a gateway to the Internet. For the sum of somewhere around $1 a message, you could send or receive an email from someone with an Internet email address! I remember the thrill of one time, as a kid of probably 11 years, sending a message to one of the editors of PC Magazine and getting a kind, if brief, reply back! But inevitably I had

The Godzilla Phone Bill Yes, one month I became lax in tracking my time online. I ran up my parents phone bill. I don t remember how high, but I remember it was hundreds of dollars, a hefty sum at the time. As I watched Jason Scott s BBS Documentary, I realized how common an experience this was. I think this was the end of CompuServe for me for awhile.

Toll-Free Numbers I lived near a town with a population of 500. Not even IN town, but near town. The calling area included another town with a population of maybe 1500, so all told, there were maybe 2000 people total I could talk to with a local call though far fewer numbers, because remember, telephones were allocated by the household. There was, as far as I know, zero modems that were a local call (aside from one that belonged to a friend I met in around 1992). So basically everything was long-distance. But there was a special feature of the telephone network: toll-free numbers. Normally when calling long-distance, you, the caller, paid the bill. But with a toll-free number, beginning with 1-800, the recipient paid the bill. These numbers almost inevitably belonged to corporations that wanted to make it easy for people to call. Sales and ordering lines, for instance. Some of these companies started to set up modems on toll-free numbers. There were few of these, but they existed, so of course I had to try them! One of them was a company called PennyWise that sold office supplies. They had a toll-free line you could call with a modem to order stuff. Yes, online ordering before the web! I loved office supplies. And, because I lived far from a big city, if the local K-Mart didn t have it, I probably couldn t get it. Of course, the interface was entirely text, but you could search for products and place orders with the modem. I had loads of fun exploring the system, and actually ordered things from them and probably actually saved money doing so. With the first order they shipped a monster full-color catalog. That thing must have been 500 pages, like the Sears catalogs of the day. Every item had a part number, which streamlined ordering through the modem.

Inbound FAXes By the 90s, a number of modems became able to send and receive FAXes as well. For those that don t know, a FAX machine was essentially a special modem. It would scan a page and digitally transmit it over the phone system, where it would at least in the early days be printed out in real time (because the machines didn t have the memory to store an entire page as an image). Eventually, PC modems integrated FAX capabilities. There still wasn t anything useful I could do locally, but there were ways I could get other companies to FAX something to me. I remember two of them. One was for US Robotics. They had an on demand FAX system. You d call up a toll-free number, which was an automated IVR system. You could navigate through it and select various documents of interest to you: spec sheets and the like. You d key in your FAX number, hang up, and US Robotics would call YOU and FAX you the documents you wanted. Yes! I was talking to a computer (of a sorts) at no cost to me! The New York Times also ran a service for awhile called TimesFax. Every day, they would FAX out a page or two of summaries of the day s top stories. This was pretty cool in an era in which I had no other way to access anything from the New York Times. I managed to sign up for TimesFax I have no idea how, anymore and for awhile I would get a daily FAX of their top stories. When my family got its first laser printer, I could them even print these FAXes complete with the gothic New York Times masthead. Wow! (OK, so technically I could print it on a dot-matrix printer also, but graphics on a 9-pin dot matrix is a kind of pain that is a whole other article.)

My own phone line Remember how I discussed that phone lines were allocated per household? This was a problem for a lot of reasons:
  1. Anybody that tried to call my family while I was using my modem would get a busy signal (unable to complete the call)
  2. If anybody in the house picked up the phone while I was using it, that would degrade the quality of the ongoing call and either mess up or disconnect the call in progress. In many cases, that could cancel a file transfer (which wasn t necessarily easy or possible to resume), prompting howls of annoyance from me.
  3. Generally we all had to work around each other
So eventually I found various small jobs and used the money I made to pay for my own phone line and my own long distance costs. Eventually I upgraded to a 28.8Kbps US Robotics Courier modem even! Yes, you heard it right: I got a job and a bank account so I could have a phone line and a faster modem. Uh, isn t that why every teenager gets a job? Now my local friend and I could call each other freely at least on my end (I can t remember if he had his own phone line too). We could exchange files using HS/Link, which had the added benefit of allowing split-screen chat even while a file transfer is in progress. I m sure we spent hours chatting to each other keyboard-to-keyboard while sharing files with each other.

Technology in Schools By this point in the story, we re in the late 80s and early 90s. I m still using PC-style OSs at home; OS/2 in the later years of this period, DOS or maybe a bit of Windows in the earlier years. I mentioned that they let me work on programming at school starting in 5th grade. It was soon apparent that I knew more about computers than anybody on staff, and I started getting pulled out of class to help teachers or administrators with vexing school problems. This continued until I graduated from high school, incidentally often to my enjoyment, and the annoyance of one particular teacher who, I must say, I was fine with annoying in this way. That s not to say that there was institutional support for what I was doing. It was, after all, a small school. Larger schools might have introduced BASIC or maybe Logo in high school. But I had already taught myself BASIC, Pascal, and C by the time I was somewhere around 12 years old. So I wouldn t have had any use for that anyhow. There were programming contests occasionally held in the area. Schools would send teams. My school didn t really send anybody, but I went as an individual. One of them was run by a local college (but for jr. high or high school students. Years later, I met one of the professors that ran it. He remembered me, and that day, better than I did. The programming contest had problems one could solve in BASIC or Logo. I knew nothing about what to expect going into it, but I had lugged my computer and screen along, and asked him, Can I write my solutions in C? He was, apparently, stunned, but said sure, go for it. I took first place that day, leading to some rather confused teams from much larger schools. The Netware network that the school had was, as these generally were, itself isolated. There was no link to the Internet or anything like it. Several schools across three local counties eventually invested in a fiber-optic network linking them together. This built a larger, but still closed, network. Its primary purpose was to allow students to be exposed to a wider variety of classes at high schools. Participating schools had an ITV room , outfitted with cameras and mics. So students at any school could take classes offered over ITV at other schools. For instance, only my school taught German classes, so people at any of those participating schools could take German. It was an early Zoom room. But alongside the TV signal, there was enough bandwidth to run some Netware frames. By about 1995 or so, this let one of the schools purchase some CD-ROM software that was made available on a file server and could be accessed by any participating school. Nice! But Netware was mainly about file and printer sharing; there wasn t even a facility like email, at least not on our deployment.

BBSs My last hop before the Internet was the BBS. A BBS was a computer program, usually ran by a hobbyist like me, on a computer with a modem connected. Callers would call it up, and they d interact with the BBS. Most BBSs had discussion groups like forums and file areas. Some also had games. I, of course, continued to have that most vexing of problems: they were all long-distance. There were some ways to help with that, chiefly QWK and BlueWave. These, somewhat like TapCIS in the CompuServe days, let me download new message posts for reading offline, and queue up my own messages to send later. QWK and BlueWave didn t help with file downloading, though.

BBSs get networked BBSs were an interesting thing. You d call up one, and inevitably somewhere in the file area would be a BBS list. Download the BBS list and you ve suddenly got a list of phone numbers to try calling. All of them were long distance, of course. You d try calling them at random and have a success rate of maybe 20%. The other 80% would be defunct; you might get the dreaded this number is no longer in service or the even more dreaded angry human answering the phone (and of course a modem can t talk to a human, so they d just get silence for probably the nth time that week). The phone company cared nothing about BBSs and recycled their numbers just as fast as any others. To talk to various people, or participate in certain discussion groups, you d have to call specific BBSs. That s annoying enough in the general case, but even more so for someone paying long distance for it all, because it takes a few minutes to establish a connection to a BBS: handshaking, logging in, menu navigation, etc. But BBSs started talking to each other. The earliest successful such effort was FidoNet, and for the duration of the BBS era, it remained by far the largest. FidoNet was analogous to the UUCP that the institutional users had, but ran on the much cheaper PC hardware. Basically, BBSs that participated in FidoNet would relay email, forum posts, and files between themselves overnight. Eventually, as with UUCP, by hopping through this network, messages could reach around the globe, and forums could have worldwide participation asynchronously, long before they could link to each other directly via the Internet. It was almost entirely volunteer-run.

Running my own BBS At age 13, I eventually chose to set up my own BBS. It ran on my single phone line, so of course when I was dialing up something else, nobody could dial up me. Not that this was a huge problem; in my town of 500, I probably had a good 1 or 2 regular callers in the beginning. In the PC era, there was a big difference between a server and a client. Server-class software was expensive and rare. Maybe in later years you had an email client, but an email server would be completely unavailable to you as a home user. But with a BBS, I could effectively run a server. I even ran serial lines in our house so that the BBS could be connected from other rooms! Since I was running OS/2, the BBS didn t tie up the computer; I could continue using it for other things. FidoNet had an Internet email gateway. This one, unlike CompuServe s, was free. Once I had a BBS on FidoNet, you could reach me from the Internet using the FidoNet address. This didn t support attachments, but then email of the day didn t really, either. Various others outside Kansas ran FidoNet distribution points. I believe one of them was mgmtsys; my memory is quite vague, but I think they offered a direct gateway and I would call them to pick up Internet mail via FidoNet protocols, but I m not at all certain of this.

Pros and Cons of the Non-Microsoft World As mentioned, Microsoft was and is the dominant operating system vendor for PCs. But I left that world in 1993, and here, nearly 30 years later, have never really returned. I got an operating system with more technical capabilities than the DOS and Windows of the day, but the tradeoff was a much smaller software ecosystem. OS/2 could run DOS programs, but it ran OS/2 programs a lot better. So if I were to run a BBS, I wanted one that had a native OS/2 version limiting me to a small fraction of available BBS server software. On the other hand, as a fully 32-bit operating system, there started to be OS/2 ports of certain software with a Unix heritage; most notably for me at the time, gcc. At some point, I eventually came across the RMS essays and started to be hooked.

Internet: The Hunt Begins I certainly was aware that the Internet was out there and interesting. But the first problem was: how the heck do I get connected to the Internet?

Computer labs There was one place that tended to have Internet access: colleges and universities. In 7th grade, I participated in a program that resulted in me being invited to visit Duke University, and in 8th grade, I participated in National History Day, resulting in a trip to visit the University of Maryland. I probably sought out computer labs at both of those. My most distinct memory was finding my way into a computer lab at one of those universities, and it was full of NeXT workstations. I had never seen or used NeXT before, and had no idea how to operate it. I had brought a box of floppy disks, unaware that the DOS disks probably weren t compatible with NeXT. Closer to home, a small college had a computer lab that I could also visit. I would go there in summer or when it wasn t used with my stack of floppies. I remember downloading disk images of FLOSS operating systems: FreeBSD, Slackware, or Debian, at the time. The hash marks from the DOS-based FTP client would creep across the screen as the 1.44MB disk images would slowly download. telnet was also available on those machines, so I could telnet to things like public-access Archie servers and libraries though not Gopher. Still, FTP and telnet access opened up a lot, and I learned quite a bit in those years.

Continuing the Journey At some point, I got a copy of the Whole Internet User s Guide and Catalog, published in 1994. I still have it. If it hadn t already figured it out by then, I certainly became aware from it that Unix was the dominant operating system on the Internet. The examples in Whole Internet covered FTP, telnet, gopher all assuming the user somehow got to a Unix prompt. The web was introduced about 300 pages in; clearly viewed as something that wasn t page 1 material. And it covered the command-line www client before introducing the graphical Mosaic. Even then, though, the book highlighted Mosaic s utility as a front-end for Gopher and FTP, and even the ability to launch telnet sessions by clicking on links. But having a copy of the book didn t equate to having any way to run Mosaic. The machines in the computer lab I mentioned above all ran DOS and were incapable of running a graphical browser. I had no SLIP or PPP (both ways to run Internet traffic over a modem) connectivity at home. In short, the Web was something for the large institutional users at the time.

CD-ROMs As CD-ROMs came out, with their huge (for the day) 650MB capacity, various companies started collecting software that could be downloaded on the Internet and selling it on CD-ROM. The two most popular ones were Walnut Creek CD-ROM and Infomagic. One could buy extensive Shareware and gaming collections, and then even entire Linux and BSD distributions. Although not exactly an Internet service per se, it was a way of bringing what may ordinarily only be accessible to institutional users into the home computer realm.

Free Software Jumps In As I mentioned, by the mid 90s, I had come across RMS s writings about free software most probably his 1992 essay Why Software Should Be Free. (Please note, this is not a commentary on the more recently-revealed issues surrounding RMS, but rather his writings and work as I encountered them in the 90s.) The notion of a Free operating system not just in cost but in openness was incredibly appealing. Not only could I tinker with it to a much greater extent due to having source for everything, but it included so much software that I d otherwise have to pay for. Compilers! Interpreters! Editors! Terminal emulators! And, especially, server software of all sorts. There d be no way I could afford or run Netware, but with a Free Unixy operating system, I could do all that. My interest was obviously piqued. Add to that the fact that I could actually participate and contribute I was about to become hooked on something that I ve stayed hooked on for decades. But then the question was: which Free operating system? Eventually I chose FreeBSD to begin with; that would have been sometime in 1995. I don t recall the exact reasons for that. I remember downloading Slackware install floppies, and probably the fact that Debian wasn t yet at 1.0 scared me off for a time. FreeBSD s fantastic Handbook far better than anything I could find for Linux at the time was no doubt also a factor.

The de Raadt Factor Why not NetBSD or OpenBSD? The short answer is Theo de Raadt. Somewhere in this time, when I was somewhere between 14 and 16 years old, I asked some questions comparing NetBSD to the other two free BSDs. This was on a NetBSD mailing list, but for some reason Theo saw it and got a flame war going, which CC d me. Now keep in mind that even if NetBSD had a web presence at the time, it would have been minimal, and I would have not all that unusually for the time had no way to access it. I was certainly not aware of the, shall we say, acrimony between Theo and NetBSD. While I had certainly seen an online flamewar before, this took on a different and more disturbing tone; months later, Theo randomly emailed me under the subject SLIME saying that I was, well, SLIME . I seem to recall periodic emails from him thereafter reminding me that he hates me and that he had blocked me. (Disclaimer: I have poor email archives from this period, so the full details are lost to me, but I believe I am accurately conveying these events from over 25 years ago) This was a surprise, and an unpleasant one. I was trying to learn, and while it is possible I didn t understand some aspect or other of netiquette (or Theo s personal hatred of NetBSD) at the time, still that is not a reason to flame a 16-year-old (though he would have had no way to know my age). This didn t leave any kind of scar, but did leave a lasting impression; to this day, I am particularly concerned with how FLOSS projects handle poisonous people. Debian, for instance, has come a long way in this over the years, and even Linus Torvalds has turned over a new leaf. I don t know if Theo has. In any case, I didn t use NetBSD then. I did try it periodically in the years since, but never found it compelling enough to justify a large switch from Debian. I never tried OpenBSD for various reasons, but one of them was that I didn t want to join a community that tolerates behavior such as Theo s from its leader.

Moving to FreeBSD Moving from OS/2 to FreeBSD was final. That is, I didn t have enough hard drive space to keep both. I also didn t have the backup capacity to back up OS/2 completely. My BBS, which ran Virtual BBS (and at some point also AdeptXBBS) was deleted and reincarnated in a different form. My BBS was a member of both FidoNet and VirtualNet; the latter was specific to VBBS, and had to be dropped. I believe I may have also had to drop the FidoNet link for a time. This was the biggest change of computing in my life to that point. The earlier experiences hadn t literally destroyed what came before. OS/2 could still run my DOS programs. Its command shell was quite DOS-like. It ran Windows programs. I was going to throw all that away and leap into the unknown. I wish I had saved a copy of my BBS; I would love to see the messages I exchanged back then, or see its menu screens again. I have little memory of what it looked like. But other than that, I have no regrets. Pursuing Free, Unixy operating systems brought me a lot of enjoyment and a good career. That s not to say it was easy. All the problems of not being in the Microsoft ecosystem were magnified under FreeBSD and Linux. In a day before EDID, monitor timings had to be calculated manually and you risked destroying your monitor if you got them wrong. Word processing and spreadsheet software was pretty much not there for FreeBSD or Linux at the time; I was therefore forced to learn LaTeX and actually appreciated that. Software like PageMaker or CorelDraw was certainly nowhere to be found for those free operating systems either. But I got a ton of new capabilities. I mentioned the BBS didn t shut down, and indeed it didn t. I ran what was surely a supremely unique oddity: a free, dialin Unix shell server in the middle of a small town in Kansas. I m sure I provided things such as pine for email and some help text and maybe even printouts for how to use it. The set of callers slowly grew over the time period, in fact. And then I got UUCP.

Enter UUCP Even throughout all this, there was no local Internet provider and things were still long distance. I had Internet Email access via assorted strange routes, but they were all strange. And, I wanted access to Usenet. In 1995, it happened. The local ISP I mentioned offered UUCP access. Though I couldn t afford the dialup shell (or later, SLIP/PPP) that they offered due to long-distance costs, UUCP s very efficient batched processes looked doable. I believe I established that link when I was 15, so in 1995. I worked to register my domain, complete.org, as well. At the time, the process was a bit lengthy and involved downloading a text file form, filling it out in a precise way, sending it to InterNIC, and probably mailing them a check. Well I did that, and in September of 1995, complete.org became mine. I set up sendmail on my local system, as well as INN to handle the limited Usenet newsfeed I requested from the ISP. I even ran Majordomo to host some mailing lists, including some that were surprisingly high-traffic for a few-times-a-day long-distance modem UUCP link! The modem client programs for FreeBSD were somewhat less advanced than for OS/2, but I believe I wound up using Minicom or Seyon to continue to dial out to BBSs and, I believe, continue to use Learning Link. So all the while I was setting up my local BBS, I continued to have access to the text Internet, consisting of chiefly Gopher for me.

Switching to Debian I switched to Debian sometime in 1995 or 1996, and have been using Debian as my primary OS ever since. I continued to offer shell access, but added the WorldVU Atlantis menuing BBS system. This provided a return of a more BBS-like interface (by default; shell was still an uption) as well as some BBS door games such as LoRD and TradeWars 2002, running under DOS emulation. I also continued to run INN, and ran ifgate to allow FidoNet echomail to be presented into INN Usenet-like newsgroups, and netmail to be gated to Unix email. This worked pretty well. The BBS continued to grow in these days, peaking at about two dozen total user accounts, and maybe a dozen regular users.

Dial-up access availability I believe it was in 1996 that dial up PPP access finally became available in my small town. What a thrill! FINALLY! I could now FTP, use Gopher, telnet, and the web all from home. Of course, it was at modem speeds, but still. (Strangely, I have a memory of accessing the Web using WebExplorer from OS/2. I don t know exactly why; it s possible that by this time, I had upgraded to a 486 DX2/66 and was able to reinstall OS/2 on the old 25MHz 486, or maybe something was wrong with the timeline from my memories from 25 years ago above. Or perhaps I made the occasional long-distance call somewhere before I ditched OS/2.) Gopher sites still existed at this point, and I could access them using Netscape Navigator which likely became my standard Gopher client at that point. I don t recall using UMN text-mode gopher client locally at that time, though it s certainly possible I did.

The city Starting when I was 15, I took computer science classes at Wichita State University. The first one was a class in the summer of 1995 on C++. I remember being worried about being good enough for it I was, after all, just after my HS freshman year and had never taken the prerequisite C class. I loved it and got an A! By 1996, I was taking more classes. In 1996 or 1997 I stayed in Wichita during the day due to having more than one class. So, what would I do then but enjoy the computer lab? The CS dept. had two of them: one that had NCD X terminals connected to a pair of SunOS servers, and another one running Windows. I spent most of the time in the Unix lab with the NCDs; I d use Netscape or pine, write code, enjoy the University s fast Internet connection, and so forth. In 1997 I had graduated high school and that summer I moved to Wichita to attend college. As was so often the case, I shut down the BBS at that time. It would be 5 years until I again dealt with Internet at home in a rural community. By the time I moved to my apartment in Wichita, I had stopped using OS/2 entirely. I have no memory of ever having OS/2 there. Along the way, I had bought a Pentium 166, and then the most expensive piece of computing equipment I have ever owned: a DEC Alpha, which, of course, ran Linux.

ISDN I must have used dialup PPP for a time, but I eventually got a job working for the ISP I had used for UUCP, and then PPP. While there, I got a 128Kbps ISDN line installed in my apartment, and they gave me a discount on the service for it. That was around 3x the speed of a modem, and crucially was always on and gave me a public IP. No longer did I have to use UUCP; now I got to host my own things! By at least 1998, I was running a web server on www.complete.org, and I had an FTP server going as well.

Even Bigger Cities In 1999 I moved to Dallas, and there got my first broadband connection: an ADSL link at, I think, 1.5Mbps! Now that was something! But it had some reliability problems. I eventually put together a server and had it hosted at an acquantaince s place who had SDSL in his apartment. Within a couple of years, I had switched to various kinds of proper hosting for it, but that is a whole other article. In Indianapolis, I got a cable modem for the first time, with even tighter speeds but prohibitions on running servers on it. Yuck.

Challenges Being non-Microsoft continued to have challenges. Until the advent of Firefox, a web browser was one of the biggest. While Netscape supported Linux on i386, it didn t support Linux on Alpha. I hobbled along with various attempts at emulators, old versions of Mosaic, and so forth. And, until StarOffice was open-sourced as Open Office, reading Microsoft file formats was also a challenge, though WordPerfect was briefly available for Linux. Over the years, I have become used to the Linux ecosystem. Perhaps I use Gimp instead of Photoshop and digikam instead of well, whatever somebody would use on Windows. But I get ZFS, and containers, and so much that isn t available there. Yes, I know Apple never went away and is a thing, but for most of the time period I discuss in this article, at least after the rise of DOS, it was niche compared to the PC market.

Back to Kansas In 2002, I moved back to Kansas, to a rural home near a different small town in the county next to where I grew up. Over there, it was back to dialup at home, but I had faster access at work. I didn t much care for this, and thus began a 20+-year effort to get broadband in the country. At first, I got a wireless link, which worked well enough in the winter, but had serious problems in the summer when the trees leafed out. Eventually DSL became available locally highly unreliable, but still, it was something. Then I moved back to the community I grew up in, a few miles from where I grew up. Again I got DSL a bit better. But after some years, being at the end of the run of DSL meant I had poor speeds and reliability problems. I eventually switched to various wireless ISPs, which continues to the present day; while people in cities can get Gbps service, I can get, at best, about 50Mbps. Long-distance fees are gone, but the speed disparity remains.

Concluding Reflections I am glad I grew up where I did; the strong community has a lot of advantages I don t have room to discuss here. In a number of very real senses, having no local services made things a lot more difficult than they otherwise would have been. However, perhaps I could say that I also learned a lot through the need to come up with inventive solutions to those challenges. To this day, I think a lot about computing in remote environments: partially because I live in one, and partially because I enjoy visiting places that are remote enough that they have no Internet, phone, or cell service whatsoever. I have written articles like Tools for Communicating Offline and in Difficult Circumstances based on my own personal experience. I instinctively think about making protocols robust in the face of various kinds of connectivity failures because I experience various kinds of connectivity failures myself.

(Almost) Everything Lives On In 2002, Gopher turned 10 years old. It had probably been about 9 or 10 years since I had first used Gopher, which was the first way I got on live Internet from my house. It was hard to believe. By that point, I had an always-on Internet link at home and at work. I had my Alpha, and probably also at least PCMCIA Ethernet for a laptop (many laptops had modems by the 90s also). Despite its popularity in the early 90s, less than 10 years after it came on the scene and started to unify the Internet, it was mostly forgotten. And it was at that moment that I decided to try to resurrect it. The University of Minnesota finally released it under an Open Source license. I wrote the first new gopher server in years, pygopherd, and introduced gopher to Debian. Gopher lives on; there are now quite a few Gopher clients and servers out there, newly started post-2002. The Gemini protocol can be thought of as something akin to Gopher 2.0, and it too has a small but blossoming ecosystem. Archie, the old FTP search tool, is dead though. Same for WAIS and a number of the other pre-web search tools. But still, even FTP lives on today. And BBSs? Well, they didn t go away either. Jason Scott s fabulous BBS documentary looks back at the history of the BBS, while Back to the BBS from last year talks about the modern BBS scene. FidoNet somehow is still alive and kicking. UUCP still has its place and has inspired a whole string of successors. Some, like NNCP, are clearly direct descendents of UUCP. Filespooler lives in that ecosystem, and you can even see UUCP concepts in projects as far afield as Syncthing and Meshtastic. Usenet still exists, and you can now run Usenet over NNCP just as I ran Usenet over UUCP back in the day (which you can still do as well). Telnet, of course, has been largely supplanted by ssh, but the concept is more popular now than ever, as Linux has made ssh be available on everything from Raspberry Pi to Android. And I still run a Gopher server, looking pretty much like it did in 2002. This post also has a permanent home on my website, where it may be periodically updated.

15 August 2022

Jonathan Dowland: Temperature monitoring

Xiaomi Mijia Temperature Sensor Xiaomi Mijia Temperature Sensor
I've been having some temperature problems in my house, so I wanted to set up some thermometers which I could read from a computer, and look at trends. I bought a pack of three cheap Xiaomi IoT thermometers. There's some official Xiaomi tooling to access them from smartphones and suchlike, but I wanted something more open. The thermometers have some rudimentary security on them to try and ensure you use the official tooling. This is pretty weak, and the open-source Home Assistant (HA) has support for querying them. I wasn't already running HA and it looked to do more than I needed right now. gathering A friend told me that it was trivial to write custom firmware to the devices. It's so easy you can do it from a web-based flasher: in fact, anyone in range can. There's a family of custom firmwares out there, and most move the sensors readings into the BTLE announce packets, meaning, you can scrape the temperature by simply reading and decoding the announcement packets, no need to handshake at all, and certainly no need to navigate Xiaomi's weird security. This is the one I used. I hacked up a Python script to read the values with the help of this convenience library1. Next, I needed to set up somewhere to write the values. reporting
The study is thankfully cooler today The study is thankfully cooler today
It's been long enough since I last looked at something like this that the best in class software was things like multi router traffic grapher, and rrdtool, or things that build on top of them like Munin. The world seems to have moved on (rightly or wrongly) with a cornucopia of options like Prometheus, Grafana, Graphite/Carbon, InfluxDB, statsd, etc. I ruled most of these out as being too complex for what I want to do, and got something working with Graphite (front-end) and Carbon (back-end). I was surprised that this wasn't packaged in Debian, and opted to try the Docker container. This works, although even that is more complex than I need: it's got graphite and carbon, but also nginx and statsd too; I'm submitting directly to carbon, side-stepping statsd entirely. So as I refine what I'm doing I might possibly strip that back. next steps I might add more sensors in my house! My scripts also need a lot of tidying up. But, I think it would be useful to add some external temperature data, such as something from a Weather service. I am also considering pulling in some of the sensor data from the Newcastle University Urban Observatory, which is something I looked at a while ago for my PhD but didn't ultimately end up using. There are several temperature sensors nearby, but they seem to operate relatively sporadically. There's a load of other interesting sensors in my vicinity, such as air quality monitors. I'm currently ignoring the humidity data from the sensors but I should collect that too. It would be useful to mark relevant "events", too: does switching on or off my desktop PC, or printer, etc. correlate to a jump in temperature?

  1. I might get rid of that in the future as I refine my approach

10 August 2022

Shirish Agarwal: Mum, Samsung Galaxy M-52

Mum I dunno from where to start. While I m not supposed to announce it, mum left this earth a month ago (thirteen days when I started to write this blog post) ago. I am still in part denial, part shock, and morose. Of all the seasons in a year, the rainy season used to be my favorite, now would I ever be able to look and feel other than the emptiness that this season has given me? In some senses, it is and was very ironic, when she became ill about last year, I had promised myself I would be by her side for 5-6 years, not go anywhere either Hillhacks or Debconf or any meetup and I was ok with that. Now that she s no more I have no clue why am I living. What is the purpose, the utility? When she was alive, the utility was understandable. We had an unspoken agreement, I would like after her, and she was supposed to look after me. A part of me self-blames as I am sure, I have done thousands of things wrong otherwise the deal was that she was going to be for another decade. But now that she has left not even halfway, I dunno what to do. I don t have someone to fight with anymore  It s mostly a robotic existence atm. I try to distract myself via movies, web series, the web, books, etc. whatever can take my mind off. From the day she died to date, I have a lower back pain which acts as a reminder. It s been a month, I eat, drink, and am surviving but still feel empty. I do things suggested by extended family but within there is no feeling, just emptiness :(. I have no clue if things will get better and even if I do want the change. I clearly have no idea, so let me share a little about what I know.

Samsung Galaxy M-52G Just a couple of days before she died, part of our extended family had come and she chose that opportunity to gift me Samsung Galaxy M-52G even though my birthday was 3 months away. Ironically, after I purchased it, the next day, one of the resellers of the phone cut the price from INR 28k to 20k. If a day more, I could have saved another 8k/- but what s done is done  To my mind, the phone is middling yet a solid phone. I had the phone drop accidentally at times but not a single scratch or anything like that. One can look at the specs in greater detail on fccd.io. Before the recent price drop, as I shared it was a mid-range phone so am gonna review it on that basis itself. One of the first things I did is to buy a plastic cover as well as a cover shield even though the original one is meant to work for a year or more. This was simply for added protection and it has served me to date. Even with the additional weight, I can easily use it with one hand. It only becomes problematic when using chatting apps. such as Whatsapp, Telegram, Quicksy and a few others where it comes with Samsung keyboard with the divided/split keyboard. The A.I. for guessing words and sentences are spot-on when you are doing it in English but if you try a mixture of Hinglish (Hindi and English) that becomes a bit of a nightmare. Tryng to each A.I; new words is something of a task. I wish there was an interface in which I could train the A.I. so it could be served for Hinglish words also. I do think it does, but it s too rudimentary as it is to be any useful at least where it is now.

WiFi Direct While my previous phone did use wifi direct but it that ancient android version wasn t wedded to Wifi Direct as this one is. You have essentially two ways to connect to any system outside. One is through Wi-Fi Direct and the more expensive way is through mobile data. One of the strange things I found quite a number of times, that Wifi would lose it pairings. Before we get into it, Wikipedia has a good explanation of what Wifi direct is all about. Apparently, either my phone or my modem loses the pairing, which of the two is the culprit, I really don t know. There are two apps from the Play Store that do help in figuring out what the issue is (although it is limited in what it gives out in info. but still good.) The first one is Wifi Signal Meter and the other one is WifiAnalyzer (open-source). I have found that pairing done through Wifi Signal Meter works better than through Google s own implementation which feels lacking. The whole universe of Android seems to be built on apps and games and many of these can be bought for money, but many of these can also be played using a combination of micro-transactions and ads. For many a game, you cannot play for more than 5 minutes before you either see an ad or wait for something like 2-3 hrs. before you attempt again. Hogwarts Mystery, for e.g., is an example of that. Another one would be Explore Lands . While Hogwarts Mystery is more towards the lore created by J.K.Rowling and you can really get into the thick of things if you know the lore, Explore lands is more into Exploration of areas. In both the games, you are basically looking to gain energy over a period of time, which requires either money or viewing ads or a combination of both Sadly most ads and even Google don t seem to have caught up that I m deaf so most ads do not have subtitles, so more often than not they are useless to me. I have found also that many games share screenshots or videos that have nothing to do with how the game is. So there is quite a bit of misleading going on. I did read that Android had been having issues with connecting with developers after their app. is in the Play Store. Most apps. ask and require a whole lot of permissions that aren t needed by that app.

F-Droid Think Pirate Praveen had introduced me to F-Droid and a whole lot of things have happened in F-Droid, lot more apps. games etc. the look of F-Droid has been pulled back. In fact, I found Neo Store to be a better skin to see F-Droid. I have yet to explore more of F-Droid before sharing any recommendations and spending some time on it. I do find that many of foss apps. do need to work on how we communicate with our users. For e.g. one app. that Praveen had shared with me recently was Quicksy. And while it is better, it uses a double negative while asking permission whether it should or not to use more of the phone s resources. It is an example of that sort of language that we need to be aware of and be better. I know this post is more on the mobile rather than the desktop but that is where I m living currently.

16 January 2022

Chris Lamb: Favourite films of 2021

In my four most recent posts, I went over the memoirs and biographies, the non-fiction, the fiction and the 'classic' novels that I enjoyed reading the most in 2021. But in the very last of my 2021 roundup posts, I'll be going over some of my favourite movies. (Saying that, these are perhaps less of my 'favourite films' than the ones worth remarking on after all, nobody needs to hear that The Godfather is a good movie.) It's probably helpful to remark you that I took a self-directed course in film history in 2021, based around the first volume of Roger Ebert's The Great Movies. This collection of 100-odd movie essays aims to make a tour of the landmarks of the first century of cinema, and I watched all but a handul before the year was out. I am slowly making my way through volume two in 2022. This tome was tremendously useful, and not simply due to the background context that Ebert added to each film: it also brought me into contact with films I would have hardly come through some other means. Would I have ever discovered the sly comedy of Trouble in Paradise (1932) or the touching proto-realism of L'Atalante (1934) any other way? It also helped me to 'get around' to watching films I may have put off watching forever the influential Battleship Potemkin (1925), for instance, and the ur-epic Lawrence of Arabia (1962) spring to mind here. Choosing a 'worst' film is perhaps more difficult than choosing the best. There are first those that left me completely dry (Ready or Not, Written on the Wind, etc.), and those that were simply poorly executed. And there are those that failed to meet their own high opinions of themselves, such as the 'made for Reddit' Tenet (2020) or the inscrutable Vanilla Sky (2001) the latter being an almost perfect example of late-20th century cultural exhaustion. But I must save my most severe judgement for those films where I took a visceral dislike how their subjects were portrayed. The sexually problematic Sixteen Candles (1984) and the pseudo-Catholic vigilantism of The Boondock Saints (1999) both spring to mind here, the latter of which combines so many things I dislike into such a short running time I'd need an entire essay to adequately express how much I disliked it.

Dogtooth (2009) A father, a mother, a brother and two sisters live in a large and affluent house behind a very high wall and an always-locked gate. Only the father ever leaves the property, driving to the factory that he happens to own. Dogtooth goes far beyond any allusion to Josef Fritzl's cellar, though, as the children's education is a grotesque parody of home-schooling. Here, the parents deliberately teach their children the wrong meaning of words (e.g. a yellow flower is called a 'zombie'), all of which renders the outside world utterly meaningless and unreadable, and completely mystifying its very existence. It is this creepy strangeness within a 'regular' family unit in Dogtooth that is both socially and epistemically horrific, and I'll say nothing here of its sexual elements as well. Despite its cold, inscrutable and deadpan surreality, Dogtooth invites all manner of potential interpretations. Is this film about the artificiality of the nuclear family that the West insists is the benchmark of normality? Or is it, as I prefer to believe, something more visceral altogether: an allegory for the various forms of ontological violence wrought by fascism, as well a sobering nod towards some of fascism's inherent appeals? (Perhaps it is both. In 1972, French poststructuralists Gilles and F lix Guattari wrote Anti-Oedipus, which plays with the idea of the family unit as a metaphor for the authoritarian state.) The Greek-language Dogtooth, elegantly shot, thankfully provides no easy answers.

Holy Motors (2012) There is an infamous scene in Un Chien Andalou, the 1929 film collaboration between Luis Bu uel and famed artist Salvador Dal . A young woman is cornered in her own apartment by a threatening man, and she reaches for a tennis racquet in self-defence. But the man suddenly picks up two nearby ropes and drags into the frame two large grand pianos... each leaden with a dead donkey, a stone tablet, a pumpkin and a bewildered priest. This bizarre sketch serves as a better introduction to Leos Carax's Holy Motors than any elementary outline of its plot, which ostensibly follows 24 hours in the life of a man who must play a number of extremely diverse roles around Paris... all for no apparent reason. (And is he even a man?) Surrealism as an art movement gets a pretty bad wrap these days, and perhaps justifiably so. But Holy Motors and Un Chien Andalou serve as a good reminder that surrealism can be, well, 'good, actually'. And if not quite high art, Holy Motors at least demonstrates that surrealism can still unnerving and hilariously funny. Indeed, recalling the whimsy of the plot to a close friend, the tears of laughter came unbidden to my eyes once again. ("And then the limousines...!") Still, it is unclear how Holy Motors truly refreshes surrealism for the twenty-first century. Surrealism was, in part, a reaction to the mechanical and unfeeling brutality of World War I and ultimately sought to release the creative potential of the unconscious mind. Holy Motors cannot be responding to another continental conflagration, and so it appears to me to be some kind of commentary on the roles we exhibit in an era of 'post-postmodernity': a sketch on our age of performative authenticity, perhaps, or an idle doodle on the function and psychosocial function of work. Or perhaps not. After all, this film was produced in a time that offers the near-universal availability of mind-altering substances, and this certainly changes the context in which this film was both created. And, how can I put it, was intended to be watched.

Manchester by the Sea (2016) An absolutely devastating portrayal of a character who is unable to forgive himself and is hesitant to engage with anyone ever again. It features a near-ideal balance between portraying unrecoverable anguish and tender warmth, and is paradoxically grandiose in its subtle intimacy. The mechanics of life led me to watch this lying on a bed in a chain hotel by Heathrow Airport, and if this colourless circumstance blunted the film's emotional impact on me, I am probably thankful for it. Indeed, I find myself reduced in this review to fatuously recalling my favourite interactions instead of providing any real commentary. You could write a whole essay about one particular incident: its surfaces, subtexts and angles... all despite nothing of any substance ever being communicated. Truly stunning.

McCabe & Mrs. Miller (1971) Roger Ebert called this movie one of the saddest films I have ever seen, filled with a yearning for love and home that will not ever come. But whilst it is difficult to disagree with his sentiment, Ebert's choice of sad is somehow not quite the right word. Indeed, I've long regretted that our dictionaries don't have more nuanced blends of tragedy and sadness; perhaps the Ancient Greeks can loan us some. Nevertheless, the plot of this film is of a gambler and a prostitute who become business partners in a new and remote mining town called Presbyterian Church. However, as their town and enterprise booms, it comes to the attention of a large mining corporation who want to bully or buy their way into the action. What makes this film stand out is not the plot itself, however, but its mood and tone the town and its inhabitants seem to be thrown together out of raw lumber, covered alternatively in mud or frozen ice, and their days (and their personalities) are both short and dark in equal measure. As a brief aside, if you haven't seen a Roger Altman film before, this has all the trappings of being a good introduction. As Ebert went on to observe: This is not the kind of movie where the characters are introduced. They are all already here. Furthermore, we can see some of Altman's trademark conversations that overlap, a superb handling of ensemble casts, and a quietly subversive view of the tyranny of 'genre'... and the latter in a time when the appetite for revisionist portrays of the West was not very strong. All of these 'Altmanian' trademarks can be ordered in much stronger measures in his later films: in particular, his comedy-drama Nashville (1975) has 24 main characters, and my jejune interpretation of Gosford Park (2001) is that it is purposefully designed to poke fun those who take a reductionist view of 'genre', or at least on the audience's expectations. (In this case, an Edwardian-era English murder mystery in the style of Agatha Christie, but where no real murder or detection really takes place.) On the other hand, McCabe & Mrs. Miller is actually a poor introduction to Altman. The story is told in a suitable deliberate and slow tempo, and the two stars of the film are shown thoroughly defrocked of any 'star status', in both the visual and moral dimensions. All of these traits are, however, this film's strength, adding up to a credible, fascinating and riveting portrayal of the old West.

Detour (1945) Detour was filmed in less than a week, and it's difficult to decide out of the actors and the screenplay which is its weakest point.... Yet it still somehow seemed to drag me in. The plot revolves around luckless Al who is hitchhiking to California. Al gets a lift from a man called Haskell who quickly falls down dead from a heart attack. Al quickly buries the body and takes Haskell's money, car and identification, believing that the police will believe Al murdered him. An unstable element is soon introduced in the guise of Vera, who, through a set of coincidences that stretches credulity, knows that this 'new' Haskell (ie. Al pretending to be him) is not who he seems. Vera then attaches herself to Al in order to blackmail him, and the world starts to spin out of his control. It must be understood that none of this is executed very well. Rather, what makes Detour so interesting to watch is that its 'errors' lend a distinctively creepy and unnatural hue to the film. Indeed, in the early twentieth century, Sigmund Freud used the word unheimlich to describe the experience of something that is not simply mysterious, but something creepy in a strangely familiar way. This is almost the perfect description of watching Detour its eerie nature means that we are not only frequently second-guessed about where the film is going, but are often uncertain whether we are watching the usual objective perspective offered by cinema. In particular, are all the ham-fisted segues, stilted dialogue and inscrutable character motivations actually a product of Al inventing a story for the viewer? Did he murder Haskell after all, despite the film 'showing' us that Haskell died of natural causes? In other words, are we watching what Al wants us to believe? Regardless of the answers to these questions, the film succeeds precisely because of its accidental or inadvertent choices, so it is an implicit reminder that seeking the director's original intention in any piece of art is a complete mirage. Detour is certainly not a good film, but it just might be a great one. (It is a short film too, and, out of copyright, it is available online for free.)

Safe (1995) Safe is a subtly disturbing film about an upper-middle-class housewife who begins to complain about vague symptoms of illness. Initially claiming that she doesn't feel right, Carol starts to have unexplained headaches, a dry cough and nosebleeds, and eventually begins to have trouble breathing. Carol's family doctor treats her concerns with little care, and suggests to her husband that she sees a psychiatrist. Yet Carol's episodes soon escalate. For example, as a 'homemaker' and with nothing else to occupy her, Carol's orders a new couch for a party. But when the store delivers the wrong one (although it is not altogether clear that they did), Carol has a near breakdown. Unsure where to turn, an 'allergist' tells Carol she has "Environmental Illness," and so Carol eventually checks herself into a new-age commune filled with alternative therapies. On the surface, Safe is thus a film about the increasing about of pesticides and chemicals in our lives, something that was clearly felt far more viscerally in the 1990s. But it is also a film about how lack of genuine healthcare for women must be seen as a critical factor in the rise of crank medicine. (Indeed, it made for something of an uncomfortable watch during the coronavirus lockdown.) More interestingly, however, Safe gently-yet-critically examines the psychosocial causes that may be aggravating Carol's illnesses, including her vacant marriage, her hollow friends and the 'empty calorie' stimulus of suburbia. None of this should be especially new to anyone: the gendered Victorian term 'hysterical' is often all but spoken throughout this film, and perhaps from the very invention of modern medicine, women's symptoms have often regularly minimised or outright dismissed. (Hilary Mantel's 2003 memoir, Giving Up the Ghost is especially harrowing on this.) As I opened this review, the film is subtle in its messaging. Just to take one example from many, the sound of the cars is always just a fraction too loud: there's a scene where a group is eating dinner with a road in the background, and the total effect can be seen as representing the toxic fumes of modernity invading our social lives and health. I won't spoiler the conclusion of this quietly devasting film, but don't expect a happy ending.

The Driver (1978) Critics grossly misunderstood The Driver when it was first released. They interpreted the cold and unemotional affect of the characters with the lack of developmental depth, instead of representing their dissociation from the society around them. This reading was encouraged by the fact that the principal actors aren't given real names and are instead known simply by their archetypes instead: 'The Driver', 'The Detective', 'The Player' and so on. This sort of quasi-Jungian erudition is common in many crime films today (Reservoir Dogs, Kill Bill, Layer Cake, Fight Club), so the critics' misconceptions were entirely reasonable in 1978. The plot of The Driver involves the eponymous Driver, a noted getaway driver for robberies in Los Angeles. His exceptional talent has far prevented him from being captured thus far, so the Detective attempts to catch the Driver by pardoning another gang if they help convict the Driver via a set-up robbery. To give himself an edge, however, The Driver seeks help from the femme fatale 'Player' in order to mislead the Detective. If this all sounds eerily familiar, you would not be far wrong. The film was essentially remade by Nicolas Winding Refn as Drive (2011) and in Edgar Wright's 2017 Baby Driver. Yet The Driver offers something that these neon-noir variants do not. In particular, the car chases around Los Angeles are some of the most captivating I've seen: they aren't thrilling in the sense of tyre squeals, explosions and flying boxes, but rather the vehicles come across like wild animals hunting one another. This feels especially so when the police are hunting The Driver, which feels less like a low-stakes game of cat and mouse than a pack of feral animals working together a gang who will tear apart their prey if they find him. In contrast to the undercar neon glow of the Fast & Furious franchise, the urban realism backdrop of the The Driver's LA metropolis contributes to a sincere feeling of artistic fidelity as well. To be sure, most of this is present in the truly-excellent Drive, where the chase scenes do really communicate a credible sense of stakes. But the substitution of The Driver's grit with Drive's soft neon tilts it slightly towards that common affliction of crime movies: style over substance. Nevertheless, I can highly recommend watching The Driver and Drive together, as it can tell you a lot about the disconnected socioeconomic practices of the 1980s compared to the 2010s. More than that, however, the pseudo-1980s synthwave soundtrack of Drive captures something crucial to analysing the world of today. In particular, these 'sounds from the past filtered through the present' bring to mind the increasing role of nostalgia for lost futures in the culture of today, where temporality and pop culture references are almost-exclusively citational and commemorational.

The Souvenir (2019) The ostensible outline of this quietly understated film follows a shy but ambitious film student who falls into an emotionally fraught relationship with a charismatic but untrustworthy older man. But that doesn't quite cover the plot at all, for not only is The Souvenir a film about a young artist who is inspired, derailed and ultimately strengthened by a toxic relationship, it is also partly a coming-of-age drama, a subtle portrait of class and, finally, a film about the making of a film. Still, one of the geniuses of this truly heartbreaking movie is that none of these many elements crowds out the other. It never, ever feels rushed. Indeed, there are many scenes where the camera simply 'sits there' and quietly observes what is going on. Other films might smother themselves through references to 18th-century oil paintings, but The Souvenir somehow evades this too. And there's a certain ring of credibility to the story as well, no doubt in part due to the fact it is based on director Joanna Hogg's own experiences at film school. A beautifully observed and multi-layered film; I'll be happy if the sequel is one-half as good.

The Wrestler (2008) Randy 'The Ram' Robinson is long past his prime, but he is still rarin' to go in the local pro-wrestling circuit. Yet after a brutal beating that seriously threatens his health, Randy hangs up his tights and pursues a serious relationship... and even tries to reconnect with his estranged daughter. But Randy can't resist the lure of the ring, and readies himself for a comeback. The stage is thus set for Darren Aronofsky's The Wrestler, which is essentially about what drives Randy back to the ring. To be sure, Randy derives much of his money from wrestling as well as his 'fitness', self-image, self-esteem and self-worth. Oh, it's no use insisting that wrestling is fake, for the sport is, needless to say, Randy's identity; it's not for nothing that this film is called The Wrestler. In a number of ways, The Sound of Metal (2019) is both a reaction to (and a quiet remake of) The Wrestler, if only because both movies utilise 'cool' professions to explore such questions of identity. But perhaps simply when The Wrestler was produced makes it the superior film. Indeed, the role of time feels very important for the Wrestler. In the first instance, time is clearly taking its toll on Randy's body, but I felt it more strongly in the sense this was very much a pre-2008 film, released on the cliff-edge of the global financial crisis, and the concomitant precarity of the 2010s. Indeed, it is curious to consider that you couldn't make The Wrestler today, although not because the relationship to work has changed in any fundamentalway. (Indeed, isn't it somewhat depressing the realise that, since the start of the pandemic and the 'work from home' trend to one side, we now require even more people to wreck their bodies and mental health to cover their bills?) No, what I mean to say here is that, post-2016, you cannot portray wrestling on-screen without, how can I put it, unwelcome connotations. All of which then reminds me of Minari's notorious red hat... But I digress. The Wrestler is a grittily stark darkly humorous look into the life of a desperate man and a sorrowful world, all through one tragic profession.

Thief (1981) Frank is an expert professional safecracker and specialises in high-profile diamond heists. He plans to use his ill-gotten gains to retire from crime and build a life for himself with a wife and kids, so he signs on with a top gangster for one last big score. This, of course, could be the plot to any number of heist movies, but Thief does something different. Similar to The Wrestler and The Driver (see above) and a number of other films that I watched this year, Thief seems to be saying about our relationship to work and family in modernity and postmodernity. Indeed, the 'heist film', we are told, is an understudied genre, but part of the pleasure of watching these films is said to arise from how they portray our desired relationship to work. In particular, Frank's desire to pull off that last big job feels less about the money it would bring him, but a displacement from (or proxy for) fulfilling some deep-down desire to have a family or indeed any relationship at all. Because in theory, of course, Frank could enter into a fulfilling long-term relationship right away, without stealing millions of dollars in diamonds... but that's kinda the entire point: Frank needing just one more theft is an excuse to not pursue a relationship and put it off indefinitely in favour of 'work'. (And being Federal crimes, it also means Frank cannot put down meaningful roots in a community.) All this is communicated extremely subtly in the justly-lauded lowkey diner scene, by far the best scene in the movie. The visual aesthetic of Thief is as if you set The Warriors (1979) in a similarly-filthy Chicago, with the Xenophon-inspired plot of The Warriors replaced with an almost deliberate lack of plot development... and the allure of The Warriors' fantastical criminal gangs (with their alluringly well-defined social identities) substituted by a bunch of amoral individuals with no solidarity beyond the immediate moment. A tale of our time, perhaps. I should warn you that the ending of Thief is famously weak, but this is a gritty, intelligent and strangely credible heist movie before you get there.

Uncut Gems (2019) The most exhausting film I've seen in years; the cinematic equivalent of four cups of double espresso, I didn't even bother even trying to sleep after downing Uncut Gems late one night. Directed by the two Safdie Brothers, it often felt like I was watching two films that had been made at the same time. (Or do I mean two films at 2X speed?) No, whatever clumsy metaphor you choose to adopt, the unavoidable effect of this film's finely-tuned chaos is an uncompromising and anxiety-inducing piece of cinema. The plot follows Howard as a man lost to his countless vices mostly gambling with a significant side hustle in adultery, but you get the distinct impression he would be happy with anything that will give him another high. A true junkie's junkie, you might say. You know right from the beginning it's going to end in some kind of disaster, the only question remaining is precisely how and what. Portrayed by an (almost unrecognisable) Adam Sandler, there's an uncanny sense of distance in the emotional chasm between 'Sandler-as-junkie' and 'Sandler-as-regular-star-of-goofy-comedies'. Yet instead of being distracting and reducing the film's affect, this possibly-deliberate intertextuality somehow adds to the masterfully-controlled mayhem. My heart races just at the memory. Oof.

Woman in the Dunes (1964) I ended up watching three films that feature sand this year: Denis Villeneuve's Dune (2021), Lawrence of Arabia (1962) and Woman in the Dunes. But it is this last 1964 film by Hiroshi Teshigahara that will stick in my mind in the years to come. Sure, there is none of the Medician intrigue of Dune or the Super Panavision-70 of Lawrence of Arabia (or its quasi-orientalist score, itself likely stolen from Anton Bruckner's 6th Symphony), but Woman in the Dunes doesn't have to assert its confidence so boldly, and it reveals the enormity of its plot slowly and deliberately instead. Woman in the Dunes never rushes to get to the film's central dilemma, and it uncovers its terror in little hints and insights, all whilst establishing the daily rhythm of life. Woman in the Dunes has something of the uncanny horror as Dogtooth (see above), as well as its broad range of potential interpretations. Both films permit a wide array of readings, without resorting to being deliberately obscurantist or being just plain random it is perhaps this reason why I enjoyed them so much. It is true that asking 'So what does the sand mean?' sounds tediously sophomoric shorn of any context, but it somehow applies to this thoughtfully self-contained piece of cinema.

A Quiet Place (2018) Although A Quiet Place was not actually one of the best films I saw this year, I'm including it here as it is certainly one of the better 'mainstream' Hollywood franchises I came across. Not only is the film very ably constructed and engages on a visceral level, I should point out that it is rare that I can empathise with the peril of conventional horror movies (and perhaps prefer to focus on its cultural and political aesthetics), but I did here. The conceit of this particular post-apocalyptic world is that a family is forced to live in almost complete silence while hiding from creatures that hunt by sound alone. Still, A Quiet Place engages on an intellectual level too, and this probably works in tandem with the pure 'horrorific' elements and make it stick into your mind. In particular, and to my mind at least, A Quiet Place a deeply American conservative film below the surface: it exalts the family structure and a certain kind of sacrifice for your family. (The music often had a passacaglia-like strain too, forming a tombeau for America.) Moreover, you survive in this dystopia by staying quiet that is to say, by staying stoic suggesting that in the wake of any conflict that might beset the world, the best thing to do is to keep quiet. Even communicating with your loved ones can be deadly to both of you, so not emote, acquiesce quietly to your fate, and don't, whatever you do, speak up. (Or join a union.) I could go on, but The Quiet Place is more than this. It's taut and brief, and despite cinema being an increasingly visual medium, it encourages its audience to develop a new relationship with sound.

1 January 2022

Chris Lamb: Favourite books of 2021: Classics

In my three most recent posts, I went over the memoirs and biographies, the non-fiction and fiction I enjoyed in 2021. But in the last of my 2021 book-related posts, however, I'll be going over my favourite classics. Of course, the difference between regular fiction and a 'classic' is an ambiguous, arbitrary and often-meaningless distinction: after all, what does it matter if Hemingway's The Old Man and the Sea (from 1951) is a classic or not? The term also smuggles in some of the ethnocentric gatekeeping encapsulated in the term 'Western canon' too. Nevertheless, the label of 'classic' has some utility for me in that it splits up the vast amount of non-fiction I read in two... Books that just missed the cut here include: Oscar Wilde's The Picture of Dorian Gray (moody and hilarious, but I cannot bring myself to include it due to the egregious antisemitism); Tolstoy's The Kreutzer Sonata (so angry! so funny!); and finally Notes from Underground by Fyodor Dostoevsky. Of significant note, though, would be the ghostly The Turn of the Screw by Henry James.

Heart of Darkness (1899) Joseph Conrad Heart of Darkness tells the story of Charles Marlow, a sailor who accepts an assignment from a Belgian trading company as a ferry-boat captain in the African interior, and the novella is widely regarded as a critique of European colonial rule in Africa. Loosely remade by Francis Ford Coppola as Apocalypse Now (1979), I started this book with the distinct possibility that this superb film adaptation would, for a rare treat, be 'better than the book'. However, Conrad demolished this idea of mine within two chapters, yet also elevated the film to a new level as well. This was chiefly due to how observant Conrad was of the universals that make up human nature. Some of his insight pertains to the barbarism of the colonialists, of course, but Conrad applies his shrewd acuity to the at the smaller level as well. Some of these quotes are justly famous: Ah! but it was something to have at least a choice of nightmares, for example, as well as the reference to a fastidiously turned-out colonial administrator who, with unimaginable horrors occurring mere yards from his tent, we learn he was devoted to his books, which were in applepie order . (It seems to me to be deliberately unclear whether his devotion arises from gross inhumanity, utter denial or some combination of the two.) Oh, and there's a favourite moment of mine when a character remarks that It was very fine for a time, but after a bit I did get tired of resting. Tired of resting! Yes, it's difficult to now say something original about a many-layered classic such as this, especially one that has analysed from so many angles already; from a literary perspective at first, of course, but much later from a critical postcolonial perspective, such as in Chinua Achebe's noted 1975 lecture, An Image of Africa. Indeed, the history of criticism in the twentieth century of Heart of Darkness must surely parallel the social and political developments in the Western world. (On a highly related note, the much-cited non-fiction book King Leopold's Ghost is on my reading list for 2022.) I will therefore limit myself to saying that the boat physically falling apart as it journeys deeper into the Congo may be intended to represent that our idea of 'Western civilisation' ceases to function, both morally as well as physically, in this remote environment. And, whilst I'm probably not the first to notice the potential ambiguity, when Marlow lies to Kurtz's 'Intended [wife]' in the closing section in order to save her from being exposed to the truth about Kurtz (surely a metaphor about the ignorance of the West whilst also possibly incorporating some comment on gender?), the Intended replies: I knew it. For me, though, it is not beyond doubt that what the Intended 'knows' is that she knew that Marlow would lie to her: in other words, that the alleged ignorance of everyday folk in the colonial homeland is studied and deliberate. Compact and fairly easy-to-read, it is clear that Heart of Darkness rewards even the most rudimentary analysis.

Rebecca (1938) Daphne du Maurier Daphne du Maurier creates in Rebecca a credible and suffocating atmosphere in the shape of Manderley, a grand English mansion owned by aristocratic widower Maxim de Winter. Our unnamed narrator (a young woman seemingly na ve in the ways of the world) meets Max in Monte Carlo, and she soon becomes the second Mrs. de Winter. The tale takes a turn to the 'gothic', though, when it becomes apparent that the unemotional Max, as well as potentially Manderley itself, appears to be haunted by the memory of his late first wife, the titular Rebecca. Still, Rebecca is less of a story about supernatural ghosts than one about the things that can haunt our minds. For Max, this might be something around guilt; for our narrator, the class-centered fear that she will never fit in. Besides, Rebecca doesn't need an actual ghost when you have Manderley's overbearing housekeeper, Mrs Danvers, surely one of the creepiest characters in all of fiction. Either way, the conflict of a kind between the fears of the protagonists means that they never really connect with each other. The most obvious criticism of Rebecca is that the main character is unreasonably weak and cannot quite think or function on her own. (Isn't it curious that the trait of the male 'everyman' is a kind of physical clumsiness yet the female equivalent is shorthanded by being slightly slow?) But the na vete of Rebecca's narrator makes her easier to relate to in a way, and it also makes the reader far more capable of empathising with her embarrassment. This is demonstrated best whilst she, in one of the best evocations of this particular anxiety I have yet come across, is gingerly creeping around Manderlay and trying to avoid running into the butler. A surprise of sorts comes in the latter stages of the book, and this particular twist brings us into contact with a female character who is anything but 'credulous'. This revelation might even change your idea of who the main character of this book really is too. (Speaking of amateur literary criticism, I have many fan theories about Rebecca, including that Maxim de Winter's estate manager, Frank Crawley, is actually having an affair with Max, and also that Maxim may have a lot more involvement in Mrs Danvers final act that he lets on.) An easily accessible novel (with a great-but-not-perfect 1940 adaptation by Alfred Hitchcock, Rebecca is a real indulgence.

A Clockwork Orange (1962) Anthony Burgess One of Stanley Kubrick's most prominent tricks was to use different visual languages in order to prevent the audience from immediately grasping the underlying story. In his 1975 Barry Lyndon, for instance, the intentionally sluggish pacing and elusive characters require significant digestion to fathom and appreciate, and the luminous and quasi-Renaissance splendour of the cinematography does its part to constantly distract the viewer from the film's greater meaning. This is very much the case in Kubrick's A Clockwork Orange as well whilst it ostensibly appears to be about a Saturnalia of violence, the 'greater meaning' of A Clockwork Orange pertains to the Christian conception of free will; admittedly, a much drier idea to bother making a film around. This is all made much clearer when reading Anthony Burgess' 1962 original novel. Alex became a 'true Christian' through the experimental rehabilitation process, and even offers to literally turn the other cheek at one point. But as Alex had no choice to do so (and can no longer choose to commit violence), he is incapable of making a free moral choice. Thus, is he really a Man? Yet whilst the book's central concern is our conception of free will in modern societies, it also appears to be a repudiation of two conservative principles. Firstly, A Clockwork Orange demolishes the idea that 'high art' leads to morally virtuous citizens. After all, if you can do a bit of the old ultra-violence whilst listening to the glorious 9th by old Ludvig van, then so much for the oft-repeated claims that culture makes you better as a person. (This, at least, I already knew from personal experience.) The other repudiation in A Clockwork Orange is in regard to the pervasive idea that the countryside is a refuge from crime and sin. By contrast, we see the gang commit their most horrific violence in rural areas, and, later, Alex is taken to the countryside by his former droogs for a savage beating. Although this doesn't seem to quite fit the novel, this was actually an important point for Burgess to include: otherwise his book could easily be read as a commentary on the corrupting influence of urban spaces, rather than of modernity itself. The language of this book cannot escape comment here. Alex narrates most of the book in a language called Nadsat, a fractured slang constructed by Burgess based on Russian and Cockney rhyming slang. (The language is strange for only a few pages, I promise. And note that 'Alex' is a very common Russian name.) Using Nadsat has the effect of making the book feel distinctly alien, but it also prevents it from prematurely aging too. Indeed, it comes as bit of a shock to realise that A Clockwork Orange was published 1962, the same year as The Beatles' released their first single, Love Me Do. I could probably say a whole lot more about this thoroughly engrossing book and its movie adaptation (eg. the meta-textual line in Kubrick's version: It's funny how the colours of the real world only seem really real when you watch them on a screen... appears verbatim in the textual original), but I'll leave it there. The book of A Clockwork Orange is not only worth the investment in the language, but is, again, somehow better than the film.

The Great Gatsby (1925) F. Scott Fitzgerald I'm actually being a little deceitful by including this book here: I cannot really say that The Great Gatsby was a 'favourite' read of the year, but its literary merit is so undeniable (and my respect for Fitzgerald's achievement is deep enough) that the experience was one of those pleasures you feel at seeing anything done well. Here you have a book so rich in symbolic meaning that you could easily confuse the experience with drinking Coke syrup undiluted. And a text that has made the difficulty and complexity of reading character a prominent theme of the novel, as well as a technical concern of the book itself. Yet at all times you have in your mind that The Great Gatsby is first and foremost a book about a man writing a book, and, therefore, about the construction of stories and myths. What is the myth being constructed in Gatsby? The usual answer today is that the book is really about the moral virtues of America. Or, rather, the lack thereof. Indeed, as James Boice wrote in 2016:
Could Wilson have killed Gatsby any other way? Could he have ran him over, or poisoned him, or attacked him with a knife? Not at all this an American story, the quintessential one, so Gatsby could have only died the quintessential American death.
The quintessential American death is, of course, being killed with a gun. Whatever your own analysis, The Great Gatsby is not only magnificently written, but it is captivating to the point where references intrude many months later. For instance, when reading something about Disney's 'princess culture', I was reminded of when Daisy says of her daughter: I hope she'll be a fool that's the best thing of a girl can be in this world, a beautiful little fool . Or the billboard with the eyes of 'Doctor T. J. Eckleburg'. Or the fact that the books in Gatsby's library have never been read (so what is 'Owl Eyes' doing there during the party?!). And the only plain room in Gatsby's great house is his bedroom... Okay, fine, I must have been deluding myself: I love this novel.

10 August 2021

Shirish Agarwal: BBI, IP report, State Borders and Civil Aviation I

If I have seen further, it is by standing on the shoulders of Giants Issac Newton, 1675. Although it should be credited to 12th century Bernard of Chartres. You will know why I have shared this, probably at the beginning of Civil Aviation history itself.

Comments on the BBI court case which happened in Kenya, then and the subsequent appeal. I am not going to share much about the coverage of the BBI appeal as Gautam Bhatia has shared quite eloquently his observations, both on the initial case and the subsequent appeal which lasted 5 days in Kenya and was shown all around the world thanks to YouTube. One of the interesting points which stuck with me was that in Kenya, sign language is one of the official languages. And in fact, I was able to read quite a bit about the various sign languages which are there in Kenya. It just boggles the mind that there are countries that also give importance to such even though they are not as rich or as developed as we call developed economies. I probably might give more space and give more depth as it does carry some important judicial jurisprudence which is and which will be felt around the world. How does India react or doesn t is probably another matter altogether  But yes, it needs it own space, maybe after some more time. Report on Standing Committee on IP Regulation in India and the false promises. Again, I do not want to take much time in sharing details about what the report contains, as the report can be found here. I have uploaded it on WordPress, in case of an issue. An observation on the same subject can be found here. At least, to me and probably those who have been following the IP space as either using/working on free software or even IP would be aware that the issues shared have been known since 1994. And it does benefit the industry rather than the country. This way, the rent-seekers, and monopolists win. There is ample literature that shared how rich countries had weak regulation for decades and even centuries till it was advantageous for them to have strong IP. One can look at the history of Europe and the United States for it. We can also look at the history of our neighbor China, which for the last 5 decades has used some provision of IP and disregarded many others. But these words are of no use, as the policies done and shared are by the rich for the rich.

Fighting between two State Borders Ironically or because of it, two BJP ruled states Assam and Mizoram fought between themselves. In which 6 policemen died. While the history of the two states is complicated it becomes a bit more complicated when one goes back into Assam and ULFA history and comes to know that ULFA could not have become that powerful until and unless, the Marwaris, people of my clan had not given generous donations to them. They thought it was a good investment, which later would turn out to be untrue. Those who think ULFA has declined, or whatever, still don t have answers to this or this. Interestingly, both the Chief Ministers approached the Home Minister (Mr. Amit Shah) of BJP. Mr. Shah was supposed to be the Chanakya but in many instances, including this one, he decided to stay away. His statement was on the lines of you guys figure it out yourself. There is a poem that was shared by the late poet Rahat Indori. I am sharing the same below as an image and will attempt to put a rough translation.
kisi ke baap ka hindustan todi hain Rahat Indori
Poets, whether in India or elsewhere, are known to speak truth to power and are a bit of a rebel. This poem by Rahat Indori is provocatively titled Kisi ke baap ka Hindustan todi hai , It challenges the majoritarian idea that Hindustan/India only belongs to the majoritarian religion. He also challenges as well as asserts at the same time that every Indian citizen, regardless of whatever his or her religion might be, is an Indian and can assert India as his home. While the whole poem is compelling in itself, for me what hits home is in the second stanza

:Lagegi Aag to aayege ghat kayi zad me, Yaha pe sirf hamara makan todi hai The meaning is simple yet subtle, he uses Aag or Fire as a symbol of hate sharing that if hate spreads, it won t be his home alone that will be torched. If one wants to literally understand what he meant, I present to you the cult Russian movie No Escapes or Ogon as it is known in Russian. If one were to decipher why the Russian film doesn t talk about climate change, one has to view it from the prism of what their leader Vladimir Putin has said and done over the years. As can be seen even in there, the situation is far more complex than one imagines. Although, it is interesting to note that he decried Climate change as man-made till as late as last year and was on the side of Trump throughout his presidency. This was in 2017 as well as perhaps this. Interestingly, there was a change in tenor and note just a couple of weeks back, but that could be only politicking or much more. Statements that are not backed by legislation and application are usually just a whitewash. We would have to wait to see what concrete steps are taken by Putin, Kremlin, and their Duma before saying either way.

Civil Aviation and the broad structure Civil Aviation is a large topic and I would not be able to do justice to it all in one article/blog post. So, for e.g. I will not be getting into Aircraft (Boeing, Airbus, Comac etc., etc.) or the new electric aircraft as that will just make the blog post long. I will not be also talking about cargo or Visa or many such topics, as all of them actually would and do need their own space. So this would be much more limited to Airports and to some extent airlines, as one cannot survive without the other. The primary reason for doing this is there is and has been a lot of myth-making in India about Civil Aviation in general, whether it has to do with Civil Aviation history or whatever passes as of policy in India.

A little early history Man has always looked at the stars and envisaged himself or herself as a bird, flying with gay abandon. In fact, there have been many paintings, sculptors who imagined how we would fly. The Steam Engine itself was invented in 82 BCE. But the attempt to fly was done by a certain Monk called Brother Elmer of Malmesbury who attempted the same in 1010., shortly after the birth of the rudimentary steam engine The most famous of all would be Leonardo da Vinci for his amazing sketches of flying machines in 1493. There were a couple of books by Cyrano de Bergerac, apparently wrote two books, both sadly published after his death. Interestingly, you can find both the book and the gentleman in the Project Gutenberg archives. How much of M/s Cyrano s exploits were his own and how much embellished by M/S Curtis, maybe a friend, a lover who knows, but it does give the air of the swashbuckling adventurer of the time which many men aspired to in that time. So, why not an author???

L Autre Monde: ou les tats et Empires de la Lune (Comical History of the States and Empires of the Moon) and Les tats et Empires du Soleil (The States and Empires of the Sun). These two French books apparently had a lot of references to flying machines. Both of them were authored by Cyrano de Bergerac. Both of these were sadly published after his death, one apparently in 1656 and the other one a couple of years later. By the 17th century, while it had become easy to know and measure the latitude, measuring longitude was a problem. In fact, it can be argued and probably successfully that India wouldn t have been under British rule or UK wouldn t have been a naval superpower if it hadn t solved the longitudinal problem. Over the years, the British Royal Navy suffered many blows, one of the most famous or infamous among them might be the Scilly naval disaster of 1707 which led to the death of 2000 odd British Royal naval personnel and led to Queen Anne, who was ruling over England at that time via Parliament and called it the Longitude Act which basically was an open competition for anybody to fix the problem and carried the prize money of 20,000. While nobody could claim the whole prize, many did get smaller amounts depending upon the achievements. The best and the nearest who came was John Harrison who made the first sea-watch and with modifications, over the years it became miniaturized to a pocket-sized Marine chronometer although, I doubt the ones used today look anything in those days. But if that had not been invented, we surely would have been freed long ago. The assumption being that the East India Company would have dashed onto rocks so many times, that the whole exercise would have been futile. The downside of it is that maritime trade routes that are being used today and the commerce would not have been. Neither would have aircraft or space for that matter, or at the very least delayed by how many years or decades, nobody knows. If one wants to read about the Longitudinal problem, one can get the famous book Longitude .

In many mythologies, including Indian and Arabian tales, in which we had the flying carpet which would let its passengers go from one place to the next. Then there is also mention of Pushpak Vimana in ancient texts, but those secrets remain secrets. Think how much foreign exchange India could make by both using it and exporting the same worldwide. And I m being serious. There are many who believe in it, but sadly, the ones who know the secret don t seem to want India s progress. Just think of the carbon credits that India could have, which itself would make India a superpower. And I m being serious.

Western Ideas and Implementation. Even in the late and early 18th century, there were many machines that were designed to have controlled flight, but it was only the Wright Flyer that was able to demonstrate a controlled flight in 1903. The ones who came pretty close to what the Wrights achieved were the people by the name of Cayley and Langley. They actually studied what the pioneers had done. They looked at what Otto Lilienthal had done, as he had done a lot of hang-gliding and put a lot of literature in the public domain then.

Furthermore, they also consulted Octave Chanute. The whole system and history of the same are a bit complicated, but it does give a window to what happened then. So, it won t be wrong to say that whatever the Wright Brothers accomplished would probably not have been possible or would have taken years or maybe even decades if that literature and experiments, drawings, etc. in the commons were not available. So, while they did experimentation, they also looked at what other people were doing and had done which was in public domain/commons.

They also did a lot of testing, which gave them new insights. Even the propulsion system they used in the 1903 flight was a design by Nicolaus Otto. In fact, the Aircraft would not have been born if the Chinese had not invented kites in the early sixth century A.D. One also has to credit Issac Newton because of the three laws of motion, again without which none of the above could have happened. What is credited to the Wilbur brothers is not just they made the Kitty Hawk, they also made it commercial as they sold it and variations of the design to the American Air Force and also made a pilot school where pilots were trained for warfighting. 119 odd pilots came out of that school. The Wrights thought that air supremacy would end the war early, but this turned out to be a false hope.

Competition and Those Magnificent Men and their flying machines One of the first competitions to unlock creativity was the English Channel crossing offer made by Daily Mail. This was successfully done by the Frenchman Louis Bl riot. You can read his account here. There were quite a few competitions before World War 1 broke out. There is a beautiful, humorous movie that does dedicate itself to imagining how things would have gone in that time. In fact, there have been two movies, this one and an earlier movie called Sky Riders made many a youth dream. The other movie sadly is not yet in the public domain, and when it will be nobody knows, but if you see it or even read it, it gives you goosebumps.

World War 1 and Improvements to Aircraft World War 1 is remembered as the Great War or the War to end all wars in an attempt at irony. It did a lot of destruction of both people and property, and in fact, laid the foundation of World War 2. At the same time, if World War 1 hadn t happened then Airpower, Plane technology would have taken decades. Even medicine and medical techniques became revolutionary due to World War 1. In order to be brief, I am not sharing much about World War 1 otherwise that itself would become its own blog post. And while it had its heroes and villains who, when, why could be tackled perhaps another time.

The Guggenheim Family and the birth of Civil Aviation If one has to credit one family for the birth of the Civil Aviation, it has to be the Guggenheim family. Again, I would not like to dwell much as much of their contribution has already been noted here. There are quite a few things still that need to be said and pointed out. First and foremost is the fact that they made lessons about flying from grade school to college and afterward till college and beyond which were in the syllabus, whereas in the Indian schooling system, there is nothing like that to date. Here, in India, even in Engineering courses, you don t have much info. Unless until you go for professional Aviation or Aeronautical courses and most of these courses cost a bomb so either the very rich or the very determined (with loans) only go for that, at least that s what my friends have shared. And there is no guarantee you will get a job after that, especially in today s climate. Even their fund, grants, and prizes which were given to people for various people so that improvements could be made to the United States Civil Aviation. This, as shared in the report/blog post shared, was in response to what the younger child/brother saw as Europe having a large advantage both in Military and Civil Aviation. They also made several grants in several Universities which would not only do notable work during their lifetime but carry on the legacy researching on different aspects of Aircraft. One point that should be noted is that Europe was far ahead even then of the U.S. which prompted the younger son. There had already been talks of civil/civilian flights on European routes, although much different from what either of us can imagine today. Even with everything that the U.S. had going for her and still has, Europe is the one which has better airports, better facilities, better everything than the U.S. has even today. If you look at the lists of the Airports for better value of money or facilities, you would find many Airports from Europe, some from Asia, and only a few from the U.S. even though they are some of the most frequent users of the service. But that debate and arguments I would have to leave for perhaps the next blog post as there is still a lot to be covered between the 1930s, 1950s, and today. The Guggenheims archives does a fantastic job of sharing part of the story till the 1950s, but there is also quite a bit which it doesn t. I will probably start from that in the next blog post and then carry on ahead. Lastly, before I wind up, I have to share why I felt the need to write, capture and share this part of Aviation history. The plain and simple reason being, many of the people I meet either on the web, on Twitter or even in real life, many of them are just unaware of how this whole thing came about. The unawareness in my fellow brothers and sisters is just shocking, overwhelming. At least, by sharing these articles, I at least would be able to guide them or at least let them know how it all came to be and where things are going and not just be so clueless. Till later.

5 December 2020

Joachim Breitner: Named goals in Coq

TL;DR: Some mildly tricky tricks allow you to select subgoals by name, like in Isabelle. This leads to more readable proofs, and proofs that are more robust against changes.

The need for case names Consider the following Coq proof, right out of the respected Software Foundations book, Chapter Inductively Defined Propositions (and slightly simplified):
Lemma star_app: forall T (s1 s2 : list T) (re : reg_exp T),
  s1 =~ Star re ->
  s2 =~ Star re ->
  s1 ++ s2 =~ Star re.
Proof.
  intros T s1 s2 re H1.
  remember (Star re) as re'.
   generalize dependent s2.
  induction H1.
  - (* MEmpty *) discriminate.
  - (* MChar *) discriminate.
  - (* MApp *) discriminate.
  - (* MUnionL *) discriminate.
  - (* MUnionR *) discriminate.
  - (* MStar0 *)
    injection Heqre' as Heqre''. intros s H. apply H.
  - (* MStarApp *)
    injection Heqre' as Heqre''.
    intros s3 H1. rewrite <- app_assoc.
    apply MStarApp.
    + apply H1_.
    + apply IHexp_match2.
      * rewrite Heqre''. reflexivity.
      * apply H1.
Qed.
This is a very typical example for a proof by induction over an inductively defined relation (in this case, the relations s =~ re which indicates that the string s matches the regular expression re): After some preparation, the invocation of a tactic like induction (could also be inversion or others) creates a number of new goals, one of for each rule of the relation. Then the modern(!) Coq user uses proof bullets to focus the goals, one after another, in order, to solve them. We can see from this example that proof bullets (which were a big improvement) are not quite satisfactory, and the author of this proof felt the need to annotate each subproof with the case it corresponds to. But these are mere comments! This is unsatisfactory, as there is nothing keeping them correct as my proof changes. And anyone who came to Coq from Isabelle is just flabbergasted that the state of art in Coq is still at that level So, wouldn t it be great if
  • I could focus goals by their name (e.g. MEmpty)
  • Focus them in any order (so that refactoring of my data types do not break my proofs)
  • This even worked when simultaneously inducting over or inverting multiple relations?

Simulating named goals in Coq Well, with a few tricks we can! It just takes three simple steps:
  1. I load a small file full of hacks (explained below):
    Require Import NamedCases.
  2. This allows me to prefix the type of the constructors of the inductive relations (i.e. the rules of the relation) with case <name>, , as follows:
    Reserved Notation "s =~ re" (at level 80).
    Inductive exp_match  T  : list T -> reg_exp T -> Prop :=
        MEmpty:
        case empty,
        [] =~ EmptyStr
        MChar:
        case char,
        forall x,
        [x] =~ (Char x)
        MApp:
        case app,
        forall s1 re1 s2 re2
        (H1 : s1 =~ re1)
        (H2 : s2 =~ re2),
        (s1 ++ s2) =~ (App re1 re2)
        MUnionL:
        case union_l,
        forall s1 re1 re2
        (H1 : s1 =~ re1),
        s1 =~ (Union re1 re2)
        MUnionR:
        case union_r,
        forall re1 s2 re2
        (H2 : s2 =~ re2),
        s2 =~ (Union re1 re2)
        MStar0:
        case star0,
        forall re,
        [] =~ (Star re)
        MStarApp:
        case star_app,
        forall s1 s2 re
        (H1 : s1 =~ re)
        (H2 : s2 =~ (Star re)),
        (s1 ++ s2) =~ (Star re)
      where "s =~ re" := (exp_match s re).
    This is the definition from the Software Foundations book; only the case <name>, lines are added.
  3. Now I can change the proof: After the induction tactic I use ; name_cases to name the cases, and then I can use the relatively new named goal focusing feature ([name]: , which I suggested two years ago, and was implemented by Th o Zimmermann) to focus a specific case:
    Lemma star_app2: forall T (s1 s2 : list T) (re : reg_exp T),
      s1 =~ Star re ->
      s2 =~ Star re ->
      s1 ++ s2 =~ Star re.
    Proof.
      intros T s1 s2 re H1.
      remember (Star re) as re'.
       generalize dependent s2.
      induction H1; name_cases.
      [empty]:  
         discriminate.
       
      [char]:  
         discriminate.
       
      [app]:  
         discriminate.
       
      [unionL]:  
         discriminate.
       
      [unionR]:  
        discriminate.
       
      [star0]:  
        injection Heqre' as Heqre''. intros s H. apply H.
       
      [starApp]:  
        injection Heqre' as Heqre''.
        intros s3 H1. rewrite <- app_assoc.
        apply MStarApp; clear_names.
        + apply H1_.
        + apply IHexp_match2.
          * rewrite Heqre''. reflexivity.
          * apply H1.
        
    Qed.
    The comments have turned into actual names!
Goal focusing is now just much more reliable. For example, let me try to discharge the boring cases by using ; try discriminate after the induction. This means there are suddenly only two goals left. With bullets, the first bullet would now suddenly focus a different goal, and my proof script would likely fail somewhere. With named goals, I get a helpful error message No such goal: empty and I can t even enter the first subproof. So let me delete these goals and, just to show off, focus the remaining goals in the wrong order:
Lemma star_app: forall T (s1 s2 : list T) (re : reg_exp T),
  s1 =~ Star re ->
  s2 =~ Star re ->
  s1 ++ s2 =~ Star re.
Proof.
  intros T s1 s2 re H1.
  remember (Star re) as re'.
   generalize dependent s2.
  induction H1; name_cases; try discriminate.
  [starApp]:  
    injection Heqre' as Heqre''.
    intros s3 H1. rewrite <- app_assoc.
    apply MStarApp; clear_names.
    + apply H1_.
    + apply IHexp_match2.
      * rewrite Heqre''. reflexivity.
      * apply H1.
    
  [star0]:  
    injection Heqre' as Heqre''. intros s H. apply H.
   
Qed.
I did not explain the clear_names tactic yet that I had to use after apply MStarApp. What I am showing here is a hack, and it shows these case names are extra hypothesis (of a trivially true type CaseName), and this tactic gets rid of it.

Even better than Isabelle The above even works when there are multiple inductions/inversions involved. To give an example, let s introduce another inductive predicate, and begin proving a lemma that one can do by induction on one and inversion on the other predicate.
Inductive Palindrome  T  : list T -> Prop :=
  EmptyPalin:
  case emptyP,
  Palindrome []
  SingletonPalin:
  case singletonP,
  forall x,
  Palindrome [x]
  GrowPalin:
  case growP,
  forall l x,
  Palindrome l -> Palindrome ([x] ++ l ++ [x])
.
Lemma palindrome_star_of_two:
  forall T (s : list T) x y,
  Palindrome s -> s =~ Star (App (Char x) (Char y)) ->
  s = [] \/ x = y.
Proof.
  intros T s x y HPalin HRe.
  induction HPalin; inversion HRe; name_cases.
At this point we have four open goals. Can you tell why there are four, and which ones they are? Luckily, Show Existentials will tell us their names (and Coq 8.13 will just show them in the proof state window). These names not only help us focus the goal in a reliable way; they even assist us in understanding what goal we have to deal with. For example, the last goal arose from the GrowP rule and the StarApp rule. Neat!
  [emptyP_star0]:  
    left. reflexivity.
   
  [emptyP_starApp]:  
     left. reflexivity.
   
  [singletonP_starApp]:  
    inversion HRe; clear HRe.
    inversion H5; clear H5.
    inversion H10; clear H10.
    inversion H11; clear H11.
    subst.
    inversion H3.
   
 [growP_starApp]:  
    admit. (* I did not complete the proof *)
  

Under the hood (the hack) The above actually works, but it works with a hack, and I wish that Coq s induction tactic would just do the right thing here. I hope eventually this will be the case, and this section will just be curiosity but until, it may actually be helpful.
  1. The trick1 is to leave markers in the proof state. We use a dedicate type CaseName, and a hypothesis name : CaseName indicates that this goal arose from the that case.
    Inductive CaseName := CaseNameI.
    With this, we can write StarApp: forall (starApp : CaseName), to name this case.
  2. For a little bit of polish, we can introduce notation for that:
    Notation "'case' x , t" := (forall  x : CaseName , t) (at level 200).
    This allows us to use the syntax you saw above.
  3. To easily discharge these CaseName assumptions when using the introduction forms, I wrote these tactics:
    Ltac clear_names := try exact CaseNameI.
    Ltac named_constructor  := constructor; [ exact CaseNameI   idtac .. ].
    Ltac named_econstructor := econstructor; [ exact CaseNameI   idtac .. ].
  4. Finally I wrote the tactic name_cases. What it does is relatively simple:
    For each subgoal: Take all hypotheses of type CaseName, concatenate their names, change the name of the current goal to that, and clear these hypotheses
    Unfortunately, my grasp of Coq tactic programming is rather rudimentary, and the only way I manged to solve this is a horrible monster of Ltac2, including hand-rolled imperative implementations of string concatenation and a little nested Ltac1 tactic within. You have been warned:
    From Ltac2 Require Import Ltac2.
    From Ltac2 Require Option.
    Set Default Proof Mode "Classic".
    Ltac name_cases := ltac2:(
      (* Horribly manual string manipulations. Does this mean I should
         go to the Ocaml level?
      *)
      let strcpy s1 s2 o :=
        let rec go := fun n =>
          match Int.lt n (String.length s2) with
            true => String.set s1 (Int.add o n) (String.get s2 n); go (Int.add n 1)
            false => ()
          end
        in go 0
      in
      let concat := fun s1 s2 =>
        let l := Int.add (Int.add (String.length s1) (String.length s2)) 1 in
        let s := String.make l (Char.of_int 95) in
        strcpy s s1 0;
        strcpy s s2 (Int.add (String.length s1) 1);
        s
      in
      Control.enter (let rec go () :=
        lazy_match! goal with
          [ h1 : CaseName, h2 : CaseName  - _ ] =>
          (* Multiple case names? Combine them (and recurse) *)
          let h := concat (Ident.to_string h1) (Ident.to_string h2) in
          Std.clear [h1; h2];
          let h := Option.get (Ident.of_string h) in
          assert ($h := CaseNameI);
          go ()
          [ _ : CaseName  - _ ] =>
          (* A single case name? Set current goal name accordingly. *)
          ltac1:(
            (* How to do this in ltac2? *)
            lazymatch goal with
              [ H : CaseName  - _ ] => refine ?[H]; clear H
            end
          )
          [  - _ ] =>
          Control.backtrack_tactic_failure "Did not find any CaseName hypotheses"
        end
      in go)
    ).
    If someone feels like re-implementing this in a way that is more presentable, I ll be happy to update this code (and learn something along the way
That s all it takes!

Conclusion With some mild hackery, we can get named cases in Coq, which I find indispensable in managing larger, evolving proofs. I wonder why this style is not more common, and available out of the box, but maybe it is just not known well enough? Maybe this blog post can help a bit here.

  1. I have vague recollections of seeing at least parts of this trick on some blog some years ago, and I do not claim originality here.

21 November 2020

Michael Stapelberg: Debian Code Search: positional index, TurboPFor-compressed

See the Conclusion for a summary if you re impatient :-)

Motivation Over the last few months, I have been developing a new index format for Debian Code Search. This required a lot of careful refactoring, re-implementation, debug tool creation and debugging. Multiple factors motivated my work on a new index format:
  1. The existing index format has a 2G size limit, into which we have bumped a few times, requiring manual intervention to keep the system running.
  2. Debugging the existing system required creating ad-hoc debugging tools, which made debugging sessions unnecessarily lengthy and painful.
  3. I wanted to check whether switching to a different integer compression format would improve performance (it does not).
  4. I wanted to check whether storing positions with the posting lists would improve performance of identifier queries (= queries which are not using any regular expression features), which make up 78.2% of all Debian Code Search queries (it does).
I figured building a new index from scratch was the easiest approach, compared to refactoring the existing index to increase the size limit (point ). I also figured it would be a good idea to develop the debugging tool in lock step with the index format so that I can be sure the tool works and is useful (point ).

Integer compression: TurboPFor As a quick refresher, search engines typically store document IDs (representing source code files, in our case) in an ordered list ( posting list ). It usually makes sense to apply at least a rudimentary level of compression: our existing system used variable integer encoding. TurboPFor, the self-proclaimed Fastest Integer Compression library, combines an advanced on-disk format with a carefully tuned SIMD implementation to reach better speeds (in micro benchmarks) at less disk usage than Russ Cox s varint implementation in github.com/google/codesearch. If you are curious about its inner workings, check out my TurboPFor: an analysis . Applied on the Debian Code Search index, TurboPFor indeed compresses integers better:

Disk space
8.9G codesearch varint index
5.5G TurboPFor index Switching to TurboPFor (via cgo) for storing and reading the index results in a slight speed-up of a dcs replay benchmark, which is more pronounced the more i/o is required.

Query speed (regexp, cold page cache)
18s codesearch varint index
14s TurboPFor index (cgo)

Query speed (regexp, warm page cache)
15s codesearch varint index
14s TurboPFor index (cgo) Overall, TurboPFor is an all-around improvement in efficiency, albeit with a high cost in implementation complexity.

Positional index: trade more disk for faster queries This section builds on the previous section: all figures come from the TurboPFor index, which can optionally support positions. Conceptually, we re going from:
type docid uint32
type index map[trigram][]docid
to:
type occurrence struct  
    doc docid
    pos uint32 // byte offset in doc
 
type index map[trigram][]occurrence
The resulting index consumes more disk space, but can be queried faster:
  1. We can do fewer queries: instead of reading all the posting lists for all the trigrams, we can read the posting lists for the query s first and last trigram only.
    This is one of the tricks described in the paper AS-Index: A Structure For String Search Using n-grams and Algebraic Signatures (PDF), and goes a long way without incurring the complexity, computational cost and additional disk usage of calculating algebraic signatures.
  2. Verifying the delta between the last and first position matches the length of the query term significantly reduces the number of files to read (lower false positive rate).
  3. The matching phase is quicker: instead of locating the query term in the file, we only need to compare a few bytes at a known offset for equality.
  4. More data is read sequentially (from the index), which is faster.

Disk space A positional index consumes significantly more disk space, but not so much as to pose a challenge: a Hetzner EX61-NVME dedicated server ( 64 /month) provides 1 TB worth of fast NVMe flash storage.
6.5G non-positional
123G positional
93G positional (posrel) The idea behind the positional index (posrel) is to not store a (doc,pos) tuple on disk, but to store positions, accompanied by a stream of doc/pos relationship bits: 1 means this position belongs to the next document, 0 means this position belongs to the current document. This is an easy way of saving some space without modifying the TurboPFor on-disk format: the posrel technique reduces the index size to about . With the increase in size, the Linux page cache hit ratio will be lower for the positional index, i.e. more data will need to be fetched from disk for querying the index. As long as the disk can deliver data as fast as you can decompress posting lists, this only translates into one disk seek s worth of additional latency. This is the case with modern NVMe disks that deliver thousands of MB/s, e.g. the Samsung 960 Pro (used in Hetzner s aforementioned EX61-NVME server). The values were measured by running dcs du -h /srv/dcs/shard*/full without and with the -pos argument.

Bytes read A positional index requires fewer queries: reading only the first and last trigram s posting lists and positions is sufficient to achieve a lower (!) false positive rate than evaluating all trigram s posting lists in a non-positional index. As a consequence, fewer files need to be read, resulting in fewer bytes required to read from disk overall. As an additional bonus, in a positional index, more data is read sequentially (index), which is faster than random i/o, regardless of the underlying disk.
1.2G
19.8G
21.0G regexp queries
4.2G (index)
10.8G (files)
15.0G identifier queries The values were measured by running iostat -d 25 just before running bench.zsh on an otherwise idle system.

Query speed Even though the positional index is larger and requires more data to be read at query time (see above), thanks to the C TurboPFor library, the 2 queries on a positional index are roughly as fast as the n queries on a non-positional index ( 4s instead of 3s). This is more than made up for by the combined i/o matching stage, which shrinks from 18.5s (7.1s i/o + 11.4s matching) to 1.3s.
3.3s (index)
7.1s (i/o)
11.4s (matching)
21.8s regexp queries
3.92s (index)
1.3s
5.22s identifier queries Note that identifier query i/o was sped up not just by needing to read fewer bytes, but also by only having to verify bytes at a known offset instead of needing to locate the identifier within the file.

Conclusion The new index format is overall slightly more efficient. This disk space efficiency allows us to introduce a positional index section for the first time. Most Debian Code Search queries are positional queries (78.2%) and will be answered much quicker by leveraging the positions. Bottomline, it is beneficial to use a positional index on disk over a non-positional index in RAM.

23 March 2020

Dima Kogan: org-babel for documentation

So I just gave a talk at SCaLE 18x about numpysane and gnuplotlib, two libraries I wrote to make using numpy bearable. With these two, it's actually quite nice! Prior to the talk I overhauled the documentation for both these projects. The gnuplotlib docs now have a tutorial/gallery page, which is interesting-enough to write about. Check it out! Mostly it is a sequence of Clearly you want the plots in the documentation to correspond to the code, so you want something to actually run each code snippet to produce each plot. Automatically. I don't want to maintain these manually, and periodically discover that the code doesn't make the plot I claim it does or worse: that the code barfs. This is vaguely what Jupyter notebooks do, but they're ridiculous, so I'm doing something better: That's it. The git repo is hosted by github, which has a rudimentary renderer for .org documents. I'm committing the .svg files, so that's enough to get rendered documentation that looks nice. Note that the usual workflow is to use org to export to html, but here I'm outsourcing that job to github; I just make the .svg files, and that's enough. Look at the link again: gnuplotlib tutorial/gallery. This is just a .org file committed to the git repo. github is doing its normal org->html thing to display this file. This has drawbacks too: github is ignoring the :noexport: tag on the init section at the end of the file, so it's actually showing all the emacs lisp goop that makes this work (described below!). It's at the end, so I guess this is good-enough. Those of us that use org-mode would be completely unsurprised to hear that the talk is also written as .org document. And the slides that show gnuplotlib plots use the same org-babel system to render the plots. It's all oh-so-nice. As with anything as flexible as org-babel, it's easy to get into a situation where you're bending it to serve a not-quite-intended purpose. But since this all lives in emacs, you can make it do whatever you want with a bit of emacs lisp. I ended up advising a few things (mailing list post here). And I stumbled on an (arguable) bug in emacs that needed working around (mailing list post here). I'll summarize both here.

Handling large Local Variables blocks
The advises I ended up with ended up longer than emacs expected, which made emacs not evaluate them when loading the buffer. As I discovered (see the mailing list post) the loading code looks for the string Local Variables in the last 3000 bytes of the buffer only, and I exceeded that. Stefan Monnier suggested a workaround in this post. Instead of the normal Local Variables block at the end:
Local Variables:
eval: (progn ... ...
             ... ...
             LONG chunk of emacs-lisp
      )
End:
I do this:
(progn ;;local-config
   lisp lisp lisp
   as long as I want
)
Local Variables:
eval: (progn (re-search-backward "^(progn ;;local-config") (eval (read (current-buffer))))
End:
So emacs sees a small chunk of code that searches backwards through the buffer (as far back as needed) for the real lisp to evaluate. As an aside, this blog is also an .org document, and the lisp snippets above are org-babel blocks that I'm not evaluating. The exporter knows to respect the emacs-lisp syntax highlighting, however.

Advises
OK, so what was all the stuff I needed to tell org-babel to do specially here? First off, org needed to be able to communicate to the Python session the name of the file to write the plot to. I do this by making the whole plist for this org-babel snippet available to python:
;; THIS advice makes all the org-babel parameters available to python in the
;; _org_babel_params dict. I care about _org_babel_params['_file'] specifically,
;; but everything is available
(defun dima-org-babel-python-var-to-python (var)
  "Convert an elisp value to a python variable.
  Like the original, but supports (a . b) cells and symbols
"
  (if (listp var)
      (if (listp (cdr var))
          (concat "[" (mapconcat #'org-babel-python-var-to-python var ", ") "]")
        (format "\"\"\"%s\"\"\"" var))
    (if (symbolp var)
        (format "\"\"\"%s\"\"\"" var)
      (if (eq var 'hline)
          org-babel-python-hline-to
        (format
         (if (and (stringp var) (string-match "[\n\r]" var)) "\"\"%S\"\"" "%S")
         (if (stringp var) (substring-no-properties var) var))))))
(defun dima-alist-to-python-dict (alist)
  "Generates a string defining a python dict from the given alist"
  (let ((keyvalue-list
         (mapcar (lambda (x)
                   (format "%s = %s, "
                           (replace-regexp-in-string
                            "[^a-zA-Z0-9_]" "_"
                            (symbol-name (car x)))
                           (dima-org-babel-python-var-to-python (cdr x))))
                 alist)))
    (concat
     "dict( "
     (apply 'concat keyvalue-list)
     ")")))
(defun dima-org-babel-python-pass-all-params (f params)
  (cons
   (concat
    "_org_babel_params = "
    (dima-alist-to-python-dict params))
   (funcall f params)))
(unless
    (advice-member-p
     #'dima-org-babel-python-pass-all-params
     #'org-babel-variable-assignments:python)
  (advice-add
   #'org-babel-variable-assignments:python
   :around #'dima-org-babel-python-pass-all-params))
So if there's a :file plist key, the python code can grab that, and write the plot to that filename. But I don't really want to specify an output file for every single org-babel snippet. All I really care about is that each plot gets a unique filename. So I omit the :file key entirely, and use this advice to generate one for me:
;; This sets a default :file tag, set to a unique filename. I want each demo to
;; produce an image, but I don't care what it is called. I omit the :file tag
;; completely, and this advice takes care of it
(defun dima-org-babel-python-unique-plot-filename
    (f &optional arg info params)
  (funcall f arg info
           (cons (cons ':file
                       (format "guide-%d.svg"
                               (condition-case nil
                                   (setq dima-unique-plot-number (1+ dima-unique-plot-number))
                                 (error (setq dima-unique-plot-number 0)))))
                 params)))
(unless
    (advice-member-p
     #'dima-org-babel-python-unique-plot-filename
     #'org-babel-execute-src-block)
  (advice-add
   #'org-babel-execute-src-block
   :around #'dima-org-babel-python-unique-plot-filename))
This uses the dima-unique-plot-number integer to keep track of each plot. I increment this with each plot. Getting closer. It isn't strictly required, but it'd be nice if each plot had the same output filename each time I generated it. So I want to reset the plot number to 0 each time:
;; If I'm regenerating ALL the plots, I start counting the plots from 0
(defun dima-reset-unique-plot-number
    (&rest args)
    (setq dima-unique-plot-number 0))
(unless
    (advice-member-p
     #'dima-reset-unique-plot-number
     #'org-babel-execute-buffer)
  (advice-add
   #'org-babel-execute-buffer
   :after #'dima-reset-unique-plot-number))
Finally, I want to lie to the user a little bit. The code I'm actually executing writes each plot to an .svg. But the code I'd like the user to see should use the default output: an interactive, graphical window. I do that by tweaking the python session to tell the gnuplotlib object to write to .svg files from org by default, instead of using the graphical terminal:
;; I'm using github to display guide.org, so I'm not using the "normal" org
;; exporter. I want the demo text to not contain the hardcopy= tags, but clearly
;; I need the hardcopy tag when generating the plots. I add some python to
;; override gnuplotlib.plot() to add the hardcopy tag somewhere where the reader
;; won't see it. But where to put this python override code? If I put it into an
;; org-babel block, it will be rendered, and the :export tags will be ignored,
;; since github doesn't respect those (probably). So I put the extra stuff into
;; an advice. Whew.
(defun dima-org-babel-python-set-demo-output (f body params)
  (with-temp-buffer
    (insert body)
    (beginning-of-buffer)
    (when (search-forward "import gnuplotlib as gp" nil t)
      (end-of-line)
      (insert
       "\n"
       "if not hasattr(gp.gnuplotlib, 'orig_init'):\n"
       "    gp.gnuplotlib.orig_init = gp.gnuplotlib.__init__\n"
       "gp.gnuplotlib.__init__ = lambda self, *args, **kwargs: gp.gnuplotlib.orig_init(self, *args, hardcopy=_org_babel_params['_file'] if 'file' in _org_babel_params['_result_params'] else None, **kwargs)\n"))
    (setq body (buffer-substring-no-properties (point-min) (point-max))))
  (funcall f body params))
(unless
    (advice-member-p
     #'dima-org-babel-python-set-demo-output
     #'org-babel-execute:python)
  (advice-add
   #'org-babel-execute:python
   :around #'dima-org-babel-python-set-demo-output))
)
And that's it. The advises in the talk are slightly different, in uninteresting ways. Some of this should be upstreamed to org-babel somehow. Now entirely clear which part, but I'll cross that bridge when I get to it.

10 August 2017

Thorsten Glaser: New mksh and jupp releases, mksh FAQ, jupprc for JOE 4.4; MuseScore

mksh R56 was released with experimental fixes for the history no longer persisted when HISTFILE near-full and interactive shell cannot wait on coprocess by PID issues (I hope they do not introduce any regressioins) and otherwise as a bugfix release. You might wish to know the $EDITOR selection mechanism in dot.mkshrc changed. Some more alias characters are allowed again, and POSIX character classes (for ASCII, and EBCDIC, only) appeared by popular vote. mksh now has a FAQ; enjoy. Do feel free to contribute (answers, too, of course). The jupp text editor has also received a new release; asides from being much smaller, and updated (mksh too, btw) to Unicode 10, and some segfault fixes, it features falling back to using /dev/tty if stdin or stdout is not a terminal (for use on GNU with find xargs jupp, since they don t have our xargs(1) -o option yet), a new command to exit nonzero (sometimes, utilities invoking the generic visual editor need this), and presentation mode . Presentation mode, crediting Natureshadow, is basically putting your slides as (UTF-8, with fancy stuff inside) plaintext files into one directory, with sorting names (so e.g. zero-padded slide numbers as filenames), presenting them with jupp * in a fullscreen xterm. You d hit F6 to switch to one-file view first, then present by using F8 to go forward (F7 to go backward), and, for demonstrations, F9 to pipe the entire slide through an external command (could be just sh ) offering the previous one as default. Simple yet powerful; I imagine Sven Guckes would love it, were he not such a vim user. The new release is offered as source tarball (as usual) and in distribution packages, but also, again, a Win32 version as PKZIP archive (right-click on setup.inf and hit I nstall to install it). Note that this comes with its own (thankfully local) version of the Cygwin32 library (compatible down to Windows 95, apparently), so if you have Cygwin installed yourself you re better off compiling it there and using your own version instead. I ve also released a new DOS version of 2.8 with no code patches but an updated jupprc; the binary (self-extracting LHarc archive) this time comes with all resource files, not just jupp s. Today, the jupprc drop-in file for JOE 3.7 got a matching update (and some fixes for bugs discovered during that) and I added a new one for JOE 4.4 (the former being in Debian wheezy, the latter in jessie, stretch and buster/sid). It s a bit rudimentary (the new shell window functionality is absent) but, mostly, gives the desired jupp feeling, more so than just using stock jstar would. CVS ability to commit to multiple branches of a file at the same time, therefore grouping the commit (by commitid at least, unsure if cvsps et al. can be persuaded to recognise it). If you don t know what cvs(GNU) is: it is a proper (although not distributed) version control system and the best for centralised tasks. (For decentral tasks, abusing git as pseudo-VCS has won by popularity vote; take this as a comparison.) If desired, I can make these new versions available in my WTF APT repository on request. (Debian buster/sid users: please change https to http there, the site is only available with TLSv1.0 as it doesn t require bank-level security.) I d welcome it very much if people using an OS which does not yet carry either to package it there. Message me when one more is added, too In unrelated news I uploaded MuseScore 2.1 to Debian unstable, mostly because the maintainers are busy (though I could comaintain it if needed, I d just need help with the C++ and CMake details). Bonus side effect is that I can now build 2.2~ test versions with patches of mine added I plan to produce to fix some issues (and submit upstream) (read more )

1 July 2017

Daniel Silverstone: F/LOSS activity, June 2017

It seems to be becoming popular to send a mail each month detailing your free software work for that month. I have been slowly ramping my F/LOSS activity back up, after years away where I worked on completing my degree. My final exam for that was in June 2017 and as such I am now in a position to try and get on with more F/LOSS work. My focus, as you might expect, has been on Gitano which reached 1.0 in time for Stretch's release and which is now heading gently toward a 1.1 release which we have timed for Debconf 2017. My friend a colleague Richard has been working hard on Gitano and related components during this time too, and I hope that Debconf will be an opportunity for him to meet many of my Debian friends too. But enough of that, back to the F/LOSS. We've been running Gitano developer days roughly monthly since March of 2017, and the June developer day was attended by myself, Richard Maw, and Richard Ipsum. You are invited to read the wiki page for the developer day if you want to know exactly what we got up to, but a summary of my involvement that day is: Other than that, related to Gitano during June I: My non-Gitano F/LOSS related work in June has been entirely centred around the support I provide to the Lua community in the form of the Lua mailing list and website. The host on which it's run is ailing, and I've had to spend time trying to improve and replace that. Hopefully I'll have more to say next month. Perhaps by doing this reporting I'll get more F/LOSS done. Of course, July's report will be sent out while I'm in Montr al for debconf 2017 (or at least for debcamp at that point) so hopefully more to say anyway.

Next.