Search Results: "spectra"

24 January 2024

Louis-Philippe V ronneau: Montreal Subway Foot Traffic Data, 2023 edition

For the fifth year in a row, I've asked Soci t de Transport de Montr al, Montreal's transit agency, for the foot traffic data of Montreal's subway. By clicking on a subway station, you'll be redirected to a graph of the station's foot traffic. Licences

27 May 2023

Dirk Eddelbuettel: RcppArmadillo 0.12.4.0.0 on CRAN: New Upstream Minor

armadillo image Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language and is widely used by (currently) 1074 other packages on CRAN, downloaded 29.3 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 535 times according to Google Scholar. This release brings a new upstream release 12.4.0 made by Conrad a day or so ago. I prepared the usual release candidate, tested on the over 1000 reverse depends (which sadly takes almost a day on old hardware), found no issues and sent it to CRAN. Where it got tested again and was once again auto-processed smoothly by CRAN within a few hours on a Friday night which is just marvelous. So this time I tweeted about it too. The releases actually has a relatively small set of changes as a second follow-up release in the 12.* series.

Changes in RcppArmadillo version 0.12.4.0.0 (2023-05-26)
  • Upgraded to Armadillo release 12.4.0 (Cortisol Profusion Redux)
    • Added norm2est() for finding fast estimates of matrix 2-norm (spectral norm)
    • Added vecnorm() for obtaining the vector norm of each row or column of a matrix

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page. If you like my open-source work, you may consider sponsoring me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

30 April 2023

Russell Coker: Links April 2023

Cory Doctorow has an insightful article Gig Work is the Opposite of Steampunk [1] about the horrors that companies like Amazon are forcing on their employees. Valerie Aurora and Leigh Honeywell wrote an insightful article about the al Capone theory of sexual harassment [2]. Why people who sexually harass others usually perform other anti-social activity that is also easier to prosecute. The IEEE has an interesting article about using ML for parts of the CPU design process, both the technical issues and the controversy about competing ideas which is probably caused by sexism [3]. Love and taxes are forever my heart is a line from an anime dating sim game that prepares US taxes [4]. Unfortunately it was removed from Steam. The existence of the game is a weird social commentary and removing the game because you can t have an anime hottie do taxes is bizarre but also understandable given liability issues. There s no mention in the review of whether male hotties are available for people who prefer dating guys. As an aside my accountant looks like he is allergic to exercise The Killdozer Book web site (which has an invalid SSL certificate so you have to click on advanced in Chrome to get to the content) has an insightful article debunking some of the stories about the Killdozer [5]. He wasn t some sort of hero of freedom, he was just a jerk who reneged on a deal hoping to get more money, thought that laws shouldn t apply to him, and killed himself because of it. Apparently some big tech companies are knowingly hiring people to not work unlike the usual large corporate case of unknowingly hiring people to not work [6]. Silicon Valley is a good TV show, and it s apparently realistic. Ron Garrett wrote in insightful blog post about theoretical attacks on Bitcoin and how Bitcoin could be used [7]. The conclusion is not good for Bitcoin. Compiler Explorer is a program that shows how various C++ compilers produce assembly code for various architectures, this site hosts the main active instance [8]. There are other instances, here is an instance that produces code for the Ruzzian Elbrus architecture [9]. The Elbrus Wikipedia page is interesting [10]. Apparently the Ruzzians don t want this information to be spread, LOL. The Smithsonian Magazine has an interesting article about pet parrots being taught to video call each other [11]. Apparently parrots are social animals and can develop psychological problems if kept alone, so the video calls can be good for them. Also the owners had to monitor the chats to ensure that they played nicely together, just like play-dates for kids! Phoronix has an amusing article about the drama regarding the AMD Spectral Chicken bit in the Linux kernel source [12]. This page listing bad free software licenses is amusing [13]. The ACS has an interesting article about how Samsung fakes photos of the moon and presumably could fake other photos of notable objects that don t change [14]. The way that they proved the forgery was interesting.

26 January 2023

Louis-Philippe V ronneau: Montreal Subway Foot Traffic Data, 2022 edition

For the fourth year in a row, I've asked Soci t de Transport de Montr al, Montreal's transit agency, for the foot traffic data of Montreal's subway. By clicking on a subway station, you'll be redirected to a graph of the station's foot traffic. Licences

21 January 2022

Louis-Philippe V ronneau: Montreal Subway Foot Traffic Data, 2021 edition

For the third time now, I've asked Soci t de Transport de Montr al, Montreal's transit agency, for the foot traffic data of Montreal's subway. I think this has become an annual thing now :) The original blog post and the 2019-2020 edition can be read here: By clicking on a subway station, you'll be redirected to a graph of the station's foot traffic. Licences

3 January 2022

Thorsten Alteholz: My Debian Activities in December 2021

FTP master This month I accepted 412 and rejected 44 packages. The overall number of packages that got accepted was 423. Debian LTS This was my ninetieth month that I did some work for the Debian LTS initiative, started by Raphael Hertzog at Freexian. This month my all in all workload has been 40h. During that time I did LTS and normal security uploads of: I also started to work on libarchive Further I worked on packages in NEW on security-master. In order to faster process such packages, I added a notification when work arrived there. Last but not least I did some days of frontdesk duties. Debian ELTS This month was the forty-second ELTS month. During my allocated time I uploaded: Last but not least I did some days of frontdesk duties. Debian Astro Related to my previous article about fun with telescopes I uploaded new versions or did source uploads for: Besides the indi-stuff I also uploaded Other stuff I celebrated christmas :-).

29 December 2021

Thorsten Alteholz: Fun with Telescopes

Recently I purchased a small telescope to look at solar spots. When choosing a mount, I checked whether it can be controlled with OSS. In this context INDI is mentioned everywhere and my desired mount was supported. indi and kstars are already part of Debian, so apt install , selecting my mount, . oh, wait, the menu shows it, but I can not select it. Ok, that was the time when I learned about the difference of indi and indi-3rdparty. The indi package just contains a few drivers and others are available from a different repository. This split has been done either due to different release cycles of the driver, a different OSS license of it, or just due to binary blobs without source being part of some drivers. Fine, there are already packages of the 3rdparty-repository available from an Ubuntu PPA, so it should be no problem to add them do Debian as well. Some manufacturers freely distribute at least the specification of their API so that others are able to write the corresponding software. Some manufacturers even write their own driver. Examples are: A minor part actually does not have binary blobs but distributes the sources of their software. Unfortunately they have licenses that are not compatible with DFSG and those packages still need to go to non-free. Examples are: But there also seem to exist lots of manufacturers of astronomically accessories, especially cameras, that just distribute some binary blobs to talk to their hardware. This is sad, but at the moment it is just the way it is and such package need to go to non-free. Luckily their blobs are accompanied with corresponding licenses. At least those manufacturers understand how software licenses work and packaging their SDK is just straight forward. Examples are: However, when looking at the license information of some Ubuntu packages, several of them were distributed under a CC license. This is not a common license for software, so I wanted to get a confirmation whether these information are correct. Unfortunately most of such manufacturers don t want to disclose their licenses. For whatever reason they distribute their tarballs without any hint and emails to their support channels are just ignored. Examples of such bad behaviour are: However the best answer comes from the Levenhuk support. My question about the license of their SDK was answered by:
I am afraid we cannot disclose any further information except the software file that is available on our website.
So strictly speaking nobody is allowed to use their software. I wonder whether such competence also becomes visible in their products. I will never really know as there are more than enough OSS friendly manufacturers available. Anyway, most of the indi-3rdparty drivers are now available and I got lots of suggestions about hardware I need to buy in the future :-).

30 May 2021

Vincent Fourmond: QSoas quiz #2: averaging several Y values for the same X value

This second quiz may sound like the first one, but in fact, the approach used is completely different. The point is to gather some elementary statistics from a series of experiments performed under different conditions, but with several repeats at the same conditions.
Quiz You are given a file (which you can download there) that contains a series of pH value data: the X column is the pH, the Y column the result of the experiment at the given pH (let's say the measure of the catalytic rate of an enzyme). Your task is to take this data and produce a single dataset which contains, for each pH value, the pH, the average of the results at that pH and the standard deviation. The result should be identical to the following file, and should look like that:
There are several ways to do this, but all ways must rely on stats, and the more natural way in QSoas is to take advantage of split-on-values, which is a very powerful command but somehow hard to master, which is the point of this Quiz.
By the way, the data file is purely synthetic, if you look in the GitHub repository, you'll see how it was generated.

About QSoas QSoas is a powerful open source data analysis program that focuses on flexibility and powerful fitting capacities. It is released under the GNU General Public License. It is described in Fourmond, Anal. Chem., 2016, 88 (10), pp 5050 5052. Current version is 3.0. You can download its source code there (or clone from the GitHub repository) and compile it yourself, or buy precompiled versions for MacOS and Windows there.

4 May 2021

Erich Schubert: Machine Learning Lecture Recordings

I have uploaded most of my Machine Learning lecture to YouTube. The slides are in English, but the audio is in German. Some very basic contents (e.g., a demo of standard k-means clustering) were left out from this advanced class, and instead only a link to recordings from an earlier class were given. In this class, I wanted to focus on the improved (accelerated) algorithms instead. These are not included here (yet). I believe there are some contents covered in this class you will find nowhere else (yet). The first unit is pretty long (I did not split it further yet). The later units are shorter recordings. ML F1: Principles in Machine Learning ML F2/F3: Correlation does not Imply Causation & Multiple Testing Problem ML F4: Overfitting beranpassung ML F5: Fluch der Dimensionalit t Curse of Dimensionality ML F6: Intrinsische Dimensionalit t Intrinsic Dimensionality ML F7: Distanzfunktionen und hnlichkeitsfunktionen ML L1: Einf hrung in die Klassifikation ML L2: Evaluation und Wahl von Klassifikatoren ML L3: Bayes-Klassifikatoren ML L4: N chste-Nachbarn Klassifikation ML L5: N chste Nachbarn und Kerndichtesch tzung ML L6: Lernen von Entscheidungsb umen ML L7: Splitkriterien bei Entscheidungsb umen ML L8: Ensembles und Meta-Learning: Random Forests und Gradient Boosting ML L9: Support Vector Machinen - Motivation ML L10: Affine Hyperebenen und Skalarprodukte Geometrie f r SVMs ML L11: Maximum Margin Hyperplane die breitest m gliche Stra e ML L12: Training Support Vector Machines ML L13: Non-linear SVM and the Kernel Trick ML L14: SVM Extensions and Conclusions ML L15: Motivation of Neural Networks ML L16: Threshold Logic Units ML L17: General Artificial Neural Networks ML L18: Learning Neural Networks with Backpropagation ML L19: Deep Neural Networks ML L20: Convolutional Neural Networks ML L21: Recurrent Neural Networks and LSTM ML L22: Conclusion Classification ML U1: Einleitung Clusteranalyse ML U2: Hierarchisches Clustering ML U3: Accelerating HAC mit Anderberg s Algorithmus ML U4: k-Means Clustering ML U5: Accelerating k-Means Clustering ML U6: Limitations of k-Means Clustering ML U7: Extensions of k-Means Clustering ML U8: Partitioning Around Medoids (k-Medoids) ML U9: Gaussian Mixture Modeling (EM Clustering) ML U10: Gaussian Mixture Modeling Demo ML U11: BIRCH and BETULA Clustering ML U12: Motivation Density-Based Clustering (DBSCAN) ML U13: Density-reachable and density-connected (DBSCAN Clustering) ML U14: DBSCAN Clustering ML U15: Parameterization of DBSCAN ML U16: Extensions and Variations of DBSCAN Clustering ML U17: OPTICS Clustering ML U18: Cluster Extraction from OPTICS Plots ML U19: Understanding the OPTICS Cluster Order ML U20: Spectral Clustering ML U21: Biclustering and Subspace Clustering ML U22: Further Clustering Approaches

11 March 2021

Vincent Fourmond: All tips and tricks about QSoas

I've decided to post regular summaries of all the articles written here about QSoas; this is the first post of this kind. All the articles related to QSoas can be found here also. The articles written here can be separated into several categories. Tutorials to analyze real data These are posts about how to reproduce the data analysis of published articles, including links to the original data so you can fully reproduce our results. These posts all have the label tutorial. All about fits QSoas has a particularly powerful interface for non-linear least square minimisations (fits): Meta-data Meta data describe the conditions in which experiments were performed. Quiz and their solutions Quiz are small problems that take some skill to solve; they can teach you a lot about how to work with QSoas. Other tips and tricks Release annoucements These have generally lot of general information about the possibilities in QSoas:
About QSoas QSoas is a powerful open source data analysis program that focuses on flexibility and powerful fitting capacities. It is released under the GNU General Public License. It is described in Fourmond, Anal. Chem., 2016, 88 (10), pp 5050 5052. Current version is 3.0. You can download its source code there (or clone from the GitHub repository) and compile it yourself, or buy precompiled versions for MacOS and Windows there.

23 November 2020

Vincent Fourmond: QSoas tips and tricks: using meta-data, first level

By essence, QSoas works with \(y = f(x)\) datasets. However, in practice, when working with experimental data (or data generated from simulations), one has often more than one experimental parameter (\(x\)). For instance, one could record series of spectra (\(A = f(\lambda)\)) for different pH values, so that the absorbance is in fact a function of both the pH and \(\lambda\). QSoas has different ways to deal with such situations, and we'll describe one today, using meta-data. Setting meta-data Meta-data are simply series of name/values attached to a dataset. It can be numbers, dates or just text. Some of these are automatically detected from certain type of data files (but that is the topic for another day). The simplest way to set meta-data is to use the set-meta command:
QSoas> set-meta pH 7.5
This command sets the meta-data pH to the value 7.5. Keep in mind that QSoas does not know anything about the meaning of the meta-data[1]. It can keep track of the meta-data you give, and manipulate them, but it will not interpret them for you. You can set several meta-data by repeating calls to set-meta, and you can display the meta-data attached to a dataset using the command show. Here is an example:
QSoas> generate-buffer 0 10
QSoas> set-meta pH 7.5
QSoas> set-meta sample "My sample"
QSoas> show 0
Dataset generated.dat: 2 cols, 1000 rows, 1 segments, #0
Flags: 
Meta-data:	pH =	 7.5	sample =	 My sample
Note here the use of quotes around My sample since there is a space inside the value. Using meta-data There are many ways to use meta-data in QSoas. In this post, we will discuss just one: using meta-data in the output file. The output file can collect data from several commands, like peak data, statistics and so on. For instance, each time the command 1 is run, a line with the information about the largest peak of the current dataset is written to the output file. It is possible to automatically add meta-data to those lines by using the /meta= option of the output command. Just listing the names of the meta-data will add them to each line of the output file. As a full example, we'll see how one can take advantage of meta-data to determine the position of the peak of the function \(x^2 \exp (-a\,x)\) depends on \(a\). For that, we first create a script that generates the function for a certain value of \(a\), sets the meta-data a to the corresponding value, and find the peak. Let's call this file do-one.cmds (all the script files can be found in the GitHub repository):
generate-buffer 0 20 x**2*exp(-x*$ 1 )
set-meta a $ 1 
1 
This script takes a single argument, the value of \(a\), generates the appropriate dataset, sets the meta-data a and writes the data about the largest (and only in this case) peak to the output file. Let's now run this script with 1 as an argument:
QSoas> @ do-one.cmds 1
This command generates a file out.dat containing the following data:
## buffer       what    x       y       index   width   left_width      right_width     area
generated.dat   max     2.002002002     0.541340590883  100     3.4034034034    1.24124124124   2.162162162161.99999908761
This gives various information about the peak found: the name of the dataset it was found in, whether it's a maximum or minimum, the x and y positions of the peak, the index in the file, the widths of the peak and its area. We are interested here mainly in the x position. Then, we just run this script for several values of \(a\) using run-for-each, and in particular the option /range-type=lin that makes it interpret values like 0.5..5:80 as 80 values evenly spread between 0.5 and 5. The script is called run-all.cmds:
output peaks.dat /overwrite=true /meta=a
run-for-each do-one.cmds /range-type=lin 0.5..5:80
V all /style=red-to-blue
The first line sets up the output to the output file peaks.dat. The option /meta=a makes sure the meta a is added to each line of the output file, and /overwrite=true make sure the file is overwritten just before the first data is written to it, in order to avoid accumulating the results of different runs of the script. The last line just displays all the curves with a color gradient. It looks like this:
Running this script (with @ run-all.cmds) creates a new file peaks.dat, whose first line looks like this:
## buffer       what    x       y       index   width   left_width      right_width     area    a
The column x (the 3rd) contains the position of the peaks, and the column a (the 10th) contains the meta a (this column wasn't present in the output we described above, because we had not used yet the output /meta=a command). Therefore, to load the peak position as a function of a, one has just to run:
QSoas> load peaks.dat /columns=10,3
This looks like this:
Et voil ! To train further, you can:

[1] this is not exactly true. For instance, some commands like unwrap interpret the sr meta-data as a voltammetric scan rate if it is present. But this is the exception. About QSoasQSoas is a powerful open source data analysis program that focuses on flexibility and powerful fitting capacities. It is released under the GNU General Public License. It is described in Fourmond, Anal. Chem., 2016, 88 (10), pp 5050 5052. Current version is 2.2. You can download its source code there (or clone from the GitHub repository) and compile it yourself, or buy precompiled versions for MacOS and Windows there.

11 November 2020

Vincent Fourmond: Solution for QSoas quiz #1: averaging spectra

This post describes the solution to the Quiz #1, based on the files found there. The point is to produce both the average and the standard deviation of a series of spectra. Below is how the final averaged spectra shoud look like:
I will present here two different solutions. Solution 1: using the definition of standard deviation There is a simple solution using the definition of the standard deviation: $$\sigma_y = \sqrt <y^2> - <y> ^2 $$ in which \(<y^2>\) is the average of \(y^2\) (and so on). So the simplest solution is to construct datasets with an additional column that would contain \(y^2\), average these columns, and replace the average with the above formula. For that, we need first a companion script that loads a single data file and adds a column with \(y^2\). Let's call this script load-one.cmds:
load $ 1 
apply-formula y2=y**2 /extra-columns=1
flag /flags=processed
When this script is run with the name of a spectrum file as argument, it loads it (replaces $ 1 by the first argument, the file name), adds a column y2 containing the square of the y column, and flag it with the processed flag. This is not absolutely necessary, but it makes it much easier to refer to all the spectra when they are processed. Then to process all the spectra, one just has to run the following commands:
run-for-each load-one.cmds Spectrum-1.dat Spectrum-2.dat Spectrum-3.dat
average flagged:processed
apply-formula y2=(y2-y**2)**0.5
dataset-options /yerrors=y2
The run-for-each command runs the load-one.cmds script for all the spectra (one could also have used Spectra-*.dat to not have to give all the file names). Then, the average averages the values of the columns over all the datasets. To be clear, it finds all the values that have the same X (or very close X values) and average them, column by column. The result of this command is therefore a dataset with the average of the original \(y\) data as y column and the average of the original \(y^2\) data as y2 column. So now, the only thing left to do is to use the above equation, which is done by the apply-formula code. The last command, dataset-options, is not absolutely necessary but it signals to QSoas that the standard error of the y column should be found in the y2 column. This is now available as script method-one.cmds in the git repository.

Solution 2: use QSoas's knowledge of standard deviation The other method is a little more involved but it demonstrates a good approach to problem solving with QSoas. The starting point is that, in apply-formula, the value $stats.y_stddev corresponds to the standard deviation of the whole y column... Loading the spectra yields just a series of x,y datasets. We can contract them into a single dataset with one x column and several y columns:
load Spectrum-*.dat /flags=spectra
contract flagged:spectra
After these commands, the current dataset contains data in the form of:
lambda1	a1_1	a1_2	a1_3
lambda2	a2_1	a2_2	a2_3
...
in which the ai_1 come from the first file, ai_2 the second and so on. We need to use transpose to transform that dataset into:
0	a1_1	a2_1	...
1	a1_2	a2_2	...
2	a1_3	a2_3	...
In this dataset, values of the absorbance for the same wavelength for each dataset is now stored in columns. The next step is just to use expand to obtain a series of datasets with the same x column and a single y column (each corresponding to a different wavelength in the original data). The game is now to replace these datasets with something that looks like:
0	a_average
1	a_stddev
For that, one takes advantage of the $stats.y_average and $stats.y_stddev values in apply-formula, together with the i special variable that represents the index of the point:
apply-formula "if i == 0; then y=$stats.y_average; end; if i == 1; then y=$stats.y_stddev; end"
strip-if i>1
Then, all that is left is to apply this to all the datasets created by expand, which can be just made using run-for-datasets, and then, we reverse the splitting by using contract and transpose ! In summary, this looks like this. We need two files. The first, process-one.cmds contains the following code:
apply-formula "if i == 0; then y=$stats.y_average; end; if i == 1; then y=$stats.y_stddev; end"
strip-if i>1
flag /flags=processed
The main file, method-two.cmds looks like this:
load Spectrum-*.dat /flags=spectra
contract flagged:spectra
transpose
expand /flags=tmp
run-for-datasets process-one.cmds flagged:tmp
contract flagged:processed
transpose
dataset-options /yerrors=y2
Note some of the code above can be greatly simplified using new features present in the upcoming 3.0 version, but that is the topic for another post. About QSoasQSoas is a powerful open source data analysis program that focuses on flexibility and powerful fitting capacities. It is released under the GNU General Public License. It is described in Fourmond, Anal. Chem., 2016, 88 (10), pp 5050 5052. Current version is 2.2. You can download its source code and compile it yourself or buy precompiled versions for MacOS and Windows there.

7 October 2020

Vincent Fourmond: QSoas quiz #1 : averaging spectra

Here is the first QSoas quiz ! I recently measured several identical spectra in a row to evaluate the noise of the setup, and so I wanted to average all the spectra and also determine the standard deviation in the absorbances. Averaging the spectra can simply be done taking advantage of the average command:
QSoas> load Spectrum*.dat /flags=spectra
QSoas> average flagged:spectra
However, average does not provide means to make standard deviations, it just takes the average of all but the X column. I wanted to add this feature, but I realized there are already at least two distinct ways to do that...

Quiz Your task is to determine the average and standard deviations of the three spectra located there, (Spectrum-1.dat, Spectrum-2.dat and Spectrum-3.dat). There are at least two ways: To help you, I've added the result in Average.dat. The figure below shows a zoom on the data superimposed to the average (bonus points to find how to display this light red area that corresponds to the standard deviation !).
I will post the answer later. In the meantime, feel free to post your own solutions or attempts, hacks, and so on !

About QSoasQSoas is a powerful open source data analysis program that focuses on flexibility and powerful fitting capacities. It is released under the GNU General Public License. It is described in Fourmond, Anal. Chem., 2016, 88 (10), pp 5050 5052. Current version is 2.2. You can download its source code or buy precompiled versions for MacOS and Windows there.

20 June 2020

Dima Kogan: OpenCV C API transition. A rant.

I just went through a debugging exercise that was so ridiculous, I just had to write it up. Some of this probably should go into a bug report instead of a rant, but I'm tired. And clearly I don't care anymore. OK, so I'm doing computer vision work. OpenCV has been providing basic functions in this area, so I have been using them for a while. Just for really, really basic stuff, like projection. The C API was kinda weird, and their error handling is a bit ridiculous (if you give it arguments it doesn't like, it asserts!), but it has been working fine for a while. At some point (around OpenCV 3.0) somebody over there decided that they didn't like their C API, and that this was now a C++ library. Except the docs still documented the C API, and the website said it supported C, and the code wasn't actually removed. They just kinda stopped testing it and thinking about it. So it would mostly continue to work, except some poor saps would see weird failures; like this and this, for instance. OpenCV 3.2 was the last version where it was mostly possible to keep using the old C code, even when compiling without optimizations. So I was doing that for years. So now, in 2020, Debian is finally shipping a version of OpenCV that definitively does not work with the old code, so I had to do something. Over time I stopped using everything about OpenCV, except a few cvProjectPoints2() calls. So I decided to just write a small C++ shim to call the new version of that function, expose that with =extern "C"= to the rest of my world, and I'd be done. And normally I would be, but this is OpenCV we're talking about. I wrote the shim, and it didn't work. The code built and ran, but the results were wrong. After some pointless debugging, I boiled the problem down to this test program:
#include <opencv2/calib3d.hpp>
#include <stdio.h>
int main(void)
 
    double fx = 1000.0;
    double fy = 1000.0;
    double cx = 1000.0;
    double cy = 1000.0;
    double _camera_matrix[] =
          fx,  0, cx,
          0,  fy, cy,
          0,   0,  1  ;
    cv::Mat camera_matrix(3,3, CV_64FC1, _camera_matrix);
    double pp[3] =  1., 2., 10. ;
    double qq[2] =  444, 555 ;
    int N=1;
    cv::Mat object_points(N,3, CV_64FC1, pp);
    cv::Mat image_points (N,2, CV_64FC1, qq);
    // rvec,tvec
    double _zero3[3] =  ;
    cv::Mat zero3(1,3,CV_64FC1, _zero3);
    cv::projectPoints( object_points,
                       zero3,zero3,
                       camera_matrix,
                       cv::noArray(),
                       image_points,
                       cv::noArray(), 0.0);
    fprintf(stderr, "manually-projected no-distortion: %f %f\n",
            pp[0]/pp[2] * fx + cx,
            pp[1]/pp[2] * fy + cy);
    fprintf(stderr, "opencv says: %f %f\n", qq[0], qq[1]);
    return 0;
 
This is as trivial as it gets. I project one point through a pinhole camera, and print out the right answer (that I can easily compute, since this is trivial), and what OpenCV reports:
$ g++ -I/usr/include/opencv4 -o tst tst.cc -lopencv_calib3d -lopencv_core && ./tst
manually-projected no-distortion: 1100.000000 1200.000000
opencv says: 444.000000 555.000000
Well that's no good. The answer is wrong, but it looks like it didn't even write anything into the output array. Since this is supposed to be a thin shim to C code, I want this thing to be filling in C arrays, which is what I'm doing here:
double qq[2] =  444, 555 ;
int N=1;
cv::Mat image_points (N,2, CV_64FC1, qq);
This is how the C API has worked forever, and their C++ API works the same way, I thought. Nothing barfed, not at build time, or run time. Fine. So I went to figure this out. In the true spirit of C++, the new API is inscrutable. I'm passing in cv::Mat, but the API wants cv::InputArray for some arguments and cv::OutputArray for others. Clearly cv::Mat can be coerced into either of those types (and that's what you're supposed to do), but the details are not meant to be understood. You can read the snazzy C++-style documentation. Clicking on "OutputArray" in the doxygen gets you here. Then I guess you're supposed to click on "_OutputArray", and you get here. Understand what's going on now? Me neither. Stepping through the code revealed the problem. cv::projectPoints() looks like this:
void cv::projectPoints( InputArray _opoints,
                        InputArray _rvec,
                        InputArray _tvec,
                        InputArray _cameraMatrix,
                        InputArray _distCoeffs,
                        OutputArray _ipoints,
                        OutputArray _jacobian,
                        double aspectRatio )
 
    ....
    _ipoints.create(npoints, 1, CV_MAKETYPE(depth, 2), -1, true);
    ....
I.e. they're allocating a new data buffer for the output, and giving it back to me via the OutputArray object. This object already had a buffer, and that's where I was expecting the output to go. Instead it went to the brand-new buffer I didn't want. Issues: Well that's just super. I can call the C++ function, copy the data into the place it's supposed to go to, and then deallocate the extra buffer. Or I can pull out the meat of the function I want into my project, and then I can drop the OpenCV dependency entirely. Clearly that's the way to go. So I go poking back into their code to grab what I need, and here's what I see:
static void cvProjectPoints2Internal( const CvMat* objectPoints,
                  const CvMat* r_vec,
                  const CvMat* t_vec,
                  const CvMat* A,
                  const CvMat* distCoeffs,
                  CvMat* imagePoints, CvMat* dpdr CV_DEFAULT(NULL),
                  CvMat* dpdt CV_DEFAULT(NULL), CvMat* dpdf CV_DEFAULT(NULL),
                  CvMat* dpdc CV_DEFAULT(NULL), CvMat* dpdk CV_DEFAULT(NULL),
                  CvMat* dpdo CV_DEFAULT(NULL),
                  double aspectRatio CV_DEFAULT(0) )
 
...
 
Looks familiar? It should. Because this is the original C-API function they replaced. So in their quest to move to C++, they left the original code intact, C API and everything, un-exposed it so you couldn't call it anymore, and made a new, shitty C++ wrapper for people to call instead. CvMat is still there. I have no words. Yes, this is a massive library, and maybe other parts of it indeed did make some sort of non-token transition, but this thing is ridiculous. In the end, here's the function I ended up with (licensed as OpenCV; see the comment)
// The implementation of project_opencv is based on opencv. The sources have
// been heavily modified, but the opencv logic remains. This function is a
// cut-down cvProjectPoints2Internal() to keep only the functionality I want and
// to use my interfaces. Putting this here allows me to drop the C dependency on
// opencv. Which is a good thing, since opencv dropped their C API
//
// from opencv-4.2.0+dfsg/modules/calib3d/src/calibration.cpp
//
// Copyright (C) 2000-2008, Intel Corporation, all rights reserved.
// Copyright (C) 2009, Willow Garage Inc., all rights reserved.
// Third party copyrights are property of their respective owners.
//
// Redistribution and use in source and binary forms, with or without modification,
// are permitted provided that the following conditions are met:
//
//   * Redistribution's of source code must retain the above copyright notice,
//     this list of conditions and the following disclaimer.
//
//   * Redistribution's in binary form must reproduce the above copyright notice,
//     this list of conditions and the following disclaimer in the documentation
//     and/or other materials provided with the distribution.
//
//   * The name of the copyright holders may not be used to endorse or promote products
//     derived from this software without specific prior written permission.
//
// This software is provided by the copyright holders and contributors "as is" and
// any express or implied warranties, including, but not limited to, the implied
// warranties of merchantability and fitness for a particular purpose are disclaimed.
// In no event shall the Intel Corporation or contributors be liable for any direct,
// indirect, incidental, special, exemplary, or consequential damages
// (including, but not limited to, procurement of substitute goods or services;
// loss of use, data, or profits; or business interruption) however caused
// and on any theory of liability, whether in contract, strict liability,
// or tort (including negligence or otherwise) arising in any way out of
typedef union
 
    struct
     
        double x,y;
     ;
    double xy[2];
  point2_t;
typedef union
 
    struct
     
        double x,y,z;
     ;
    double xyz[3];
  point3_t;
void project_opencv( // outputs
                     point2_t* q,
                     point3_t* dq_dp,               // may be NULL
                     double* dq_dintrinsics_nocore, // may be NULL
                     // inputs
                     const point3_t* p,
                     int N,
                     const double* intrinsics,
                     int Nintrinsics)
 
    const double fx = intrinsics[0];
    const double fy = intrinsics[1];
    const double cx = intrinsics[2];
    const double cy = intrinsics[3];
    double k[12] =  ;
    for(int i=0; i<Nintrinsics-4; i++)
        k[i] = intrinsics[i+4];
    for( int i = 0; i < N; i++ )
     
        double z_recip = 1./p[i].z;
        double x = p[i].x * z_recip;
        double y = p[i].y * z_recip;
        double r2      = x*x + y*y;
        double r4      = r2*r2;
        double r6      = r4*r2;
        double a1      = 2*x*y;
        double a2      = r2 + 2*x*x;
        double a3      = r2 + 2*y*y;
        double cdist   = 1 + k[0]*r2 + k[1]*r4 + k[4]*r6;
        double icdist2 = 1./(1 + k[5]*r2 + k[6]*r4 + k[7]*r6);
        double xd      = x*cdist*icdist2 + k[2]*a1 + k[3]*a2 + k[8]*r2+k[9]*r4;
        double yd      = y*cdist*icdist2 + k[2]*a3 + k[3]*a1 + k[10]*r2+k[11]*r4;
        q[i].x = xd*fx + cx;
        q[i].y = yd*fy + cy;
        if( dq_dp )
         
            double dx_dp[] =   z_recip, 0,       -x*z_recip  ;
            double dy_dp[] =   0,       z_recip, -y*z_recip  ;
            for( int j = 0; j < 3; j++ )
             
                double dr2_dp = 2*x*dx_dp[j] + 2*y*dy_dp[j];
                double dcdist_dp = k[0]*dr2_dp + 2*k[1]*r2*dr2_dp + 3*k[4]*r4*dr2_dp;
                double dicdist2_dp = -icdist2*icdist2*(k[5]*dr2_dp + 2*k[6]*r2*dr2_dp + 3*k[7]*r4*dr2_dp);
                double da1_dp = 2*(x*dy_dp[j] + y*dx_dp[j]);
                double dmx_dp = (dx_dp[j]*cdist*icdist2 + x*dcdist_dp*icdist2 + x*cdist*dicdist2_dp +
                                k[2]*da1_dp + k[3]*(dr2_dp + 4*x*dx_dp[j]) + k[8]*dr2_dp + 2*r2*k[9]*dr2_dp);
                double dmy_dp = (dy_dp[j]*cdist*icdist2 + y*dcdist_dp*icdist2 + y*cdist*dicdist2_dp +
                                k[2]*(dr2_dp + 4*y*dy_dp[j]) + k[3]*da1_dp + k[10]*dr2_dp + 2*r2*k[11]*dr2_dp);
                dq_dp[i*2 + 0].xyz[j] = fx*dmx_dp;
                dq_dp[i*2 + 1].xyz[j] = fy*dmy_dp;
             
         
        if( dq_dintrinsics_nocore )
         
            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 0] = fx*x*icdist2*r2;
            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 0] = fy*(y*icdist2*r2);
            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 1] = fx*x*icdist2*r4;
            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 1] = fy*y*icdist2*r4;
            if( Nintrinsics-4 > 2 )
             
                dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 2] = fx*a1;
                dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 2] = fy*a3;
                dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 3] = fx*a2;
                dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 3] = fy*a1;
                if( Nintrinsics-4 > 4 )
                 
                    dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 4] = fx*x*icdist2*r6;
                    dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 4] = fy*y*icdist2*r6;
                    if( Nintrinsics-4 > 5 )
                     
                        dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 5] = fx*x*cdist*(-icdist2)*icdist2*r2;
                        dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 5] = fy*y*cdist*(-icdist2)*icdist2*r2;
                        dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 6] = fx*x*cdist*(-icdist2)*icdist2*r4;
                        dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 6] = fy*y*cdist*(-icdist2)*icdist2*r4;
                        dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 7] = fx*x*cdist*(-icdist2)*icdist2*r6;
                        dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 7] = fy*y*cdist*(-icdist2)*icdist2*r6;
                        if( Nintrinsics-4 > 8 )
                         
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 8] = fx*r2; //s1
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 8] = fy*0; //s1
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 9] = fx*r4; //s2
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 9] = fy*0; //s2
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 10] = fx*0;//s3
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 10] = fy*r2; //s3
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 0) + 11] = fx*0;//s4
                            dq_dintrinsics_nocore[(Nintrinsics-4)*(2*i + 1) + 11] = fy*r4; //s4
                         
                     
                 
             
         
     
 
This does only the stuff I need: projection only (no geometric transformation), and gradients in respect to the point coordinates and distortions only. Gradients in respect to fxy and cxy are trivial, and I don't bother reporting them. So now I don't compile or link against OpenCV, my code builds and runs on Debian/sid and (surprisingly) it runs much faster than before. Apparently there was a lot of pointless overhead happening. Alright. Rant over.

31 October 2017

Norbert Preining: Debian/TeX Live 2017.20171031-1

Halloween is here, time to upload a new set of scary packages of TeX Live. About a month has passed, so there is the usual big stream up updates. There was actually an intermediate release to get out some urgent fixes, but I never reported the news here. So here are the accumulated changes and updates. My favorite this time is wallcalendar, a great class to design all kind of calendars, it looks really well done. I immediately will start putting one together. On the font side there is the new addition coelacanth. To quote from the README: Coelacanth is inspired by the classic Centaur type design of Bruce Rogers, described by some as the most beautiful typeface ever designed. It aims to be a professional quality type family for general book typesetting. And indeed it is beautiful! Other noteworthy addition is the Spark font that allows creating sparklines in the running text with LaTeX. Enjoy. New packages algobox, amscls-doc, beilstein, bib2gls, coelacanth, crossreftools, dejavu-otf, dijkstra, ducksay, dynkin-diagrams, eqnnumwarn, fetchcls, fixjfm, glossaries-finnish, hagenberg-thesis, hecthese, ifxptex, isopt, istgame, ku-template, limecv, mensa-tex, musicography, na-position, notestex, outlining, pdfreview, spark-otf, spark-otf-fonts, theatre, unitn-bimrep, upzhkinsoku, wallcalendar, xltabular. Updated packages acmart, amsmath, animate, arabluatex, arara, babel, babel-french, bangorexam, baskervillef, beebe, biblatex-philosophy, biblatex-source-division, bibletext, bidi, bxjaprnind, bxjscls, bxpapersize, bytefield, classicthesis, cochineal, complexity, cooking-units, curves, datetime2-german, dccpaper, doclicense, docsurvey, eledmac, epstopdf, eqparbox, esami, etoc, fbb, fei, fithesis, fmtcount, fnspe, fonts-tlwg, fontspec, genealogytree, glossaries, glossaries-extra, hecthese, hepthesis, hvfloat, ifplatform, ifptex, inconsolata, jfmutil, jsclasses, ketcindy, knowledge, koma-script, l3build, l3experimental, l3kernel, l3packages, langsci, latex2man, latexbug, lato, leadsheets, libertinust1math, listofitems, luatexja, luatexko, luatodonotes, lwarp, markdown, mcf2graph, media9, newtx, novel, numspell, ocgx2, overpic, philokalia, phonenumbers, platex, poemscol, pst-exa, pst-geometrictools, pst-ovl, pst-plot, pst-pulley, pst-tools, pst-vehicle, pst2pdf, pstool, pstricks, pstricks-add, pxchfon, pxjahyper, quran, randomlist, rec-thy, reledmac, robustindex, scratch, skrapport, spectralsequences, tcolorbox, tetex, tex4ht, texcount, texdoc, tikzducks, tikzsymbols, toptesi, translation-biblatex-de, unicode-math, updmap-map, uplatex, widetable, xcharter, xepersian, xetexko, xetexref, xsim, zhlipsum.

26 September 2017

Norbert Preining: Debian/TeX Live 2017.20170926-1

A full month or more has past since the last upload of TeX Live, so it was high time to prepare a new package. Nothing spectacular here I have to say, two small bugs fixed and the usual long list of updates and new packages. From the new packages I found fontloader-luaotfload and interesting project. Loading fonts via lua code in luatex is by now standard, and this package allows for experiments with newer/alternative font loaders. Another very interesting new-comer is pdfreview which lets you set pages of another PDF on a lined background and add notes to it, good for reviewing. Enjoy. New packages abnt, algobox, beilstein, bib2gls, cheatsheet, coelacanth, dijkstra, dynkin-diagrams, endofproofwd, fetchcls, fixjfm, fontloader-luaotfload, forms16be, hithesis, ifxptex, komacv-rg, ku-template, latex-refsheet, limecv, mensa-tex, multilang, na-box, notes-tex, octave, pdfreview, pst-poker, theatre, upzhkinsoku, witharrows. Updated packages 2up, acmart, acro, amsmath, animate, babel, babel-french, babel-hungarian, bangorcsthesis, beamer, beebe, biblatex-gost, biblatex-philosophy, biblatex-source-division, bibletext, bidi, bpchem, bxjaprnind, bxjscls, bytefield, checkcites, chemmacros, chet, chickenize, complexity, curves, cweb, datetime2-german, e-french, epstopdf, eqparbox, esami, etoc, fbb, fithesis, fmtcount, fnspe, fontspec, genealogytree, glossaries, glossaries-extra, hvfloat, ifptex, invoice2, jfmutil, jlreq, jsclasses, koma-script, l3build, l3experimental, l3kernel, l3packages, latexindent, libertinust1math, luatexja, lwarp, markdown, mcf2graph, media9, nddiss, newpx, newtx, novel, numspell, ocgx2, philokalia, phfqit, placeat, platex, poemscol, powerdot, pst-barcode, pst-cie, pst-exa, pst-fit, pst-func, pst-geometrictools, pst-ode, pst-plot, pst-pulley, pst-solarsystem, pst-solides3d, pst-tools, pst-vehicle, pst2pdf, pstricks, pstricks-add, ptex-base, ptex-fonts, pxchfon, quran, randomlist, reledmac, robustindex, scratch, skrapport, spectralsequences, tcolorbox, tetex, tex4ht, texcount, texdef, texinfo, texlive-docindex, texlive-scripts, tikzducks, tikzsymbols, tocloft, translations, updmap-map, uplatex, widetable, xepersian, xetexref, xint, xsim, zhlipsum.

20 June 2017

Norbert Preining: TeX Live 2017 hits Debian/unstable

Yesterday I uploaded the first packages of TeX Live 2017 to Debian/unstable, meaning that the new release cycle has started. Debian/stretch was released over the weekend, and this opened up unstable for new developments. The upload comprised the following packages: asymptote, cm-super, context, context-modules, texlive-base, texlive-bin, texlive-extra, texlive-extra, texlive-lang, texworks, xindy.
I mentioned already in a previous post the following changes: The last two changes are described together with other news (easy TEXMF tree management) in the TeX Live release post. These changes more or less sum up the new infra structure developments in TeX Live 2017. Since the last release to unstable (which happened in 2017-01-23) about half a year of package updates have accumulated, below is an approximate list of updates (not split into new/updated, though). Enjoy the brave new world of TeX Live 2017, and please report bugs to the BTS! Updated/new packages:
academicons, achemso, acmart, acro, actuarialangle, actuarialsymbol, adobemapping, alkalami, amiri, animate, aomart, apa6, apxproof, arabluatex, archaeologie, arsclassica, autoaligne, autobreak, autosp, axodraw2, babel, babel-azerbaijani, babel-english, babel-french, babel-indonesian, babel-japanese, babel-malay, babel-ukrainian, bangorexam, baskervaldx, baskervillef, bchart, beamer, beamerswitch, bgteubner, biblatex-abnt, biblatex-anonymous, biblatex-archaeology, biblatex-arthistory-bonn, biblatex-bookinother, biblatex-caspervector, biblatex-cheatsheet, biblatex-chem, biblatex-chicago, biblatex-claves, biblatex-enc, biblatex-fiwi, biblatex-gb7714-2015, biblatex-gost, biblatex-ieee, biblatex-iso690, biblatex-manuscripts-philology, biblatex-morenames, biblatex-nature, biblatex-opcit-booktitle, biblatex-oxref, biblatex-philosophy, biblatex-publist, biblatex-shortfields, biblatex-subseries, bibtexperllibs, bidi, biochemistry-colors, bookcover, boondox, bredzenie, breqn, bxbase, bxcalc, bxdvidriver, bxjalipsum, bxjaprnind, bxjscls, bxnewfont, bxorigcapt, bxpapersize, bxpdfver, cabin, callouts, chemfig, chemformula, chemmacros, chemschemex, childdoc, circuitikz, cje, cjhebrew, cjk-gs-integrate, cmpj, cochineal, combofont, context, conv-xkv, correctmathalign, covington, cquthesis, crimson, crossrefware, csbulletin, csplain, csquotes, css-colors, cstldoc, ctex, currency, cweb, datetime2-french, datetime2-german, datetime2-romanian, datetime2-ukrainian, dehyph-exptl, disser, docsurvey, dox, draftfigure, drawmatrix, dtk, dviinfox, easyformat, ebproof, elements, endheads, enotez, eqnalign, erewhon, eulerpx, expex, exsheets, factura, facture, fancyhdr, fbb, fei, fetamont, fibeamer, fithesis, fixme, fmtcount, fnspe, fontmfizz, fontools, fonts-churchslavonic, fontspec, footnotehyper, forest, gandhi, genealogytree, glossaries, glossaries-extra, gofonts, gotoh, graphics, graphics-def, graphics-pln, grayhints, gregoriotex, gtrlib-largetrees, gzt, halloweenmath, handout, hang, heuristica, hlist, hobby, hvfloat, hyperref, hyperxmp, ifptex, ijsra, japanese-otf-uptex, jlreq, jmlr, jsclasses, jslectureplanner, karnaugh-map, keyfloat, knowledge, komacv, koma-script, kotex-oblivoir, l3, l3build, ladder, langsci, latex, latex2e, latex2man, latex3, latexbug, latexindent, latexmk, latex-mr, leaflet, leipzig, libertine, libertinegc, libertinus, libertinust1math, lion-msc, lni, longdivision, lshort-chinese, ltb2bib, lualatex-math, lualibs, luamesh, luamplib, luaotfload, luapackageloader, luatexja, luatexko, lwarp, make4ht, marginnote, markdown, mathalfa, mathpunctspace, mathtools, mcexam, mcf2graph, media9, minidocument, modular, montserrat, morewrites, mpostinl, mptrees, mucproc, musixtex, mwcls, mweights, nameauth, newpx, newtx, newtxtt, nfssext-cfr, nlctdoc, novel, numspell, nwejm, oberdiek, ocgx2, oplotsymbl, optidef, oscola, overlays, pagecolor, pdflatexpicscale, pdfpages, pdfx, perfectcut, pgfplots, phonenumbers, phonrule, pkuthss, platex, platex-tools, polski, preview, program, proofread, prooftrees, pst-3dplot, pst-barcode, pst-eucl, pst-func, pst-ode, pst-pdf, pst-plot, pstricks, pstricks-add, pst-solides3d, pst-spinner, pst-tools, pst-tree, pst-vehicle, ptex2pdf, ptex-base, ptex-fontmaps, pxbase, pxchfon, pxrubrica, pythonhighlight, quran, ran_toks, reledmac, repere, resphilosophica, revquantum, rputover, rubik, rutitlepage, sansmathfonts, scratch, seealso, sesstime, siunitx, skdoc, songs, spectralsequences, stackengine, stage, sttools, studenthandouts, svg, tcolorbox, tex4ebook, tex4ht, texosquery, texproposal, thaienum, thalie, thesis-ekf, thuthesis, tikz-kalender, tikzmark, tikz-optics, tikz-palattice, tikzpeople, tikzsymbols, titlepic, tl17, tqft, tracklang, tudscr, tugboat-plain, turabian-formatting, txuprcal, typoaid, udesoftec, uhhassignment, ukrainian, ulthese, unamthesis, unfonts-core, unfonts-extra, unicode-math, uplatex, upmethodology, uptex-base, urcls, variablelm, varsfromjobname, visualtikz, xassoccnt, xcharter, xcntperchap, xecjk, xepersian, xetexko, xevlna, xgreek, xsavebox, xsim, ycbook.

29 April 2017

Russ Allbery: Review: Neverness

Review: Neverness, by David Zindell
Publisher: Bantam Spectra
Copyright: May 1988
Printing: July 1989
ISBN: 0-553-27903-3
Format: Mass market
Pages: 552
Mallory Ringess is a Pilot, one of the people who can guide a lightship through interstellar space from inside the dark cocoon and biotech interface that allows visualization of the mathematics of interstellar travel. At the start of the book, he's young, arrogant, impulsive, and has a deeply unhealthy relationship with Leopold Soli, the Lord Pilot and supposedly his uncle by marriage (although they share a remarkable physical resemblance). An encounter with his uncle in a bar provokes a rash promise, and Ringess finds himself promising to attempt to map the Solid State Entity in search of the Elder Eddas, a secret of life from the mythical Ieldra that might lead to mankind's immortality. The opening of Neverness is Ringess's initial voyage and brash search, in which he proves to be a capable mathematician who can navigate a region of space twisted and deformed by becoming part of a transcendent machine intelligence. The knowledge he comes away with, though, is scarcely more coherent than the hints Soli relates at the start of the story: the secret of mankind is somehow hidden in its deepest past. That, in turn, provokes a deeply bizarre trip into the ice surrounding his home city of Neverness to attempt to steal biological material from people who have recreated themselves as Neanderthals. Beyond that point, I would say that things get even weirder, but weird still implies some emotional connection with the story. I think a more accurate description is that the book gets more incoherently mystical, more hopelessly pretentious, and more depressingly enthralled by childish drama. It's the sort of thing that one writes if one is convinced that the Oedipal complex is the height of subtle characterization. I loathed this book. I started loathing this book partway through Ringess's trip through the Solid State Entity, when Zindell's prose reached for transcendent complexity, tripped over its own shoelaces, and fell headlong into overwrought babbling. I continued reading every page because there's a perverse pleasure in hate-reading a book one dislikes this intensely, and because I wanted to write a review on the firm foundation of having endured the entire experience. The paperback edition I have has a pull quote from Orson Scott Card on the cover, which includes the phrase "excellent hard science fiction." I'm not sure what book Card read, because if this is hard science fiction, Lord of the Rings is paranormal romance. Even putting aside the idea that one travels through interstellar space by proving mathematical theorems in artificially dilated time (I don't think Zindell really understands what a proof is or why you write one), there's the whole business with stopping time with one's mind, reading other people's minds, and remembering one's own DNA. The technology, such as it is, makes considerably less sense than Star Wars. The hard SF requirement to keep technology consistent with extrapolated science is nowhere to be found here. The back-cover quote from the St. Louis Post-Dispatch is a bit more on-target: "Reminiscent of Gene Wolfe's New Sun novels... really comes to life among the intrigues of Neverness." This is indeed reminiscent of Gene Wolfe, in that it wouldn't surprise me at all if Zindell fell in love with the sense of antiquity, strangeness, and hints of understood technology that Wolfe successfully creates and attempted to emulate Wolfe in his first novel. Sadly, Zindell isn't Wolfe. Almost no one is, which is why attempting to emulate the extremely difficult feat Wolfe pulls off in the Book of the New Sun in your first novel is not a good idea. The results aren't pretty. There is something to be said for resplendent descriptions, rich with detail and ornamental prose. That something is "please use sparingly and with an eye to the emotional swings of the novel." Wolfe does not try to write most of a novel that way, which is what makes those moments of description so effective. Wolfe is also much better at making his mysteries and allusions subtle and unobtrusive, rather than having the first-person protagonist beat the reader over the head with them for pages at a time. This is a case where showing is probably better than telling. Let me quote a bit of description from the start of the book:
She shimmers, my city, she shimmers. She is said to be the most beautiful of all the cities of the Civilized Worlds, more beautiful even than Parpallaix or the cathedral cities of Vesper. To the west, pushing into the green sea like a huge, jewel-studded sleeve of city, the fragile obsidian cloisters and hospices of the Farsider's Quarter gleamed like black glass mirrors. Straight ahead as we skated, I saw the frothy churn of the Sound and their whitecaps of breakers crashing against the cliffs of North Beach and above the entire city, veined with purple and glazed with snow and ice, Waaskel and Attakel rose up like vast pyramids against the sky. Beneath the half-ring of extinct volcanoes (Urkel, I should mention, is the southernmost peak, and though less magnificent than the others, it has a conical symmetry that some find pleasing) the towers and spires of the Academy scattered the dazzling false winter light so that the whole of the Old City sparkled.
That's less than half of that paragraph, and the entire book is written like that, even in the middle of conversations. Endless, constant words piled on words about absolutely everything, whether important or not, whether emotionally significant or not. And much of it isn't even description, but philosophical ponderings that are desperately trying to seem profound. Here's another bit:
Although I knew I had never seen her before, I felt as if I had known her all my life. I was instantly in love with her, not, of course, as one loves another human being, but as a wanderer might love a new ocean or a gorgeous snowy peak he has glimpsed for the first time. I was practically struck dumb by her calmness and her beauty, so I said the first stupid thing which came to mind. "Welcome to Neverness," I told her.
Now, I should be fair: some people like this kind of description, or at least have more tolerance for it than I do. But that brings me to the second problem: there isn't a single truly likable character in this entire novel. Ringess, the person telling us this whole story, is a spoiled man-child, the sort of deeply immature and insecure person who attempts to compensate through bluster, impetuousness, and refusing to ever admit that he made a mistake or needed to learn something. He spends a good portion of the book, particularly the deeply bizarre and off-putting sections with the fake Neanderthals, attempting to act out some sort of stereotyped toxic masculinity and wallowing in negative emotions. Soli is an arrogant, abusive asshole from start to finish. Katherine, Ringess's love interest, is a seer who has had her eyes removed to see the future (I cannot express how disturbing I found Zindell's descriptions of this), has bizarre and weirdly sexualized reactions to the future she never explains, and leaves off the ends of all of her sentences, which might be be the most pointlessly irritating dialogue quirk I've seen in a novel. And Ringess's mother is a man-hating feminist from a separatist culture who turns into a master manipulator (I'm starting to see why Card liked this book). I at least really wanted to like Bardo, Ringess's closest friend, who has a sort of crude loyalty and unwillingness to get pulled too deep into the philosophical quicksand lurking underneath everything in this novel. Alas, Zindell insists on constantly describing Bardo's odious eating, belching, and sexual habits every time he's on the page, thus reducing him to the disgusting buffoon who gets drunk a lot and has irritating verbal ticks. About the only person I could stand by the end of the book was Justine, who at least seems vaguely sensible (and who leaves the person who abuses her), but she's too much of a non-entity to carry sustained interest. (There is potential here for a deeply scathing and vicious retelling of this story from Justine's point of view, focusing on the ways she was belittled, abused, and ignored, but I think Zindell was entirely unaware of why that would be so effective.) Oh, and there's lots of gore and horrific injury and lovingly-described torture, because of course there is. And that brings me back to the second half of that St. Louis Post-Dispatch review quote: "... really comes to life among the intrigues of Neverness." I would love to know what was hiding behind the ellipses in this pull quote, because this half-sentence is not wrong. Insofar as Neverness has any real appeal, it's in the intrigues of the city of Neverness and in the political structure that rules it. What this quote omits is that these intrigues start around page 317, more than halfway through the novel. That's about the point where faux-Wolfe starts mixing with late-career Frank Herbert and we get poet-assassins, some revelations about the leader of the Pilot culture, and some more concrete explanations of what this mess of a book is about. Unfortunately, you have to read through the huge and essentially meaningless Neanderthal scenes to get there, scenes that have essentially nothing to do with the interesting content of this book. (Everything that motivates them turns out to be completely irrelevant to the plot and useless for the characters.) The last 40% of the book is almost passable, and characters I cared about might have even made it enjoyable. Still, a couple of remaining problems detract heavily, chief among them the lack of connection of the great revelation of the story to, well, anything in the story. We learn at the very start of the novel that the stars of the Vild are mysteriously exploding, and much of the novel is driven by uncovering an explanation and solution. The characters do find an explanation, but not through any investigation. Ringess is simply told what is happening, in a wad of exposition, as a reward for something else entirely. It's weirdly disconnected from and irrelevant to everything else in the story. (There are some faint connections to the odd technological rules that the Pilot society lives under, but Zindell doesn't even draw attention to those.) The political intrigue in Neverness is similar: it appears out of nowhere more than halfway through the book, with no dramatic foundation for the motives of the person who has been keeping most of the secrets. And the final climax of the political machinations involves a bunch of mystical nonsense masquerading as science, and more of the Neanderthal bullshit that ruins the first half of the book. This is a thoroughly bad book: poorly plotted, poorly written, clotted and pretentious in style, and full of sociopaths and emotionally stunted children. I read the whole thing because I'm immensely stubborn and make poor life choices, but I was saying the eight deadly words ("I don't care what happens to these people") by a hundred pages in. Don't emulate my bad decisions. (Somehow, this novel was shortlisted for the Arthur C. Clarke award in 1990. What on earth could they possibly have been thinking?) Neverness is a stand-alone novel, but the ending sets up a subsequent trilogy that I have no intention of reading. Followed by The Broken God. Rating: 2 out of 10

7 November 2016

Reproducible builds folks: Reproducible Builds: week 80 in Stretch cycle

What happened in the Reproducible Builds effort between Sunday October 30 and Saturday November 5 2016: Upcoming events Reproducible work in other projects Bugs filed Reviews of unreproducible packages 81 package reviews have been added, 14 have been updated and 43 have been removed in this week, adding to our knowledge about identified issues. 3 issue types have been updated: 1 issue type has been removed: 1 issue type has been updated: Weekly QA work During of reproducibility testing, some FTBFS bugs have been detected and reported by: diffoscope development buildinfo.debian.net development tests.reproducible-builds.org Reproducible Debian: Misc. Also with thanks to Profitbricks sponsoring the "hardware" resources, Holger created a 13 core machine with 24GB RAM and 100GB SSD based storage so that Ximin can do further tests and development on GCC and other software on a fast machine. This week's edition was written by Chris Lamb, Ximin Luo, Vagrant Cascadian, Holger Levsen and reviewed by a bunch of Reproducible Builds folks on IRC.

2 November 2016

Reproducible builds folks: Reproducible Builds: week 79 in Stretch cycle

What happened in the Reproducible Builds effort between Sunday October 23 and Saturday October 29 2016: Upcoming events The second Reproducible Builds World Summit will be held from December 13th-15th in Berlin! See the link for more details. Other events: Introduction to Reproducible Builds - Vagrant Cascadian will be presenting at the SeaGL.org Conference In Seattle, USA on November 12th, 2016. Reproducible Debian Hackathon - A small hackathon organized in Boston, USA on December 3rd and 4th. If you are interested in attending, contact Valerie Young - spectranaut in the #debian-reproducible IRC channel on irc.oftc.net. IRC meeting The next IRC meeting will be held on 2016-11-01 at 18:00 UTC. The meeting after that will be held on 2016-11-15 at 18:00 UTC. Reproducible work in other projects Ximin Luo has had his fix to bug 77985 accepted into GCC. This is needed to be able to write test cases for patches to make GCC produce debugging symbols that are reproducible regardless of the build path. There was continued discussion on the mailing list regarding our build path proposals. It has now been decided to use an environment variable SOURCE_PREFIX_MAP instead of the older proposal SOURCE_ROOT_DIR. This would be similar to GCC's existing -fdebug-prefix-map option, which allows for better disambiguation between paths from different packages. mandoc's makewhatis is now reproducible. It is used by all the BSDs, including FreeBSD, as well as Alpine Linux and Void Linux. Packages reviewed and fixed, and bugs filed Chris Lamb: Reiner Herrmann: Reviews of unreproducible packages 145 package reviews have been added, 608 have been updated and 94 have been removed in this week, adding to our knowledge about identified issues. 3 issue types have been updated: Weekly QA work During of reproducibility testing, some FTBFS bugs have been detected and reported by: tests.reproducible-builds.org Debian: General: diffoscope development Misc. This week's edition was written by Ximin Luo, Chris Lamb and Holger Levsen and reviewed by a bunch of Reproducible Builds folks on IRC.

Next.