Search Results: "igor"

14 October 2013

Charles Plessy: Update of EMBOSS explorer in Wheezy.

EMBOSS explorer was broken in Debian 7 (Wheezy) because of an incompatibly with EMBOSS 6.4. The package was repaired with the second update (7.2). The development and maintenance of EMBOSS explorer have stopped for many years. If a new serious bug surfaces, we may need to remove the package rather than repair it. In consequence, do not hesitate to suggest us an alternative, or if you are developer and need EMBOSS explorer, to see how you can reinvigorate this project (currently on SourceForge).

29 September 2013

Dirk Eddelbuettel: Rcpp 0.10.5

A new version of Rcpp is now on the CRAN network for GNU R; binaries for Debian have been uploaded as well. Once more, this release brings a large number of exciting changes to Rcpp. Some concern usability, some bring new features, some increase performance; see below for the detailed list. We have now released three updates on a quarterly cycle; if we keep this up the next version ought to be ready at the end of December. As in the past, we tested the release rather rigorously by checking against all packages I could (relatively easily) built on my server: this time it successfully passed \code R CMD check for all 107 packages I can build locally out of a total of 136 packages. (Two failed: one for an error in \code Makevars , and one for the need of an X11 server during tests; this may get addressed in the in test script next time). As all of these 107 packages passed, we do not expect any issues with dependent packages. Should there be issues we would appreciate a note, preferably with reproducible code, to the rcpp-devel mailing list. The complete NEWS entry for 0.10.4 is below; more details are in the ChangeLog file in the package and on the Rcpp Changelog page.
Changes in Rcpp version 0.10.5 (2013-09-28)
  • Changes in R code:
    • New R function demangle that calls the DEMANGLE macro.
    • New R function sizeof to query the byte size of a type. This returns an object of S3 class bytes that has a print method showing bytes and bits.
  • Changes in Rcpp API:
    • Add defined(__sun) to lists of operating systems to test for when checking for lack of backtrace() needed for stack traces.
    • as<T*>, as<const T*>, as<T&> and as<const T&> are now supported, when T is a class exposed by modules, i.e. with RCPP_EXPOSED_CLASS
    • DoubleVector as been added as an alias to NumericVector
    • New template function is<T> to identify if an R object can be seen as a T. For example is<DataFrame>(x). This is a building block for more expressive dispatch in various places (modules and attributes functions).
    • wrap can now handle more types, i.e. types that iterate over std::pair<const KEY, VALUE> where KEY can be converted to a String and VALUE is either a primitive type (int, double) or a type that wraps. Examples :
      • std::map<int, double> : we can make a String from an int, and double is primitive
      • boost::unordered_map<double, std::vector<double> >: we can make a String from a double and std::vector<double> can wrap itself
      Other examples of this are included at the end of the wrap unit test file (runit.wrap.R and wrap.cpp).
    • wrap now handles containers of classes handled by modules. e.g. if you expose a class Foo via modules, then you can wrap vector<Foo>, ... An example is included in the wrap unit test file.
    • RcppLdFlags(), often used in Makevars files of packages using Rcpp, is now exported from the package namespace.
  • Changes in Attributes:
    • Objects exported by a module (i.e. by a RCPP_MODULE call in a file that is processed by sourceCpp) are now directly available in the environment. We used to make the module object available, which was less useful.
    • A plugin for openmp has been added to support use of OpenMP.
    • Rcpp::export now takes advantage of the more flexible as<>, handling constness and referenceness of the input types. For users, it means that for the parameters of function exported by modules, we can now use references, pointers and const versions of them. The file Module.cpp file has an example.
    • No longer call non-exported functions from the tools package
    • No longer search the inline package as a fallback when loading plugins for the the Rcpp::plugins attribute.
  • Changes in Modules:
    • We can now expose functions and methods that take T& or const T& as arguments. In these situations objects are no longer copied as they used to be.
  • Changes in sugar:
    • is_na supports classes DatetimeVector and DateVector
  • Changes in Rcpp documentation:
    • The vignettes have been moved from inst/doc/ to the vignettes directory which is now preferred.
    • The appearance of the vignettes has been refreshed by switching to the Bistream Charter font, and microtype package.
  • Deprecation of RCPP_FUNCTION_*:
    • The macros from the preprocessor_generated.h file have been deprecated. They are still available, but they print a message in addition to their expected behavior.
    • The macros will be permanently removed in the first Rcpp release after July 2014.
    • Users of these macros should start replacing them with more up-to-date code, such as using 'Rcpp attributes' or 'Rcpp modules'.
Thanks to CRANberries, you can also look at a diff to the previous release 0.10.4. As always, even fuller details are on the Rcpp Changelog page and the Rcpp page which also leads to the downloads, the browseable doxygen docs and zip files of doxygen output for the standard formats. A local directory has source and documentation too. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page

Joachim Breitner: Heidelberg Laureates Forum 2013

During the last week I was attending the first Heidelberg Laureates Forum as one of the lucky 200 accepted young scientists. The HFL is a (from now on hopefully yearly) event that brings together Fields Medalists, Abel Prize laureates and Turing Award winners with young scientists (undergraduates, Ph.D. students and postdocs) from both fields in the city of Heidelberg. The extremely well organized week consisted of lectures from the laureates, some workshops held by postdocs, excursions and plenty of good food. Videos of the lectures are available (but don t work on Linux, at least not for me ), and I have shot a few pictures of the event as well. I believe that my favourite lectures where Michael Atiyah s Advice to a Young Mathematician , Vladimir Voevodsky s Univalent Foundations of Mathematics , William Morton Kahan s Desperately Needed Remedies for the Undebuggability of Large-Scale Floating-Point Computations in Science and Engineering and Alan Kay s Putting Turing to Work . Where are all the functional programmers? During that event, one gets to talk to many other math and computer scientists researchers; sometimes just Where are you from? and What do you do? , sometimes long discussions. Unfortunately, I hardly found one who is into functional programming language research is that only because the event was parallel to ICFP (which I really would have liked to attend as well), or is functional programming really just about 1% of all computer science? What is a proof? My other research interest lies in interactive theorem proving, especially using Isabelle. Of course that is a topic that one can discuss with almost everyone at that event, including the mathematicians. The reactions were rather mixed: On the one end of the spectrum, some mathematicians seriously doubt that they would ever trust a computer to check proofs and that it would ever be efficient enough to use. Others would not mind having a button that tells whether their paper written in LaTeX is correct , but were not keen to invest time or thought into making the proof readable by the computer. And then there were some (but very few!) who had not heard of theorem proving before and were very excited by the prospect of being able to obtain certainty about their proofs immediately and without having to bother other scientists with it. During the mathematician s panel discussions, where I posed the question Do you see value in or even a need for machine-checked proofs in mathematics. , Efim Zelmanov (Fields Medal 1994) said a proof is what other mathematicians see as a proof . I found this attitude a bit surprising for me, a proof has always been a rigorous derivation within a formal system (say, ZFC set theory), and what we write in papers is a (less formal) description of the actual proof, whose existence we believe in. Therefore I was very happy to see Vladimir Voevodsky give a very committed talk about Univalent Foundations and how using that as the language for mathematics will allow more of mathematics to be cast in a formal, machine-checked form. I got the chance to discuss this with him in person, as I wanted to hear his option on Isabelle, and especially on the usefulness of the style of structured proof that Isar provides, and which is closer to the style of proofs that mathematicians use in papers. He said that he enjoyed writing his proof in the style required in Type Theory and in Coq, and that maybe mathematicians should and will adjust to the language of the system, while I believe that a structured proof languages like Isar, independent of the underlying logic (HOL in this case; which is insufficient to form a base for all of abstract mathematics), is a very useful feature and that proof assistants should adjust to the mathematicians.
We also briefly discussed the idea of mine to work with theorem provers already with motivated students in high schools, e.g. in math clubs, and found that simple proofs about arithmetic of natural numbers could be feasible here, without being too trivial. All in all a very rewarding and special week, and I can only recommend to try to attend one of the next forums, if possible.

25 June 2013

Dirk Eddelbuettel: Rcpp 0.10.4

A new version of Rcpp is now on the CRAN network for GNU R; binaries for Debian have been uploaded as well. This release brings a fairly large number of fixes and improvements across a number of Rcpp features, see below for the detailed list. We are also announcing with this release that we plan to phase out the RCPP_FUNCTION_* macros. Not only have they been superceded by Rcpp Modules and Rcpp Atributes (each of which has its own pdf vignette in the Rcpp package), but they also appear to be at best lightly used. We are for example not aware of any CRAN packages deploying them. To provide a smooth transition, we are aiming to keep them around for another twelve months, but plan to remove them with the first release after that time window has passed. As before, we tested the release rather rigorously by checking against all packages I could (relatively easily) built on my server: this time it covered 91 of the 124 CRAN packages depending on Rcpp. As all of these 91 packages passed their checks, we do not expect any issues with dependent packages. The complete NEWS entry for 0.10.4 is below; more details are in the ChangeLog file in the package and on the Rcpp Changelog page.
Changes in Rcpp version 0.10.4 (2013-06-23)
  • Changes in R code: None beyond those detailed for Rcpp Attributes
  • Changes in Rcpp attributes:
    • Fixed problem whereby the interaction between the gc and the RNGScope destructor could cause a crash.
    • Don't include package header file in generated C++ interface header files.
    • Lookup plugins in inline package if they aren't found within the Rcpp package.
    • Disallow compilation for files that don't have extensions supported by R CMD SHLIB
  • Changes in Rcpp API:
    • The DataFrame::create set of functions has been reworked to just use List::create and feed to the DataFrame constructor
    • The operator-() semantics for Date and Datetime are now more inline with standard C++ behaviour; with thanks to Robin Girard for the report.
    • RNGScope counter now uses unsigned long rather than int.
    • Vector<*>::erase(iterator, iterator) was fixed. Now it does not remove the element pointed by last (similar to what is done on stl types and what was intended initially). Reported on Rcpp-devel by Toni Giorgino.
    • Added equality operator between elements of CharacterVectors.
  • Changes in Rcpp sugar:
  • Changes in Rcpp build tools:
    • Fix by Martyn Plummer for Solaris in handling of SingleLogicalResult.
    • The src/Makevars file can now optionally override the path for /usr/bin/install_name_tool which is used on OS X.
    • Vignettes are trying harder not to be built in parallel.
  • Changes in Rcpp documentation:
    • Updated the bibliography in Rcpp.bib (which is also sourced by packages using Rcpp).
    • Updated the THANKS file.
  • Planned Deprecation of RCPP_FUNCTION_*:
    • The set of macros RCPP_FUNCTION_ etc ... from the preprocessor_generated.h file will be deprecated in the next version of Rcpp, i.e they will still be available but will generate some warning in addition to their expected behavior.
    • In the first release that is at least 12 months after this announcement, the macros will be removed from Rcpp.
    • Users of these macros (if there are any) should start replacing them with more up to date code, such as using Rcpp attributes or Rcpp modules.
Thanks to CRANberries, you can also look at a diff to the previous release 0.10.3. As always, even fuller details are on the Rcpp Changelog page and the Rcpp page which also leads to the downloads, the browseable doxygen docs and zip files of doxygen output for the standard formats. A local directory has source and documentation too. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page

1 June 2013

Daniel Pocock: Democracy down-under is not democracy

This weekend we have the AGM of a local community organisation. Like many organisations I am involved in, there will be a democratic process of decision making, board election and a low-budget social activity afterwards. We take it for granted that democracy is a good thing. In my own country, Australia, people are supposedly happier than they've ever been, no doubt people will suggest that our democracy is part of the recipe for success. While this example is from Australia, it could well happen anywhere. While everybody was slapping themselves on the back about our officially confirmed contentment, the politicians tried to slip something under the radar. With elections expected in September, the press exposed a secret deal between the two biggest political parties, raiding the public piggy bank for $60 million to prop up their campaign accounts. There is even a leaked copy of a letter confirming the deal. It was to be voted through parliament in a stitch-up within 48 hours after the happiness' announcement. Why would these two big political parties engage in a such a gross conspiracy? Weren't they already content with their whopping big pay increases that put our Prime Minister on a bigger salary than the US president or the UK Prime Minister? Well, you don't have to look hard to find out what this special funding was all about: Not long ago, the post office at the University of Melbourne where Wikileaks operates a post-office box was mysteriously shut down. While that may seem like it could just be a co-incidence on it's own, it's worth considering in the wider context: the Wikileaks party is one of the most widely recognised names in Australian politics right now. The party's leader, like the new pope, is seen as somebody who puts his principles ahead of his own comfort, living a humble life in exile while our politicians romp around with prostitutes paid for with stolen money. Whatever you think of Wikileaks or Mr Assange's private life, they are not the only example here. There are other democratic movements in our country that are equally frightening for those who are currently drunk on power. One of the independent MPs holding the balance of power is a former Lieutenant Colonel in Australia's spy agency who was ridiculed by a prior Government and forced out of his job when he exposed the sham of the Iraq war. Neither of the major political parties wants to continue being held accountable by someone who has shown such strong principles against their campaign of death and deception. That $60 million welfare payment to big political parties was intended to be something akin to a weapon of mass destruction, obliterating independent representatives out of the parliament. More recently, the same independent MP has been equally vigorous in his campagin to break the scourge of our mafia-style gambling industry with it's cosy links to Australian politicians. Now it is starting to become obvious just how scary democracy can be. Motivated by the spectacle of a few independents holding our incumbent politicians to account, other Australians have also volunteered to get in on the act and try their hand at running the country. One of Australia's richest men, Clive Palmer has found that even with his enormous wealth (and having started planning more than a year before the election), his plans to form a political party are dampened by the fact that it can't be registered with a proper name to be listed on the ballot paper and all the candidates have to be listed under his name, Palmer, or their own names, barely distinguishable from the independent candidates. This discriminatory approach to the creation of political parties clearly favours the two big incumbent groups. Now it is a lot clearer why existing politicians needed an extra $60 million war chest: like Lance Armstrong's illegal doping program, it was intended to keep themselves ahead of the pack. It all goes to show that people should not take democracy for granted: constant vigilence and involvement is needed to hold leaders to account or replace them when they deviate.
AttachmentSize
piggy-bank.jpg32.83 KB

30 May 2013

Daniel Pocock: Free JavaScript, Debian and Drupal

Hot on the heels of my announcement about the benefits of combining free JavaScript on Debian with Drupal's libraries module, the FSF has launched a high profile campaign directed at public web sites using non-free JavaScript. It's excellent to see that the rigorous policies used by Debian and the Drupal project, such as the Debian Free Software Guidelines and Drupal's use of the GPL have provided a turn-key solution that web publishers can go to in order to give the best possible experience to their end users.

23 March 2013

Dirk Eddelbuettel: Rcpp 0.10.3

A new relase 0.10.3 of Rcpp is now on CRAN and in Debian. This is the fourth release in the 0.10.* series, and further extends and solidifies the excellent Rcpp attributes. A few other bugs were fixed as well, and support for wide character strings has been added. We once again tested this fairly rigorously by checking against 86 of the 100 CRAN packages depending on Rcpp. All of these passed. So we do not expect any issues with dependent packages, but one never knows. The complete NEWS entry for 0.10.3 is below; more details are in the ChangeLog file in the package and on the Rcpp Changelog page.
Changes in Rcpp version 0.10.3 (2013-03-23)
  • Changes in R code:
    • Prevent build failures on Windowsn when Rcpp is installed in a library path with spaces (transform paths in the same manner that R does before passing them to the build system).
  • Changes in Rcpp attributes:
    • Rcpp modules can now be used with sourceCpp
    • Standalone roxygen chunks (e.g. to document a class) are now transposed into RcppExports.R
    • Added Rcpp::plugins attribute for binding directly to inline plugins. Plugins can be registered using the new registerPlugin function.
    • Added built-in cpp11 plugin for specifying the use of C++11 in a translation unit
    • Merge existing values of build related environment variables for sourceCpp
    • Add global package include file to RcppExports.cpp if it exists
    • Stop with an error if the file name passed to sourceCpp has spaces in it
    • Return invisibly from void functions
    • Ensure that line comments invalidate block comments when parsing for attributes
    • Eliminated spurious empty hello world function definition in Rcpp.package.skeleton
  • Changes in Rcpp API:
    • The very central use of R API R_PreserveObject and R_ReleaseObject has been replaced by a new system based on the functions Rcpp_PreserveObject, Rcpp_ReleaseObject and Rcpp_ReplaceObject which shows better performance and is implemented using a generic vector treated as a stack instead of a pairlist in the R implementation. However, as this preserve / release code is still a little rough at the edges, a new #define is used (in config.h) to disable it for now.
    • Platform-dependent code in Timer.cpp now recognises a few more BSD variants thanks to contributed defined() test suggestions
    • Support for wide character strings has been added throughout the API. In particular String, CharacterVector, wrap and as are aware of wide character strings
Thanks to CRANberries, you can also look at a diff to the previous release 0.10.2. As always, even fuller details are on the Rcpp Changelog page and the Rcpp page which also leads to the downloads, the browseable doxygen docs and zip files of doxygen output for the standard formats. A local directory has source and documentation too. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page

22 February 2013

Richard Hartmann: Finland I

Finland. Helsinki, Lahti, streets Arriving at Helsinki airport, we filed a claim with Lufthansa as a hard shell suitcase had a splintered corner. We were surprised that so many Finns arrived from Munich with skis, more on that later. We picked up our car and started on our way towards Koli; driving with a top speed of 100 km/h and often being limited to 80 km/h or even 60 km/h is... unusual... Finnish police/authorities seem to be obsessed with enforcing those speed limits as there are a lot of speed cameras along the way. Finnish people seem to be similarly obsessed with slot machines; there is an incredible amount of them at gas stations and a constant stream of people playing them. From an outsider's perspective, it's weird that a country as strict about one form of addiction, alcohol, and working against it vigorously, by means of taxes, would allow another form of addiction, gambling, run as freely and this allow so many slot machines. Speaking of taxes on alcohol: a single 0.33 l bottle of beer is more expensive in a Finnish supermarket than 0.5 l of beer in a German restaurant. Which also explains why supermarkets tend to have a rather large section with relatively cheap alcohol free beer. Anyway, coming back to streets: Highway intersections don't have continuous on/off ramps from which you change from one highway to another; you drive off of the highway, stop at a traffic light, and then continue onto the other highway. Weird system, but given the amount of traffic we witnessed, it's probably Good Enough (tm). Stopping for a short time in Lohti simply because it's apparently famous for winter sports competitions, we arrived at Future Freetime in Koli national park after about five to six gruelling hours of net driving through somewhat bad weather and behind slow drivers. Koli Hiking up to Ukko-Koli and its sister peaks proved to be rather exhausting as we kept on breaking through the snow cover to our knees and sometimes even our hips. Once we were up there, we realized that even though you couldn't see it in between the trees, there was fog all over the plains so we couldn't see anything. Still, it was a nice hike even if somewhat short. Note to self: Even when a trail is marked locally, if OpenStreetMap does not know about it... don't walk along it. Especially not when the going's rough already. And if there's a sign suggesting you wear snow shoes... wear snow shoes. Returning to Koli Hotel and the museum next to it, we walked over to the ski slope. The highest peak within Koli,Ukko-Koli, is 347 meters high, the local ski slope starts a good way below that. This would explain why a lot of Finns came back from Munich with their skis... Afterwards, we rented a snow mobile, without guide or supervision, and drove from Loma-Koli over lake Pielien towards Purnuniemi and in a large circle down towards lake Ryyn skyl where we turned around and went back the same way. If we thought Finnish streets don't have a lot of signs we soon realized that snow mobile tracks have even less. There are at most two or three signs pointing you in the right direction, but on the plus side, there are no posted speed limits for snow mobiles, either. In somewhat related news, snow mobiles can go at least 95 km/h. At that point, the scratched and dirty visor of your rental helmet will keep slamming down, forcing you to take one hand off the handle and thus stop accelerating to maintain stability. To round off the day, we heated up the sauna built into our little wooden hut. Running outside three times to rub myself off with snow from head to toes, I almost slipped and fell while standing still. When your feet are too hot for the snowy ground, you'll start to melt your own little pools of slippery water/snow mush within seconds. File that one under "I would never have guessed unless I had experienced it myself". Generic The MarkDown source of this blog post is not even 5 kiB in size; even in a worst case scenario, pushing this to my ikiwiki instance via git will eat up less 10 kiB of mobile data. Which is good because I have 78 MiB of international data left on this plan. This is also the reason why there are no links in this blog post: I am writing everything off line and don't want to search for the correct URLs to link to. I really wish EU regulators would start to tackle data roaming now that SMS and voice calls are being forced down into somewhat sane pricing regions by regulations. PS:
-rw-r--r-- 1 richih richih 4.6K Feb 11 22:55 11-Finland-I.mdwn
[...]
Writing objects: 100% (7/7), 2.79 KiB, done.

15 February 2013

Francesca Ciceri: The DPL game

In his latest bits from the DPL, Stefano wrote:
I'd like to respond (also) here to inquiries I'm receiving these days: I will not run again as DPL. So you have about 20 days to mob\^Wconvince other DDs to run, or decide to run yourself. Do not to wait for the vary last minute, as that makes for lousy campaigns.
Ladies and gentlemen, I am pleased to present you... THE DPL GAME GOALS:
The goal of the game is to let people know you think they'd be nice DPLs.
The point is not to pressure them, but to let them know they're awesome and make them at least consider the idea to run for DPL. The winners are those who have at least one of their Fantastic Four running for DPL. Bonus points if one of them ends being the next DPL. RULES:
Name three persons (plus a reserve, just in case) you'd like to see as candidates for DPL. Publicly list them (on your blog or on identi.ca using the hashtag #DPLgame) or at least let them know that you'd like to have them as candidate for DPL (via private mail).
You may want to add a couple of lines explaining the rationale for your choices. AGE:
0-99 NUMBER OF PLAYERS:
The more the merrier Some suggestions on how to play:
First think of the qualities a DPL needs to do, in your opinion, a good job. Then look around you: the people you work with, the people you see interact on mailing list, etc. There must be someone with those qualities.
Here are my Fantastic Four (in rigorous alphabetic order): In my opinion, they all more or less have: enthusiasm, a general understanding of dynamics inside the project and of various technical sides of the project itself, ability to delegate and coordinate with different people (inside and outside the project), good communication skills and some diplomacy and ability in de-escalating conflicts. These are people I worked with or I observed working and discussing on mailing lists, and I think they'd do a good job. But -hey!- we are almost a thousand of developers and you cannot possibly know everyone or observe all the people who work in the various teams. This is why you should pick your four names!

6 February 2013

Biella Coleman: Edward Tufte was a phreak

It has been so very long since I have left a trace here. I guess moving to two new countries (Canada and Quebec), starting a new job, working on Anonymous, and finishing my first book was a bit much. I miss this space, not so much because what I write here is any good. But it a handy way for me to keep track of time and what I do and even think. My life feels like a blur at times and hopefully here I can see its rhythms and changes a little more clearly if I occasionally jot things down here. So I thought it would nice to start with something that I found surprising: famed information designer, Edward Tufte, a professor emeritus at Yale was a phone phreak (and there is a stellar new book on the topic by former phreak Phil Lapsley. He spoke about his technological exploration during a sad event, a memorial service in NYC which I attended for the hacker and activist Aaron Swartz. I had my wonderful RA transcribe the speech, so here it is [we may not have the right spelling for some of the individuals so please let us know of any mistakes]:
Edward Tufte s Speech From Aaron Swartz s Memorial
Speech starts 41:00 [video cuts out in beginning]
We would then meet over the years for a long talk every now and then, and my responsibility was to provide him with a reading list, a reading list for life and then about two years ago Quinn had Aaron come to Connecticut and he told me about the four and a half million downloads of scholarly articles and my first question is, Why isn t MIT celebrating this? .
[Video cuts out again]
Obviously helpful in my career there, he then became president of the Mellon foundation, he then retired from the Mellon foundation, but he was asked by the Mellon foundation to handle the problem of JSTOR and Aaron. So I wrote Bill Bullen(sp?) an email about it, I said first that Aaron was a treasure and then I told a personal story about how I had done some illegal hacking and been caught at it and what happened. In 1962, my housemate and I invented the first blue box, that s a device that allows for free, undetectable, unbillable long distance telephone calls. And we got this up and played around with it and the end of our research came when we concluded what was the longest long distance call ever made, which was from Palo Alto to New York time-of-day via Hawaii, well during our experimentation, AT&T, on the second day it turned out, had tapped our phone and uh but it wasn t until about 6 months later when I got a call from the gentleman, AJ Dodge, senior security person at AT&T and I said, I know what you re calling about. and so we met and he said You what you are doing is a crime that would , you know all that. But I knew it wasn t serious because he actually cared about the kind of engineering stuff and complained that the tone signals we were generating were not the standard because they record them and play them back in the network to see what numbers they we were that you were trying to reach, but they couldn t break though the noise of our signal. The upshot of it was that uh oh and he asked why we went off the air after about 3 months, because this was to make long distance telephone calls for free and I said this was because we regarded it as an engineering problem and we made the longest long distance call and so that was it. So the deal was, as I explained in my email to Bill Bullen, that we wouldn t try to sell this and we were told, I was told that crime significance would pay a great deal for this, we wouldn t do any more of it and that we would turn our equipment over to AT&T, and so they got a complete vacuum tube isolator kit for making long distance phone calls. But I was grateful for AJ Dodge and I must say, AT&T that they decided not to wreck my life. And so I told Bill Bullen that he had a great opportunity here, to not wreck somebody s life, course he thankfully did the right thing.
Aaron s unique quality was that he was marvelously and vigorously different. There is a scarcity of that. Perhaps we can be all a little more different too.
Thank you very much.

21 December 2012

Dirk Eddelbuettel: Rcpp 0.10.2

Relase 0.10.2 of Rcpp provides the second update to the 0.10.* series, and has arrived on CRAN and in Debian. It brings another great set of enhancements and extensions, building on the recent 0.10.0 and 0.10.1 releases. The new Rcpp attributes were rewritten to not require Rcpp modules (as we encountered on issue with exceptions on Windows when built this way), code was reorganized to significantly accelerate compilation and a couple of new things such as more Rcpp sugar goodies, a new timer class, and a new string class were added. See below for full details. We also tested this fairly rigorously by checking about two thirds of the over 90 CRAN packages depending on Rcpp (and the remainder required even more package installs which we did not do as this was already taking about 12 total cpu hours to test). We are quite confident that no changes are required (besides one in our own RcppClassic package which we will update. The complete NEWS entry for 0.10.2 is below; more details are in the ChangeLog file in the package and on the Rcpp Changelog page.
Changes in Rcpp version 0.10.2 (2012-12-21)
  • Changes in Rcpp API:
    • Source and header files were reorganized and consolidated so that compile time are now significantly lower
    • Added additional check in Rstreambuf deletetion
    • Added support for clang++ when using libc++, and for anc icpc in std=c++11 mode, thanks to a patch by Yan Zhou
    • New class Rcpp::String to facilitate working with a single element of a character vector
    • New utility class sugar::IndexHash inspired from Simon Urbanek's fastmatch package
    • Implementation of the equality operator between two Rcomplex
    • RNGScope now has an internal counter that enables it to be safely used multiple times in the same stack frame.
    • New class Rcpp::Timer for benchmarking
  • Changes in Rcpp sugar:
    • More efficient version of match based on IndexHash
    • More efficient version of unique base on IndexHash
    • More efficient version of in base on IndexHash
    • More efficient version of duplicated base on IndexHash
    • More efficient version of self_match base on IndexHash
    • New function collapse that implements paste(., collapse= "" )
  • Changes in Rcpp attributes:
    • Use code generation rather than modules to implement sourceCpp and compileAttributes (eliminates problem with exceptions not being able to cross shared library boundaries on Windows)
    • Exported functions now automatically establish an RNGScope
    • Functions exported by sourceCpp now directly reference the external function pointer rather than rely on dynlib lookup
    • On Windows, Rtools is automatically added to the PATH during sourceCpp compilations
    • Diagnostics are printed to the console if sourceCpp fails and C++ development tools are not installed
    • A warning is printed if when compileAttributes detects Rcpp::depends attributes in source files that are not matched by Depends/LinkingTo entries in the package DESCRIPTION
Thanks to CRANberries, you can also look at a diff to the previous release 0.10.1. As always, even fuller details are on the Rcpp Changelog page and the Rcpp page which also leads to the downloads, the browseable doxygen docs and zip files of doxygen output for the standard formats. A local directory has source and documentation too. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page

Daniel Kahn Gillmor: libasound2-plugins is a resource hog!

I run mpd on debian on "igor", an NSLU2 -- a very low-power ~266MHz armel machine, with no FPU and a scanty 32MiB of RAM. This serves nicely to feed my stereo with music that is controllable from anywhere on my LAN. When playing music and talking to a single mpd client, the machine is about 50% idle. However, during a recent upgrade, something wanted to pull in pulseaudio, which in turn wanted to pull in libasound2-plugins, and i distractedly (foolishly) let it. With that package installed, after an mpd restart, the CPU was completely thrashed (100% utilization) and music only played in stutters of 1 second interrupted by a couple seconds of silence. igor was unusable for its intended purpose. Getting rid of pulseaudio was my first attempt to fix the stuttering, but the problem remained even after pulse was all gone and mpd was restarted. Then i did a little search of which packages had been freshly installed in the recent run:
grep ' install .* <none> ' /var/log/dpkg.log
and used that to pick out the offending package. After purging libasound2-plugins and restarting mpd, the igor is back in action. Lesson learned: on low-overhead machines, don't allow apt to install recommends!
echo 'APT::Install-Recommends "0";' >> /etc/apt/apt.conf
And it should go without saying, but sometimes i get sloppy: i need to pay closer attention during an "apt-get dist-upgrade" Tags: alsa, apt, low-power, mpd

9 November 2012

Gunnar Wolf: Road trip to ECSL 2012 in Guatemala

Encuentro Centroamericano de Software Libre! Guatemala! During a national (for us) holiday, so it's easy to go without missing too much work time! How would I miss the opportunity? Several years ago, I started playing with the idea of having a road trip Probably this was first prompted with the UK crew and the Three Intrepid Motorcycle Riders arriving by land to DebConf 9 I don't know. Fact is, I wanted to go to DebConf10 in New York by land, as well as to DebConf12 in Nicaragua. Mostly due to a lack of time, I didn't Although we did start making some longish trips. Of course, my desire to show Regina what Mexico is like also helped! So, up until a week ago, our (according to my standards) long distance driving experience included:
  • M xico Guanajuato Puerto Vallarta Guanajuato M xico, in early November 2011, for Festival de Software Libre and with Regina and our Blender friends Octavio and Claudia. Totalling almost 1900Km, mostly consisting of wide, toll highway.
  • M xico Xilitla San Luis Potos M xico, in April 2012, just for fun and for a nice short vacation, alone with Regina. Totalling almost 1200Km, but through Sierra Gorda de Quer taro, a very tough stretch of about 250Km which we did at about 50Km/h on average. Beautiful route for sure! We didn't originally intend to go through San Luis Potos , and it does not appear to make much sense, as it adds ~350Km to the total, but it was even quicker than going back by the same route and according to those who now, even faster than our planned route, via Tamazunchale and Ixmiquilpan!
  • M xico San Luis Potos Zacatecas Aguascalientes Guanajuato M xico, in May 2012, for Congreso Internacional de Software Libre, again with Octavio and Claudia. Totalling 1250Km, and following very good roads, although most of them were toll-free.
But there is always a certain halo over crossing a border, maybe more so in countries as large as Mexico. We convinced Pooka and Moni, and granted, with some aprehension, as we knew of some important security risks in the more rural areas we wanted to go through we decided to go to Guatemala. And, although we wanted to go with a bit more time, Real Life took its toll: We could not take more time than the intersection of what our respective jobs offered. So, here goes a short(?) recap of our six day long, 3200Km trip. Of course, we have a map detailing this. Mexico Veracruz I came to my office early on Wednesday (31-oct), and left with Regina around 10AM towards Veracruz. We agreed to meet there with Moni and Pooka, who would take the night bus, and continue together. Crossing Mexico City proved to be the longest obstacle We arrived to Veracruz already past 3PM, and spent a nice evening walking down the center and port of the city. Veracruz port can still be seen as part of central Mexico; I knew the road quite well. Veracruz San Andr s Tuxtla Catemaco San Cristobal de las Casas We met with our friends at the iconic Gran Caf de la Parroquia at 6:30AM. Had a nice breakfast with coffee, and by 7:30 we were heading south-west. The reason to have a road trip was to get to know the route, to enjoy the countryside So, given we "only" had to make 650Km this day, we took the non-toll road A narrow path stretching along the coastal plains of Veracruz, until Acayucan. Doing so, we also saved some money, as the equivalent toll road is around MX$300 (~US$25)! Veracruz is a hot state. We ended up all sweaty and tired by 19:00, when we reached San Cristobal. We had agreed not to drive at night, due to security issues, but fortunately there was quite a bit of traffic both ways between Tuxtla Guti rrez (Chiapas State capital, around 1hr from San Cristobal, where darkness got us) and our destination, so we carried on. Now, San Cristobal is a high city, almost as high as Mexico City (2100m), and being more humid, it was quite chilly. We went for a walk, and were convinced that at a later time, we had to stay for several days there. The city is beautiful, the region is breath-taking, there is a lot of great handicrafts as well, and it's overall very cheap. Really lovely place. San Cristobal de las Casas Cd. Cuauht moc La Mesilla Guatemala Once again, this day started early. We woke up ready to leave at 7AM, and not earlier because the hotel's parking didn't open earlier. After a very quick visit to San Cristobal downtown, to take some photos that were not right the night before, we took the road to Comit n, stopping just for some tamales de bola y chipil n for breakfast. Central Chiapas is almost continuously populated, differing from most of my experience in Mexico. It is all humid, and has some very beautiful landscapes. We passed Comit n, which is a much larger city than what we expected, went downhill after La Trinitaria, crossed a plain, and continued until hills started taking over again. We stopped in a very chaotic, dirty place: Just accross the border, where Ciudad Cuauht moc becomes La Mesilla. This border was basically what we expected: There is no half-official place to exchange money, so we had to buy quetzales from somebody who offered them on the street, at MX$2 per Q1 (where the real exchange should be around 1.50 to 1). While on the road, I was half-looking for exchange posts in Comit n and onwards, and found none (and being a festive day, they would probably be closed anyway). But we were expecting this, after all, and exchanged just the basic minimum: MX$600 (US$50, which by magic became Q300, US$40). The tramit consists of:
  • Spraying the car against diseases (which has a cost of Q18)
  • Each of us has to go through migration. Note, in case you cross this border: We didn't expressly cross Mexican migration, so officially there was no record of us going out. Be sure to go through migration to avoid problems at re-entry!
    Migration has no cost.
  • Customs. As we were entering by car, I had to purchase a permit for circulation. I don't remember the exact quote, but it was around Q150, and the permit is valid for 90 days.
  • That's it! Welcome to Guatemala!
La Mesilla is in Guatemala's Huehuetenango Department, and from all of the Departments we crossed until Guatemala city (Huehuetenango, Quetzaltenango, Totonicap n, Solol , Chimaltenango, Sacatep quez and Guatemala), this is the largest one. Huehuetenango is home to the Cuchumatanes mountain ridge. We found beautiful, really steep, really fertile mountains. It is plainly amazing: Mountains over 60 , and quite often full with agricultural use Even at their steepest points! The CA-1 highway is, in general, in very good shape. There are however many (many, many) speed bumps (or topes, in Mexican terminology. Or t mulos in Guatemalan), at least a couple at every village we crossed, not always painted. The road is narrow and quite winding; it follows river streams for most of the way. We feared it would be in much worse shape, from what we have heard, but during the whole way we found only three points where the road was unusable due to landslides and an alternative road was always in place when we needed it. After Totonicap n, the narrow road becomes a wide (four lane) highway. Don't let that fool you! It still goes through the center of every village along the road, so it's really not meant for speeding. Also, even though the pavement is in very good condition, it is really steep quite often. It is not the easiest road to drive, but it's (again) by far not as bad as we expected. We arrived to Guatemala City as dawn was falling, and got promptly lost. Guatemala has a very strange organization scheme: The city is divided in several zones, laid out in a swirl-like fashion. East-west roads are called Calle and North-south roads are called Avenida (except for zona 4, I think, where they are diagonal, and some are Rutas while the others are V as). I won't go into too much detail). Thing is, many people told us it's a foolproof design, and people from different countries understand the system perfectly. We didn't... At least not when we arrived. We got quite lost, and it took us around one hour to arrive to our hotel, at almost 19:00 Almost 12 hours since we left San Cristobal. Went for a quick dinner, and then waited for our friends to arrive after the first day of work of ECSL, which we missed completely. And, of course, we were quite tired, so we didn't stay up much longer. Antigua Guatemala On Saturday, ECSL's activities started after 14:00 so we almost-kidnapped Wences, the local organization lead, and took him to show us around Antigua Guatemala. Antigua was the capital of Guatemala until an earthquake destroyed it in the 1770s; the capital was moved to present-day Guatemala city, but Antigua was never completely abandoned. Today, it is a world heritage site, a beautiful city, where we could/should have stayed for several days. But we were there for the conference, so we were in Antigua just a couple of hours, and headed back to Guatemala. Word of caution: Going from Guatemala to Antigua, we went down via the steepest road I have ever driven. Again, a real four-lane highway... but quite scary! The main focus for this post is to give some roadtrip advice to potential readers... So, this time around, I won't give much detail regarding ECSL. It was quite interesting, we had some very good discussions... but it would take me too much space to talk about it! The road back: Guatemala Tec n Um n; Cd. Hidalgo Arriaga So, about the road back: Yes, we just spent three days getting to Guatemala City. We were there only for ~36 hours. And... We needed to be here by Tuesday morning no matter what. So, Sunday at noon we said goodbye to our good friends in ECSL and started the long way back. To get to know more of Guatemala, we went back by the CA-2 highway, which goes via the coastal plains Not close to the Pacific ocean, which we didn't get to see at all, but not through the mountains. To get to CA-2, we took CA-9 from Guatemala. If I am not mistaken, this is the only toll road in Guatemala (at least, the only we used, and we used some pretty good highways!) It is not expensive; I don't remember right now, but must have been around Q20 (US$3). Went South past Palin and until CA-2, just outside Escuintla city, and headed West. All of Escuintla and Suchitep quez it is again a four lane highway; somewhere in Retalhueu it becomes a two lane highway. We were strongly advised not to take this road at night because, as the population density is significantly lower than in CA-1, it can get lonely at times And there are several reports of robberies. We did feel the place much less populated, but saw nothing suspicious in any way. Something important: There are almost no speedbumps in CA-2! The terrain stayed quite flat and easy as we crossed Quetzaltenango, and only in San Marcos we found some interesting hills and a very strong rain that would intermitetntly accompany us for the rest of the ride. So, we finally arrived to the border city of Tec n Um n at around 16:30 Approximately four hours after leaving the capital. The Tec n Um n Cd. Hidalgo cities and border pass are completely different from the disorderly and dirty Cd. Cuauht moc La Mesilla ones. The city of Tec n Um n could be just a nice town anywhere in the country, it does not feel aggressive as most border cities I have seen in our continent. We stopped to eat at El pollo campero and headed to the border. In the Mexican side, we also saw a very well consolidated, big and ordered migration area. Migration officers were very kind and helpful As we left Cd. Cuauht moc, Regina didn't get a stamp of leaving Mexico, so technically she was ilegally out of the country (as she is not a national... They didn't care about the rest of us). The tramit to fix this was easy, simple, straightforward. We only paid for the fumigation again (MX$60, US$5), and were allowed to leave. Anyway, we crossed the border. There is a ~30Km narrow road between Cd. Hidalgo and Tapachula, but starting in Tapachula we went on Northwards via a very good, four lane and very straight highway. Even though we had agreed not to drive at night... Well, we were quite hurried and still too far from Mexico City, so we decided to push it for three more hours, following the coastline until the city of Arriaga, almost at the border between Chiapas and Oaxaca. Found a little hotel to sleep some hours and collapsed. Word of warning: This road (from Tapachula to Arriaga) is also known for its robberies. We saw only one suspicious thing: Two guys were pushing up their motorcycle, from which they had apparently fallen. We didn't stop, as they looked healthy and not much in need of help, but later on talked about this Even though this was at night, they were not moving as if they had just crashed; nothing was scratched, not the motorcycle and not their clothes. That might have been an attempt to mug us (or whoever stopped by). This highway is very lonely, and the two directions are separated by a wall of vegetation, so nobody would have probably seen us were we to stop for some minutes. Be aware if you use this road! The trip comes to an end: Arriaga Niltepec Istmo C rdoba M xico The next (last, finally!) day, we left at 6:30AM. After driving somewhat over one hour, we arrived to Niltepec, where a group of taxi drivers had the highway closed as a protest against their local government's tolerance of mototaxis. We evaluated going back to Arriaga and continue via the Tuxtla Guti rrez highway, but that would have been too long. We had a nice breakfast of tlayudas (which resulted in Pooka getting an alergic reaction shortly afterwards) and, talking with people here and there, were told about an alternative route by an agricultural road that surrounds the blockade. So, we took this road the best way we could, and after probably 1hr of driving at 20Km/h, finally came back to the main road. We planned on crossing the isthmus using the Acayucan-Juchit n road We were amazed at the La Ventosa ("the windy") area, where we crossed a huge eolic plant for electricity generation, so of course we got our good share of photos. From then onwards, not much more worth mention. Crossed the isthmus via a quite secondary road in not too good shape (although there is a lot of machinery, and the road will much likely improve in the next few months/years), then took the toll freeway along Veracruz until C rdoba. We stopped for a (delilcious and revigorizing!) cup of coffee in Hotel Zeballos, where Agust n de Iturbide signed with Viceroy Juan O'Donoj the treaties that granted Mexico the independence. Traveller, beware: When crossing between Puebla and Veracruz, there is a steep slope of almost 1000m where , you will almost always (except if it's close to noon) find very thick fog; taking the highway from C rdoba, this is in the region known as Cumbres de Maltrata. We had the usual fog, and just as we left it, a thin but constant rain that went on with us up until we got home. Crossed Puebla state with no further eventualities, and arrived to Pooka and Moni's house by 22:00. Less than one hour later, Regina and me arrived home as well. This was four days ago... and I have finally finished writing it all down ;-) Hope you find this useful, or if not, at least entertaining! If you read this post in my blog, you will find many pictures taken along the trip below (Well, if you are reading the right page, not in the general blog index...). If you are reading from a planet or other syndication service... Well, come to the blog! Dreamhost woes Oh, and... Yes, it sometimes happens: My blog is hosted at Dreamhost. This means that usually it works correctly... But sometimes, specially when many people request many nontrivial pages, it just gives an error. If you get an error, reload once or twice... Or until your patience manages ;-)

11 August 2012

Russ Allbery: Review: Design Patterns

Review: Design Patterns, by Erich Gamma, et al.
Author: Erich Gamma
Author: Richard Helm
Author: Ralph Johnson
Author: John Vlissides
Publisher: Addison-Wesley
Copyright: 1995
Printing: September 1999
ISBN: 0-201-63361-2
Format: Hardcover
Pages: 374
Design Patterns: Elements of Reusable Object-Oriented Software by the so-called "Gang of Four" (Gamma, Helm, Johnson, and Vlissides) is one of the best-known books ever written about software design, and one of the most widely cited. The language introduced here, including the names of specific design patterns, is still in widespread use in the software field, particularly with object-oriented languages. I've had a copy for years, on the grounds that it's one of those books one should have a copy of, but only recently got around to reading it. The goal of this book is to identify patterns of design that are widely used, and widely useful, for designing object-oriented software. It's specific to the object-oriented model; while some of the patterns could be repurposed for writing OO-style programs in non-OO languages, they are about inheritance, encapsulation, and data hiding and make deep use the facilities of object-oriented design. The patterns are very general, aiming for a description that's more general than any specific domain. They're also high-level, describing techniques and methods for constructing a software system, not algorithms. You couldn't encapsulate the ideas here in a library and just use them; they're ideas about program structure that could be applied to any program with the relevant problem. With the benefit of seventeen years of hindsight, I think the primary impact of this book has been on communication within the field. The ideas in here are not new to this book. Every pattern in Design Patterns was already in use in the industry before it was published; the goal was taxonomy, not innovation. One would not come to Design Patterns to learn how to program, although most introductory texts on object-oriented programming now borrow much of the pattern terminology. Rather, Design Patterns is as influential as it is because it introduced a shared terminology and a rigor around that terminology, allowing writers and programmers to put a name to specific program structures and thus talk about them more clearly. This also allows one to take a step back and see a particular structure in multiple programs, compare and contrast how it's used, and draw some general conclusions about where it would be useful. I have the feeling that the authors originally hoped the book would serve as a toolbox, but I think it's instead become more of a dictionary. The pattern names standardized here are widely used even by people who have never read this book, but I doubt many people regularly turn to this book for ideas for how to structure programs. Design Patterns is divided into two parts: a general introduction to and definition of a software pattern followed by a case study, and then a catalog of patterns. The catalog is divided into creational patterns (patterns for creating objects), structural patterns (patterns for composing objects into larger structures), and behavioral patterns (patterns for interactions between objects). Each pattern in turn follows a very rigid presentation structure consisting of the pattern name and high-level classification, its basic intent, other common names, a scenario that motivates the pattern, comments on the applicability of the pattern, the structure and classes or objects that make up the pattern, how those participants collaborate, how the pattern achieves its goals, comments on implementation issues, sample code, known uses of the pattern in real-world software, and related patterns. As with a dictionary, the authors go to great lengths to keep the structure, terminology, and graphical representations uniform throughout, and the cross-referencing is comprehensive (to the point of mild redundancy). As for the patterns themselves, their success, both as terminology and as useful design elements, varies. Some have become part of the core lexicon of object-oriented programming (Factory Method, Builder, Singleton), sometimes to the point of becoming syntactic elements in modern OO languages (Iterator). These are terms that working programmers use daily. Others aren't quite as widespread, but are immediately recognizable as part of the core toolkit of object-oriented programming (Adapter, Decorator, Proxy, Observer, Strategy, Template Method). In some cases, the technique remains widespread, but the name hasn't caught on (Command, for example, which will be immediately familiar but which I rarely hear called by that name outside of specific uses inside UI toolkits due to ambiguity of terms). Other patterns are abstract enough that it felt like a bit of a reach to assign a name to them (Bridge, Composite, Facade), and I don't think use of those names is common, but the entries are still useful for definitional clarity and for comparing similar approaches with different implications. Only one pattern (Interpreter) struck me as insufficiently generic to warrant recording in a catalog of this type. So far, so good, but the obvious question arises: if you've not already read this book, should you read it? I think the answer to that is debatable. The largest problem with Design Patterns is that it's old. It came late enough in the development of object-oriented programming that it does capture much of the foundation, but OO design has continued to change and grow, and some patterns have either been developed subsequently or have become far more important. For example, Model-View-Controller is conspicuous by its absence, mentioned only in passing in the discussion of the Observer pattern. Any pattern catalog written today would have an extensive discussion. Similarly absent are Inversion of Control and Data Access Object, which are much more central to the day-to-day world of the modern programmer than, say, Memento or Visitor. One could easily go on: Lazy Initialization, Mock Object, Null Object... everyone will have their own list. A related problem is that all the implementation examples are shown in either C++ or Smalltalk (occasionally both). Those were probably the best languages to use at the time, but it's doubtful a modern rewrite would choose them. Smalltalk, in particular, I found nearly incomprehensible for the uninitiated, to the point where I ignored the code and only read the surrounding English description. C++ fares better, but Design Patterns occasionally drifts off into tedious discussions of how to work around C++'s limitations in ways that are irrelevant to the pattern itself and would not be necessary in, say, Java or Python. (This is ameliorated by the fact that C++, unlike Smalltalk, is still in widespread use, so those discussions remain moderately helpful for some readers.) Design Patterns is not, therefore, a very good source for a working knowledge of the most common patterns in use today. It has also become somewhat obsolete via its own success: the concept of a design pattern has become so popular that nearly all introductory texts include at least a basic discussion of design patterns and an introduction to the most notable and useful patterns. I think that's a more comfortable and more efficient way to pick up the basics than reading through this book, which is somewhat dense and which expects from the reader a reasonably good working knowledge of object-oriented programming. And, once you have the basics, MVC, DAO, and similar design patterns are probably more important than the more subtle design patterns presented here. That said, I think the rigor of description and the comparisons and discussions here still have some value. Design Patterns encourages the reader to look at patterns from a higher-level perspective, to think about meta-patterns such as the balance between data hiding and access, or between structure designed for the present purpose and structure that's adaptable to future needs. It's also mildly interesting from a historical standpoint; one can see the inspiration for future language designers in how problems are described here, and see how many of the implementation issues and negative consequences have been corrected or simplified by richer language designs. Overall, I would hesitate to recommend buying this book today, particularly at new textbook prices. But if you're a working object-oriented software designer or programmer, I think it's worth checking out from a library (and, thanks to its influence, any library with a decent software design section will almost certainly have a copy). Read the overall discussion, skim the catalog, and read the discussion of the patterns that strike your interest. It may help provide some additional structure and perspective to how you think about OO design. Rating: 6 out of 10

26 June 2012

Russ Allbery: Review: Quiet

Review: Quiet, by Susan Cain
Publisher: Crown
Copyright: 2012
ISBN: 0-307-45220-4
Format: Kindle
Pages: 263
I've always been an introvert. This is something that surprises some people when they first meet me since they equate introversion with shyness, and I'm not at all shy. It surprises others because I'm quite vocal and outspoken in meetings, but introversion also doesn't imply a lack of self-confidence. I can run meetings, give presentations, and argue my perspective in front of just about anyone, but I don't like parties, I crave time by myself, and I could happily go for weeks without seeing another human being. I'm an introvert because I find people draining rather than invigorating, written communication far easier and more comfortable than spoken, and superficial social contact more irritating and frustrating than enjoyable. If you think think that means there may be something wrong with me, or that I would be happier if "drawn out of my shell," I wish you would read this book. But I suspect its core audience will be people like me: those who are tired of being pushed to conform with extrovert beliefs about social interaction, those who are deeply disgusted by the word "antisocial" or feel pangs of irrational guilt when hearing it, or those who just want to read an examination of interpersonal interactions that, for once, is written by and about people who like quiet and solitude just like they do. I first encountered Susan Cain via her TED talk, which I think is both the best possible summary of this book and the best advertisement for it. If you've not already seen it, watch it; it's one of the best TED talks I've seen, good enough that I've watched it three times. If you then want more of the same, buy Quiet. Quiet has, I think, three messages. First, it's a tour of the science: what is introversion and extroversion? Is there evidence that these are real physiological differences? (Spoiler: yes.) What do we know about introversion? How do we know those things what experiments have been done and what methods have been used? Here, it's a good general introduction, although Cain is careful to point out that it only scratches the surface and there's much more scientific depth. For example, she touches on the connections between introversion and sensitivity to stimulus and points out that they're two separate, if related, categorizations, but doesn't have the space here to clarify the distinctions and tease them apart. But she lays a reasonable foundation, particularly in defense of introversion as a natural, physiologically grounded, scientifically analyzable, common, and healthy way of interacting with the world. (For those who are curious about the distinctions between introversion and sensitivity, and the argument that most of what Cain says here about introversion is actually about sensitivity, see the blog post by Elaine Aron.) The second message, the one that resonated with me the most, was Cain's passionate defense of introversion. Business culture (at least in the United States, which is what both Cain and I know) is strongly biased towards extroversion; at least faking extroversion seems to be required for some career advancement. Extrovert culture dominates politics and most public discourse. It's common to find people who consider introversion, particularly in children, to be a sign of unhappiness, poor social adjustment, psychological problems, or other issues that should be "cured" or changed. Cain's gentle but firm passion in defense of introversion is a breath of fresh air. She attacks open plan offices, the current obsession with group learning and social school settings, and the modern group-think bias towards collaboration over solitude and concentration, and she does that with a combination of polite frustration and the conclusions of multiple studies. Introverts will be cheering as she constructs solid arguments and musters evidence against things that we've always found miserable and then been told we were wrong, short-sighted, or socially inept for finding miserable. I am so utterly on her side in this argument that I have no way of knowing how persuasive it will be, but it's lovely just to hear someone put into words what I feel. This defense does skew the book. Quiet is not, and does not purport to be, an even-handed presentation of introversion and extroversion. It's written proudly and unabashedly from the introvert's point of view. I'm fine with that: I, like Cain, think the US is saturated in extrovert perspectives and extrovert advice, particularly in the business world, and could use some balancing by activism from the other perspective. But be aware that this is not the book to look to for an objective study of all angles of the introvert/extrovert dichotomy, and I'm not sure her descriptions of extroversion are entirely fair or analogous to those of introversion. The extroversion described here seems somewhat extreme to me. I'm dubious how many extroverts would recognize themselves in it, which partly undermines the argument. The third message of the book, once Cain has won the introvert's heart, is some advice on how to be a proud introvert, to make the space and find the quiet that one desires, and to balance that against places where one may want and need to act like an extrovert for a while. Cain thankfully does not try to make this too much of the book, nor does she hold up any particular approach as The Answer. All the answers are going to be individual. But she does offer some food for thought, particularly around how to be conscious of and make delibrate choices about one's energy expenditures and one's recharge space. She also captures beautifully something that I've not seen explained this well before: the relief that an introvert can feel in the company of an extrovert who helps navigate social situations, make conversation, and keep discussions going until they can reach the depth and comfort level where the introvert can engage. I wish anyone in a position of authority over social situations would read this book, or at least watch the TED talk and be aware of the issues. Particularly managers, since (at least in my relatively limited experience) workplace culture is so far skewed towards extroversion that it can be toxic to introverts. Many of the techniques used by extrovert managers, and the goals and advice they give their employees, are simply wrong for introverts, and even damaging. Cain speaks well to the difficulties of empathy between people with very different interaction preferences, such as the problems with extroverts trying to "draw out" introverts who have hit social overload (or sensitive people who have hit stimulus overload). She also discusses something that I'd previously not thought about, namely how the pressure towards extroversion leads people to act extroverted even when naturally introverted, and how it's therefore very difficult to tell from behavior (and sometimes even to tell internally!) what one's natural interaction style is. But mostly I recommend this book if you're an introvert, if the TED talk linked above speaks to you. Even if we can't convince the world to respect introversion more, or at least stop treating it as abnormal, it's a lovely feeling to read a book from someone who gets it. Who understands. Who fills a book with great stories about introverts and how they construct their worlds and create quiet space in which to be themselves. Rating: 9 out of 10

7 May 2012

Lars Wirzenius: Quality of discussion in free software development

The Online Photographer has a meta-article on some discussion in the photography world. Summary: someone wrote an opinion piece on one site, and people on the discussion forum of another site got his name wrong, possibly repeatedly. And the quality of the discussion went down from there. The quality of the discourse of free software development is frequently of some concern. Debian has a reputation as being a host to, er, particularly vigorous discussions. That reputation is not unwarranted, but, I think, we've improved a lot since 2005. The problem is hardly restricted to Debian, however. How can we improve this? I don't know. As a community, I'm not even sure we agree what the problems are. Here's my list. Insults, personal attacks, and other such outrageously bad behavior is uncommon. It crosses the line so clearly it becomes easy to deal with; I don't think handling this needs much attention. What can we do about this? I'm not sure. I have, for the time being, abandonded Debian mailing lists as a way to influence what goes on in the project, but that's just a way for me to clear some space in my head and time in my day to actually do things. My pet hypothetical solution of the day is that mailing lists might raise the quality of the debates by limiting the number of messages written by each person per day in each thread. This might, I think, induce people to write with more thought and put more effort into making each message count.

9 February 2012

Matthew Garrett: Is GPL usage really declining?

Matthew Aslett wrote about how the proportion of projects released under GPL-like licenses appears to be declining, at least as far as various sets of figures go. But what does that actually mean? In absolute terms, GPL use has increased - any change isn't down to GPL projects transitioning over to liberal licenses. But an increasing number of new projects are being released under liberal licenses. Why is that?

The figures from Black Duck aren't a great help here, because they tell us very little about the software they're looking at. FLOSSmole is rather more interesting. I pulled the license figures from a few sites and found the following proportion of GPLed projects:

RubyForge: ~30%
Google Code: ~50%
Launchpad: ~70%

I've left the numbers rough because there's various uncertainties - should proprietary licenses be included in the numbers, is CC Sharealike enough like the GPL to count it there, that kind of thing. But what's clear is that these three sites have massively different levels of GPL use, and it's not hard to imagine why. They all attract different types of developer. The RubyForge figures are obviously going to be heavily influenced by Ruby developers, and that (handwavily) implies more of a bias towards web developers than the general developer population. Launchpad, on the other hand, is going to have a closer association with people with an Ubuntu background - it's probably more representative of Linux developers. Google Code? The 50% figure is the closest to the 56.8% figure that Black Duck give, so it's probably representative of the more general development community.

The impression gained from this is that the probability of you using one of the GPL licenses is influenced by the community that you're part of. And it's not a huge leap to believe that an increasing number of developers are targeting the web, and the web development community has never been especially attached to the GPL. It's not hard to see why - the benefits of the GPL vanish pretty much entirely when you're never actually obliged to distribute the code, and while Affero attempts to compensate from that it also constrains your UI and deployment model. No matter how strong a believer in Copyleft you are, the web makes it difficult for users to take any advantage of the freedoms you'd want to offer. It's as easy not to bother.
So it's pretty unsurprising that an increase in web development would be associated with a decrease in the proportion of projects licensed under the GPL.

This obviously isn't a rigorous analysis. I have very little hard evidence to back up my assumptions. But nor does anyone who claims that the change is because the FSF alienated the community during GPLv3 development. I'd be fascinated to see someone spend some time comparing project type with license use and trying to come up with a more convincing argument.

(Raw data from FLOSSmole: Howison, J., Conklin, M., & Crowston, K. (2006). FLOSSmole: A collaborative repository for FLOSS research data and analyses. International Journal of Information Technology and Web Engineering, 1(3), 17 26.)

comment count unavailable comments

3 January 2012

Matthew Garrett: TVs are all awful

A discussion a couple of days ago about DPI detection (which is best summarised by this and this and I am not having this discussion again) made me remember a chain of other awful things about consumer displays and EDID and there not being enough gin in the world, and reading various bits of the internet and wikipedia seemed to indicate that almost everybody who's written about this has issues with either (a) technology or (b) English, so I might as well write something.

The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.

EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).

1366/8 = 170.75. Hm.

Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.

It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.

So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?

But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.

Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.

tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.

So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.

The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.

The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.

The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.

comment count unavailable comments

28 November 2011

Dirk Eddelbuettel: A Story of Life and Death. On CRAN. With Packages.

The Comprehensive R Archive Network, or CRAN for short, has been a major driver in the success and rapid proliferation of the R statistical language and environment. CRAN currently hosts around 3400 packages, and is growing at a rapid rate. Not too long ago, John Fox gave a keynote lecture at the annual R conference and provided a lot of quantitative insight into R and CRAN---including an estimate of an incredible growth rate of 40% as a near-perfect straight line on a log-log chart! So CRAN does in fact grow exponentially. (His talk morphed into this paper in the R Journal, see figure 3 for this chart.) The success of CRAN is due to a lot of hard work by the CRAN maintainers, lead for many years and still today by Kurt Hornik whose dedication is unparalleled. Even at the current growth rate of several packages a day, all submissions are still rigorously quality-controlled using strong testing features available in the R system. And for all its successes, and without trying to sound ungrateful, there have always been some things missing at CRAN. It has always been difficult to keep a handle on the rapidly growing archive. Task Views for particular fields, edited by volunteers with specific domain knowledge (including yours truly) help somewhat, but still cannot keep up with the flow. What is missing are regular updates on packages. What is also missing is a better review and voting system (and while Hadley Wickham mentored a Google Summer of Code student to write CRANtastic, it seems fair to say that this subproject didn't exactly take off either). Following useR! 2007 in Ames, I decided to do something and noodled over a first design on the drive back to Chicago. A weekend of hacking lead to CRANberries. CRANberries uses existing R functions to learn which packages are available right now, and compares that to data stored in a local SQLite database. This is enough to learn two things: First, which new packages were added since the last run. That is very useful information, and it feeds a website with blog subscriptions (for the technically minded: an RSS feed, at this URL). Second, it can also compare current versions numbers with the most recent stored version number, and thereby learns about updated packages. This too is useful, and also feeds a website and RSS stream (at this URL; there is also a combined one for new and updated packages.) CRANberries writes out little summaries for both new packages (essentially copying what the DESCRIPTION file contains), and a quick diffstat summary for updated packages. A static blog compiler munges this into static html pages which I serve from here, and creates the RSS feed data at the same time. All this has been operating since 2007. Google Reader tells me the the RSS feed averages around 137 posts per week, and has about 160 subscribers. It does feed to Planet R which itself redistributes so it is hard to estimate the absolute number of readers. My weblogs also indicate a steady number of visits to the html versions. The most recent innovation was to add tweeting earlier in 2011 under the @CRANberriesFeed Twitter handle. After all, the best way to address information overload and too many posts in our RSS readers surely is to ... just generate more information and add some Twitter noise. So CRANberries now tweets a message for each new package, and a summary message for each set of new packages (or several if the total length exceeds the 140 character limit). As of today, we have sent 1723 tweets to what are currently 171 subscribers. Tweets for updated packages were added a few months later. Which leads us to today's innovation. One feature which has truly been missing from CRAN was updates about withdrawn packages. Packages can be withdrawn for a number of reasons. Back in the day, CRAN carried so-called bundles carrying packages inside. Examples were VR and gregmisc. Both had long been split into their component packages, making VR and gregmisc part of the set of packages no longer on the top page of CRAN, but only its archive section. Other examples are packages such as Design, which its author Frank Harrell renamed to rms to match to title of the book covering its methodology. And then there are of course package for which the maintainer disappeared, or lost interest, or was unable to keep up with quality requirements imposed by CRAN. All these packages are of course still in the Archive section of CRAN. But how many packages did disappear? Well, compared to the information accumulated by CRANberries over the years, as of today a staggering 282 packages have been withdrawn for various reasons. And at least I would like to know more regularly when this happens, if only so I have a chance to see if the retired package is one the 120+ packages I still look after for Debian (as happened recently with two Rmetrics packages). So starting with the next scheduled run, CRANberries will also report removed packages, in its own subtree of the website and its own RSS feed (which should appear at this URL). I made the required code changes (all of about two dozen lines), and did some light testing. To not overwhelm us all with line noise while we catch up to the current steady state of packages, I have (temporarily) lowered the frequency with which CRANberries is called by cron. I also put a cap on the number of removed packages that are reported in each run. As always with new code, there may be a bug or two but I will try to catch up in due course. I hope this is of interest and use to others. If so, please use the RSS feeds in your RSS readers, and subscribe to the @CRANberriesFeed, And keep using CRAN, and let's all say thanks to Kurt, Stefan, Uwe, and everybody who is working on CRAN (or has been in the past).

17 November 2011

Rapha&#235;l Hertzog: People Behind Debian: Mark Shuttleworth, Ubuntu s founder

I probably don t have to present Mark Shuttleworth he was already a Debian developer when he became millionaire after having sold Thawte to Verisign in 1999. Then in 2002 he became the first African (and first Debian developer) in space. 2 years later, he found another grandiose project to pursue: bring the Microsoft monopoly to an end with a new alternative operating system named Ubuntu (see bug #1). I have met Mark during Debconf 6 in Oaxtepec (Mexico), we were both trying to find ways to enhance the collaboration between Debian and Ubuntu. The least I can say is that Mark is opinionated but any leader usually is, and in particular the self-appointed ones! :-) Read on to discover his view on the Ubuntu-Debian relationship and much more. Raphael: Who are you? Mark: At heart I m an explorer, inventor and strategist. Change in technology, society and business is what fascinates me, and I devote almost all of my time and wealth to the catalysis of change in a direction that I hope improves society and the environment. I m 38, studied information systems and finance at the University of Cape Town. My hearts home is Cape Town, and I ve lived there and in Star City and in London, now I live in the Isle of Man with my girlfriend Claire and 14 precocious ducks. I joined Debian in around 1995 because I was helping to setup web servers for as many groups as possible, and I thought Debian s approach to packaging was very sensible but there was no package for Apache. In those days, the NM process was a little easier ;-) Raphael: What was your initial motivation when you decided to create Ubuntu 7 years ago? Mark: Ubuntu is designed to fulfill a dream of change; a belief that the potential of free software was to have a profound impact on the economics of software as well as its technology. It s obvious that the technology world is enormously influenced by Linux, GNU and the free software ecosystem, but the economics of software are still essentially unchanged. Before Ubuntu, we have a two-tier world of Linux: there s the community world (Debian, Fedora, Arch, Gentoo) where you support yourself, and the restricted, commercial world of RHEL and SLES/SLED. While the community distributions are wonderful in many regards, they don t and can t meet the needs of the whole of society; one can t find them pre-installed, one can t get certified and build a career around them, one can t expect a school to deploy at scale a platform which is not blessed by a wide range of institutions. And the community distributions cannot create the institutions that would fix that. Ubuntu brings those two worlds together, into one whole, with a commercial-grade release (inheriting the goodness of Debian) that is freely available but also backed by an institution. The key to that dream is economics, and as always, a change in economics; it was clear to me that the flow of money around personal software would change from licensing ( buying Windows ) to services ( paying for your Ubuntu ONE storage ). If that change was coming, then there might be room for a truly free, free software distribution, with an institution that could make all the commitments needed to match the commercial Linux world. And that would be the achievement of a lifetime. So I decided to dedicate a chunk of my lifetime to the attempt, and found a number of wonderful people who shared that vision to help with the attempt. It made sense to me to include Debian in that vision; I knew it well as both a user and insider, and believed that it would always be the most rigorous of the community distributions. I share Debian s values and those values are compatible with those we set for Ubuntu.
Debian would always be the most rigorous of the community distributions.
Debian on its own, as an institution, could not be a partner for industry or enterprise. The bits are brilliant, but the design of an institution for independence implies making it difficult to be decisive counterparty, or contractual provider. It would be essentially impossible to achieve the goals of pre-installation, certification and support for third-party hardware and software inside an institution that is designed for neutrality, impartiality and independence. However, two complementary institutions could cover both sides of this coin. So Ubuntu is the second half of a complete Debian-Ubuntu ecosystem. Debian s strengths complement Ubuntu s, Ubuntu can achieve things that Debian cannot (not because its members are not capable, but because the institution has chosen other priorities) and conversely, Debian delivers things which Ubuntu cannot, not because its members are not capable, but because it chooses other priorities as an institution. Many people are starting to understand this: Ubuntu is Debian s arrow, Debian is Ubuntu s bow. Neither instrument is particularly useful on its own, except in a museum of anthropology ;)
Ubuntu is Debian s arrow, Debian is Ubuntu s bow.
So the worst and most frustrating attitude comes from those who think Debian and Ubuntu compete. If you care about Debian, and want it to compete on every level with Ubuntu, you are going to be rather miserable; you will want Debian to lose some of its best qualities and change some of its most important practices. However, if you see the Ubuntu-Debian ecosystem as a coherent whole, you will celebrate the strengths and accomplishments of both, and more importantly, work to make Debian a better Debian and Ubuntu a better Ubuntu, as opposed to wishing Ubuntu was more like Debian and vice versa. Raphael: The Ubuntu-Debian relationship was rather hectic at the start, it took several years to mature . If you had to start over, would you do some things differently? Mark: Yes, there are lessons learned, but none of them are fundamental. Some of the tension was based on human factors that cannot really be altered: some of the harshest DD critics of Canonical and Ubuntu are folk who applied for but were not selected for positions at Canonical. I can t change that, and wouldn t change that, and would understand the consequences are, emotionally, what they are. Nevertheless, it would have been good to be wiser about the way people would react to some approaches. We famously went to DebConf 5 in Porto Allegre and hacked in a room at the conference. It had an open door, and many people popped a head in, but I think the not-a-cabal collection of people in there was intimidating and the story became one of exclusion. If we d wanted to be exclusive, we would have gone somewhere else! So I would have worked harder to make that clear at the time if I d known how many times that story would be used to paint Canonical in a bad light. As for engagement with Debian, I think the situation is one of highs and lows. As a high, it is generally possible to collaborate with any given maintainer in Debian on a problem in which there is mutual interest. There are exceptions, but those exceptions are as problematic within Debian as between Debian and outsiders. As a low, it is impossible to collaborate with Debian as an institution, because of the design of the institution.
It is generally possible to collaborate with any given maintainer [ ] [but] it is impossible to collaborate with Debian as an institution.
In order to collaborate, two parties must make and keep commitments. So while one Debian developer and one Ubuntu developer can make personal commitments to each other, Debian cannot make commitments to Ubuntu, because there is no person or body that can make such commitments on behalf of the institution, on any sort of agile basis. A GR is not agile ;-) . I don t say this as a critique of Debian; remember, I think Debian has made some very important choices, one of those is the complete independence of its developers, which means they are under no obligation to follow a decision made by anyone else. It s also important to understand the difference between collaboration and teamwork. When two people have exactly the same goal and produce the same output, that s just teamwork. When two people have different goals and produce different product, but still find ways to improve one anothers product, that s collaboration. So in order to have great collaboration between Ubuntu and Debian, we need to start with mutual recognition of the value and importance of the differences in our approach. When someone criticises Ubuntu because it exists, or because it does not do things the same way as Debian, or because it does not structure every process with the primary goal of improving Debian, it s sad. The differences between us are valuable: Ubuntu can take Debian places Debian cannot go, and Debian s debianness brings a whole raft of goodness for Ubuntu. Raphael: What s the biggest problem of Debian? Mark: Internal tension about the vision and goals of Debian make it difficult to create a harmonious environment, which is compounded by an unwillingness to censure destructive behaviour. Does Debian measure its success by the number of installs? The number of maintainers? The number of flamewars? The number of packages? The number of messages to mailing lists? The quality of Debian Policy? The quality of packages? The freshness of packages? The length and quality of maintenance of releases? The frequency or infrequency of releases? The breadth of derivatives? Many of these metrics are in direct tension with one another; as a consequence, the fact that different DD s prioritise all of these (and other goals) differently makes for interesting debate. The sort of debate that goes on and on because there is no way to choose between the goals when everyone has different ones. You know the sort of debate I mean :-) Raphael: Do you think that the Debian community improved in the last 7 years? If yes, do you think that the coopetition with Ubuntu partly explains it? Mark: Yes, I think some of the areas that concern me have improved. Much of this is to do with time giving people the opportunity to consider a thought from different perspectives, perhaps with the benefit of maturity. Time also allows ideas to flow and and of course introduces new people into the mix. There are plenty of DD s now who became DD s after Ubuntu existed, so it s not as if this new supernova has suddenly gone off in their galactic neighbourhood. And many of them became DD s because of Ubuntu. So at least from the perspective of the Ubuntu-Debian relationship, things are much healthier. We could do much better. Now that we are on track for four consecutive Ubuntu LTS releases, on a two-year cadence, it s clear we could collaborate beautifully if we shared a freeze date. Canonical offered to help with Squeeze on that basis, but institutional commitment phobia reared its head and scotched it. And with the proposal to put Debian s first planned freeze exactly in the middle of Ubuntu s LTS cycle, our alignment in interests will be at a minimum, not a maximum. Pure <facepalm />. Raphael: What would you suggest to people (like me) who do not feel like joining Canonical and would like to be paid to work on improving Debian? Mark: We share the problem; I would like to be paid to work on improving Ubuntu, but that s also a long term dream ;-) Raphael: What about using the earnings of the dormant Ubuntu Foundation to fund some Debian projects? Mark: The Foundation is there in the event of Canonical s failure to ensure that commitments, like LTS maintenance, are met. It will hopefully be dormant for good ;-) Raphael: The crowdfunding campaign for the Debian Administrator s Handbook is still going on and I briefly envisioned the possibility to create the Ubuntu Administrator s Handbook. What do you think of this project? Mark: Crowdfunding is a great match for free software and open content, so I hope this works out very well for you. I also think you d find a bigger market for an Ubuntu book, not because Ubuntu is any more important than Debian but because it is likely to appeal to people who are more inclined to buy or download a book than to dive into the source. Again, this is about understanding the difference in audiences, not judging the projects or the products. Raphael: Is there someone in Debian that you admire for their contributions? Mark: Zack is the best DPL since 1995; it s an impossible job which he handles with grace and distinction. I hope praise from me doesn t tarnish his reputation in the project!
Thank you to Mark for the time spent answering my questions. I hope you enjoyed reading his answers as I did.

Subscribe to my newsletter to get my monthly summary of the Debian/Ubuntu news and to not miss further interviews. You can also follow along on Identi.ca, Google+, Twitter and Facebook .

32 comments Liked this article? Click here. My blog is Flattr-enabled.

Next.

Previous.