DebConf team: Wrapping up DebConf14 (Posted by Paul Wise, Donald Norwood)

ADT_REBOOT_MARK=mymarker
.
The new Reboot during a test section in README.package-tests explains this in detail with an example.
Implicit test metadata for similar packages
The Debian pkg-perl team recently discussed how to add package tests to the ~ 3.000 Perl packages. For most of these the test metadata looks pretty much the same, so they created a new pkg-perl-autopkgtest package which centralizes the logic. autopkgtest 3.5 now supports an implicit debian/tests/control
control file to avoid having to modify several thousand packages with exactly the same file.
An initial run already looked quite promising, 65% of the packages pass their tests. There will be a few iterations to identify common failures and fix those in pkg-perl-autopkgtest
and autopkgtest
itself now.
There is still some discussion about how implicit test control files go together with the DEP-8 specification, as other runners like sadt
do not support them yet. Most probably we ll declare those packages XS-Testsuite: autopkgtest-pkg-perl
instead of the usual autopkgtest
.
In the same vein, Debian s Ruby maintainer (Antonio Terceiro) added implicit test control support for Ruby packages. We haven t done a mass test run with those yet, but their structure will probably look very similar.
autopkgtest
support. It will be expanded in the future to run tests on other suites and architectures.gem2deb-test-runner
from gem2deb
, so that autopkgtest
tests can be run against any Ruby package that has tests by running gem2deb-test-runner --autopkgtest
. gem2deb-test-runner
will do the right thing, make sure that the tests don t use code from the source package, but instead run them against the installed package.
Then, right after my talk I was glad to discover that the Perl team is also working on a similar tool that will automate running tests for their packages against the installed package. We agreed that they will send me a whitelist of packages in which we could just call that tool and have it do The Right Thing.
We might be talking here about getting autopkgtest
support (and consequentially continuous integration) for free for almost Testsuite: autopkgtest
field in the Sources file, could be assumed to have autopkgtest
support by calling the right tool (gem2deb-test-runner
for Ruby, or the Perl team s new tool for Perl packages).autopkgtest
test runner assume a corresponding, implicit, debian/tests/control
when it not exists in those packagesamqp
branch against the code in the master branch.debci enqueue
command that can be used to force test runs for packages given on the command line.$HOME
by default when the user is not root
. We discussed a few implementation options, and while I don t have a solution yet, we have a better understanding of the potential pitfalls.
The Ruby BoF session on Friday produced a few interesting discussions. Some take away point include, but are not limited to:
rails-4.1
branch in the upstream Subversion repository as source.
I am a little nervous about using a upstream snapshot, though. According to the "roadmap of the project ":http://www.redmine.org/projects/redmine/roadmap the only purpose of the 3.0 release will be to upgrade to Rails 4, but before that happens there should be a 2.6.0 release that is also not released yet. 3.0 should be equivalent to that 2.6.0 version both feature-wise and, specially, bug-wise. The only problem is that we don t know what that 2.6.0 looks like yet. According to the roadmap it seems there is not much left in term of features for 2.6.0, though.
The updated package is not in unstable yet, but will be soon. It needs more testing, and a good update to the documentation. Those interested in helping to test Redmine on jessie before the freeze please get in touch with me.
Noosfero
I gave a lighting talk on Noosfero, a platform for social networking websites I am upstream for. It is a Rails appplication licensed under the AGPLv3, and there are packages for wheezy. You can checkout the slides I used. Video recording is not available yet, but should be soon.
That s it. I am looking forward to DebConf 15 at Heidelberg. :-)
ruby
package.
And that dependency graph is very small. Looking at the dependency graph for,
say, the rails
package will make your eyes bleed. I tried it here, and
GraphViz needed a PNG image with 7653 10003 pixels to draw it. It ain t pretty.
Installing rails on a clean Debian system will pull in another 109 packages as
part of the dependency chain. Again, as new versions of those packages are
uploaded the archive, there is a probability that a backwards-incompatible
change, or even a bug fix which was being worked around, might make some
funcionality in rails
stop working. Even if that probability is low for each
package in the dependency chain, with enough packages the probability of any
of them causing problems for rails is quite high.
And still the rails
dependency chain is not that big. libreoffice
will pull
in another 264 packages. gnome
will pull in 1311 dependencies, and kde-full
1320 (!).
With a system this big, problems will arrive, and that s a fact of life. As
developers, what we can do is try to spot these problems as early as possible,
and fixing them in time to make a solid release with the high quality Debian is
known for.
While automated testing is not the proverbial Silver Bullet of Software
Engineering, it is an effective way of finding regressions.
Back in 2006, Ian Jackson started the development of
autopkgtest as a tool to test Debian
packages in their installed form (as opposed to testing packages using their
source tree).
In 2011, the autopkgtest test suite format was proposed as a standard for the
Debian project, in what we now know as the
DEP-8 specification.
Since then, some maintainers such as myself started experimenting with DEP-8
tests in their packages. There was an expectation in the air that someday,
someone would run those tests for the entire archive, and that would be a
precious source of QA information.
Durign the holiday break last year, I decided to give it a shot. I initially
called the codebase dep8
. Later I renamed it to debci
, since it could
potentially also run other other types of test suites in the future. Since
early January, ci.debian.net run an instance of debci
for the Debian Project.
The Debian continuous Integration will trigger tests at most 4 times a day, 3
hours after each dinstall run. It will update a local APT cache and look for
packages that declare a DEP-8 test suite. Each package with a test suite will
then have its test suite executed if there was any change in its dependency chain
since the last test run. Existing test results are published at
ci.debian.net every hour, and at the end of each batch
a global status is updated.
Maintainers can subscribe to a per package Atom feed to keep up with their
package test results. People interested in the overall status can subscribe to
a global Atom feed of events.
Since the introduction of Debian CI in mid-January 2014, we have seen an amazing increase in
the number of packages with test suites. We had little less than 200 packages
with test suites back then, against around 350 now (early June 2014). The ratio
of packages passing passing their test suite has also improved a lot, going
from less than 50% to more than 75%.
debci
is under a good rate of development, and
you can expect to see a constant flux of improvements. In special, I would like
to mention a few people who are giving amazing contributions to the project:
This thesis proposes a theory to characterize structural complexity in software systems. This theory aims to identify (i) the contribution of several factors to the structural complexity variation and (ii) the effects of structural complexity in software projects. Possible factors in the structural complexity variation include: human factors, such as general experience of the developers and their familiarity with the different parts of the system; factors related to the changes performed on the system, such as size variation and change diffusion; and organizational factors, such as the maturity of the software development process and the communication structure of the project. Effects of structural complexity include higher effort, and consequently higher cost, in software comprehension and maintenance activities. To test the validity of the proposed theory, four empirical studies were performed, mining data from free software project repositories. We analyzed historical data from changes performed in 13 systems from different application domains and written in different programming languages. The results of these studies indicated that all the factors studied influenced the structural complexity variation significantly in at least one of the projects, but different projects were influenced by different sets of factors. The models obtained were capable of describing up to 93% of the structural complexity variation in the projects analyzed. Keywords: Structural Complexity, Software Maintainance, Human factors in Software Engineering, Mining Software Repositories, Theories in Software Engineering, Empirical Software Engineering, Free/Open Source Software Projects.Those who read Portuguese can check out the actual thesis text as a PDF file. Most of the studies discussed in the thesis are presented in English in papers I have published during the last years. My defense is going to be on March 23rd. If you happen to be at Salvador at that day, please feel cordially invited.
#!/bin/sh -e
for attr in $(seq 0 1); do
for fg in $(seq 30 37); do
for bg in $(seq 40 47); do
printf "\033[$attr;$ bg ;$ fg m$attr;$fg;$bg\033[m "
done
echo
done
done
Is there a package in Debian that already does that? Would people find it useful to have this packaged?
update: it turns out you can find some similar stuff on google images. It was a quick and fun hack, though.
update 2: Replacing echo -n
with printf
makes the script work independently if /bin/sh is bash or dash. Thanks to cocci for pointing that out.
exec()
, you fork()
a process in the background that will wait for the current process id to disappear from the process list, and then does whatever you want to do.
A simple proof-of-concept I wrote is composed of two bash programs: wrapper
and real
.
real
is really simple: it just waits a few seconds and then prints its process id to the console:
#!/bin/bash
sleep 5
echo $BASHPID
wrapper
is the program that handles the situation we want to exercise: it replaces itself with real
, but still has the chance to do something after real
finishes. In this case, wrapper
notifies the user that real
finished.
#!/bin/bash
echo $BASHPID
real_program_pid=$BASHPID
(
while ps -p "$real_program_pid" >/dev/null; do
sleep 0.1s
done
notify-send 'real program finished'
) &
exec ./real
One nice property that wrapper
explores is that when exec()
starts real
, it really replaces wrapper
, and therefore has the same process id (in this case accessible by bash in the $BASHPID
variable). Because of this, the background process that wrapper
starts just before the exec()
call already knows which process it has to watch for.
The actual code for waiting is not optimal, though. I cannot use waitpid()
(the wait
builtin in bash), since real
is not a child process of wrapper
. I went with a brute force approach here, and I am pretty sure there is a cheaper way to wait for a random PID without a busy loop (but that wasn t the point here).
1
update: I am aware of the classic fork()
/exec()
pattern. My Very Good Reasons include the fact that I can t control the flow: I am writing a plugin for a program that calls its plugins in sequence, and after that, calls exec()
, but my plugin is interested in doing some work after exec()
finishes.
single-debian-patch
option to debian/source/options
so that a single Debian patch is generated, and include a patch header that points people interested in the individual changes to the public Git repository where they were originally done.
My reasoning for doing so was the following: most upstream developers will hardly care enough to come check the patches applied against their source in Debian, so it's not so important to have a clean source package with separated and explained patches. But then there is the people who will actually care about the patches: other distribution developers. Not imposing a specific VCS on them to review the patches applied in Debian is a nice thing to do.
Then I wrote a script called git-export-debian-patches (download, manpage), which was partly inspired by David Bremner's script. It exports all commits in the Debian packaging branch that do not touch files under debian/ and were not applied upstream to debian/patches. The script also creates an appropriate debian/patches/series files. The script is even smart enough to detect patches that were later reverted in the Debian branch and exclude them (and the commit that reverted them) from the patch list.
The advantage I see over gbp-pq is that I don't need to rebase (and thus lose history) to have a clean set of patches. The advantage over the gitpkg quilt-patches-deb-export-hook hook is that I don't need to explicitly say which ranges I want: every change that is merged in master, was not applied upstream and was not reverted gets listed as a patch. To be honest I don't have any experience with either gbp-pq or gitpkg and these advantages were based on what I read, so please leave a (nice ;-)) comment if I said something stupid.
I am looking forward to receive feedback about the tool, specially about potential corner cases in which it would break. For now I have tested it in a package with simple changes agains upstream source, and it seems fine.
$ for i in $(seq -f %03g 1 100); do convert -scale 640x480 -quality $i /path/to/original.jpg $i.jpg; echo $i; doneThen I generated a data file by calculating the size of each file with du and piping the results through sed and awk:
$ du -b [0-9]*.jpg sed 's/.jpg//' awk ' print $2 " " $1 'The generated data file looks this, with JPEG quality in first column and file size in bytes in the second column:
001 20380 002 20383 003 20634 004 21106 [...]Regarding to file size, it seems like between 1 and 50, file size grows sublinearly with quality. Beyond that, the curve reaches an inflection point and grows in a way that looks, if not exponentially, at least polynomially.
$ R R version 2.13.1 (2011-07-08) Copyright (C) 2011 The R Foundation for Statistical Computing ISBN 3-900051-07-0 Platform: i486-pc-linux-gnu (32-bit) R um software livre e vem sem GARANTIA ALGUMA. Voc pode redistribu -lo sob certas circunst ncias. Digite 'license()' ou 'licence()' para detalhes de distribui o. R um projeto colaborativo com muitos contribuidores. Digite 'contributors()' para obter mais informa es e 'citation()' para saber como citar o R ou pacotes do R em publica es. Digite 'demo()' para demonstra es, 'help()' para o sistema on-line de ajuda, ou 'help.start()' para abrir o sistema de ajuda em HTML no seu navegador. Digite 'q()' para sair do R. > png() > data <- read.table('points.dat') > quality <- data[[1]] > quality [1] 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 [19] 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 [37] 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 [55] 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 [73] 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 [91] 91 92 93 94 95 96 97 98 99 100 > filesize <- data[[2]] > filesize [1] 20380 20383 20634 21106 21551 22012 22469 22878 23323 23715 [11] 24103 24494 24952 25327 25725 26127 26507 26886 27216 27550 [21] 27917 28288 28627 28945 29271 29583 29919 30280 30516 30813 [31] 31099 31367 31679 31873 32232 32538 32704 33072 33324 33443 [41] 33860 34055 34253 34633 34804 35074 35216 35491 35871 35935 [51] 36030 36443 36743 36898 37120 37382 37726 38077 38307 38581 [61] 39002 39270 39700 39962 40388 40762 41086 41629 42062 42544 [71] 43048 43392 44062 44824 45023 45682 46532 47347 47833 48701 [81] 49612 50423 51694 52637 53635 55243 56340 58304 59709 62162 [91] 64207 66273 70073 74617 79917 86745 94950 105680 128158 145937 > plot(quality, filesize, xlab = 'JPEG Quality', ylab = 'File size') > Save workspace image? [y/n/c]: yLooking at the actual generated thumbnails, somewhere after quality > 60 I stopped noticing the difference between increasing quality factors. Settling with a default quality of 75 seems to be good enough: the resulting static HTML album generated from a folder with 82 pictures dropped from 12MB with the default ImageMagick quality factor to 6MB with quality 75, with very little perceivable image quality loss.
# We like our code without pic on x86, thank you
[i386]: shlib-with-non-pic-code
If you had not guessed it, we use the same format as is used in the Build-Depends field (except for the lack of wildcard support). So you should be familiar with it. Next.