Search Results: "sez"

22 November 2013

Vincent Sanders: Error analysis is the sweet spot for improvement

Although Don Norman was discussing designers attitude to user errors I assert the same is true for programmers when we use static program analysis tools.

The errors, or rather defects in the jargon, that a static analysis tools produce can be considered low cost well formed bug reports available very early in the development process.

When I say low cost it is because they can be found by a machine without a user or fellow developer wasting their time finding them. Well formed comes because the machine can describe exactly how it came to the logical deduction leading to the defect.
IntroductionStatic analysis is in general terms using the computer to examine a program for logic errors beyond those of pure syntax before it is executed. Examining a running program for defects is known as dynamic program analysis and while a powerful tool in its own right is not the topic of discussion.

This analysis has historically been confined to compiled languages as their compilers already had the Abstract Syntax Tree (AST) of the code available for analysis. As an example the C language (released in 1972) had the lint tool (released in 1979) based on the PCC compiler.

Practical early compilers (I am generalising here as the 19070s were a time of white hot innovation in computing and examples of just about any innovation in the field could probably be found) were pretty primitive and produced executables which were less good than hand written assembler output. Due to practical constraints the progress of optimising compilers was not as rapid as might be desired so static analysis was largely used as an external process.

Before progressing I ought to explain why I just mixed the concept of an optimising compiler and static analysis. The act of optimisation within those compilers requires program analysis, from which they can generate defect reports which we all know and love as compiler warnings, also explaining why many warnings only appear at higher optimisation levels where deeper analysis is required.

The attentive reader may now enquire as to why we would need external analysis tools when our compilers already perform the task. The answer stems from the issue that a compiler is trying to reconcile many desirable traits including:
The slow progress in creating optimising compilers initially centred around the problem of getting the compiled output in a reasonable time to allow for a practical edit-compile-run-debug cycle although the issues more recently have moved more towards the compiler implementation costs.

Because the output generation time is still a significant factor compilers limit the level of static analysis performed to that strictly required to produce good output. In standard operation optimising compilers do not do the extended analysis necessary to find all the defects that might be detectable.

An example: compiling one 200,000 line C program with the clang (v3.3) compiler producing x86 instruction binaries at optimisation level 2 takes 70 seconds but using the clang based scan-build static analysis tool took 517 seconds or more than seven times as long.
Using static analysis
As already described warnings are a by-product of an optimising compilers analysis and most good programmers will endeavour to remove all warnings from a project. Thus almost all programmers are already using static analysis to some degree.

The external analysis tools available can produce many more defect reports than the compiler alone as long as the developer is prepared to wait for the output. Because of this delay static analysis is often done outside the usual developers cycle and often integrated into a projects Continuous Integration (CI) system.

The resulting defects are usually presented as annotated source code with a numbered list of logical steps which shows how the defect can present. For example the steps might highlight where a line of code allocates memory from the heap and then an exit path where no reference to the allocated memory is kept resulting in a resource leak.

Once the analysis has been performed and a list of defects generated the main problem with this technology rears its ugly head, that of so called "false positives". The analysis is fundamentally an undecidable problem (it is a variation of the halting problem) and relies on algorithms to generate approximate solutions. Because of this some of the identified defects are erroneous.

The level of erroneous defect reports varies depending on the codebase being analysed and how good the analysis tool being used is. It is not uncommon to see false positive rates, even with the best tools, in excess of 10%

Good tools allow for this and provide ways to supply additional context through model files or hints in the source code to suppress the incorrect defect reports. This is analogous to using asserts to explicitly constrain variable values or a type cast to suppress a type warning.

Even once the false positives have been dealt with there comes the problem of defects which while they may be theoretically possible take so many steps to achieve that their probability is remote at best. These defects are often better categorized as a missing constraint and the better analysis tools generate fewer than the more naive implementations.

An issue with some defect reports is that often defects will appear in a small number of modules within programs, generally where the developers already know the code is of poor quality, thus not adding useful knowledge about a project.

As with all code quality tools static analysis can be helpful but is not a panacea code may be completely defect free but still fail to function correctly.
Defect DensityA term that is often used as a metric for code quality is the defect density. This is nothing more than the ratio of defect to thousands of lines of code e.g. a defect density of 0.9 means that there is approximately one defect found in every 1100 lines of code.

The often quoted industry average defect density value is 1, as with all software metrics this can be a useful indicator but should not be used without understanding.

The value will be affected by improvements in the tool as well as how lines of code are counted so is exceptionally susceptible to gaming and long term trends must be treated with scepticism.
Practical examplesI have integrated two distinct static analysis tools into the development workflow for the NetSurf project which I shall present as case studies. These examples show a good open source solution and a commercial offering highlighting the issues with each.

Several other solutions, both open source and commercial, exist many of which have been examined and discarded as either impractical or proving less useful than those selected. However the investigation was not comprehensive and only considered what was practical for the project at the time.
clangThe clang project is a frontend to the LLVM project providing an optimising compiler for the C, C++ and objective C languages. As part of this project the compiler has been enhanced to run a collection of "checkers" which implement various methods of analysis on the code being compiled.

The "scan-build" tool is provided to make the using these features straightforward. This tool generates defect reports as a series of html files which show the analysis results.


NetSurf CI system scan-build overview
Because the scan-build takes in excess of eight minutes on powerful hardware the NetSurf developers are not going to run this tool themselves as a matter of course. To get the useful output without the downsides it was decided to integrate the scan into the CI system code quality checks.

NetSurf CI system scan-build result list
Whenever a git commit happens to the mainline branch and the standard check build completes successfully on all target architectures the scan is performed and the results are published as a list of defects.

The list is accessible directly through the CI interface and also incorporates a trend graph showing how many defects were detected in each build.

A scan-build report showing an extremely unlikely path to a defect
Each defect listed has a detail link which reveals the full analysis and logic necessary to cause the defect to occur.

Unfortunately even NetSurf which is a relatively small piece of software (around 200,000 lines of code at time of writing) causes 107 defects to be emitted by scan-build.

All but 25 of the defects are however "Dead Store" where the code has a value assigned but is never checked. These errors are simply not interesting to the developers and are occurring in code generated by a tool.

Of the remaining defects identified the majority are false positives and several (like the example in the image above) are simply improbable requiring a large number of steps to reach.

This shows up the main problem with the scan-build tool in that there is no way to suppress certain checks, mark defects as erroneous or avoid false positives using a model file. This reduces the usefulness of these builds because the developers all need to remember that this list of defects is not relevant.

Most of the NetSurf developers know that the project currently has 107 outstanding issues and if a code change or tool improvement were to change that value we have to manually work through the defect list one by one to check what had changed.
CoverityThe coverity SAVE tool is a commercial offering from a company founded in the Computer Systems Laboratory at Stanford University in Palo Alto, California. The results of the original novel research has produced a good solution which improved on analysis tools previously available.

Coverity Interface showing summary of NetSurf analysis. Layout issues are a NetSurf bug
The company hosts a gratis service for open source projects, they even provide scans for the Linux kernel so project size does not appear to be an issue.

The challenges faced integrating the coverity tool into the build process differed from clang however the issue of execution time remained and the CI service was used.

The coverity scanning tool is a binary executable which collects data on the build which is then submitted to the coverity service to be analysed. This tool obviously relies upon the developer running the executable to trust coverity to some degree.

A basic examination of the binary was performed and determined the executable was not establishing network connections or performing and observably undesirable behaviour. From this investigation the decision was made that running the tool inside a sandbox environment on a CI build slave was safe. The CI system also submits the collected results in a compressed form directly to the coverity scan service.

Care must be taken to only submit builds according to the services Acceptable Use Policy which limits the submission frequency of NetSurf scans to every other day. To ensure the project stays within the rules the build performed by the CI system is manually controlled and confined to a subset of NetSurf developers.

Coverity connect defect management console for NetSurfThe results are presented using the coverity connect web technology based defect management tool. Access to the coverity connect interface is controlled by a user management system which precludes publicly publishing the results within the CI system.

Unfortunately NetSurf itself does not currently have good enough JavaScript DOM bindings to support this interface so another browser must be used to view it.

Despite the drawbacks the quality of the analysis results is greatly superior to the clang solution. The false positive rate is very low while finding many real issues which had not been previously detected.

The analysis can be enhanced by use of collection configuration and modelling files which remove intended constructions from consideration reducing the false positive rate to very low levels. The ability to easily and persistently suppress false positives through the web interface is also available.

The false positive management capabilities coupled with a user interface that makes understanding the defect path simple make this solution very practical and indeed the NetSurf developers have removed over 50 actual issues within a relatively short period since the introduction of the tool.

Not all of those defects could be considered serious but they had the effect of encouraging deeper inspection of some very dubious smelling source.
ConclusionsThe principle conclusions of implementing and using static analysis have been:

When I started looking at this technology I was somewhat dubious about its usefulness but I have definitely changed my mind. It is a useful addition to any non-trivial project and the return on time and effort should be repaid handsomely in all but already perfect code (if you believe you have such code I have a bridge to sell you).

22 February 2013

Hideki Yamane: Postgresql + GPU / SELinux approach

I've participated an event (18th, Feb), Kohei Kaigai from NEC talked about 3 topics.


  • Places to Visit in Europe
  • PG-Strom: GPU Accelerated AsynchronousSuper-Parallel Query
  • Row-Level Security: DB Security, prevent information leak

I've interested in PG-Strom, it can accelerated query 8 times with $100 GPU card, for more detail, see above link.

This event was hosted in Red Hat office at Yebisu( ), Tokyo. Thanks to Ryo Fujita from Red Hat for his coordination.


25 December 2011

Matthew Palmer: The Other Way...

Chris Siebenmann sez:
The profusion of network cables strung through doorways here demonstrates that two drops per sysadmin isn t anywhere near enough.
What I actually suspect it demonstrates is that Chris company hasn t learnt about the magic that is VLANs. All of the reasons he cites in the longer, explanatory blog post could be solved with VLANs. The only time you can t get away with one gigabit drop per office and an 8 port VLAN-capable switch is when you need high capacity, and given how many companies struggle by with wifi, I m going to guess that sustained gigabit-per-machine is not a common requirement. So, for Christmas, buy your colleages a bunch of gigabit VLAN capable switches, and you can avoid both the nightmare of not having enough network ports, and the more hideous tragedy of having to crawl around the roofspace and recable an entire office.

1 June 2010

Debian News: New Debian Developers (May 2010)

The following developers got their Debian accounts in the last month: Congratulations!

23 December 2008

Emilio Pozuelo Monfort: Collaborative maintenance

The Debian Python Modules Team is discussing which DVCS to switch to from SVN. Ondrej Certik asked how to generate a list of commiters to the team s repository, so I looked at it and got this:
emilio@saturno:~/deb/python-modules$ svn log egrep "^r[0-9]+ cut -f2 -d sed s/-guest// sort uniq -c sort -n -r
865 piotr
609 morph
598 kov
532 bzed
388 pox
302 arnau
253 certik
216 shlomme
212 malex
175 hertzog
140 nslater
130 kobold
123 nijel
121 kitterma
106 bernat
99 kibi
87 varun
83 stratus
81 nobse
81 netzwurm
78 azatoth
76 mca
73 dottedmag
70 jluebbe
68 zack
68 cgalisteo
61 speijnik
61 odd_bloke
60 rganesan
55 kumanna
52 werner
50 haas
48 mejo
45 ucko
43 pabs
42 stew
42 luciano
41 mithrandi
40 wardi
36 gudjon
35 jandd
34 smcv
34 brettp
32 jenner
31 davidvilla
31 aurel32
30 rousseau
30 mtaylor
28 thomasbl
26 lool
25 gaspa
25 ffm
24 adn
22 jmalonzo
21 santiago
21 appaji
18 goedson
17 toadstool
17 sto
17 awen
16 mlizaur
16 akumar
15 nacho
14 smr
14 hanska
13 tviehmann
13 norsetto
13 mbaldessari
12 stone
12 sharky
11 rainct
11 fabrizio
10 lash
9 rodrigogc
9 pcc
9 miriam
9 madduck
9 ftlerror
8 pere
8 crschmidt
7 ncommander
7 myon
7 abuss
6 jwilk
6 bdrung
6 atehwa
5 kcoyner
5 catlee
5 andyp
4 vt
4 ross
4 osrevolution
4 lamby
4 baby
3 sez
3 joss
3 geole
2 rustybear
2 edmonds
2 astraw
2 ana
1 twerner
1 tincho
1 pochu
1 danderson
As it s likely that the Python Applications Packaging Team will switch too to the same DVCS at the same time, here are the numbers for its repo:

emilio@saturno:~/deb/python-apps$ svn log egrep "^r[0-9]+ cut -f2 -d sed s/-guest// sort uniq -c sort -n -r
401 nijel
288 piotr
235 gothicx
159 pochu
76 nslater
69 kumanna
68 rainct
66 gilir
63 certik
52 vdanjean
52 bzed
46 dottedmag
41 stani
39 varun
37 kitterma
36 morph
35 odd_bloke
29 pcc
29 gudjon
28 appaji
25 thomasbl
24 arnau
20 sc
20 andyp
18 jalet
15 gerardo
14 eike
14 ana
13 dfiloni
11 tklauser
10 ryanakca
10 nxvl
10 akumar
8 sez
8 baby
6 catlee
4 osrevolution
4 cody-somerville
2 mithrandi
2 cjsmo
1 nenolod
1 ffm
Here I m the 4th most committer :D And while I was on it, I thought I could do the same for the GNOME and GStreamer teams:
emilio@saturno:~/deb/pkg-gnome$ svn log egrep "^r[0-9]+ cut -f2 -d sed s/-guest// sort uniq -c sort -n -r
5357 lool
2701 joss
1633 slomo
1164 kov
825 seb128
622 jordi
621 jdassen
574 manphiz
335 sjoerd
298 mlang
296 netsnipe
291 grm
255 ross
236 ari
203 pochu
198 ondrej
190 he
180 kilian
176 alanbach
170 ftlerror
148 nobse
112 marco
87 jak
84 samm
78 rfrancoise
75 oysteigi
73 jsogo
65 svena
65 otavio
55 duck
54 jcurbo
53 zorglub
53 rtp
49 wasabi
49 giskard
42 tagoh
42 kartikm
40 gpastore
34 brad
32 robtaylor
31 xaiki
30 stratus
30 daf
26 johannes
24 sander-m
21 kk
19 bubulle
16 arnau
15 dodji
12 mbanck
11 ruoso
11 fpeters
11 dedu
11 christine
10 cpm
7 ember
7 drew
7 debotux
6 tico
6 emil
6 bradsmith
5 robster
5 carlosliu
4 rotty
4 diegoe
3 biebl
2 thibaut
2 ejad
1 naoliv
1 huats
1 gilir

emilio@saturno:~/deb/pkg-gstreamer$ svn log egrep "^r[0-9]+ cut -f2 -d sed s/-guest// sort uniq -c sort -n -r
891 lool
840 slomo
99 pnormand
69 sjoerd
27 seb128
21 manphiz
8 he
7 aquette
4 elmarco
1 fabian
Conclusions:
- Why do I have the full python-modules and pkg-gstreamer trees, if I have just one commit to DPMT, and don t even have commit access to the GStreamer team?
- If you don t want to seem like you have done less commits than you have actually done, don t change your alioth name when you become a DD ;) (hint: pox-guest and piotr in python-modules are the same person)
- If the switch to a new VCS was based on a vote where you have one vote per commit, the top 3 commiters in pkg-gnome could win the vote if they chosed the same! For python-apps it s the 4 top commiters, and the 7 ones for python-modules. pkg-gstreamer is a bit special :)

29 October 2008

Benjamin Mako Hill: An Invisible Handful of Stretched Metaphors

The following list is merely a small selection of scholarly articles listed in the ISI Web of Knowledge with "invisible hand" in their title: And, finally:

9 June 2008

Igor Genibel: Premi re s ance

Lev t t pour tre frais et dispo pour la premi re s ance, je me suis pr par rapidement. La nuit a t assez courte, couch vers minuit pour finaliser le support de cours. Enfin, il est pr t. J'ai juste quelques ajustements y apporter et il partira pour l'impression. Mieux vaut tard que jamais ;) J' tais donc pr t 7h30 pour un petit d jeuner ma foi agr able au bord de la piscine et accompagn de Tisserin et de la superbe Grue Couronn e quelques dizaines de centim tres de moi ;) Un petit tour au coin wifi internet pour envoyer 2 ou 3 mails et me voil attendre. Tout compte fait, j'ai d attendre jusqu' 10h30 pour que l'on vienne me chercher. La salle n' tait pas disponible, les petits tracas administratifs de derni re minute. Une fois sur place l'ENA (Ecole Nationale de l'Administration) je me suis lanc pr senter le syst me 25 personnes. Mais impossible de faire fonctionner loe r troprojecteur mis ma disposition. Pb de configuration Xorg voir ce soir. Donc j'ai projet mes transparent fait avec MagicPoint g n r s en HTML sur une machine sous Windows. Apr s une br ve introduction nous voil partis dans le vif du sujet. Les stagiaires taient tout ou e et donc je n'ai pas m nag les informations dispens es. Repas 14h frugal pour ma part, je n'avais pas trop faim. Poulet en sauve, crudit s, riz, pommes de terre. Me voil reparti 15h pour la deuxi me partie de la journ e. J'ai demand aux stagiaires de se pr senter... Bon, il va falloir que je fasse les bases des syst mes Unix pour tout le monde sans exception. Seance fini vers 18h o je suis rentr l'h tel, seul. Donc au programme de la soir e, remplir mon billet quotidien, r cuper la conf Xorg me permettant d'utiliser le r troprojecteur, manger et ensuite rododo ;)

11 October 2007

Matthew Palmer: EBay Sez: Linux is for scammers

In what I can only assume is a case of "graphic design spec gone horribly wrong", EBay has decided to portray our lovable penguin friend as the mascot of the online scam artist: Linux is for scammers, says EBay (Click on the image for a full-size version) Personally, I think the scam artists' best friend is a Windows machine, as it provides a rich source of untraceable connectivity, but my guess is that Microsoft has scarier lawyers. I don't know how long the image has been up there for, but I'd imagine it won't last long. Linux users may not be that numerous, but we're a noisy lot. <grin>

21 May 2007

Stefano Zacchiroli: i had a dream

I'm Going to Play Some Number at the Lotto The Lotto Game is a really popular gambling game in Italy, run by the state itself. As in all good gambling game: you, the player, can win a lot of money while who is running the game (the Italian state in our case) will win a lot of money for sure, no matter what. Anyhow, to the facts, it's a well-known folklore that people win playing Lotto FOR SURE by playing numbers told while dreaming by parents, friends, kitties, whatever, ... As a scientist, I've no reason not to trust this folklore, why should I? As a strange coincidence (another fact that strongly hints I HAVE TO PLAY), this night I had a dream (well, it was more this morning while reading the planet, but it doesn't really matter..., folklore is folklore!). There were a lot, really a lot of people whispering this to my ears:
09 F9 11 02 9D 74 E3 5B D8 41 56 C5 63 56 88 C0

well, ok, they aren't in decimal notation, and they aren't even comprised between 1 and 90, ... but what would you expect from a bunch of geeks??? They are probably trying to exploit some overflow in the hands of the lotto extractor or something like that... But I won't desist, I'll play those number! Open questions: any proficient Lotto player to answer them around?

3 May 2007

Stefano Zacchiroli: i had a dream

I'm Going to Play Some Number at the Lotto The Lotto Game is a really popular gambling game in Italy, run by the state itself. As in all good gambling game: you, the player, can win a lot of money while who is running the game (the Italian state in our case) will win a lot of money for sure, no matter what. Anyhow, to the facts, it's a well-known folklore that people win playing Lotto FOR SURE by playing numbers told while dreaming by parents, friends, kitties, whatever, ... As a scientist, I've no reason not to trust this folklore, why should I? As a strange coincidence (another fact that strongly hints I HAVE TO PLAY), this night I had a dream (well, it was more this morning while reading the planet, but it doesn't really matter..., folklore is folklore!). There were a lot, really a lot of people whispering this to my ears:
09 F9 11 02 9D 74 E3 5B D8 41 56 C5 63 56 88 C0

well, ok, they aren't in decimal notation, and they aren't even comprised between 1 and 90, ... but what would you expect from a bunch of geeks??? They are probably trying to exploit some overflow in the hands of the lotto extractor or something like that... But I won't desist, I'll play those number! Open questions: any proficient Lotto player to answer them around?

11 January 2007

Martin F. Krafft: Destinationen

Just now on the Swiss hotline, a lady announced in the typical Swiss way of speaking "high German" that the airline now offers bla bla bla to over 200 "Destinationen". I had to snicker. English readers may wonder what the deal is, and even Germans might just yawn: "Destinationen" is not a German word, it's "destinations" germanified, and it's no news that the German language is seriously deteriorating as English words are creeping in. At the risk of being repetitive: it's not the English words per se, it's the fact that they are being conjugated or declinated according to German rules, which irritates me (and many others). The reason for this blog post was simply the humour: a Swiss lady speaking "high German" with her subtle Swiss accent, using words that don't exist as if there was nothing to it. I actually had to laugh out loud. Note that I have great respect for Swiss people speaking German, it being somewhat of a foreign language to them. I am perfectly aware that many are uncomfortable doing so, and this post is not trying to make fun of them, but rather expose the irony of the use of a non-German word. NP: Dream Theater / When Dream and Day Reunite Update: Christof Roduner informs me that "Destinationen" is actually a German word, or at least recognised by the Duden:
De s ti na ti on, die; -, -en <lat.> (Reiseziel; veraltet f r
Bestimmung, Endzweck)
Aus: Duden - Die deutsche Rechtschreibung, 24. Aufl. Mannheim 2006.
That must be the new orthography, which I won't comment on at this point.

4 January 2007

Anthony Towns: Five Things

Suppposedly, card number five in the Tarot is the Hierophant, described as “someone who interprets secret knowledge” and representing concepts such as “conformity” and “group identification”. Not that any of that is related to this “five things you don’t know about me” meme, for which I’ve apparently been tagged by both Pia and Tony. And since I wouldn’t want to be accused of being either cool or vanity lacking here’s some from me. Let’s see, I’ll tag: vocalist extraordinaire James, companionable carnivore Pat, sometime C hacker David, fellow motorcyclist Sez, and future housemate Clinton.

11 November 2006

David Welton: Bicycling, Running, and Open Source Economics

I am very passionate about open source (or free software, or whatever you want to call it) software. In one way or the other, it seems I'm always drawn back to it, whatever else I'm supposed to be working on. I get a good feeling from the community aspects of it, believe in the potential for technological benefits, and appreciate that being open is often the best way to advance the state of the art. One thing that has always been something of a mystery though, is how the whole thing works in economic terms. There is no doubt in my mind that open source works in terms of creating value, but what isn't so simple is how it sustains itself, and where it makes sense. Some people, like Richard Stallman, sustain that all software should be free. Others take a more laissez faire attitude. Aside from philosophical debates, though, there is a very pragmatic question of how it works in practice, and if the process could be made more efficient, and in which cases should one simply decide to 'go proprietary' in terms of maximizing monetary returns. In the proprietary software world, this is all rather "simple" - (well, conceptually at least - it's not actually easy for anyone). If you write some software that you then sell, and people like it, they'll buy it, which in turn gives you money that you can funnel back into improving the product. If it stinks, you won't get any money, and presumably you'll move on to something you are more qualified for. It's a pretty simple and direct feedback loop. With free software though, it gets much more complex, because there is not necessarily an economic link between users and developers, which might mean that while the software provides very real economic benefits in terms of flexibility, avoiding vendor lock in, and low costs, none of those benefits necessarily flow to the person or people who actually wrote the software. This, in turn, means that they may have to support themselves in some other way and may not be able to dedicate much time to working on the code. This system is thus 'inefficient', because there is a disconnect between the value and the person who created it. A lot of thought has been dedicated to "solving" this problem, but it doesn't seem to have a satisfying answer just yet, although there are several free software companies that clearly do function, with a variety of models. One of the original ideas about how people would live in an entirely free software based economy, proposed by Stallman, is services. In other words, the software is free, but you get money to fix it, teach people about it, improve it, provide support for it, customize it and so on. While riding my bike around the Colli Euganei the other day, it occured to me that a pure services business is a lot like running. The minute you have fewer clients, go on vacation, or otherwise ease up... revenue drops proportionately, just as if you had slowed from a run to a walk. Ease off on the effort completely, and the money is gone, because you have no way of carrying your momentum forward. With a bicycle, on the other hand, it's more like a product-based business. It's certainly still an effort, and you have to be strong to beat your competitors, however, you can stop pedaling for a moment now and then, and you do get the descents after riding up the hills, and for a few meters at least, you can stop pedaling altogether and coast. What prompted me to think about products versus services was a small web application I developed recently. It's a nice ajaxy to-do list/time-tracker that I started developing with Ruby on Rails. My first instinct, and the easiest thing to do, of course, would be to open source it and let it go at that, but... I also wonder if I could make a little bit of money at it if I were to commercialize the product somehow. Not "get rich quick" money, but enough to pay me back for the time I sunk into making it work, and enough to let me justify doing more work on it to meet people's needs and requests. A couple of strategies come to mind: Whatever ends up happening with it, I'll be sure to post further information as it becomes available.

27 July 2006

Michael Janssen: Weekend Weeviews: Omega Man, Conquest of the Planet of the Apes

A month between postings. Who needs regular updates? I've got a few of these weekend weeviews piled up, so I'm going to do them two at a time every couple of days. Omega Man Charleton Heston makes this post-apocalyptic world view quite the masterpiece. Considering his current stance on guns, it is easy to throw a couple of cheap shots at the movie in which he plays a gun-crazed doctor. Thinking you're the only one left in existence has got to be pretty hard on the psyche, and it shows in the character. Robert Neville (Heston's character) obviously doesn't think he's really alone, just the only one left who shouldn't be shot on sight. The rest of the human race was hit by a crappy plague, 28 days later-style. This plague apparently doesn't kill all brain functions, but only crams you into a religious sect which could only be described as luddite. Movies aren't interesting without a love interest, so Heston finds out early about Lisa (Rosalind Cash) who is keeping a set of children who are immune. Robert is also immune, and hopes to create a serum from his blood. This 70's end of the world is remarkably watchable, although it has it's moments of Action Movie. The acting is well done, although the cult of luddites is somewhat overplayed and has strange overtones of vampirism for some reason. Omega Man gets a 7. (imdb, amazon) Conquest of the Planet of the Apes As a child of the 80s, I never saw any of the original Planet of the Apes movies, and always assumed they were campy and not well done. Imagine my surprise when I discovered that they actually have a plot and are interesting. This fourth movie in the series of 70s movies presents it's moral message with a generous heaping of.. it's moral message. Years after his parents travelled backward in time in Escape from the Planet of the Apes, Caesar, the only talking ape left, is incognito with the circus master who raised him and comes to the city for the first time. At this time the plague which killed all pets has already ravaged the nation, humans have decided that apes are more useful as slaves than pets, and already have a large monolithic "Ape Management" section of government. Caesar gets thrown through the system when his "owner" gets brought into questioning for.. questioning the treatment of an ape. Being highly intelligent, he quickly gets snatched up for his skills by a top government official. Starting an underground resistance movement is the next step, and dominos start to fall from there. Conquest.. then degenerates into a large-cast action film, with hundreds of extras in a street-level ape vs. human fight. It's not the most interesting movie ever, but considering my expectations I can happily give it a 6. (imdb, amazon)

19 June 2006

Michael Janssen: Weekend Weeviews: Jawbreaker, Real Genius, Primer

There are three reviews this week, because I neglected to review one which I watched earlier. I'm also trying to decide whether the links at the end are of any use, and whether it's useful to syndicate this to Planet Debian. Feedback is encouraged! ;-)

Jawbreaker

The whole premise of Jawbreaker gets laid out right at the beginning - after a opening sequence in which we see the title candy being made, a couple of girls kill a friend by surprising her and using a jawbreaker as a gag. The movie then degenerates into a formulaic nerdy girl gets to be pretty storyline. Unfortunately Jawbreaker doesn't pull it off as well as Mean Girls does, and ends up falling pretty flat. One saving grace of this movie is that it is very short at just 87 minutes. One small treat is that Rose McGowan is playing a decent role, although I do have a much older vision whenever I see her from her work on Charmed. One thing which keeps bugging me is how the movie reminds me of But I'm a Cheerleader, even though the plots and premises are very different. The soundtrack is very well done with good placement of good songs. Jawbreaker is bittersweet at a 5. (imdb, amazon)

Real Genius

I have always heard good things about Real Genius, but never saw it all the way through. Supposedly this is a tragedy. Apparently [info]ceilingsarecool saw it before, but didn't remember a bunch of the plot. A whiz kid enters college early and starts working for a professor. The premise is setup in the first 10 minutes or so, which is that Herr Professor is really corrupt and working to provide a laser weapon to the military. This leaves ample time for the comedy in the movie to play out, as Whiz Kid gets to experience all the wild antics of college with a high-IQ twist. Lasers are used fairly extensively in the movie (mostly as part of the plot) and get a nice chunk of credits in the end. The movie unfortunately doesn't really age well with time, unless you're a fan of 80's movies. There are no less than 2 montages with music, and the ultimate 80's end-of-movie song "Everybody wants to rule the world" by Tears for Fears is playing at the end. Real Genius still is smarter than many movies you'll see nowadays, so it burns in a 7. (imdb, amazon)

Primer

Small budget films are usually interesting to watch, because either they are simple masterpieces, or they try to be way more than they can be and end up being a trainwreck which you just can't stop watching. Primer is the rare gem which found a happy medium between complex and simple. At the start, a garage business is introduced with four tech friends making a bit of money selling to hobbyists. Apparently they have some sort of deal about who gets to pick the next thing they try, and one of them wants to try something related to physics. Two decide to attempt it on their own and they create a device which apparently lowers density, or gravity, or something. Then they discover it does much more than that - it can manipulate time. Fortunately there is still much of the movie left after this. The acting is not top notch by any means, and comes off as cardboardish, but it seems well-placed in this script because most techies that I know are not the best with social skills. It's important to note that the main focus doesn't lie on the device, but on the implications and ramifications. As the movie approaches it's end, the pace and convolution increases quickly. A rewatch is in order for Primer - and even then you still may be missing parts in the giant puzzle which is laid out before you. Shane Carruth directs, writes, and produces an amazing film, especially considering the sub-10k budget. Primer scores e^2.30258509. (imdb, amazon)

12 May 2006

David Welton: "Hydras" and "real" open source

An interesting post by Ian Holsman, one of my colleagues in the Apache Software Foundation who has similar interests to mine: http://feh.holsman.net/articles/2006/05/12/is-your-project-a-hydra I take a more lassez-faire approach, myself. Ian's advice is valuable to those sorting through open source projects to use or get involved with - a full-fledged free software project with many users and committers who are independant of one another is usually going to be a better proposition than something run by one company, or one person. A lot of ASF thinking focuses on the community being the real value in an open source project, and it's obvious that it adds a lot of value. However, the licensing is important too, because it's your escape hatch. It means that if the company goes away, or decides to create a proprietary product, that you have the option to create a community around the open source code, by taking over its development and maintainance.

David Welton: "Hydras", "real" open source, and the ASF Incubator

An interesting post by Ian Holsman, one of my colleagues in the Apache Software Foundation who is also interested in open source business and economics: http://feh.holsman.net/articles/2006/05/12/is-your-project-a-hydra I take a more lassez-faire approach, myself. Ian's advice is valuable to those sorting through open source projects to use or get involved with - a full-fledged free software project with many users and committers who are independant of one another is usually going to be a better proposition than something run by one company, or one person. A lot of ASF thinking focuses on the community being the real value in an open source project, and it's obvious that it adds a lot of value. However, the licensing is important too, because it's your escape hatch. It means that if the company goes away, or decides to create a proprietary product, that you have the option to create a community around the open source code, by taking over its development and maintainance. ASF Incubator Ian also mentions the ASF Incubator, which I've been involved with first hand through my involvement in incubating OFBiz. One of the big hurdles that the project faces is getting a scrap of paper from everyone who has ever contributed anything important to the project. And since OFBiz is a real open source project with committers all over the world, and, over the years, many contributions, that is a lot of paper to collect! Thanks to the efforts of the OFBiz team, they're doing an admirable job of completing the task. However, I can't help but observe that "incubation" is a far easier process to go through if the code arrives in the form of a corporate donation, because it all comes from one place. Unfortunately, I think that leads to some selection for "hydra style" projects, although to their credit they may be trying to break out of that by joining the ASF (they need to if they want to successfully complete incubation). Still, I think it is likely to lead to a more "corporate" organization.

4 April 2006

Jos Parrella: Free Software vs. Privative Software in Venezuelan National Assembly

For my hispanohablantes readers: this post is written in english since it’s gonna be syndicated into Planet Debian. Expect a post in spanish later in the day. If you can understand spoken spanish, I recommend you to hear the Free Software talk of M.Sc. Ernesto Hern ndez-Novich (Debian Maintainer) which ends noting several benefits of using Debian GNU/Linux. Available here. Update: other posts in spanish are already available in Planeta Linux Venezuela. Yesterday we (SOLVE, a Venezuelan Association of Free Software Users, Developers, Cooperatives and Entrepreneurs) had the opportunity to assist to a Free Software vs. Privative Software forum. This doesn’t sound amazing, indeed. But if you take into account that the Forum was held in the National Assembly (or House of Representatives, anyway I’ll call it AN) it gets better. Now, get this idea: the AN is having a forum to discuss a Law, and it’s listening to the parts involved. Let it sink for a minute. But what if the AN is also bringing this parts to a table to actually write the Law? This is how laws are made in our Country now (Damog always says we are terribly nationalistic. It may be true, indeed.) Anyway, we were there around 10 AM, in the Protocolar Room of the AN. The event was full-house, featuring two thirds of people supporting proprietary software (students between 17 and 20 years from the countryside and big entrepreneurs -and Microsoft, of course-) and one third of people supporting free software, including people working in the Government, students, cooperatives, developers, etc. The session was opened by Representative Luis Tasc n, which once was a great supporter of Free Software (now it seems the same for him, anyway he’s doing a great job changing the law-making roadmap) and the speakers were: Microsoft, the National Center for IT, Cavecom (the association of big software enterprises in Venezuela) in the Dark Side and Felipe P rez Mart (from SOLVE) and Ernesto Hern ndez-Novich (Sim n Bol var University) in the White Side. Kudos for IBM, in the Gray Side with Black Spots. The session of talks was incredibly amusing. The speakers started to change their opinions based on the last talk. So, this way, Berrizbeitia (Director, CNTI) tried to discuss Tascon (Representative), the Microsoft Brazilian guy tried to oppose Berrizbeitia, the guy from Cavecom fighted Microsoft (this guy was a complete moron, by the way: when he was told that a group of venezuelan people had developed translations into wayuunaiki he said that was useless since we only have 800 wayuu - native americans) the IBM-boy tried to say they were the best, Felipe Perez (SOLVE) tried to dismantle Berrizbeitia’s talk and finally Ernesto Heranndez-Novich roundkicked them all. The kids from Microsoft (wearing MSDN and VisualStudio shirts) said that Microsoft is a good company which is open-source and gives the code for free. I argued some of them that they were working free for Microsoft. They “hadn’t see it that way”. They also said the plain-ol’-good excuse that “free software was weaker since everyone can see it’s code”. They were changing their strategies and opinions sistematically as the Forum evolved. They argued no compatibility, security problems, and finally they arrived to their real reason to oppose Free Software. Several girls and boys (Daddy’s Boys, as we say to them) were arguing “no free speech” in the Country (which was amusing since this is the first time the National Assembly is writing the laws with the People) and that the Law was going to make them lose their years of University studies. Then, the most accurate of the interventions came, made by Rogmar Marin from Venezuelan Patents and Author Rights Office. He said that this event was all about the people controlling the State, not the State controlling the people. He cited two articles from the draft and the 3390 decree explicitly saying that Free Software was mandatory for Government (in the Decree it says the “National Public Administration” and in the Law it says the “National Public Power”) so he said that they were not going to lose their “bicoca” (a popular form to say “rivers of money”) since they still can program in whatever they want, just that the State has the right to decie the best for their people, and the best for their people was Free Software. This was the whole quid of the event. This is not about Free Software vs. Proprietary Software. This is not about Linux vs. Windows. This is not about IBM vs. Microsoft vs. Venezuelan Companies. This is about the best for the People (which is the maximum interest of the AN) and, in this case, the People was represented not only by the intellectual-closed-circles of Free and Proprietary Software, but by Ana Maria Morillo, from the Cambalache community in Puerto Ordaz, at the southern part of the Country. She said that their Community had absolutely no access to technology. They were poor and humble, and willing to learn. CVG Telecom, a State-based telecom company, installed a Nudetel (no, it’s not an ethic-relaxed phone operator. It stands for Telecommunications Endogen Development Kernel) in their Community, using Free Software. This was their first approach to technology and computers. They didn’t even know how to turn on a computer. Free Software helped them to approach the reality of computers. They felt prepared to use a computer, and proud of themselves. When they used privative software in other places, they needed to adapt to that technology. This demonstrates that Free Software is as easy as any Software. You need to adapt to it. The difference with Free Software is that you are free when you use it. The people from Cambalache appreciate that. Felipe Perez Marti talk was really good. He slapped Microsoft and opposed very roughly to the “technological neutrality” position of the CNTI Director, Jorge Berrizbeitia. Technological neutrality is a neo-liberal, right-wing doctrine which is the technological counterpart to laissez faire, laissez passer. The problem is that (Felipe is an economist, and a great mathematician) technological neutrality isn’t actual neutral. It happens under the conditions where the Smith’s “Invisible Hand” Theorem is valid, so, in a scenary of technological neutrality, the powerful always wins. So, Microsoft wins. He also noted the recent study funded by the United States Homeland Security and carried on by Carnegie Mellon, Coverity and Symantec, which stated that Proprietary Software had between 20 and 30 bugs per TLOC (thousand lines of code) while FOSS had only 0.434 bugs per TLOC. The Microsoft guy seemed astonished by the “discovery” and the students were offended by this. Great. What next? We brought CD’s for people, made Debian evangelization to several “disident” students, made a great cheer to every Free Software supporter, brought signs and SOLVE shirts. While the Microsoft dudes where more than twice times us, we were a coherent group which stayed until 1930 (they left at 1600, you know, it’s a Micro matter of Soft) and held great arguments, sometimes slapping the student’s moral, but definitely making our voice be heard after tenths of years being discriminated by companies, Universities and the Government. As a Debian Project volunteer, what’s your opinion about this? Is the State violating Free Speech taking this decisions? Does it have the right to take it? Is Free Software the best solution for the corporate environments of the Government, taking into account that it handles the citizen information? What are your general comments about this?

15 January 2006

Pierre Habouzit: DAVDSI - D route Anonc e du Virtuel, par D ficience de ses Syst mes Immunitaires ...

Je ne vais pas faire de pr sentation de DAVDSI, je ne suis pas juriste, et d'autres[1] l'ont d j fait, bien mieux que je ne saurais sans doute jamais le faire. L'article Wikipedia est (comme toujours) tr s clair Je vais plut t revenir sur les diff rents probl mes que cette loi soul vent et qui font actuellement d bat. Je rel ve essentiellement trois grands axes : Ce projet de loi se veut (en tout cas c'est comme a qu'en tant que citoyen il me'a t pr sent , et comment je l'ai compris) de mettre un cadre juridique autour des changes num riques, notament sur Internet (mais aussi de n'importe quel support num rique vers un autre). L galisation du peer-to-peer ? ou licence globale ? Actuellement, le peer to peer et les changes de fichiers sur les r seaux sont dans un tat de vide juridique, ce sur quoi tout le monde est d'accord. Les internautes voudraient voir leur passe temps favori l galis , ou au moins sans doute prot g des inquisitions de la RIAA. Les distributeurs eux (et je souligne le terme distributeur opposer auteurs) sont contre, puisque ceci repr senterait une perte nette dans leur buisness. Je tourne ma phrase au conditionnel parce que je ne veux pas entrer dans les d bats sur le fait que le p2p est ou non b n fique aux distributeurs. D'autres l'ont fait (par exemple dans Confessions d'un voleur de Laurent Chemla, puis plus r cemment Roberto Di Cosmo dans sa lettre ouverte Eddy Mittchell. Mais pour autant que je respecte monsieur Di Cosmo (dont j'ai d'ailleurs suivi les cours passionnants[2]), je pense que lui, ainsi que tout ceux qui se battent pour ou contre la licence globale se trompent de combat. Je m'explique : les d fenseurs de DAVDSI savaient que a serait un gros point de d bat, et tant le point auquel les distributeurs de contenus sont le plus sensibles, ils se sont tr s largement pr par s ce d bat. C'est donc un terrain dangereux, o il est facile de ne plus tre pris au s rieux. D'autant plus que par exemple l' tude de la SPEDIDAM me parait totalement biais (il ne propose que le choix licence global versus ill galit , ce qui donne au r sultat une majorit de gens pour la licence globale, alors que c'est plut t une majorit contre l'ill galit ). Mais j'ai dit que je ne voulais pas m'enfoncer plus loin dans ce d bat, parce qu'il est surtout pr matur . Le r el probl me est que l'on ne sait pas qui ou qu'est ce qu'on prot ge. Il manque un r el d bat et travail de fond sur la propri t intellectuelle vis vis des nouveaux probl mes soulev s par l' re num rique. Nous n'avions pas connu a depuis Gutemberg, bien sur a commence un peu dater ... Au lieu de se poser cette question, qui me parait pourtant pr liminaire au reste du d bat, on voit de toute part les gens se ruer vers des lois-rustines, et cache-mis re. Car DAVDSI est une loi h tive qui veut au plus vite appliquer un sparadrap sur la fissure que repr sente le p2p, sous la pression des distributeurs de contenus. Alors pour ou contre la licence globale ? personnellement je suis contre (et je ne rentrerai pas dans les d tails du sujet ici, on en trouve une vague explication dans les commentaires sur le billet DAVDSI CODE (5)), mais comme les gens qui ont r pondu au sondage de la SPEDIDAM, je suis encore plus contre le fait que a soit ill gal. Entre deux maux, je pr f re le moins dangereux pour moi. Mais la loi se limiterait la licence globale, la limite, je me dirais   quoi bon ... , a serait un mauvais pr c dent, mais la taxe sur les disques durs l' tait d j , c'est juste une prolongation. Malheureusement, la loi va plus loin, et ce sont ces deux points qui me posent le plus de probl mes. L'incompatibilit avec les logiciels libres, et la protection l gale des DRM Mon inqui tude est li e l'interdiction p nalement sanctionn e du contournement des DRM. Les ayatollahs du libres, ainsi que m me des gens plus mod r s, ont tout de suite not l'incompatibilit flagrante de cette loi, avec les logiciels libres (je renvoie le lecteur au blog de Tristan Nitot sur ce point). Mais le probl me me semble bien plus vaste. En effet, m me m me si comme le souligne Me Eolas (que je cite) : La loi fait obligation aux utilisateurs (les maisons de disque et soci t s de production) de ces mesures techniques efficaces d'accorder "de mani re non discriminatoire" des licences d'utilisation aux fabricants de lecteurs et concepteurs de logiciels de lecture, condition que ceux-ci respectent leur tour ces mesures.. Ceci fait qu'aucun logiciel libre (bien plus que les soucis de licence voqu s par Tristan Nitot) n'aura jamais aucune licence qui lui sera accord e : tant que le mat riel informatique, ne supportera pas les DRM en propre, les logiciels capables de lire des contenus prot g s vont n cessairement un moment du processus de rendu (graphique pour les livres, audio pour les mp3, etc ...) avoir la version non prot g e du contenu en m moire. Le code source tant ouvert, n'importe quelle personne (m me pas forc ment de tr s grand hackers) sera capable de d tourner un logiciel libre (son code source est ouvert !) pour r cup rer cette version d cod e du contenu, et en profiter pour le sauver. Je ne suis pas juriste, mais il me parait tr s facile un distributeur de contenus (par exemple Sony) de d fendre devant un tribunal que m me si le responsable du projet libre (par exemple VLC) qui lui demande une licence est sinc re, que aucun d veloppeur de VLC ne pourra assurer Sony que leur logiciel ne sera jamais utilis pour contourner les DRM. Tant que aucun mat riel informatique ne sera capable de d coder des DRM lui m me (enlevant cette responsabilit aux logiciels), aucun logiciel libre, et plus g n ralement open source, ne pourra poss der une telle licence, parce que la loi donne tous les outils aux distributeurs et diteurs de contenus de l'emp cher. Arrive donc ma seconde inqui tude : le jour o le mat riel informatique saura g rer lui m me les contenus prot g s. En effet, le jour o l'ordinateur g rera lui m me gr ce au mat riel toutes les protections num riques autour des contenus, nous serons arriv s un r ve d'un consortium Microsoft-AMD-Intel appel Palladium. Sous des pr textes de s curit informatique, ce consortium vise implanter dans le coeur de votre ordinateur, une puce qui sait si tel ou tel mat riel respecte ou non les DRM[3], tant en charge de refuser la transmission d'un contenu sous sa forme d cod e un p riph rique louche . Bien sur, pour tre reconnu par cette puce (appel e puce TCPA comme Trusted Computing Platform Alliance ) il faudra avoir montr patte blanche ce consortium, pay le droit d' tre reconnu, etc ... Sur la FAQ TCPA vous pourrez lire pourquoi un tel syst me am nerait in vitablement l'informatique vers un monde tel que celui d crit dans la fameuse letter from 2020 dont vous pouvez lire une traduction fid le[4]. Ind pendament de toute consid ration personnelle sur le logiciel libre, la diversit est en tout lieu maintenir. Que a soit en politique, en biologie et en g n tique, ... et en informatique. De la diversit na t la concurrence et la cr ation. La diversit est m re d'innovation. Ce sont des lieux communs, mais je tiens les rappeler. Un monde du multim dia o le DRM est roi, serait un monde de la culture unique, o Microsoft serait le Moscou du silicium, et o rien ne pourrait tre fait sans l'aval du Camarade Gates. D'autre part -- et je vais reviens ici sur mon affirmation sur le fait que DAVDSI est pr matur -- interdire le contournement des DRM am ne directement un monde o les contenus ditoriaux ne sont poss d s que par les distributeurs (lire ce billet d'un autre juriste sur le sujet et les commentaires 22 de Luc Saint- lie et mon pr c dent billet lui aussi commentaire sur le blog de Me Eolas). Bien plus loin que des probl mes de copie priv e, c'est le patrimoine culturel de l'humanit qui serait enti rement poss d et g r par les distributeurs (comprendre Sony, Universal, Microsoft, ...). Une seconde lecteure de la letter from 2020 me fait avec une telle loi encore plus froid dans le dos. Une telle loi fait que le patrimoine culturel d'une nation enti re sera divis e entre les serveurs de grosses multinationales, qui n'auront en plus priori aucun devoir de les conserver (!!!!!) ou d'en prendre un soin particulier. La meilleure arme contre l'oubli et la perte de document, c'est la dispersion des contenus, et la r plication (en parler aux biblioth caires d'Alexandrie ... m me Sony n'est pas l'abri d'un incendie ... ou d'un pirate qui lui d truitait tous ses serveurs !). L' tat fran ais est en train de soutenir une loi, qui m ne au final une d possession de sa culture (comme a on pourra faire des conomies sur un minist re apr s tout... pas con!). Et je vous pargne (parce que ce billet est d j bien assez long) ma petite th orie sur les relations entre monopoles et puces DRM/TCPA (je suppose de toute fa on que ce sujet est abord sur la FAQ TCPA). Bref, au lieu de se demander qui doivent tre les garants de notre culture, et de nos contenus num riques, l' tat ne se pr occupe que de surprot ger des multinationales d j surpuissantes, de conforter leur monopole, et de surtout permettre en le gravant dans le marbre de la loi[5] les principes qui pourraient permettre ces m me multinationales de devenir les Big Brother de demain.
Notes [1] voir le billet Le DADVSI Code(5) : l'Opus d'Elie de Maitre Eolas sur le sujet. [2] si si, je le pense. [3] et ce de mani re incontournable. [4] Je n'arrive pas retrouver un pointeur sur l'original... [5] et c'est sans doute a que je trouve le plus scandaleux.

Pierre Habouzit: DAVDSI -- D route Anonc e du Virtuel, par D ficience de ses Syst mes Immunitaires ...

Je ne vais pas faire de pr sentation de DAVDSI, je ne suis pas juriste, et d'autres[1] l'ont d j fait, bien mieux que je ne saurais sans doute jamais le faire. L'article Wikipedia est (comme toujours) tr s clair Je vais plut t revenir sur les diff rents probl mes que cette loi soul vent et qui font actuellement d bat. Je rel ve essentiellement trois grands axes : Ce projet de loi se veut (en tout cas c'est comme a qu'en tant que citoyen il me'a t pr sent , et comment je l'ai compris) de mettre un cadre juridique autour des changes num riques, notament sur Internet (mais aussi de n'importe quel support num rique vers un autre). L galisation du peer-to-peer ? ou licence globale ? Actuellement, le peer to peer et les changes de fichiers sur les r seaux sont dans un tat de vide juridique, ce sur quoi tout le monde est d'accord. Les internautes voudraient voir leur passe temps favori l galis , ou au moins sans doute prot g des inquisitions de la RIAA. Les distributeurs eux (et je souligne le terme distributeur opposer auteurs) sont contre, puisque ceci repr senterait une perte nette dans leur buisness. Je tourne ma phrase au conditionnel parce que je ne veux pas entrer dans les d bats sur le fait que le p2p est ou non b n fique aux distributeurs. D'autres l'ont fait (par exemple dans Confessions d'un voleur de Laurent Chemla, puis plus r cemment Roberto Di Cosmo dans sa lettre ouverte Eddy Mittchell. Mais pour autant que je respecte monsieur Di Cosmo (dont j'ai d'ailleurs suivi les cours passionnants[2]), je pense que lui, ainsi que tout ceux qui se battent pour ou contre la licence globale se trompent de combat. Je m'explique : les d fenseurs de DAVDSI savaient que a serait un gros point de d bat, et tant le point auquel les distributeurs de contenus sont le plus sensibles, ils se sont tr s largement pr par s ce d bat. C'est donc un terrain dangereux, o il est facile de ne plus tre pris au s rieux. D'autant plus que par exemple l' tude de la SPEDIDAM me parait totalement biais (il ne propose que le choix licence global versus ill galit , ce qui donne au r sultat une majorit de gens pour la licence globale, alors que c'est plut t une majorit contre l'ill galit ). Mais j'ai dit que je ne voulais pas m'enfoncer plus loin dans ce d bat, parce qu'il est surtout pr matur . Le r el probl me est que l'on ne sait pas qui ou qu'est ce qu'on prot ge. Il manque un r el d bat et travail de fond sur la propri t intellectuelle vis vis des nouveaux probl mes soulev s par l' re num rique. Nous n'avions pas connu a depuis Gutemberg, bien sur a commence un peu dater ... Au lieu de se poser cette question, qui me parait pourtant pr liminaire au reste du d bat, on voit de toute part les gens se ruer vers des lois-rustines, et cache-mis re. Car DAVDSI est une loi h tive qui veut au plus vite appliquer un sparadrap sur la fissure que repr sente le p2p, sous la pression des distributeurs de contenus. Alors pour ou contre la licence globale ? personnellement je suis contre (et je ne rentrerai pas dans les d tails du sujet ici, on en trouve une vague explication dans les commentaires sur le billet DAVDSI CODE (5)), mais comme les gens qui ont r pondu au sondage de la SPEDIDAM, je suis encore plus contre le fait que a soit ill gal. Entre deux maux, je pr f re le moins dangereux pour moi. Mais la loi se limiterait la licence globale, la limite, je me dirais   quoi bon ... , a serait un mauvais pr c dent, mais la taxe sur les disques durs l' tait d j , c'est juste une prolongation. Malheureusement, la loi va plus loin, et ce sont ces deux points qui me posent le plus de probl mes. L'incompatibilit avec les logiciels libres, et la protection l gale des DRM Mon inqui tude est li e l'interdiction p nalement sanctionn e du contournement des DRM. Les ayatollahs du libres, ainsi que m me des gens plus mod r s, ont tout de suite not l'incompatibilit flagrante de cette loi, avec les logiciels libres (je renvoie le lecteur au blog de Tristan Nitot sur ce point). Mais le probl me me semble bien plus vaste. En effet, m me m me si comme le souligne Me Eolas (que je cite) : La loi fait obligation aux utilisateurs (les maisons de disque et soci t s de production) de ces mesures techniques efficaces d'accorder "de mani re non discriminatoire" des licences d'utilisation aux fabricants de lecteurs et concepteurs de logiciels de lecture, condition que ceux-ci respectent leur tour ces mesures.. Ceci fait qu'aucun logiciel libre (bien plus que les soucis de licence voqu s par Tristan Nitot) n'aura jamais aucune licence qui lui sera accord e : tant que le mat riel informatique, ne supportera pas les DRM en propre, les logiciels capables de lire des contenus prot g s vont n cessairement un moment du processus de rendu (graphique pour les livres, audio pour les mp3, etc ...) avoir la version non prot g e du contenu en m moire. Le code source tant ouvert, n'importe quelle personne (m me pas forc ment de tr s grand hackers) sera possible de d tourner un logiciel libre (son code source est ouvert !) pour r cup rer cette version d cod e du contenu, et en profiter pour le sauver. Je ne suis pas juriste, mais il me parait tr s facile un distributeur de contenus (par exemple Sony) de d fendre devant un tribunal que m me si le responsable du projet libre (par exemple VLC) qui lui demande une licence est sinc re, aucune d veloppeur de VLC ne pourra assurer Sony que leur logiciel ne sera jamais utilis pour contourner les DRM. Tant que aucun mat riel informatique ne sera capable de d coder des DRM lui m me (enlevant cette responsabilit aux logiciels), aucun logiciel libre, et plus g n ralement open source, ne pourra poss der une telle licence, parce que la loi donne tous les outils aux distributeurs et diteurs de contenus de l'emp cher. Arrive donc ma seconde inqui tude : le jour o le mat riel informatique saura g rer lui m me les contenus prot g s. En effet, le jour o l'ordinateur g rera lui m me gr ce au mat riel toutes les protections num riques autour des contenus, nous serons arriv s un r ve d'un consortium Microsoft-AMD-Intel appel Palladium. Sous des pr textes de s curit informatique, ce consortium vise implanter dans le coeur de votre ordinateur, une puce qui sait si tel ou tel mat riel respecte ou non les DRM[3], tant en charge de refuser la transmission d'un contenu sous sa forme d cod e un p riph rique louche . Bien sur, pour tre reconnu par cette puce (appel e puce TCPA comme Trusted Computing Platform Alliance ) il faudra avoir montr patte blanche ce consortium, pay le droit d' tre reconnu, etc ... Sur la FAQ TCPA vous pourrez lire pourquoi un tel syst me am nerait in vitablement l'informatique vers un monde tel que celui d crit dans la fameuse letter from 2020 dont vous pouvez lire une traduction fid le[4]. Ind pendament de toute consid ration personnelle sur le logiciel libre, la diversit est en tout lieu maintenir. Que a soit en politique, en biologie et en g n tique, ... et en informatique. De la diversit na t la concurrence et la cr ation. La diversit est m re d'innovation. Ce sont des lieux communs, mais je tiens les rappeler. Un monde du multim dia o le DRM est roi, serait un monde de la culture unique, o Microsoft serait le Moscou du silicium, et o rien ne pourrait tre fait sans l'aval du Camarade Gates. D'autre part -- et je vais reviens ici sur mon affirmation sur le fait que DAVDSI est pr matur -- interdire le contournement des DRM am ne directement un monde o les contenus ditoriaux ne sont poss d s que par les distributeurs (lire ce billet d'un autre juriste sur le sujet et les commentaires 22 de Luc Saint- lie et mon pr c dent billet lui aussi commentaire sur le blog de Me Eolas). Bien plus loin que des probl mes de copie priv e, c'est le patrimoine culturel de l'humanit qui serait enti rement poss d et g r par les distributeurs (comprendre Sony, Universal, Microsoft, ...). Une seconde lecteure de la letter from 2020 me fait avec une telle loi encore plus froid dans le dos. Une telle loi fait que le patrimoine culturel d'une nation enti re sera divis e entre les serveurs de grosses multinationales, qui n'auront en plus priori aucun devoir de les conserver (!!!!!) ou d'en prendre un soin particulier. La meilleure arme contre l'oubli et la perte de document, c'est la dispersion des contenus, et la r plication (en parler aux biblioth caires d'Alexandrie ... m me Sony n'est pas l'abri d'un incendie ... ou d'un pirate qui lui d truitait tous ses serveurs !). L' tat fran ais est en train de soutenir une loi, qui m ne au final une d possession de sa culture (comme a on pourra faire des conomies sur un minist re apr s tout... pas con!). Et je vous pargne (parce que ce billet est d j bien assez long) ma petite th orie sur les relations entre monopoles et puces DRM/TCPA (je suppose de toute fa on que ce sujet est abord sur la FAQ TCPA). Bref, au lieu de se demander qui doivent tre les garants de notre culture, et de nos contenus num riques, l' tat ne se pr occupe que de surprot ger des multinationales d j surpuissantes, de conforter leur monopole, et de surtout permettre en le gravant dans le marbre de la loi[5] les principes qui pourraient permettre ces m me multinationales de devenir les Big Brother de demain.
Notes [1] voir le billet Le DADVSI Code(5) : l'Opus d'Elie de Maitre Eolas sur le sujet. [2] si si, je le pense. [3] et ce de mani re incontournable. [4] Je n'arrive pas retrouver un pointeur sur l'original... [5] et c'est sans doute a que je trouve le plus scandaleux.

Next.

Previous.