Last entries

Introductory slides on Frama-C

I recently made at work slides to present Frama-C: what it is, the various plug-ins and how to use the two main ones, Value analysis and WP. The examples used in the slides are also available.

The attached version is updated after fixing minor issues found by frama-c-discuss mailing list readers.

Enjoy! (and let me know if you find them useful)

Sat 29 Jun 2013

WE programming idea: opportunistic secure email exchanges

A long time ago, a French computer science magazine proposed programs ideas that ranges from a few hours to a complete WE. Here is an idea to elaborate on, even if it might take a little more than a WE to implement it fully. ;-)

Observation: secure email exchange with OpenPGP or S/MIME does not work

Like many others, I have tried to exchange secured (encrypted and strongly authenticated) emails with friends and other people, in my case in OpenPGP format using GnuPG free software. But, like many others, I have stopped because it simply does not work.

Why? Probably for several reasons:

  • One need to understand at least the basic principles of asymmetric cryptography: public and private keys. It is not that complicated (if you don't go into the fine details ;-) ) but it is probably already too much complicated for the average user;
  • One need to make ones key, load it into email program. If one has several computers, one needs to do this for each one of them. Making the key adds complicated steps. Loading it on each computer is cumbersome.
  • If you want to participate in the "web of trust" (for OpenPGP emails), you need to let your key signed by other people and sign other people keys. Once again, this is very complicated to understand for the average user;
  • Even if you don't want to participate in "web of trust", you need to check the fingerprint of your correspondents to gain strong authentication. Once again, a complicated step to understand and do;
  • Even if you have done all of this and understand it, each time you want to send an email you need to enter the password to unlock your private key. This is annoying.

Regarding S/MIME, you have overall the same complications. It can be a little simpler but as you need a Public Key Infrastructure (PKI), S/MIME usefulness is limited to a single administrative entity managed by trained system administrators, in other words a big company.

A proposal: opportunistic secure email exchange

The basic approach is pretty simple: make a plug-in to some email programs. The first time the plug-in is installed, it automatically creates a public and private key couple for each email address used by the user.

Then, each time a user A sends an email, the public key attached to A's email address is automatically sent with the email. Therefore, if the user communicates with another person B using the same kind of plug-in, the receiver detects that A is capable of using secure emails. At next email from B to A, the plug-in automatically attaches its own public key.

Therefore, after two emails exchanges between A and B, they both have the public key of the other person and thus can both exchange secure emails. When one sends an email, by detecting we have the public key of the correspondent, the email programs would automatically encrypt and sign the email.

Of course, with this scheme, you don't gain strong authentication of the remote party. A man-in-the-middle attack is still possible. But this does not prevent to use another cryptographic protocol to check afterwards that the remote user is really who he is pretending to be, like in ZRTP protocol.

But the danger nowadays is not man-in-the-middle-attack, is it continuous spying on servers like the USA's PRISM program. This opportunistic encryption scheme would allow the average user to use encryption. The emails would be stored encrypted on GMail, Microsoft or Yahoo servers and be in clear only on user's computer.

The WE programming idea

I think you now have understood this WE programming idea: implement such a plug-in doing opportunistic email encryption, e.g. as a Thunderbird plug-in. :-) All the libraries are there, like GnuPG's GnuPG Made Easy library to manage keys, encryption and authentication.

Anybody willing to take the challenge? ;-)

Sun 26 May 2013

Issues with distributions, not only a Debian specific problem

Some time ago, I blamed Debian for not looking enough at its users. Apparently, I'm not the only one to make such remarks. Ingo Molnar, a famous Linux kernel developer working for Red Hat made similar remarks. He even proposed some technical solutions.

Other solutions are already available. For example, for OCaml specific software, OPAM, an OCaml specific package system, seems very interesting to work around the freshness issue of OCaml Debian packages.

I'm not sure Ingo's answers or OPAM are THE answers, but they at least open interesting perspectives. Handling of security and licensing issues are very difficult and not yet solved. Distributions are making an heavy work of integration that is currently not done by anybody else.

It is nonetheless refreshing to see people thinking at those issues and proposing new solutions. I find Ingo's proposal of sandboxing, flat dependencing and not forcing people to upgrade very interesting. If you read French, you can also read this LinuxFR article that makes a small review of current proposals.

Tue 14 May 2013

High-level requirements for re-demexp

I recently spoke about the three main points to work on if one would start some re-engineering work on demexp. The first point was to start from some High-Level requirements. I have started to write those High-Level requirements. Let me know if you have comments or questions, either directly at dmentre@linux-france.org or through this post's comments.

Sat 04 May 2013

Re-engineering demexp

demexpA long long time ago, I worked on demexp, a specific software made to support the Democratic Experience, a direct democracy initiative. demexp is programmed in OCaml language. The software contains both a server and a client. The client user interface is programmed with Gtk+. I also tried to make a web interface, without much success. The whole software is made in literate programming style. After several years of development, I was bored and stopped working on it. The software project never had a lot of momentum. The Democratic Experience political project is still alive, without much activity as far as I know.

After those years, one (at least me ;-) ) could wonder what failed in the project and how it could be improved, from a software engineering point of view. I see three main points to work on if one wanted to re-engineer demexp.

1. Start from a well-defined high-level specification. When we started demexp, we did not know exactly what kind of software we wanted. We discovered a new territory. Nowadays, I think we could have a much clearer picture of the requirements. As for any software, starting from clear requirements is mandatory. :-) So we should start from a set of high-level requirements, from which derive some low-level requirements for the software.

2. Design a modular architecture. I made demexp as a big monolithic software. The server part contained the network interface, the question and response database, the security model, the delegation system, etc. The client was containing a User Interface as well as code to authenticate the user and access the server. I now think I should have started from a much modular architecture, with several small programs, each one of them focused on a part of the system. Having several programs would force me to have clear interfaces between them, thus imposing a cleaner architecture. And if the task of developing all those programs is too big, at least I could focus on a sub-part of them to have a system with less functionalities, but at least working.

3. Use formal methods to verify some properties. Using formal method is not strictly required, but, well, it interests me :-) and I think formal methods could bring benefits to such a software. For example to prove that some properties of the system, e.g. the security model is correctly designed and implemented.

Moreover, I see some additional points to review in the original demexp design:

  1. Drop the literate programming approach. It is too costly to maintain (it is heart-braking to remove some code, but its even more difficult to remove some documentation). Some well defined documentation for the tricky parts or the global overview of the system would be enough I think.
  2. Traceability. From high-level specification to low-level one, down to code and its tests or proofs. As for safety critical software, maintaining traceability would bring a cleaner view of the system and it would help code reviews (one can dream! ;-) ) and show the software meets its goals.
  3. Focus on smaller goals. The overall objectives of demexp were very ambitious and thus difficult to reach. One should probably start from less ambitious goals, but try to make useful software in small steps.

Regarding the use of OCaml language, is was probably part of the raison why we never gain much contributions. But OCaml is so nice to use! I would probably keep that part... or not. I currently some other interesting and somewhat confidential languages to look at. :)

Sun 27 Jan 2013

The failures of Debian (and its derivatives)

I am a long term Debian user. I have used Debian since probably the mid 90' up to now. I use latest Debian stable on my personal server or at work, and Ubuntu on my desktop and laptop machines at home. Using it does not mean I am entirely satisfied with it to say the least. I think the distribution is not making enough efforts for its users and I'll try to explain why. Of course, YYMV. ;-)

  • Debian packaging is made for developers, not users. Debian has too many packages. To take just one example, the OCaml compiler is packaged into the main compiler, its libraries, the native compiler and the byte code one, etc. Why so many packages? If I am an OCaml developer, I want all of them so why do I need to manually find (the naming is far from being obvious, at least for beginners) and install so many packages? I have heard of several reasons: it allows to factorise common parts between the different binary architectures, it allows to install the command line only parts without the graphics parts, it allows to install only what the user wants, etc. For me, those reasons are just plain wrong. We have more and more disk capacity on our machines so disk usage is no longer a limitation. The package should be able to dynamically activate the graphic parts automatically if the X server is available. And the factorisation of shared parts between Debian architectures should be done on the servers storing the packages, not trough the packaging system.
  • Debian has out-dated software. Debian Wheezy is about to be released and it will have GNOME 3.4. But GNOME 3.6 is already out and GNOME 3.8 is on its way. And I am taking GNOME as an example, it is the same issue for lot of software within Debian. I have heard this is for stability issues. But software packaged in Debian is already stable! It should take 10 or 15 days to integrate a new software into Debian stable, not months or even years (the time between successive stable releases). I acknowledge that some packages have complex interdependencies between each others. For example, when a new release of the OCaml compiler is out, one needs to recompile all OCaml packages. But this constraint is only for OCaml packages. Why should I wait for a new stable version of Debian to get the newly released OCaml compiler? For me this sounds just plain wrong.
  • Nobody uses plain Debian stable. Debian developers are using unstable. Debian users are using Debian stable, but enriched with backports because of out-dated software. Or derivatives like Ubuntu. The fact the Debian developers are not using what they recommend to users is bogus. I know they do that for technical reasons, but that should not be a good reason. Debian developers should use what they are providing to their users, except maybe for a few packages they are working on.
  • There are too many dependencies between packages. The dependency system of Debian is a nice piece of work, it was ahead of its time when it was created. But the use of dependencies has been perverted. The dependencies are manually computed (thus source of errors and bugs) and at the same time any software can write to about any part of the file system. Due to too many dependencies and lack of clean interfaces between sub-systems, the installation of a recent user software can trigger a ton of packages down to a new kernel or libc. Why is it so? I think the sub-systems of Debian (e.g. the X graphical infrastructure, the kernel and base C libraries, the OCaml system, ...) should be isolated the one from the others. It would allow them to be updated without waiting for the others. Having dependencies between 29,000 packages is just not scalable. It is even more true if those dependencies are manually computed.
  • Debian packaging is lacking automation. I am a developer. Debian packagers are developers. It always astonished me how much manual work should be done to make and maintain a Debian package. All developers know that if they want to be efficient, they need to automate their work as much as possible, so as to be able to focus their manpower on the complex parts. Everything should be automated in Debian packages, starting from a minimal description file. Automation should be the default (package creation, test, static analysis, ...), not the exception.
  • One cannot install simultaneously several versions of the same software. As a user or developer, I want to use the stable version of a piece of software and maybe the latest stable version that just have been released in order to do a few tests or check that my software still compiles with the new shiny compiler. Debian does not allow me to do that. And if I install a newer package, downgrading to the previous version is complex and error prone.
  • Debian is undocumented. I am not talking about the installation guide which is nice, I am talking about the modifications made to software by Debian developers. Doing modification on the "standard" (for the software) way of configuring or using it has always seemed suspect to me, even if I agree that having harmonized configuration is a real advantage (all configuration files in /etc for example). But all of those modifications should be documented in README.Debian file. To take an example, the last time I tried to install the dokuwiki Debian package, I was unable to configure it! The way to add new users had been changed compared to a regular dokuwiki (the web interface was disabled), and those changes were undocumented. It should be a release critical bug! Without proper documentation, the user cannot use the software. And, of course, the reason behind those changes should be questioned, even for security reasons (a very secure but unusable software is superfluous. Security is a trade-off).

So, why I am complaining? Why I do not become a Debian Developer, so I can fix it? Because a single developer is not going to change the root causes of those issues. They need a massive development effort, or at least a massive acknowledgement by the Debian Developers. And I don't have ready-made answers to those issues (even if I have some ideas to solve them).

Is the grass greener in the other fields? I don't think so, or at least I am not aware of it. I like Debian for is community approach, its focus on Free Software (even if it is sometimes imperfect) and the wide range of software packaged in it (the OCaml packages are numerous for example). I just hope that the whole Debian community will focus on more user related issues in the future.

How to install Atelier B 4.1 on a Debian-like 64 bits machine

Atelier B 4.1 has just been released. For Linux machines, it is available as a binary packages for 32 bits x86 machines in RPM and DEB format. Unfortunately the DEB package won't work on a Debian-like 64 bits machine, for example an Ubuntu. Here the approach have I used to install the package:

Install Atelier B 4.1 in /opt/atelierb-4.1

  • mkdir /tmp/ab
  • cd /tmp/ab
  • wget http://www.atelierb.eu/atelier-b/4.1/free/AtelierB-4.1.0-free-linux.deb
  • ar xv AtelierB-4.1.0-free-linux.deb
  • cd /
  • sudo tar zxf /tmp/ab/data.tar.gz
  • sudo chown -R YourUserID:YourGroup /opt/atelierb-4.1/ # substitute YourUserID and YourGroup with adequate values

Configure the execution environment on an Ubuntu 12.04

  • sudo apt-get install ia32-libs-multiarch

You can now start Atelier B with:

 /opt/atelierb-4.1/startAB &

The procedure might be different for another Debian-like distribution, including Debian itself. You can add specific procedures for a given distribution in the comments.

Mon 10 Sep 2012

Book review: Rework. Change the way you work forever

Rework This book is made by people of 37signals, a successful American company doing web based product. They are know for their previous book, Getting Real, which I have not read but look quite similar in spirit if not in content.

"Rework. Change the way you work forever" by Fried and Hansson is made of several small essays, each essay containing a "catch phrase" (e.g. "Ignore the real world", "Workaholism", "Scratch your own itch", ...) and a small explanation in two or three pages. Those parts are organised into several chapters (e.g. "Takedowns", "Go", "Progress", "Productivity", ...) that look at the several issues when starting and running a business (get the idea, develop the product, hire new people, ...).

This book is interesting because it is made by people having a successful business, so you can believe them when they are saying that it can be done that way. And that way is quite unusual (for this kind of book)! They are talking about being small, saying no to the client, avoiding too long much hours ("workaholism") or funding your company by yourself and not using venture capital funds. It is thus interesting to get a fresh view in running business, even if this view is being told by other people around you or in the Free Software world. I particularly liked the spirit of the book which says that instead of doing market studies or imagine several different products, it is better to actually making a good product, targeted at a well identified need (and the authors detail how to identify it).

On the down side, even if the book is illustrated with several examples coming from successful businesses, I would have preferred such illustration to come from examples from 37signals itself. It would have been much more reliable and more interesting in my humble point of view.

The book is short. I liked it that way. It won't take you days to read it. It could have been shorter (there is a place for a reference card extracted from this book) but in its current form the ideas are illustrated enough so you have a better chance to understand them.

Fri 17 Aug 2012

GNATprove tools available

Since end of July, a GPL release of GNATprove, the tools to make formal verification of Ada 2012 programs, is available.

One can now formally verify Ada programs using only Free Software! If you are interested, you can look at my previous attempt to verify a toy electronic booth program using GNATprove.

Which approach for Hoare logic formal verification?

Currently, if one wants to do some program verification using Hoare logic framework, three approaches are possible:

  1. Down to code approach, when one annotates code functions with pre and post-conditions. This is the approach proposed by Frama-C on C code, SPARK on Ada code or GNATprove on Ada code. This approach has the advantage to be directly palatable to programmers because they are keeping they preferred language. Moreover, one can use such approaches on existing code, even if to fully verify it one will probably need to make some change. The downside of the approach is that the formal verification is tightly linked to the code, making the formalisation of program wide properties more complex;
  2. Abstract specification refined in implementable code, like the B Method (and its variant Event B for system specification). This approach using using pre and post-conditions but also invariants on which all the correctness of the code is built. Due to its abstract specification, it can express complex properties that help reasoning on the built program or system. However, it uses very mathematical concepts like set theory or relations which are making engineers not at ease. And the proof can be somewhat complex, involving some manual steps. Moreover, one needs to fit its development process in B approach, without reuse of existing code;
  3. A new approach proposed by Bertrand Meyer (of Eiffel fame which popularized the Design by Contract approach) which relies on pre and post-conditions on code but where such specification can use more abstract operators made available from libraries called Domain theories. The main idea is that such domain theories provide to the programmer abstract view on some domains (like maximum of an array) so formal specifications are easier to read (and thus probably more correct) and hopefully to prove.

The third approach of Meyer seems to take most advantages of the two first ones: use of current programming language so the programmer is at ease and use of abstract formalisation so the specification is easier to read and more abstract. Moreover, it relies heavily on invariants (already available in Eiffel language) so a B Method-like approach is possible. However, Meyer's approach lacks refinement capabilities of B Method. But as far as I know, refinements are rarely used in B Method, except for making difficult proofs easier by using several steps to make concrete algorithms or data structures[1].

It remains to see how it will be implemented in a tool. Formal methods effective capabilities are determined by the supporting tools. Time will tell if Meyer's approach is really useful! :-)

Note

[1] But that might make the difference between a nice approach and a usable one. ;-)

page 2 / 5