21 October 2014

A GNOME Kernel wishlist

GNOME has long had relationships with Linux kernel development, in that we would have some developers do our bidding, helping us solve hard problems. Features like inotify, memfd and kdbus were all originally driven by the desktop.

I've posted a wishlist of kernel features we'd like to see implemented on the GNOME Wiki, and referenced it on the kernel mailing-list.

I hope it sparks healthy discussions about alternative (and possibly existing) features, allowing us to make instant progress.

11 October 2014

And now for some hardware (Onda v975w)

Prodded by Adam Williamson's fedlet work, and by my inability to getting an Android phone to display anything, I bought an x86 tablet.

At first, I was more interested in buying a brand-name one, such as the Dell Venue 8 Pro Adam has, or the Lenovo Miix 2 that Benjamin Tissoires doesn't seem to get enough time to hack on. But all those tablets are around 300€ at most retailers around, and have a smaller 7 or 8-inch screen.

So I bought a "not exported out of China" tablet, the 10" Onda v975w. The prospect of getting a no-name tablet scared me a little. Would it be as "good" (read bad) as a PadMini or an Action Pad?


Vrrrroooom.


Well, the hardware's pretty decent, and feels rather solid. There's a small amount of light leakage on the side of the touchscreen, but not something too noticeable. I wish it had a button on the bezel to mimick the Windows button on some other tablets, but the edge gestures should replace it nicely.

The screen is pretty gorgeous and its high DPI triggers the eponymous mode in GNOME.

With help of various folks (Larry Finger, and the aforementioned Benjamin and Adam), I got the tablet to a state where I could use it to replace my force-obsoleted iPad 1 to read comic books.

I've put up a wiki page with the status of hardware/kernel support. It's doesn't contain all my notes just yet (sound is working, touchscreen will work very very soon, and various "basic" features are being worked on).

I'll be putting up the fixed-up Wi-Fi driver and more instructions about installation on the Wiki page.

And if you want to make the jump, the tablets are available at $150 plus postage from Aliexpress.

Update: On Google+ and in comments of this blog, it was pointed out that the seller on Aliexpress was trying to scam people. All my apologies, I just selected the cheapest from this website. I personally bought it on Amazon.fr using NewTec24 FR as the vendor.

06 October 2014

GNOME 3.14

It's been ten days that 3.14 was published and what have I been up to? Parties, mostly. It started even before the actual release date with our biannual Brussels gathering, organized by Guillaume of Empathy fame; a few beers at the Chaff, a nice Lebanese restaurant, then more beers in the hidden bar of a youth hostel. Quiet evening before getting back to work and making the release happen (e.g. testing stuff and pestering a few persons for tarballs, as the technical release process was handled by Matthias).

Then on Thursday morning I went back to Strasbourg, as Alexandre was also having a release party, around ten persons showed up for a dinner with Alsacian dishes I couldn't spell or pronounce. It was also the unannounced start of a serie of podcasts Alexandre is preparing, so we discussed the GNOME project with microphones on between meals and desserts.

tables at release party

Strasbourg, September 25th, 2014

My plan was then to go to Lyon but they had already had their party, on Tuesday evening, and that's how my tour of GNOME parties ended short.

Fortunately that idea of a tour was just an excuse to get on the road^Wtracks, as friends were having a big party in Ariège, in the south of France, on Saturday night. And while some computer-related conversations popped up, it was mostly a GNOME-free night, that later continued in a GNOME-free week as I joined an old friend and lent a hand to his market gardening, in beautiful Périgord.

market gardening in Périgord

Savignac-les-Églises, October 1st, 2014

I am now back and enjoying 3.14, a quality release, with a very strong GTK+, applications converging to the GNOME 3 features (headerbars, popovers...), and the long missing piece, updated human interface guidelines (thanks Allan!).

27 September 2014

GNOME 3 and HIG Love

I am making progress on the next Ekiga release. I have spent the last few months of spare time working on the user interface. The purpose is to adapt the UI and use the great GTK+3 features when adequate. There is still much left to do, but here are before and after screenshots. Stay tuned […]

24 September 2014

Ubuntu Developer Tools Center: how do we run tests?

We are starting to see multiple awesome code contributions and suggestions on our Ubuntu Loves Developers effort and we are eagerly waiting on yours! As a consequence, the spectrum of supported tools is going to expand quickly and we need to ensure that all those different targeted developers are well supported, on multiple releases, always delivering the latest version of those environments, at anytime.

A huge task that we can only support thanks to a large suite of tests! Here are some details on what we currently have in place to achieve and ensure this level of quality.

Different kinds of tests

pep8 test

The pep8 test is there to ensure code quality and consistency checking. Tests results are trivial to interpret.

This test is running on every commit to master, on each release during package build as well as every couple of hours on jenkins.

small tests

Those are basically unit tests. They are enabling us to quickly see if we've broken anything with a change, or if the distribution itself broke us. We try to cover in particular multiple corner cases that are easy to test that way.

They are running on every commit to master, on each release during package build, every time a dependency is changed in Ubuntu thanks to autopkgtests and every couple of hours on jenkins.

large tests

Large tests are real user-based testing. We execute udtc and type in stdin various scenarios (like installing, reinstalling, removing, installing with a different path, aborting, ensuring the IDE can start…) and check that the resulting behavior is the one we are expecting.

Those tests enables us to know if something in the distribution broke us, or if a website changed its layout, the download links are modified, or if a newer version of a framework can't be launched on a particular Ubuntu version or configuration. That way, we are aware, ideally most of the time even before the user, that something is broken and can act on it.

Those tests are running every couple of hours on jenkins, using real virtual machines running an Ubuntu Desktop install.

medium tests

Finally, the medium tests are inheriting from the large tests. Thus, they are running exactly the same suite of tests, but in a Docker containerized environment, with mock and small assets, not relying on the network or any archives. This means that we ship and emulate a webserver delivering web pages to the container, pretending we are, for instance, https://developer.android.com. We then deliver fake requirements packages and mock tarballs to udtc, and running those.

Implementing a medium tests is generally really easy, for instance:

class BasicCLIInContainer(ContainerTests, test_basics_cli.BasicCLI):

"""This will test the basic cli command class inside a container"""

is enough. That means "takes all the BasicCLI large tests, and run them inside a container". All the hard work, wrapping, sshing and tests are done for you. Just simply implement your large tests and they will be able to run inside the container with this inheritance!

We added as well more complex use cases, like emulating a corrupted downloading, with a md5 checksum mismatch. We generate this controlled environment and share it using trusted containers from Docker Hub that we generate from the Ubuntu Developer Tools Center DockerFile.

Those tests are running as well every couple of hours on jenkins.

By comparing medium and large tests, as the first is in a completely controlled environment, we can decipher if we or the distribution broke us, or if a change from a third-party changing their website or requesting newer version requirements impacted us (as the failure will only occurs on the large tests and not in the medium for instance).

Running all tests, continuously!

As some of the tests can show the impact of external parts, being the distribution, or even, websites (as we parse some download links), we need to run all those tests regularly[1]. Note as well that we can experience different results on various configurations. That's why we are running all those tests every couple of hours, once using the system installed tests, and then, with the tip of master. Those are running on various virtual machines (like here, 14.04 LTS on i386 and amd64).

By comparing all this data, we know if a new commit introduced regressions, if a third-party broke and we need to fix or adapt to it. Each testsuites has a bunch of artifacts attached to be able to inspect the dependencies installed, the exact version of UDTC tested here, and ensure we don't corner ourself with subtleties like "it works in trunk, but is broken once installed".

jenkins test results

You can see on that graph that trunk has more tests (and features… just wait for some days before we tell more about them ;)) than latest released version.

As metrics are key, we collect code coverage and line metrics on each configuration to ensure we are not regressing in our target of keeping high coverage. That tracks as well various stats like number of lines of code.

Conclusion

Thanks to all this, we'll probably know even before any of you if anything is suddenly broken and put actions in place to quickly deliver a fix. With each new kind of breakage we plan to back it up with a new suite of tests to ensure we never see the same regression again.

As you can see, we are pretty hardcore on tests and believe it's the only way to keep quality and a sustainable system. With all that in place, as a developer, you should just have to enjoy your productive environment and don't have to bother of the operation system itself. We have you covered!

Ubuntu Loves Developers

As always, you can reach me on G+, #ubuntu-desktop (didrocks) on IRC (freenode), or sending any issue or even pull requests against the Ubuntu Developer Tools Center project!

Note

[1] if tests are not running regularly, you can consider them broken anyway

17 September 2014

What’s in a job title?

Over on Google+, Aaron Seigo in his inimitable way launched a discussion about  people who call themselves community managers.. In his words: “the “community manager” role that is increasingly common in the free software world is a fraud and a farce”. As you would expect when casting aspertions on people whose job is to talk to people in public, the post generated a great, and mostly constructive, discussion in the comments – I encourage you to go over there and read some of the highlights, including comments from Richard Esplin, my colleague Jan Wildeboer, Mark Shuttleworth, Michael Hall, Lenz Grimmer and other community luminaries. Well worth the read.

My humble observation here is that the community manager title is useful, but does not affect the person’s relationships with other community members.

First: what about alternative titles? Community liaison, evangelist, gardener, concierge, “cat herder”, ombudsman, Chief Community Officer, community engagement… all have been used as job titles to describe what is essentially the same role. And while I like the metaphors used for some of the titles like the gardener, I don’t think we do ourselves a service by using them. By using some terrible made-up titles, we deprive ourselves of the opportunity to let people know what we can do.

Job titles serve a number of roles in the industry: communicating your authority on a subject to people who have not worked with you (for example, in a panel or a job interview), and letting people know what you did in your job in short-hand. Now, tell me, does a “community ombudsman” rank higher than a “chief cat-herder”? Should I trust the opinion of a “Chief Community Officer” more than a “community gardener”? I can’t tell.

For better or worse, “Community manager” is widely used, and more or less understood. A community manager is someone who tries to keep existing community members happy and engaged, and grows the community by recruiting new members. The second order consequences of that can be varied: we can make our community happy by having better products, so some community managers focus a lot on technology (roadmaps, bug tracking, QA, documentation). Or you can make them happier by better communicating technology which is there – so other community managers concentrate on communication, blogging, Twitter, putting a public face on the development process. You can grow your community by recruiting new users and developers through promotion and outreach, or through business development.

While the role of a community manager is pretty well understood, it is a broad enough title to cover evangelist, product manager, marketing director, developer, release engineer and more.

Second: The job title will not matter inside your community. People in your company will give respect and authority according to who your boss is, perhaps, but people in the community will very quickly pigeon-hole you – are you doing good work and removing roadblocks, or are you a corporate mouthpiece, there to explain why unpopular decisions over which you had no control are actually good for the community? Sometimes you need to be both, but whatever you are predominantly, your community will see through it and categorize you appropriately.

What matters to me is that I am working with and in a community, working toward a vision I believe in, and enabling that community to be a nice place to work in where great things happen. Once I’m checking all those boxes, I really don’t care what my job title is, and I don’t think fellow community members and colleagues do either. My vision of community managers is that they are people who make the lives of community members (regardless of employers) a little better every day, often in ways that are invisible, and as long as you’re doing that, I don’t care what’s on your business card.

 

10 September 2014

How to help on Ubuntu Developer Tools Center

Last week, we announced our "Ubuntu Loves Developers" effort! We got some great feedback and coverage. Multiple questions arose around how to help and be part of this effort. Here is the post to answer about this :)

Our philosophy

First, let's define the core principles around the Ubuntu Developer Tools Center and what we are trying to achieve with this:

  1. UDTC will always download, tests and support the latest available upstream developer stack. No version stuck in stone for 5 years, we get the latest and the best release that upstream delivers to all of us. We are conscious that being able to develop on a freshly updated environment is one of the core values of the developer audience and that's why we want to deliver that experience.
  2. We know that developers want stability overall and not have to upgrade or spend time maintaining their machine every 6 months. We agree they shouldn't have to and the platform should "get out of my way, I've got work to do". That's the reason why we focus heavily on the latest LTS release of Ubuntu. All tools will always be backported and supported on the latest Long Term Support release. Tests are running multiple times a day on this platform. In addition to this, we support, of course, the latest available Ubuntu Release for developers who likes to live on the edge!
  3. We want to ensure that the supported developer environment is always functional. Indeed, by always downloading latest version from upstream, the software stack can change its requirements, requiring newer or extra libraries and thus break. That's why we are running a whole suite of functional tests multiple times a day, on both version that you can find in distro and latest trunk. That way we know if:
  • we broke ourself in trunk and needs to fix it before releasing.
  • the platform broke one of the developer stack and we can promptly fix it.
  • a third-party application or a website changed and broke the integration. We can then fix this really early on.

All those tests running will ensure the best experience we can deliver, while fetching always latest released version from upstream, and all this, on a very stable platform!

Sounds cool, how can I help?

Reports bugs and propose enhancements

The more direct way of reporting a bug or giving any suggestions is through the upstream bug tracker. Of course, you can always reach us out as well on social networks like g+, through the comments section of this blog, or on IRC: #ubuntu-desktop, on freenode. We are also starting to look at the #ubuntulovesdevs hashtag.

The tool is really to help developers, so do not hesitate to help us directing the Ubuntu Developer Tools Center on the way which is the best for you. :)

Help translating

We already had some good translations contributions through launchpad! Thanks to all our translators, we got Basque, Chinese (Hong Kong), Chinese (Simplified), French, Italian and Spanish! There are only few strings up for translations in udtc and it should take less than half an hour in total to add a new one. It's a very good and useful way to contribute for people speaking other languages than English! We do look at them and merge them in the mainline automatically.

Contribute on the code itself

Some people started to offer code contribution and that's a very good and motivating news. Do not hesitate to fork us on the upstream github repo. We'll ensure we keep up to date on all code contributions and pull requests. If you have any questions or for better coordination, open a bug to start the discussion around your awesome idea. We'll try to be around and guide you on how to add any framework support! You will not be alone!

Write some documentation

We have some basic user documentation. If you feel there are any gaps or any missing news, feel free to edit the wiki page! You can as well merge some of the documentation of the README.md file or propose some enhancements to it!

To give an easy start to any developers who wants to hack on udtc iitself, we try to keep the README.md file readable and up to the current code content. However, this one can deviate a little bit, if you think that any part missing/explanation requires, you can propose any modifications to it to help future hackers having an easier start. :)

Spread the word!

Finally, spreading the word that Ubuntu Loves Developers and we mean it! Talk about it on social network, tagging with #ubuntulovesdevs or in blog posts, or just chatting to your local community! We deeply care about our developer audience on the Ubuntu Desktop and Server and we want this to be known!

uld.png

For more information and hopefully goodness, we'll have an ubuntu on air session session soon! We'll keep you posted on this blog when we have final dates details.

If you felt that I forgot to mention anything, do not hesitate to signal it as well, this is another form of very welcome contributions! ;)

I'll discuss next week how we maintain and runs tests to ensure your developer tools are always working and supported!

01 September 2014

Two weeks little sleep

That was now more than a month ago, on the eve of July 25th I took the train from Brussels to attend GUADEC, in Strasbourg. I really love that place, my first stay was not in the center proper but really next to the Rhine, in the "jardin des Deux Rives", but the sheer joy of that first visit still shine over all of the city. But that's not all, there's also a strong free software connection, it was the first stop in my GNOME 3 city tour after the Bangalore hackfest and the release of 3.0.

It was Alexandre who organized that, it happened in the Epitech building, and here we were again, three years later, only three years, such a short time but such a long way, the project still going strong.

So Friday evening, we assembled a small party and went out to find a restaurant, along the road the party grew but we found a place that could accomodate us all, and enjoyed it, then the next stop was at a terrace, meeting Hubert who had just arrived.

Then it started, we were Saturday morning and the coffee machine was running strong (too strong for some...), and the day passed like a blur, between talks, discussions, and somehow scribbling some notes for the release team report at the end of the day, which was mostly about inviting people to the BoF session planned for Wednesday.

Sunday, Monday, Tuesday, classic GUADEC days and nights, there have been reports about talks, videos are coming, you should watch them, the diversity, from the evolutions of GTK+ dialogs to the future of access to GNOME services (and you should really thank Patrick, he does a fantastic job).

BoF session on the Wednesday morning, I didn't quite know what to expect, but it went well, in a first part presenting the way things are working at the moment and getting feedback from persons new to the processes, then once a concurrent Red Hat meeting was over being joined by Matthias, Kalev, Allan, and digging further in the topics. Allan posted his summary and I sent the raw minutes.

/files/sponsored-badge-shadow.png

GUADEC was then over for me, I didn't even have time to give proper goodbyes, back to the train but not to Brussels, I stopped midway to my other yearly summer event, radio Esperanzah!, again long days and short nights, enormous laughs with the press team, and packing everything on the Monday.

Back in Brussels for a single night, on the morning I flew to Sofia to give a two-days workshop and hands-on coding session (and at last bought a new camera), back on Friday night, too late for a confortable airport connection to the city, but back.

...

Sofia, August 8th, 2014

...

Sofia, August 8th, 2014

As for sleeping, I did, a bit, but not much, we have now entered the 20th birthday of the Magasin 4 music venue.

15 August 2014

GNOME.Asia Summit 2014

Everyone has been blogging about GUADEC, but I’d like to talk about my other favorite conference of the year, which is GNOME.Asia. This year, it was in Beijing, a mightily interesting place. Giant megapolis, with grandiose architecture, but at the same time, surprisingly easy to navigate with its efficient metro system and affordable taxis. But the air quality is as bad as they say, at least during the incredibly hot summer days where we visited.

The conference itself was great, this year, co-hosted with FUDCon’s asian edition, it was interesting to see a crowd that’s really different from those who attend GUADEC. Many more people involved in evangelising, deploying and using GNOME as opposed to just developing it, so it allows me to get a different perspective.

On a related note, I was happy to see a healthy delegation from Asia at GUADEC this year!

Sponsored by the GNOME Foundation

17 July 2014

Stepping down as openSUSE Board Chairman

Two years ago, I got appointed as chairman of the openSUSE Board. I was very excited about this opportunity, especially as it allowed me to keep contributing to openSUSE, after having moved to work on the cloud a few months before. I remember how I wanted to find new ways to participate in the project, and this was just a fantastic match for this. I had been on the GNOME Foundation board for a long time, so I knew it would not be easy and always fun, but I also knew I would pretty much enjoy it. And I did.

Fast-forward to today: I'm still deeply caring about the project and I'm still excited about what we do in the openSUSE board. However, some happy event to come in a couple of months means that I'll have much less time to dedicate to openSUSE (and other projects). Therefore I decided a couple of months ago that I would step down before the end of the summer, after we'd have prepared the plan for the transition. Not an easy decision, but the right one, I feel.

And here we are now, with the official news out: I'm no longer the chairman :-) (See also this thread) Of course I'll still stay around and contribute to openSUSE, no worry about that! But as mentioned above, I'll have less time for that as offline life will be more "busy".

openSUSE Board Chairman at oSC14

openSUSE Board Chairman at oSC14

Since I mentioned that we were working on a transition... First, knowing the current board, I have no doubt everything will be kept pushed in the right direction. But on top of that, my good friend Richard Brown has been appointed as the new chairman. Richard knows the project pretty well and he has been on the board for some time now, so is aware of everything that's going on. I've been able to watch his passion for the project, and that's why I'm 100% confident that he will rock!

11 May 2014

Coffee: my personal history

As always, long time no blog. These days, I don't have enough energy (nor content, IMO) to write blog posts, mostly on Free Software, which would relevant for other people.

Why, would you ask ? Mostly because with my not-so-new-anymore position at SUSE (Enterprise Desktop Release Manager), I'm mostly working behind the scene (discovery the joy of OBS to create ISO images and lot of crazy similar stuff) which might not be that sexy to describe but still need to be done ;)

So, instead of closing this blog for new posts, I'm trying something new to me: writing about things which aren't Free Software but might still interest people:

My new thing these days (asks my wife ;) is coffee.



I've always been fond of coffee (and tea, they aren't mutually exclusive, fortunately), probably because when I was a child, my parents loved good coffee and I was happy to be the one taking care of both electric grinder and Expresso machine we had. And I remember how difficult it was to find good coffee, even more when you were living in a very rural area of France and when the only online services were accessible with a Minitel and were definitively not selling coffee ;)

Fast forward ten years, when I started to work in Paris, I was still into coffee and I discovered something which wasn't known at all at that time (it was in 2002 and George was still working in ER ;): Nespresso. This was a great thing (even if I was a bit worried by the closed system around it) because I was able to get a expresso at home which was always good (IMO at that time) and which also allowed me to switch between various coffees without any hassle (try that with several ground opened coffee bags when you are single and only drink one expresso per day ;)

And then started my love story with Nespresso, which has not ended (yet), with its ups (being part of a customer panel once, including UI designers, very interesting) and downs. I often skipped coffee in cafés and restaurants because I knew it wouldn't be good!

Nespresso Drinker
Fast forward again 10 years. We are in 2014. Caps war is on for few years in France, since some of Nespresso patents are in public domain and competitors are trying to get a share of this huge market (France is apparently one of the biggest markets for Nespresso). I've tried various alternative caps and most of them are just cheaper and not as good as the original caps, except one or two caps done by some "small" roasters (Terre de Café for instance). I ended up sticky with the original, until something better "happens".

And it has happened these days, somehow unexpectedly: for a few years, I was reading about strange devices (Aeropress being cited often) and tasty filter coffee (which, for me, as always been synonym of bad coffee) and I also heared some radio shows on coffee which make me think: let's try.
I ordered an Aeropress and tried it (with some fair trade coffee from my supermarket since I don't have any grinded coffee at home and opening caps wasn't really a good idea). Result: not bad, compared to the consistency of Nespresso but not that great. I knew I wasn't using great coffee.

The Aeropress
So, I decided to expand a bit more and searched for good coffee roasters in Paris. And one of those which was often mentionned is Coutume Café (their main website is not great ATM, better to look at their FB account), who also have a coffee shop. I went there, tried one of their coffee and I was astonished. This was the best ever coffee I ever tasted, with flavor like red fruits and chocolate. This was incredible and it wasn't even an expresso (which has been my reference for coffee) but filter coffee which looks like dishwater ;)



So, I'm now with this exact same coffee at home, waiting for delivery of a freshly ordered manual grinder to try to duplicate this coffee experience, because I try other coffee and other Paris roasters.

Let's see if I succeed :)

18 April 2014

Project naming

Number two in an occasional series of “time I wish I could have back” topics related to releasing proprietary software projects as free software.

What’s in a name?

It was famously said that there are 2 hard problems in computer programming: Cache invalidation, naming things, and off by one errors.

Naming a project is a pain. Everyone has their favourite, everyone’s an expert, and there are a dozen different creative processes that people will suggest. Also, project names are subject to approval by legal and brand departments often, which are 2 departments most engineers don’t want anything to do with. So how do you come up with an acceptable name without spending weeks talking about it? Here are some guidelines.

Avoid anything related to the company name or products

You don’t want to have corporate trademark guidelines, certification programmes, etc impacting the way your community can use the project name, and avoiding names related to company assets will make it easier to transfer trademark ownership to an independent non-profit, should you decide to do that in the future. In terms of maintaining a clear separation in people’s minds between the community project and your company’s products, it’s also a good idea to avoid “line extension” by reusing a product name

Outside of that, the number one thing to remember is:

Projects names are not the most important thing about your project

What’s important is what problems you solve, and how well you solve them. The name will grow on people if they’re using the project regularly. You can even end up with interesting discussions like “is it pronounced Lee-nooks or Lie-nucks?” which can galvanise a nascent community. Also, remember that:

Project names can be changed

How many projects fall foul of existing trademarks, end up rebranding soon after launch, or are forced to change names because of a change of corporate sponsor or a fork? Firefox, Jitsi, WildFly, Jenkins, LibreOffice, Joomla, Inkscape all started life under different names, and a rename has not prevented them from going on to be very successful projects. The important thing, in an open source project, is to start small so that you don’t have a huge amount invested in the old name, if circumstances require you to change it.

Avoid a few pitfalls

Avoid using anything which is related to the trademarks of competing companies or projects, unless it is pretty abstract (Avid to Diva, Mozilla to Mosaic Killer, Eclipse to Sun).

That said, don’t worry too much about trademarks. Yes, do a quick search for related projects when you have a shortlist, and check out USPTO. But just because there is a Gnome Chestnut Farms in Bend, Oregon doesn’t mean you can’t call your free software desktop environment GNOME. Domain of use is a powerful constraint, take advantage of it.

Avoid potentially politically incorrect or “bad language” words. Also, avoid artificially smart acronyms. The Flexible Add-on Release Tracker might seem like a good idea, but… don’t.  GIMP is a notable exception here to both rules, and countless days have been spent defending the choice of name over the years.

Do worry about the domain name. This will be the primary promotion mechanism. People shouldn’t spend time trying to figure out if your project is hosted at “sandbla.st” or “sandblast.org” or “ProjectSandblast.org”. Make sure you get a good domain name.

Empower a small group to choose

The decision on a name should belong to 1, 2 or 3 people. No more. Once you realise that names are not the most important thing, and that the name can be changed if you mess up badly, that frees you from getting buy-in from everyone on the development team. The “committee” should include the project leaders (the person or people who will be identified as the maintainers afterwards), and one person who is good at facilitating naming discussions (perhaps someone from your Brand department to ensure their buy-in for the result). Beyond that, do not consider surveys, general calls for names, or any other process which gives a sense of ownership of the process to more than 2 or 3 people. This way lies many weeks and months of bikeshedding arguments.

Have a process

  1. Start with a concept and work from there. Break out the Thesaurus, make a list of related concepts.
  2. Names can be abstract or prosaic, it doesn’t really matter. Discourse is one of the most wonderfully prosaic project names I’ve seen, but StackOverflow has nothing to do with a questions & answers forum. Ansible is a made up word, Puppet and Chef both evoke wonderfully orchestration while being dictionary words.
  3. Keep the shortlist to names which are short and pronounceable in multiple languages.
  4. Cull ruthlessly – don’t keep “maybe” names. If you get to the end, go back to the concepts list and start again.
  5. If you get to a shortlist of 2 or 3 and can’t decide, use random() to pick the winner or go with the choice of the project leader.

In general, don’t spend too much time on it. You should be able to get a couple of candidate names in a few days of discussion, submit them to Legal for a trademark review, and spend your time on what really matters, understanding your users’ problems and solving them as well as you can.

Of course, this is easier said than done – good luck!

31 March 2014

Some introduction seems to be necessary

It appears my blog is currently is reaching some places like planet.gnome.org and planet.fedoraproject.org so I think some introduction may be necessary.

My name is Baptiste Mille-Mathias, I’m French, I’m living in south of France, near Cannes with my partner Célia and my son Joshua and my daughter Soline.

During work days I’m System/Application Administrator on the search engine for  a French ISP which the name refers to a color and a fruit, and during my free time I contribute mainly to GNOME and somewhat Fedora.

I’ve started contributing to GNOME around 2002 or 2003 translating software, writing documentation, reporting bugs, providing patches here and there. I’m also president of the GNOME-FR association that take care of promotion of GNOME in France and French-speaking countries (We have members of Belgium or Switzerland for instance).

I started contributing Fedora few months ago after using Ubuntu few years (actually since private betas), I co-maintain a package called office-runner with Haikel Guemar.

This week I’m contributing to GNOME and Fedora indirectly by taking part to Winter docfest 2014 in UK, I’ve already published a short report already.

Thanks for your attention

30 March 2014

Pitivi et MediaGoblin, même combat !

C'est avec pas mal de retard, que je vous signale (ou rappelle) que Pitivi, le logiciel de montage vidéo a lancé un appel aux dons par l'intermédiaire de la Fondation GNOME. Le but: aider à sortir la version 1.0 en récoltant assez de fonds pour dédier des gens à temps plein sur cette tâche, et s'il y a des sous en rab, financer des évolutions, pour lesquelles vous pouvez voter. Vous trouverez plus d'informations sur ce billet d'antistress.

D'un autre côté, MediaGoblin est un logiciel permettant d'héberger son propre équivalent de Youtube, ses photos, etc., et partager cela avec ses amis. Plus de problèmes de censure: vous hébergez le contenu vous même, et c'est ce modèle décentralisé qui a fait la force de web. MediaGoblin fait aussi un appel au don. Je ne peux que vous encourager à donner pour encourager ces projets :)

14 March 2014

Traduisez GNOME 3.12 !

Ce week end, le Fondation Mozilla nous héberge dans ses locaux parisiens, pour le Traducthon/Translathon 2014. C'est l'endroit où vous pourrez facilement utiliser vos talents de traducteur ou relecteur pour être sûr que votre environnement préféré sera parfaitement traduit à la sortie de GNOME 3.12. Si vous débutez, venez quand même, nous serons là pour vous mettre le pied à l'étrier et vous expliquer les mystères des traducteurs de l'ombre :-).Pour l'adresse et les détails, direction la page du Translathon 2014 du wiki GNOME.

08 March 2014

SXSW: lemonde.fr talking about Karen Sandler

lemonde.fr, a famous French newspaper, covers SXSW and on day 1, Karen is listed in the highlighted guests and appears in a dedicated article about her fight to open source the code of pace makers. There only a small mention about GNOME though.

Update: corrected the links as the mobile version of wordpress didn’t worked properly when it came to add the links.

14 November 2013

I’m finally on twitter

I have finally decided to activate a twitter account. Short posts and news will go to twitter. I will of course keep maintaining this blog alive for longer news about Ekiga.

08 November 2013

Building a single GNOME module using jhbuild

I often see new contributors (and even seasoned hackers) wanting to hack on a GNOME module, say Empathy, trying to build it using:

jhbuild build empathy

This is obviously correct but asks jhbuild to build not only Empathy but also all its dependencies (62 modules) so you'll end up building most of the GNOME stack. While building a full GNOME stack may sometimes be useful that's generally not needed and definitely not the easiest and fastest way to proceed.

Here is what I usually do.

First I make sure to have installed all the dependencies of the module using your distribution packaging system. This is done on Fedora using:

sudo yum-builddep empathy

or on Ubuntu/Debian:

sudo apt-get build-dep empathy

If you are using a recent distribution, there are good chances that most of these dependencies are still recent enough to build the module you want to hack on. Of course, as you are trying to build the master branch of the project some dependencies may have been bumped to one of the latest developement releases. But first let's try to build just Empathy.

jhbuild buildone empathy

There are good chances that some dependencies are missing or are too old, you'll then see this kind of error message:

No package 'libsecret-1' found
Requested 'telepathy-glib >= 0.19.9' but version of Telepathy-GLib is 0.18.2

That means you'll have to build these two libraries in your jhbuild as well. Just check the list of depencies of the module to find the exact name of the module:

jhbuild list empathy | grep secret
jhbuild list empathy | grep telepathy

In this example you'll see you have to build the libsecret and telepathy-glib modules:

jhbuild buildone libsecret telepathy-glib

Of course these modules may have some extra depencies on their own so you may have to do some iteration of this process before being able to actually build the module you care about. But, from my experience, if you are using a recent distribution (like the latest Fedora release) the whole process will still be much faster than building the full stack. Furthermore, it will save you to have to deal with build errors from potentially 62 modules.

18 September 2013

GIR for java-gnome: last update / summary


This is the last week for this year Google Summer of Code and so it is time to see all the work that has been done.
To summarize for the people that did not follow my work the goal was to rewrite a piece of the java-gnome's code generator to make it able to use the GObject Introspection data in place of the .defs data.

Since about a week I am proud to say that the goal has been reached and we have a ready to use GObject Introspection powered code generator for java-gnome. The code that has been produced is not merged yet into the master branch but I guess that this work will be done one day after this GSoC.

Now we are going to take a look at all the steps passed:

  • The first step was to decide what kind of Introspection data to use: typeslib or XML files. After some tests and thinking we have agreed with Serkan, my mentor, to use the XML format. It has the advantage to be human readable but it is boring to parse in Java. To make this more easy and little bit less boring I have decided to use the XOM API.
  • After some reading and quick tests, I was trying to convert the XML data to .defs files. I tried to do that in order to still be able to use the DefsParser that has been used and tested for years. The idea was to understand the XML format properly and to find equivalences between XML elements and Defs blocks.
  • After understanding the XML format and the XML elements well enough I could start the real work and implement the GIR data parser right inside the code generator. This was the most difficult part of my work. It was a long and complex process to ensure that we had a way to get all the data that we needed to generated the bindings properly. This part of the work took almost half of the GSoC to be completed and eventually being stable enough to be used.
  • Some code polishing were done to have something maintainable and well working. This has involved a small rework of the ./configure step when building java-gnome. This step now tries to find the required .gir files needed by the IntrospectionParser class.
  • Another big part of the code was to find a way to blacklist objects, functions and other things that we do not need to have a binding for. Without it the IntrospectionParser tries to generate bindings for all the things that it finds in the GIR data. After some researches and some trials I have decided to use an XML format file that contains a list of types that we care about and for each type a list of things that we must ignore.
  • The java-gnome's code generator also used to use some custom made Defs data to generate code in the way we wanted it to be generated. To be able to continue to do such a thing, I have decided to add a way to feed the bindings generator with .defs files that contain data that can be used to override XML data or simply to add some custom code.
  • Of course I have ran the full java-gnome's tests suite when the bindings eventually compiled. I made this to ensure that there were no regressions after switching to the Introspection based code generator. At the beginning I have found some regressions which have been fixed quite easily.
  • I have taken some time to test the IntrospectionParser with several .gir files to ensure that it was working properly with every .gir files. I guess that there will probably be some improvements to do in the future but for now it is working well enough.
  • Eventually, the last (and bonus) step of this GSoC was to propose the coverage of another library. I have taken the time to do that during the past weeks. So now we have a coverage of the libgweather. I have choosed libgweather because covering it involves to touch almost all elements of the code generator (types.list files, add overriders, modify the generated Java package name). Here is a auick screenshot, yes my work was not really "screenshotable" before that.
    java-gnome-gweather-example.png

To conclude, I would like to thanks everyone that was involved in this Google Summer Code. Thanks to Carol Smith from Google, without her I would not have spent 3 awesome summers. Thanks to GNOME for selecting me as a student this year, this was a real pleasure to work for you guys. Big thanks to Serkan Kaba for mentoring me and also to Andrew Cowie who believed in me to achieve such an important goal for java-gnome. Thank you all for doing what you did for me!

11 September 2013

GIR for java-gnome: update week 36

There is not a lot of things to talk about for the past week.

The code generator can now be considered as stable and I decided to cover the GWeather library to demonstrate how to add the coverage of a new library. You can check the progress of this work in this branch. There are still some examples and tests to write. I will also write a quick how-to guide to help the contributors to implement a new library.

See you next week for an shorter report too? No it will be the last one so I will do a recap of all the work that I have done during this great Google Summer of Code.

03 August 2013

GNOME.Asia 2013

This June, I was in Seoul, Korea for the GNOME.Asia Summit, the yearly occasion to meet up with the Asian side of the GNOME community. As always, it was an awesome conference, with so many cool people. I learned about new projects like Seafile and got to meet new friends and catch up with old ones.

I’d also to thank my employer, Collabora, for sponsoring my flight and the GNOME foundation for paying the hotel.

Sponsored by Collabora                              Sponsored by the GNOME Foundation

28 July 2013

From Thessaloniki with love -- openSUSE Conference 2013

Last week-end I was in Greece, in Thessaloniki, enjoying the openSUSE Conference 2013. If I had to summarize the event in one word, that would be: wow! It was the first time we had this event in another city than Nuremberg and Prague (two places where SUSE has offices), and it was the first time the organization was fully lead by the community. I was quite confident that things couldn't go wrong since, after all, what matters is that we're all in the same place. But I was amazed that the whole event went so smoothly! This was really a great job from a whole team of volunteers:

oSC13 volunteers

Just to give an example of the hard work that was accomplished: most (all?) talks were successfully streamed, and the recordings are already online! Stella and Kostas definitely deserve credits for the overall success, as they kept leading the organization in the right direction since last year, and the event wouldn't have been possible without their dedication. Our sponsors also helped make all this happen, so many thanks to SUSE, ARM, DevHdR and Oracle!

Having people from all over the world was once again an opportunity to meet up with old and new friends, who were coming from Brazil (Izabel, Carlos), the US, all over Europe obviously, but also India (Manu, Saurabh) and China as well as Taiwan (Sunny, Max, David, etc.)... The conference is the global event of the openSUSE community, without any doubt. With 250 attendees, there were a lot of hallway chats and informal meetings; I'm sure the GNOME couch tradition that we initiated with Dominique and Richard will stay over the years ;-)

oSC13 volunteers

Unsurprisingly, the openSUSE Board took opportunity of having so many community members to discuss several topics with as many people as possible. The board also organized for the first time a session about team reports. Even though several teams didn't participate to that session (generally because no team members was there), we had more than ten teams joining the party on stage, and this was probably one of the best way to see how broad our community really is and to learn the latest developments in various areas of the project. We also had our usual town hall meeting which went rather nicely, with useful feedback to the board.

oSC13 volunteers

The bad thing for me is that I had to stay only for a few days due to work, but there's already a next opportunity to meet with the community: this will be the openSUSE Summit in Orlando next November. And if you cannot make it, then I can only recommending making sure that you will join us next year, for the openSUSE Conference in Dubrovnik!

oSC13 volunteers

12 April 2013

Atelier Git ce mercredi 17 avril

J'animerai ce mercredi, dans le cadre du hackerspace de l'ULB, un atelier sur le système de versioning Git.

Le contenu exact de l'atelier n'est pas encore arrêté mais devrait être accessible à toute personne ayant peu d'expérience avec Git. Selon le temps disponible et le niveau des participants, on abordera peut-être des fonctionnalités plus avancées, mais il y a de bonnes chances que celles-ci soient plutôt étudiées lors d'un atelier futur.

Si le sujet vous intéresse, vous êtes les bienvenus (inscription optionnelle mais appréciée).

25 March 2013

SPICE on OSX, take 2

A while back, I made a Vinagre build for OSX. However, reproducing this build needed lots of manual tweaking, the build was not working on newer OSX versions, and in the mean time, the recommended SPICE client became remote-viewer. In short, this work was obsolete.

I've recently looked again at this, but this time with the goal of documenting the build process, and making the build as easy as possible to reproduce. This is once again based off gtk-osx, with an additional moduleset containing the SPICE modules, and a script to download/install most of what is needed. I've also switched to building remote-viewer instead of vinagre

This time, I've documented all of this work, but all you should have to do to build remote-viewer for OSX is to run a script, copy a configuration file to the right place, and then run a usual jhbuild build. Read the documentation for more detailed information about how to do an OSX build.

I've uploaded a binary built using these instructions, but it's lacking some features (USB redirection comes to mind), and it's slow, etc, etc, so .... patches welcome! ;) Feel free to contact me if you are interested in making OSX builds and need help getting started, have build issues, ...

02 February 2013

Secure Boot on openSUSE talk at FOSDEM cancelled

Hi folks,

for those of you who are attending FOSDEM this year and were planning to attend my talk about Secure Boot on openSUSE on Sunday, I'm sorry to announce I had to cancel my travel to Brussels (and my talk) for family reasons.

Since my slides were already written, I thought I could still share them with you Feel free to ask questions / comments on this blog post.

11 December 2012

FOSDEM 2013 Crossdesktop devroom Call for talks

The Call for talks for the Crossdesktop devroom at FOSDEM 2013 is getting to its end this Friday. Don't wait and submit your talk proposal about your favourite part of GNOME now!

Proposals should be sent to the crossdesktop devroom mailing list (you don't have to subscribe).

29 November 2012

29 Nov 2012

Wandering in embedded land: part 2, Arduino turned remote control

Now with the Midea AC remote control being mostly deciphered,
the next step is to emulate the remote, with an arduino since it's
the system I use for that embedded greenhouse control. While waiting
for my mail ordered IR LED (I didn't want to solder off one from my
existing AC controllers), I started doing a bit of code and looking
at the integration problems.

The hardware side

One of the challenge is that the Arduino system is already heavy
packed, basically I use all the digital Input/Output except 5 (and 0 and
1 which are hooked to the serial support), and 2 of the 6 analog inputs,
as the card already drives 2 SHT1x temp/humidity sensors, 2 light sensors,
an home made 8 way relay board, and a small LCD display, there isn't much
room left physically or in memory for more wires or code ! Fortunately
driving a LED requires minimal resources, the schematic is trivial:

I actually used a 220 Ohms resistance since I didn't had a 100 Ohms one,
the only effect is how far the signal may be received, really not a problem
in my case. Also I initially hooked it on pin 5 which shouldn't had been
a problem, and that's the free slot I have available on the Arduino

The software side

My thinking was: well I just need to recreate the same set of light
patterns to emulate the remote control and that's done, sounds fairly simple
and I started coding royines which would switch the led on or off for
1T, 3T and 4T durations. Thus the core of the code was like:


void emit_midea_start(void) {
ir_down(T_8);
ir_up(T_8);
}
void emit_midea_end(void) {
ir_down(T_1);
ir_up(T_8);
}
void emit_midea_byte(byte b) {
int i;
byte cur = b;

for (i = 0;i < 8;i++) {
ir_down(T_1);
if (cur & 1)
ir_up(T_3);
else
ir_up(T_1);
cur >>= 1;
}
cur = ~b;
for (i = 0;i < 8;i++) {
ir_down(T_1);
if (cur & 1)
ir_up(T_3);
else
ir_up(T_1);
cur >>= 1;
}
}

where ir_up() and ir_down() were respectively activating or deactivating
the pin 5 set as OUTPUT for the given duration defined as macros.

Playing with 2 arduinos simultaneously

Of course to test my code the simplest was to set up the new module on
another arduino positioned in front of the Arduino with the IR receptor
and running the same code as used for decoding the protocol.

The nice thing is that you can hook up the arduinos on 2 different USB
cables connected to the same machine, they will report as ttyUSB0 and ttyUSB1
and once you have looked at the serial output you can find which is which.
The only cumbersome part is having to select the serial port to the other one
when you want to switch box either to monitor the output or to upload a new
ersion of the code, so far things are rather easy.

Except it just didn't worked !!!

Not the arduino, I actually replaced the IR LED by a normal one from
time to time to verify it was firing for a fraction of a second when
emitting the sequence, no the problem was that the IR receiver was detecting
transitions but none of the expected duration, or order, nothing I could
really consider a mapping of what my code was sending. So I tweaked
the emitting code over and over rewriting the timing routines in 3
different ways, trying to disable interrupts, etc... Nothing worked!

Clearly there was something I hadn't understood ... and I started
searching on google and reading, first about timing issues on the Arduino
but things ought to be correct there, and then on existing remote control
code for Arduino and others. Then I hit
Ken Shirriff's blog on his IR library for the Arduino and realized
that the IR LED and the IR Receiver don't operate at the same level. The
LED really can just be switched on or off, but the IR Receiver is calibrated
for a given frequency (38 KHz in this case) and will not report if it
gets the IR light, but report if it gets the 38 KHz pulse carried by
the IR light. In a nutshell the IR receiver was decoding my analogic 0's
but didn't for the 1's because it was failing to catch a 38 KHz pulse,
I was switching the IR led permanently on and that was not recognized as
a 1 and generating erroneous transitions.

Emitting the 38KHz pulse

Ken Shirriff has another great article titled
Secrets of Arduino PWM explaining the details used to generate a pulse automatically
on *selected* Arduino digital output ans explains the details used to set this
up. This is rather complex and nicely encapsulated in his infrared library
code, but I would suggest to have a look if you're starting advanced
developments on the Arduino.

The simplest is then to use Ken's
IRremote library
by first installing it into the installed arduino environment:

  • create a new directory /usr/share/arduino/libraries/IRremote (as root)
  • copy IRremote.cpp IRremote.hIRremote.h IRremoteInt.h there

and then use it in the midea_ir.ino program:


#include <IRremote.h>

IRsend irsend;

int IRpin = 3;

This includes the library in the resulting program, define an IRsend
object that we will use to drive the IR led. One thing to note is that
by default the IRremote library drives only the digital pin 3, you can
modify it to change to a couple of other pins, but it is not possible to
drive the PWM for digital pin 5 which is the one not used currently on
my greenhouse Arduino.

Then the idea is to just replace the ir_down() and ir_up() in the code
with the equivalent low level entry points driving the LED in the IRsend
object, first by using irsend.enableIROut(38) to enable the pulse at
38 KHz on the default pin (Digital 3) and then use irsend.mark(usec)
for the equivalent ir_down() and irsend.space(usec) for the ir_up():


void emit_midea_start(void) {
irsend.enableIROut(38);
irsend.mark(4200);
irsend.space(4500);
}

void emit_midea_end(void) {
irsend.mark(550);
irsend.space(4500);
}
void emit_midea_byte(byte b) {
int i;
byte cur = b;
byte mask = 0x80;

for (i = 0;i < 8;i++) {
irsend.mark(450);
if (cur & mask)
irsend.space(1700);
else
irsend.space(600);
mask >> 1;
}

...

Checking with a normal led allowed to spot a brief light when emitting
the frame so it was basically looking okay...

And this worked, placing the emitting arduino in front of the receiving
the IRanalyzer started to decode the frames, as with the real remote control,
things were looking good again !

But failed the real test ... when put in from of the AC the hardware didn't
react, some improvement is still needed.

Check your timings, theory vs. practice

I suspected some timing issue, not with the 38KHz pulse as the code from
Ken was working fine for an array of devices, but rather how my code was
emitting, another precious hint was found in the blog about the library:


IR sensors typically cause the mark to be measured as longer than expected and the space to be shorter than expected. The code extends marks by 100us to account for this (the value MARK_EXCESS). You may need to tweak the expected values or tolerances in this case.

remember that the receptor does some logic on the input to detect the
pulse at 38 KHz, that means that while a logic 0 can be detected relatively
quickly, it will take at least a few beats before the sync to the pulse is
recognized and the receiver switch its output to a logic 1. In a nutshell
a 1 T low duration takes less time to recognize than an 1 T high duration.
I was also afraid that the overall time to send a full frame would drift
over the fixed limit needed to transmit it.

So I tweaked the emitting code to count the actual overall duration of
the frames, and aslo added to the receiver decoding code the display of the
duration of the first 10 durations between transitions. I then reran the
receiver looking at the same input from the real remote control and the
arduino emulation, and found that in average:

  • the emulated 8T down was 200us too long
  • the emulated 8T up was 100us too short
  • the emulated 1T down at the beginning of a bit was 100us too long
  • the emulated 1T up at the end of logical 0 was 80us too short
  • the emulated 3T up at the end of logical 1 was 50us too short

After tweaking the duration accordingly in the emitter code, I got
my first successful emulated command to the AC, properly switching it off,
SUCCESS !!!

I then finished the code to provide the weird temperature conversions
front end routines and then glue that as a test application looping over
a minute:

  • switching to cooling to 23C for 15s
  • switching to heating to 26C for 15s
  • switching the AC off for 30s

The midea_ir_v1.ino code
is available for download, analysis and reuse. I would suggest to not
let this run for long in fron of an AC as the very frequent change of mode
may not be good for the hardware (nor for the electricity bill !).

is available for download

Generating the 38KHz pulse in software

While the PWM generation has a number of advantages, especially w.r.t.
regularity of the pattern and no risk of drift due for example to delays
handling interrupts, in my case it has the serious drawback of forcing use
of a given pin (3 by default, or 9 if switching to a different timer in the
IRremote code), and those are not available, unless getting the soldering
iron and changing some of the existing routing in my add-on board. So
the next step is to also implement the 38KHz pulse in software. First
this should only affect the up phase, the down phase consist of no emission
and hence implemented by a simple:


void send_space(int us) {
digitalWrite(IRpin, LOW);
delayMicroseconds(us);
}

The up part should divised into HIGH for most of the duration, followed
by a small LOW indicating the pulse. 38 KHz means a 26.316 microseconds
period. Since the delayMicroseconds() of the arduino indicates it can be
reliable only for more than 3 microseconds, it seems reasonable to use
a 22us HIGH/ 4us LOW split, and expect the remaining computation to fill
the sub-microsecond of the period, that ought to be accurate enough. One
of the point of the code below is to try to avoid excessive drift in two
ways:

  • by doing the accounting over the total lenght for the up period,
    not trying to just stack 21 periods
  • by running a busy loop when the delay left is minimal rather than
    call delayMicroseconds() for a too small amount (not sure it's effective
    micros() value seems periodically updated by an timer interrupt handler
    doesn't look like the chip provides a fine grained counter).

The resulting code doesn't look very nice:


void send_mark(int us) {
unsigned long e, t = micros();
e = t + us;
while (t < e) {
digitalWrite(IRpin, HIGH);
if (t - e < 4) {
while ((t = micros()) < e);
digitalWrite(IRpin, LOW);
break;
}
if (t - e < 22) {
delayMicroseconds(t-e);
digitalWrite(IRpin, LOW);
break;
}
delayMicroseconds(22);
digitalWrite(IRpin, LOW);
t = micros();
if (t - e < 4) {
while ((t = micros()) < e);
break;
}
delayMicroseconds(4);
t = micros();
}
}

But to my surprize once I replaced all the irsend.mark() and irsend.space()
by equivalent calls to send_mark() and send_space(), the IRanalyzer running on
the second arduino properly understood the sequence, proving that the IR
receiver properly picked the signal, yay !

Of course that didn't worked out the first time on the real hardware,
after a bit of analysis of the resulting timings exposed by IRanalyzer
I noticed the mark at the beginning of bits were all nearly 100us too long,
I switched the generation from 450us to 350us, and bingo, that worked with the
real aircon !

the midea_ir_v2.ino resulting module
is very specific code, but it is tiny, less than 200
lines, and he hardware side is also really minimal, a single resistor and
the IR led.

Epilogue

The code is now plugged and working, but v2 just could not work in the
real environment with all the other sensors and communication going on.
I suspect that the amount of foreign interrupts are breaking the 38KHz
pulse generation, switching back to the PWM generated pulse using the
IRremote library works in a very reliable way. So I had to unsolder pin 3
and reaffect it to the IR led, but that was a small price to pay in
comparison of trying to debug the timing issues in situ !

The next step in the embedded work will be to replace the aging NSLU2
driving the arduino by a shiny new Raspberry Pi !

This entry will be kept at http://veillard.com/embedded/midea.html.

26 November 2012

26 Nov 2012

Wandering in embedded land: part 1, Midea 美的 aircon protocol

I have been a user of Arduino's for a few years now, I use them to
control my greenhouse (I grow orchids). This mean collecting data
for various parameters (temperature, hygrometry, light) and actionning
a collection of devices in reaction (fan, misting pump, fogging machine,
a heater). The control part is actually done by an NSLU2 which also
collects the data, export them as graph on the internet and allows me
to manually jump in and take action if needed even if I'm far away
using an ssh connection.

This setup has been working well for me for a few years but since our
move to China I have had an airon installed in the greenhouse like in
other parts of the home. And that's where I have a problem, this AC of
brand Midea (very common home appliance brand in China) can only be
controlled though a remote control. And until now that meant I had no
way to automtate heating or cooling, which is perfectly unreasonnable :-)

After some googling the most useful reference I found about those
is the Tom's
Site page on building a remote adapter
for those. It explained most
parts of the protocol but not all of them, basically he stopped at the
core of the interface but didn't went into details, for example didn't
explained the commands encoding. The 3 things I really need are:

  • Start cooling to a given temperature
  • Start heating to a given temperature
  • Stop the AC

I don't really need full fan speed control, low speed is quite sufficient
for the greenhouse.

Restarting the Arduino development

I hadn't touched the Arduino development environment for the last few
years, and I remember it being a bit painful to set up at the time. With
Fedora 17, things have changed, a simple

yum install arduino

and launching the arduino tool worked the first time, actually it asked me
the permission to tweak groups to allow me as the current user to talk
through the USB serial line to the Arduino. Once done, and logging in again
everything worked perfectly, congratulation to the packagers, well done !
The only sowtware annoyance is that is often take a dozen seconds between the
time an arduino is connnected or powered and when it appears in the
ttyUSB? serial ports options in the UI, but that's probably not arduino's
fault.

The arduino environment didn't really change in all those years,
the two notable exception is the very long list of different boards supoorted
now, and the fact that arduino code files are renamed from .pde to .ino !

Learning about the data emitted

The first thing needed was to double check the result from Tom with
our own hardware, then learn about the protocol to be able to construct
the commands above. To do this I hooked a IR receptor to the Arduino on
digital pin 3, the graphic below show the logic, it's very simple:

Then I loaded a modified (for IRpin 3) version of Walter Anderson's
IRanalyzer.pde
onto the Arduino and started firing the aircon remote control at the
receiver and looked at the result: total garbage ! Whatever the key
pressed the output had no structure and actually looked as random as
input without any key being pressed :-\

It took me a couple of hours of tweaking to find out that the metal
enclosure of the receiver had to be grounded too, the GRD pin wasn't
connected, and not doing so led to random result !

Once that fixed, the data read by the Arduino started to make some
sense and it was looking like the protocol was indeed the same as the
one described in Tom's site.

The key to the understanding of how the remote work is that it
encodes a digital input (3 Bytes for Midea AC protocol) as a set of
0 and 1 patterns each of them being defined by a 0 analogic duration
followed by a short anlogic pulse at 38KHz to encode 0, or a long
analogic pulse at 38KHz to encode 1:

Each T delay correspond to 21 pulses on a 38KHz signal, this is
then a variable lenght encoding

As I was making progresses on the recognition of the patterns sent
by the aircon I was modifying the program to give a more synthetic view
of the resulting received frames, you can use my own
IRanalyzer.ino
it is exended to allow recording of a variable number of transition,
detects the start transition as a 3-4 ms up, and the end as a 3-4 ms
down from the emitter, then show the transmitted data as bit field and
hexadecimal bytes:


Waiting...
Bit stream detected: 102 transitions
D U 1011 0010 0100 1101 1001 1111 0110 0000 1011 0000 0100 1111 dUD Bit stream end : B2 4D 9F 60 B0 4F !
4484 4324 608 1572 604 472 596 1580 600 1580 !
Waiting...

So basically what we find here:

  • the frame start markers 4T down, 4T up
  • 6 Bytes of payload, this is actually 3 bytes of data but after
    each byte is sent its complement is sent too
  • the end of the frame consist of 1T down, 4T up and then 4T down

there is a few interesting things to note about this encoding:

  • It is redundant allowing to detect errors or stray data coming from
    other 38KHz remotes (which are really common !)
  • All frames are actually sent a second time just after the first one,
    so the amount of redundancy is around 4 to 1 in the end !
  • by reemitting inverted values, the anmount of 0 and 1 sent is the
    same, as a result a frame always have a constant duration even if the
    encoding uses variable lenght
  • A double frame duration is around : 2 * (8 + 8 + 3*2*8 + 3*4*8 + 1 + 8) * 21 / 38000 ~= 186 ms

The protocol decoding

Once the frame is being decoded properly, we are down to analyzing
only 3 bytes of input per command. So I started pressing the buttons
in various ways and record the emitted sequences:


Cool 24 fan level 3
1011 0010 0100 1101 0011 1111 1100 0000 0100 0000 1011 1111 B2 4D 3F C0 40 BF
Cool 24 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 0100 0000 1011 1111 B2 4D 9F 60 40 BF
Cool 20 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 0010 0000 1101 1111 B2 4D 9F 60 20 DF
Cool 19 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 0011 0000 1100 1111 B2 4D 9F 60 30 CF
Heat 18 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 0001 1100 1110 0011 B2 4D 9F 60 1C E3
Heat 17 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 0000 1100 1111 0011 B2 4D 9F 60 0C F3
Heat 29 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 1010 1100 0101 0011 B2 4D 9F 60 AC 53
Heat 30 fan level 1
1011 0010 0100 1101 1001 1111 0110 0000 1011 1100 0100 0011 B2 4D 9F 60 BC 43
Stop Heat 30 fan level 1
1011 0010 0100 1101 0111 1011 1000 0100 1110 0000 0001 1111 B2 4D 7B 84 E0 1F
Cool 28 fan 1
1011 0010 0100 1101 1001 1111 0110 0000 1000 0000 0111 1111 B2 4D 9F 60 80 7F
Stop Cool 28 fan 1
1011 0010 0100 1101 0111 1011 1000 0100 1110 0000 0001 1111 B2 4D 7B 84 E0 1F

The immediately obvious information is that the first byte is the constant
0xB2 as noted by Tom's Site. Another thing one can guess is that the command
drom the control is (in general) absolute, not relative to the current state
of the AC, so commands are idempotent,if it failed to catch one key, it will
get a correct state if this is repeated, this just makes sense from an UI
point of view ! After a bit of analysis and further testing the
code for the 3 bytes seems to be:

[1011 0010] [ffff 1111] [ttttcccc]

Where tttt == temperature in Celcius encoded as following:

17: 0000, 18: 0001, 19: 0011, 20: 0010, 21: 0110,
22: 0111, 23: 0101, 24: 0100, 25: 1100, 26: 1101,
27: 1001, 28: 1000, 29: 1010, 30: 1011, off: 1110

I fail to see any logic in the encoding there, I dunno what the Midea
guys were thinking when picking those values. What sucks is that the protocol
seems to have a hardcoded range 17-30, while basically for the orchids
I try to keep in the range 15-35, i.e. I will have to play with the
sensors output to do the detection. Moreover my test is that even when
asked to keep warm at 17, the AC will continue to heat until well above
19C, I can't trust it to be accurate, best is to keep the control and logic
on our side !

cccc == command, 0000 to cool, 1100 to heat, 1000 for automatic selection
and 1101 for the mode to remove moisture

Lastly ffff seems to be the fan control, 1001 for low speed, 0101 for
medium speed, 0011 for high speed, 1011 automatic, and 1110 for off. There
is also a mode which is about minimizing energy, useful at night, where
the fan is quite slower than even the low speed, but i didn't yet understood
how that actually work.

There is still 4 bytes left undeciphered, they could be related to 2
function that I don't use: a timer and oscilation of the air flow, I
didn't try to dig, especially with a remote control and documentation
in Chinese !

Last but not least: the stop command is 0xB2 0x7B 0xE0, it's the same
whatever the current state might be.

At this point I was relatively confident I would be able to control the
AC from an Arduino, using a relatively simple IR LED control, it ought to
be a "simple matter of programming", right ?

Well, that will be the topic of the next part ;-) !

This entry will be kept at http://veillard.com/embedded/midea.html.

22 May 2012

Pizza

I have spent few days in Berlin recently and found a nice small pizza restaurant which was very good.

So, after Il Campionissimo in Paris and Franco Manca in London, I recommend you 'A Magica in Berlin :)

It is small and you will probably need to share you table with strangers but it was delicious!

While I'm on the topic I'll shamelessly advertise my Berlin photos.

Auswärtiges Amt
Ernst TälmannP1000178Friedrichswerdersche KircheP1000259P1000258

19 February 2012

Tip for python-mode with Emacs

If you expect 'Alt + d' wil only remove the first part 'foo_' of 'foo_bar' with the great python-mode, you can make this change to python-mode.el:

- (modify-syntax-entry ?\_ "w" py-mode-syntax-table)
+ (modify-syntax-entry ?\_ "_" py-mode-syntax-table)

Thank you Ivan.

Update with python-mode v6.0.4, add this line to python-mode-syntax-table (line 153):

(modify-syntax-entry ?\_ "_" table)

27 January 2012

27 Jan 2012

FOSDEM 2012

My last FOSDEM participation was in 2004, and I always keep in mind many good moments with my French and Belgian GNOME's Friends !

Archives

So I'm totally excited to meet them again in 2012 ... :)

I'm going to FOSDEM, the Free and Open Source Software Developers' European Meeting

10 January 2012

Gtk client for HP TopTools P1218A card

From December 19 to December 28 zarb.org main server was down. This server host(s|ed) many things including this blog, Mageia website, PLF, ... The reason why it took so long is that the server is in the south of France, kindly hosted by Lost Oasis and we have no one nearby to physically access it, and in this case we had lost our main raid array.

This server (kindly donated by HP almost 10 years ago) has a remote administration card (P1218A) but it is not really usable for anything except rebooting the machine. The remote console more or less works with some of the java versions from sun, but most of the time it only displays the top third of the screen, until next refresh when it goes black, and misses many keystrokes. This made it unsuitable for accessing the RAID BIOS and finding the problem.

After about a week, for some unknown reason (I could have done it many times over the last 10 years), I thought of looking at the communications between the applet and the management card. Everything was clear text and very simple. The next days I wrote a ruby-gtk client for the card, accessed the BIOS, found that the 4 disks had been marked has failed without errors and were correctly syncronized, and put them back online.

Login
The first (and longest) part was to find how to login and get the session cookie. The exchange looks like:
GET /cgi/challenge HTTP/1.1
<?xml version='1.0'?><?RMCXML version='1.0'?><RMCLOGIN><CHALLENGE>DJRhNVfOWfuB8fS/6PFazg==</CHALLENGE><RC>0x0</RC></RMCLOGIN>
GET /cgi/login?user=FOO&hash=UtPRDzFS36s0jJBgTmtS4JDR HTTP/1.1

Challenge was obviously 16 bytes of data base64 encoded. Response was called hash and was 18 bytes whatever the password is. Given that it was written more than 10 years ago, I supposed it would be md5, even if it only gives 16 bytes.

I then wrote a small ruby application trying various combinations (md5(challenge + password), md5(xor(callenge,password)), xor(challenge,md5(password)), ...) and found that md5(xor(challenge,md5(password))) was giving me the correct first 16 bytes.

I then used an online CRC calculator to find that the remaining 2 bytes are "CRC-CCITT (XModem)".

Console
The other big part was the remote console.

Getting the current screen content is quite easy, it's a GET on /cgi/scrtxtdump (with an optional force=1 parameter).

In my initial tests there was 0x10 between each character so I just filtered them out. I found later that it actually gives attributes for the character (bold, color, ...) and now support the ones I have seen so far.

Sending a keypress is quite easy too, it's a POST to /cgi/bin with data being <RMCSEQ><REQ CMD="keybsend"><KEYS>space separated scancodes</KEYS></REQ></RMCSEQ>.

IMG_1683

The result

The code is now online, still very ugly, but hopeful helpful :)

BIOS before I handle colors

15 November 2011

Outreach in GNOME

The GNOME Montréal Summit was held a month ago now, and not only was it lots of fun, but also a very productive time. Marina held a session about the outreach in GNOME, and we spent time discussing different ways to improve welcoming and attracting people in GNOME. Let me share some of the points we raised, supplemented by my own personnal opinions, that do not reflect those of my employer, when I’ll have a job.

A warm welcome

There has been a lot of nice work done structuring and cleaning up GNOME Love page. We now have a list of people newcomers can contact if they are interested in a particular project. Feel free to add your name and project to the list, the more entry points we get, the better for them!

I tend to think there is still a bit too much content on the GNOME Love page, maybe we could use more pretty diagrams (platform overview, ways to get involved) to keep the excitement growing and to reduce the amount of text we have right now (GUI tutorials, books about GNOME,  tips & tricks). Feedback appreciated!

Start small

We tend to think of contributions as patches and a certain amount of code added in a project. Howeverit’s not easy at all for newcomers to just pop in and work on a patch, especially in GNOME where most software follows strict rules (as in coding style, GObject API style, etc.). And since GNOME maintains (again, for the most part) a very high quality in its code backed by many hackers, whether they’re part of a company or independent contributors, it makes the landing of a patch even tougher.

Which is why we should encourage everyone who wants to get involved to work on small tasks, would it be fixing a string typo, rewording or marking plural forms for translation. Working on manageable changes ensures that the patches are completed and landing these patches builds confidence to work on bigger patches. Having your name in the commit log is a great reward, that encourages sticking around and digging for more.

Advertise early, advertise often

If we want to get loads of people coming toward GNOME, we should definitely talk more and spread the word about the GNOME Outreach Program for Women (GOPW) and the Google Summer of Code (GSoC) earlier.

Google doesn’t announce the program very far in advance and approved organizations are only published three weeks before the application deadline, but we should encourage students to get involved in GNOME early and keep an eye out for such announcements. Having a list of mentors who can help newcomers anytime throughout the year and having that list included on the Google Summer of Code wiki page of organizations that provide year-round informal mentorship should help attract students to GNOME.

On our side, we could definitely gather ideas and promote the programs earlier. Don’t have exact dates in head, but our KDE fellows promote the Summer of Code early March if not before. Not only will that help better spreading the word, but students might get involved earlier, and get to know the tools/community before the actual program.

Communication is key to success

We have to get better at communicating with interns, and make sure they get the help and feedback they need. We have different channels of communication in GNOME, mainly IRC and mailing lists. Both are a bit intimidating to the newcomer (I still proceed with extreme care when I use them), so it would be good to have a short tutorial about the main mailing lists around, how to connect on IRC and what to expect out of it.

Always two there are, no more, no less

In order to increase the chances of success for the interns, we need good mentors. Most people underestimate what it takes to be a good mentor: being nice, supportive, competent, enthusiastic. You have to remember you’re helping someone to land in the big GNOME land without too much hassle, so consider it carefully. I encourage you to read this very informative blog post if you’re thinking about mentoring a student.

The Summer of Code administrators at GNOME could perhaps keep an eye on mentors as well as students, not with weekly reports but just by poking them from time to time and making sure everything is going well.

Show me the way

To help students set up their workflow, it would be great to have full-length screen-casts demonstrating how to fix a bug in GNOME, starting on the Bugzilla page and finishing on the same page when attaching the final patch. This means going through cloning the module with Git, using grep to find the faulty line, editing the code, using Git to look at the diff and format the final patch. All this in one video would really help connect the parts and suggest a way to work for students.

GNOME Love bug drive

Please consider attaching the gnome-love keyword when you file or triage a bug that is easy to fix. A selection of current GNOME Love bugs is essential to help newcomers figure out how they can start contributing.

Good GNOME Love bugs are trivial or straight forward bugs that everyone agrees on, e.g. paper cut bugs or corner cases. It’s helpful to specify the file or files that will need to be modified and any reference code that does something similar in the bug. Even most trivial bugs are suitable candidates, because in the end, fixing a GNOME Love bug is as much about learning the process, as about the fix itself!

Get involved

If you want to help us gather more people around GNOME and help them find their spot in our community, make sure to suscribe to the outreach-list mailing list.

Thanks for reading!

And thanks to Marina and Karen for reviewing this post!

09 October 2011

Feedback on GNOME 3.0

After 5 months with GNOME 3.0, I'm really happy with the experience. At the end of work day,
my mind is no more exhausted of windows placement fighting and application finding.

GNOME 3.0 is really stable, except with the Open Source driver on my Radeon 5870 (4 crashes in 2 months).

I really like the behavior of dual-head where the secondary screen has only one virtual screen.
For me, there are just 3 annoying points:

  • Ctrl + Del to remove a file in Nautilus, may be it's a Fedora settings but this change is just @!# I've already a Trash to undo my mistakes (http://www.khattam.info/howto-enable-delete-key-in-nautilus-3-fedora-15-2011-06-01.html)
  • Alt key to shutdown, no I don't want to waste energy for days and my PC boots quickly.
  • only vertical virtual screens, I found a bit painful to move down two screens when the screen is reachable with one move with a 2x2 layout but I understand this layout doesn't fit well with the GNOME 3 design.

To have a good experience with GNOME 3, I use:

  • Windows key + type to launch everything
  • Ctrl + Shift + Alt + arrows to move the application between the virtual screen
  • Ctrl + click in the launcher when I really want a new instance (the default behavior is perfect)
  • snap à la Windows 7 is great
  • Alt + Tab then the arrow keys to select an app

Don't forget to read https://live.gnome.org/GnomeShell/CheatSheet or the Help (System key + 'Help').

It's not specific to GNOME 3 but you can change the volume when your mouse is over the applet (don't click, think hover) and a mouse scroll.
With GTK+, do you know you can reach the end of scrolled area with a left click on the arrow and a specific position by middle click?

I'm impressed by the new features of GNOME 3.2 and I'm waiting for Fedora 17 to enjoy it!

23 August 2011

GNU Hackers Meeting 2011 in Paris

In case you are in the Paris area and don't know already, there is a a GNU Hackers Meeting event being held from Thursday 25th to Sunday 28th August, 2011 at IRILL If you are a GNU user, enthusiast, or contributor of any kind, feel free to come. I guess you can still drop an email to ghm-registration@gnu.org.

For folks around on Wednesday (yeah, that's tomorrow), we are having a dinner around 8 PM at the Mussuwam, a Senegalese restaurant in Paris, near Place d'Italie. When you get there, just give them the secret password (which is 'GNU') and they'll show you were the rest of the crowd sits. Be sure to keep that password secret though. No one else should be in the know.

Happy hacking and I hope to see you guys there.

22 August 2011

How to install a digital CA certificate on Red Hat based GNU/Linux distributions

This is just as a reminder for myself, as I keep forgetting about this stuff.

If like me you run a server with services that depends on SSL and need to install a certificate issued by a Certificate Authority (CA) like CACert, this might be interesting to you as well.

On Red Hat based systems the CA certificate for SSL is usually installed in the /etc/pki/tls/certs directory. The certificate is basically just dropped there in a file which name is its hash – built with the openssl program.

I wrote the shell scriptlet http://dodji.seketeli.net/install-ca-cert.txt. Download it, save it as install-ca-cert.sh and turn it into an executable.

Then, assuming your certificate is in a file named your-ca.crt, install it by doing:

sudo ./install-ca-cert.sh ./your-ca.crt

Voila. I don't know how that works on other distributions, though.

Update

A wise person taught me about the c_rehash utility from openssl, that does the same thing as my dirty script above. To use it, you need to install the openssl-perl package. Thank you, Daniël.

04 July 2011

Going to RMLL (LSM) and Debconf!

Next week, I’ll head to Strasbourg for Rencontres Mondiales du Logiciel Libre 2011. On monday morning, I’ll be giving my Debian Packaging Tutorial for the second time. Let’s hope it goes well and I can recruit some future DDs!

Then, at the end of July, I’ll attend Debconf again. Unfortunately, I won’t be able to participate in Debcamp this year, but I look forward to a full week of talks and exciting discussions. There, I’ll be chairing two sessions about Ruby in Debian and Quality Assurance.

17 February 2011

Recent Libgda evolutions

It’s been a long time since I blogged about Libgda (and for the matter since I blogged at all!). Here is a quick outline on what has been going on regarding Libgda for the past few months:

  • Libgda’s latest version is now 4.2.4
  • many bugs have been corrected and it’s now very stable
  • the documentation is now faily exhaustive and includes a lot of examples
  • a GTK3 branch is maintained, it contains all the modifications to make Libgda work in the GTK3 environment
  • the GdaBrowser and GdaSql tools have had a lot of work and are now both mature and stable
  • using the NSIS tool, I’ve made available a new Windows installer for the GdaBrowser and associated tools, available at http://www.gnome.org/~vivien/GdaBrowserSetup.exe. It’s only available in English and French, please test it and report any error.

In the next months, I’ll work on polishing even more the GdaBrowser tool which I use on a daily basis (and of course correct bugs).

21 March 2010

16 March 2010

Webkit fun, maths and an ebook reader

I have been toying with webkit lately, and even managed to do some pretty things with it. As a consequence, I haven’t worked that much on ekiga, but perhaps some of my experiments will turn into something interesting there. I have an experimental branch with a less than fifty lines patch… I’m still trying to find a way to do more with less code : I want to do as little GObject-inheritance as possible!

That little programming was done while studying class field theory, which is pretty nice on the high-level principles and somewhat awful on the more technical aspects. I also read again some old articles on modular forms, but I can’t say that was “studying” : since it was one of the main objects of my Ph.D, that came back pretty smoothly…

I found a few minutes to enter a brick-and-mortar shop and have a look at the ebook readers on display. There was only *one* of them : the sony PRS-600. I was pretty unimpressed : the display was too dark (because it was a touch screen?), but that wasn’t the worse deal breaker. I inserted an SD card where I had put a sample of the type of documents I read : they showed up as a flat list (pain #1), and not all of them (no djvu) (pain #2) and finally, one of them showed up too small… and ended up fully unreadable when I tried to zoom (pain #3). I guess that settles the question I had on whether my next techno-tool would be a netbook or an ebook reader… That probably means I’ll look more seriously into fixing the last bug I reported on evince (internal bookmarks in documents).

24 February 2010

Renouveau dans ma vie professionnelle

Bonjour à tous,

je vous délaisse depuis quelques temps. Est-ce le temps qui fait cela, une période dans ma vie ou simplement autre chose, je n'en ai pas la moindre idée.

Je tenais juste à vous annoncer que je vais quitter mon employeur actuel qui est un Agence Gouvernementale pour chercher de l'expérience dans le secteur privé. En effet, je suis de plus en plus déçu par l'Administration.

Depuis quelques années, comme vous le savez, je me passionne pour la sécurité de l'Information. Ceci ajouté à une formation en Management de la Sécurité de l'Information, j'ai l'ambition de faire valoir mes expériences auprès d'un employeur (à définir) qui pourrait me permettre de les améliorer tout en lui faisant bénéficier de mes compétences.

Si vous avez de bonnes adresses, je suis preneur évidemment. ^^

16 January 2010

New Libgda releases

With the beginning of the year comes new releases of Libgda:

  • version 4.0.6 which contains corrections for the stable branch
  • version 4.1.4, a beta version for the upcoming 4.2 version

The 4.1.4′s API is now considered stable and except for minor corrections should not be modified anymore.

This new version also includes a new database adaptator (provider) to connect to databases through a web server (which of course needs to be configured for that purpose) as illustrated by the followin diagram:

WebProvider usage

The database being accessed by the web server can be any type supported by the PEAR::MDB2 module.

The GdaBrowser application now supports defining presentation preferences for each table’s column, which are used when data from a table’s column need to be displayed:
GdaBrowser table column's preferences
The UI extension now supports improved custom layout, described through a simple XML syntax, as shown in the following screenshot of the gdaui-demo-4.0 program:

Form custom layout

For more information, please visit the http://www.gnome-db.org web site.

08 January 2010

Attending XMPP Summit and FOSDEM, 5th-8th of February in Brussels

I'm going to FOSDEM, the Free and Open Source Software Developers' European MeetingFor the third year in a row, I’ll be flying to Brussels, Belgium next month to attend the XMPP Summit/FOSDEM combo. I didn’t look through the FOSDEM schedule yet but when it comes to XMPP, I’m looking forward to some discussions on Jingle Nodes and Publish-Subscribe. I’ve been working more and more with XMPP in the past months, especially hacking on ejabberd, and attending is a good motivation to get some of my Jingle Nodes related code shaped up on time. See you there!


30 December 2009

Rappel - Définition du Hacker

Le hacker est un passionné d'informatique, souvent très doué, dont les seuls objectifs sont de "bricoler" programmes et matériels (software et hardware) afin d'obtenir des résultats de qualité pour lui-même, pour l'évolution des technologies et pour la reconnaissance de ses pairs.

Les conventions de hackers sont des rassemblements où ces férus d'informatique se rencontrent, discutent et comparent leurs travaux.

Depuis de nombreuses années, la tendance est de confondre à tort le hacker avec le cracker, dont les buts ne sont pas toujours légaux.

Or, on ne le répétera jamais assez, les objectifs du hacker sont louables et contribuent de manière active aux progrès informatiques et aux outils que nous utilisons quotidiennement.

05 November 2009

Attracted to FLT

I have been a little stuck for some weeks : a new year started (no, that post hasn’t been stuck since january — scholar year start in september) and I have students to tend to. As I have the habit to say : good students bring work because you have to push them high, and bad students bring work because you have to push them from low! Either way, it has been keeping me pretty busy.

Still, I found the time to read some more maths, but got lost on something quite unrelated to my main objective : I just read about number theory and the ideas behind the proof of Fermat’s Last Theorem (Taylor and Wiles’ theorem now). That was supposed to be my second target! Oh, well, I’ll just try to hit my first target now (Deligne’s proof of the Weil conjectures). And then go back to FLT for a new and deeper reading.

I only played a little with ekiga’s code — mostly removing dead code. Not much : low motivation.

15 October 2009

gwt-strophe 0.1.0 released

I just released the first version of gwt-strophe, GWT bindings for the Strophe XMPP library. Nothing much to say else than it is pretty young, with all that can imply. The project is hosted at https://launchpad.net/gwt-strophe


11 July 2009

Slides from RMLL (and much more)

So, I’m back from the Rencontres Mondiales du Logiciel Libre, which took place in Nantes this year. It was great to see all those people from the french Free Software community again, and I look forward to seeing them again next year in Bordeaux (too bad the Toulouse bid wasn’t chosen).

The Debian booth, mainly organized by Xavier Oswald and Aurélien Couderc, with help from Raphaël, Roland and others (but not me!), got a lot of visits, and Debian’s popularity is high in the community (probably because RMLL is mostly for über-geeks, and Debian’s market share is still very high in this sub-community).

I spent quite a lot of time with the Ubuntu-FR crew, which I hadn’t met before. They do an awesome work on getting new people to use Linux (providing great docs and support), and do very well (much better than in the past) at giving a good global picture of the Free Software world (Linux != Ubuntu, other projects do exist and play a very large role in Ubuntu’s success, etc). It’s great to see Free Software’s promotion in France being in such good hands. (Full disclosure: I got a free mug (recycled plastic) with my Ubuntu-FR T-shirt, which might affect my judgement).

I gave two talks, on two topics I wanted to talk about for some time. First one was about the interactions between users, distributions and upstream projects, with a focus on Ubuntu’s development model and relationships with Debian and upstream projects. Second one was about voting methods, and Condorcet in particular. If you attended one of those talks, feedback (good or bad) is welcomed (either in comments or by mail). Slides are also available (in french):

On a more general note, I still don’t understand why the “Mondiales” in RMLL’s title isn’t being dropped or replaced by “Francophones“. Seeing the organization congratulate themselves because 30% of the talks were in english was quite funny, since in most cases, the english part of the talk was “Is there someone not understanding french? no? OK, let’s go on in french.“, and all the announcements were made in french only. Seriously, RMLL is a great (probably the best) french-speaking community event. But it’s not FOSDEM: different goals, different people. Instead of trying (and failing) to make it an international event, it would be much better to focus on making it a better french-speaking event, for example by getting more french-speaking developers to come and talk (you see at least 5 times more french-speaking developers in FOSDEM than in RMLL).

I’m now back in Lyon for two days, before leaving to Montreal Linux Symposium, then coming back to Lyon for three days, then Debconf from 23rd to 31st, and then moving to Nancy, where I will start as an assistant professor in september (a permanent (tenured) position).

26 February 2009

fatal: protocol error: expected sha/ref

Dear Lennart,

You should probably know that typing the correct URL would work better for cloning a bzr branch (yes a branch, not a repository).

This is what I get when I try to feed git a random invalid URL:

$ git clone git://github.com/idontexist
Initialized empty Git repository in /home/asabil/Desktop/idontexist/.git/
fatal: protocol error: expected sha/ref, got ‘
*********’

No matching repositories found.

*********’

Now is probably the time to stop this non constructive “my DVCS is better than yours”, and focus on writing code and fixing bugs.


19 November 2008

19 Nov 2008

WOW ... Four fucking years without blogging in my advogado's page. I needed times to put my head and my body in the right place. Four years of doubt, sadness and Happiness as well. So since a few days, I decided to blog again.

It's all for the moment :)

22 July 2008

Looking for a job

On September I finish my studies of computer science, so I start to search a job. I really enjoyed my current job at Collabora maintaining Empathy, I learned lots of things about the Free Software world and I would like to keep working on free software related projects if possible. My CV is available online here.

Do you guys know any company around the free software and GNOME looking for new employees? You can contact me by email to xclaesse@gmail.com

10 June 2008

La parallaxe de Suzumiya Haruhi

On peut, au regard des concepts développés par Slavoj Zizek dans "La Parallaxe", tenter une nouvelle interprétation, plus fondamentale, des aventures de Suzumiya Haruhi.

La mélancolie de Suzumiya Haruhi est due au sentiment de malaise créé par l'incomplétude fondamentale qui nous caractérise tous. Haruhi part donc à la recherche du grand Autre, réponse censée venir combler ce vide, ici fétichisé dans les extra-terrestres, extra-lucides et voyageurs dans le temps. Ce comportement peut être vu comme semblable à ceux des individus cherchant la réponse à leur malaise constitutif dans la religion, voir, et nous y reviendrons, dans la philosophie et la politique.

Cependant, la réalité du monde de Haruhi est qu'il n'existe pas de grand Autre, aucun extra-ordinaire comblant les vides ennuyeux de la réalité, aucun personnage tirant les ficelles dans l'ombre. Ou plutôt, de façon plus importante, que ce grand Autre est Haruhi elle-même, ce qui constitue la réponse fondamentale : c'est bien elle-même qu'elle cherche en voulant résoudre cette incomplétude.

Selon ces hypothèses, le récit de ses aventures peut donc ultimement être vu comme celui de la recherche de la Vérité par les humains, les réflexions autour de son comportement précisant de façon très intéressantes plusieurs problématiques liées à ce processus.

L'interprétation de la fin de la série, où Haruhi semble trouver son bonheur avec Kyon, reste toujours problématique. Il n'existe pas de grand Autre, le manque ne peut donc pas être réellement comblé par quelque chose d'extérieur, donc pas par quelque chose qui soit matérialisé dans un fétiche, même humain, comme Kyon. Cependant Kyon n'est pas non plus quelque chose d'extérieur, puisqu'il est, comme tous les objets du monde de Haruhi, un produit de son imagination. Il s'agirait donc d'une pure matérialisation à figure humaine de la véritable réponse à son manque, ce qui ferait de Kyon une partie de Haruhi et non un personnage distinct. On peut donc avec un peu d'audace avancer que Kyon et Haruhi ne sont qu'un, qu'il est réellement sa moitié, ce qui n'est pas sans rappeler tout en lui redonnant une piquante nouvelle perspective le "happy end" chrétien par excellence. Malgré tout, le fait que Haruhi ne le reconnaisse pas comme tel, puisqu'ils sont clairement toujours deux personnes distinctes, laisse supposer que le problème n'est pas réglé.

01 June 2008

Laïcité

On ne reconnaît habituellement qu'une religion pose problème que lorsqu'elle constitue un risque potentiel pour le système capitaliste libéral dans lequel nous vivons. De fait, ces religions ont donc on potentiel subversif.

C'est à cause de celui-ci que les nombreux individus touchés de plein fouet par le malaise créé par cette société se tournent en nombre de plus en plus important vers ce type de communautés religieuses.

Or, qu'est-ce que le processus de laïcisation tel que nous l'entendons dans la bouche des libéraux, sinon le fait de rendre les religions aptes à rentrer dans le cadre libéral, ou, à défaut, de marginaliser et stigmatiser celles qui ne le feraient pas, leur retirant ainsi tout aspect nocif pour lui ?

Ce processus peut donc être vu comme la condition sine qua non du fonctionnement de l'opium du peuple comme instrument des puissances qui font l'ordre social, même si la résurgence des intégrismes en période de crise nous montre qu'il est de toute façon voué à l'échec.

L'attitude ambivalente de la laïcité promue par la droite, qui dit oui à, voir encourage, la croyance qui se veut inconditionnelle, et simultanément y porte des restrictions, reflète d'ailleurs cette contradiction.

Pour illustrer ceci, on peut prendre l'exemple des lois interdisant à la religion tout caractère visible en public, par lesquelles on leur enlève tout caractère choquant pour ceux qui n'y prennent pas part, tout en ne faisant rien contre leur effet idéologique sur les populations concernées.

La gauche radicale n'a donc aucun intérêt à aider l'ordre libéral à se maintenir en normalisant la religion pour l'intégrer, puis se renforcer, par cette laïcité.

Ce qu'elle devrait favoriser, c'est la prise de conscience par la classe dominée du fait que son malaise est dû à la structure de la société et que le seul moyen d'y remédier est la lutte politique permettant de le dépasser. Par conséquent, la seule laïcité qu'il ait un sens pour elle de défendre est celle qui permette l'émancipation de chacun, pour parvenir à ce fait.

22 April 2008

Enterprise Social Search slideshow

Enterprise Social Search is a way to search, manage, and share information within a company. Who can help you find relevant information and nothing but relevant information? Your colleagues, of course

Today we are launching at Whatever (the company I work for) a marketing campaign for our upcoming product: Knowledge Plaza. Exciting times ahead!

28 January 2008

Ubuntu stable updates

There was some blog entries this week about GNOME stable updates on Ubuntu. There is no reason new bug fix versions could not be uploaded to stable out of the fact that the SRU rules require to check carrefully all the changes and doing this job on all the GNOME tarballs is quite some work, or the ubuntu desktop team is quite small and already overworked.

There is a list of packages which have a relaxed rules though, we have discussed adding GNOME to those since the stable serie usually has fixes worth having and not too many unstable changes (though the stable SVN code usually doesn’t get lot of testing) and decided than the stable updates which look reasonable should be uploaded to hardy-update.

There was also some concerns about gnome-games, 2.20.3 has been uploaded to gutsy-proposed today which should reduce the number of bugs sent to the GNOME bugzilla. The new dependencies on ggz has also been reviewed and 2.21 should be built soon in hardy.

14 November 2007

GNOME and Ubuntu

The FOSSCamp and UDS week has been nice and a good occasion to talk to upstream and people from other distributions. We had desktop discussions about the new technologies landing in GNOME this cycle (the next Ubuntu will be a LTS so we need a balance between new features and stability), the desktop changes we want to do, and how Ubuntu contributes to GNOME.

Some random notes about the Ubuntu upstream contributions:

  • Vincent asked again for an easy way to browse the Ubuntu patches and Scott picked up the task, the result is available there
  • The new Canonical Desktop Team will focus on making the user experience better, most of the changes will likely be upstream material and discussed there, etc
  • Canonical has open Ubuntu Desktop Infrastructure Developer and Ubuntu Conceptual Interface Designer positions, if you want to do desktop work for a cool open source company you might be interested by those ;-)

GNOME updates in gutsy and hardy

  • Selected GNOME 2.20.1 changes have been uploaded to gutsy-updates
  • The GNOME 2.21.2 packaging has started in hardy, some updates and lot of Debian merges are still on the TODO though
  • We have decided to use tags in patches to indicate the corresponding Ubuntu and upstream bugs so it’s easier to get the context of the change, technical details still need to be discussed though

Update: Scott pointed that you can use http://patches.ubuntu.com/n/nautilus/extracted to access to the current nautilus version

03 November 2007

git commit / darcs record

I’ve been working wit git lately but I have also missed the darcs user interface. I honestly think the darcs user interface is the best I’ve ever seen, it’s such a joy to record/push/pull (when darcs doesn’t eat your cpu) :)

I looked at git add --interactive because it had hunk-based commit, a pre-requisite for darcs record-style commit, but it has a terrible user interface, so i just copied the concept: running a git diff, filtering hunks, and then outputing the filtered diff through git apply --cached.

It supports binary diffs, file additions and removal. It also asks for new files to be added even if this is not exactly how darcs behave but I always forget to add new files, so I added it. It will probably break on some extreme corner cases I haven’t been confronted to, but I gladly accept any patches :)

Here’s a sample session of git-darcs-record script:

$ git-darcs-record
Add file:  newfile.txt
Shall I add this file? (1/1) [Ynda] : y

Binary file changed: document.pdf

Shall I record this change? (1/7) [Ynda] : y

foobar.txt
@@ -1,3 +1,5 @@
 line1
 line2
+line3
 line4
+line5

Shall I record this change? (2/7) [Ynda] : y

git-darcs-record
@@ -1,17 +1,5 @@
 #!/usr/bin/env python

-# git-darcs-record, emulate "darcs record" interface on top of a git repository
-#
-# Usage:
-# git-darcs-record first asks for any new file (previously
-#    untracked) to be added to the index.
-# git-darcs-record then asks for each hunk to be recorded in
-#    the next commit. File deletion and binary blobs are supported
-# git-darcs-record finally asks for a small commit message and
-#    executes the 'git commit' command with the newly created
-#    changeset in the index
-
-
 # Copyright (C) 2007 Raphaël Slinckx
 #
 # This program is free software; you can redistribute it and/or

Shall I record this change? (3/7) [Ynda] : y

git-darcs-record
@@ -28,6 +16,19 @@
 # along with this program; if not, write to the Free Software
 # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301, USA.

+# git-darcs-record, emulate "darcs record" interface on top of a git repository
+#
+# Usage:
+# git-darcs-record first asks for any new file (previously
+#    untracked) to be added to the index.
+# git-darcs-record then asks for each hunk to be recorded in
+#    the next commit. File deletion and binary blobs are supported
+# git-darcs-record finally asks for a small commit message and
+#    executes the 'git commit' command with the newly created
+#    changeset in the index
+
+
+
 import re, pprint, sys, os

 BINARY = re.compile("GIT binary patch")

Shall I record this change? (4/7) [Ynda] : n

git-darcs-record
@@ -151,16 +152,6 @@ def read_answer(question, allowed_responses=["Y", "n", "d", "a"]):
        return resp

-def setup_git_dir():
-       global GIT_DIR
-       GIT_DIR = os.getcwd()
-       while not os.path.exists(os.path.join(GIT_DIR, ".git")):
-               GIT_DIR = os.path.dirname(GIT_DIR)
-               if GIT_DIR == "/":
-                       return False
-       os.chdir(GIT_DIR)
-       return True
-
 def git_get_untracked_files():

Shall I record this change? (5/7) [Ynda] : y

# On branch master
# Changes to be committed:
#   (use "git reset HEAD file..." to unstage)
#
#       modified:   document.pdf
#       modified:   foobar.txt
#       modified:   git-darcs-record
#       new file:   newfile.txt
#
# Changed but not updated:
#   (use "git add file file..." to update what will be committed)
#
#       modified:   git-darcs-record
#
What is the patch name? Some cute patch name
Created commit a08f34e: Some cute patch name
 4 files changed, 3 insertions(+), 29 deletions(-)
 create mode 100644 newfile.txt

Get the script here: git-darcs-record script and put in somewhere in your $PATH. Any comments or improvements is welcome !

22 January 2007

Un nouveau laptop, sans windows !

Voilà, j’y pensais depuis longtemps et c’est maintenant chose faite, je me suis acheté un tout nouveau ordinateur portable.

Je l’ai acheté sur le site français LDLC.com et me suis renseigné pour savoir si il était possible d’acheter les ordinateurs de leur catalogue sans logiciels (principalement sans windows). Je leur ai donc envoyé un email, et à ma grande surprise ils m’on répondu que c’était tout a fait possible, qu’il suffi de passer commande et d’envoyer ensuite un email pour demander de supprimer les logiciels de la commande. J’ai donc commandé mon laptop et ils m’ont remboursé de 20€ pour les logiciels, ce n’est pas énorme sur le prix d’un portable, mais symboliquement c’est déjà ça.

Toutes fois je me pose des questions, pourquoi cette offre n’est pas inscrite sur le site de LDLC ? En regardant sous mon tout nouveau portable je remarque une chose étrange, les restes d’un autocollant qu’on a enlevé, exactement à l’endroit où habituellement est collé la clef d’activation de winXP. Le remboursement de 20€ tout rond par LDLC me semble également étrange vue que LDLC n’est qu’un intermédiaire, pas un constructeur, et donc eux achètent les ordinateurs avec windows déjà installé. Bref tout ceci me pousse à croire que c’est LDLC qui perd les 20€ et je me demande dans quel but ?!? Pour faire plaisir aux clients libre-istes ? Pour éviter les procès pour vente liée ? Pour à leur tours se faire rembourser les licences que les clients n’ont pas voulu auprès du constructeur/Microsoft et éventuellement gagner plus que 20€ si les licences OEM valent plus que ça ? Bref ceci restera sans doutes toujours un mistère.

J’ai donc installé Ubuntu qui tourne plutôt bien. J’ai été même très impressionné par le network-manager qui me connecte automatiquement sur les réseaux wifi ou filaire selon la disponibilité et qui configure même un réseau zeroconf si il ne trouve pas de server dhcp, c’est très pratique pour transférer des données entre 2 ordinateurs, il suffi de brancher un cable ethernet (ça marche aussi par wifi mais j’ai pas encore testé) entre les 2 et hop tout le réseau est configuré automatiquement sans rien toucher, vraiment magique ! Windows peut aller se cacher, ubuntu est largement plus facile d’utilisation !

20 December 2006

Documenting bugs

I hate having to write about bugs in the documentation. It feels like waving a big flag that says ‘Ok, we suck a bit’.

Today, it’s the way fonts are installed, or rather, they aren’t. The Fonts folder doesn’t show the new font, and the applications that are already running don’t see them.

So I’ve fixed the bug that was filed against the documentation. Now it’s up to someone else to fix the bugs in Gnome.

05 December 2006

Choice and flexibility: bad for docs

Eye of Gnome comes with some nifty features like support for EXIF data in jpegs. But this depends on a library that isn’t a part of Gnome.

So what do I write in the user manual for EOG?

‘You can see EXIF data for an image, but you need to check the innards of your system first.’
‘You can maybe see EXIF data. I don’t know. Ask your distro.’
‘If you can’t see EXIF data, install the libexif library. I’m sorry, I can’t tell you how you can do that as I don’t know what sort of system you’re running Gnome on.’

The way GNU/Linux systems are put together is perhaps great for people who want unlimited ability to customize and choose. But it makes it very hard to write good documentation. In this sort of scenario, I would say it makes it impossible, and we’re left with a user manual that looks bad.

I’ve added this to the list of use cases for Project Mallard, but I don’t think it’ll be an easy one to solve.

02 August 2005

Still alive...

It's been a while since my last update. A lot of stuff has happened on the MlView front !
First of all, Dodji finally decided to make a release so go and get the 0.8 release !
On my side, work has been done on the Preferences which is a really repetitive (who said boring?) thing to code. It had to be done...i did it !



This week-end i wanted to take a break on the Preferences so i added a simple command launcher for MlView. This was a feature request (see #305075) so it might even make someone happy :)



I've been talking with Dodji about porting the main view container to GDL, so my next task will (probably) be to look at that...
I had made a simple poc with textviews and it really was convincing...Watch the video to get an idea....

^^

Sources

Planète GNOME-FR

Planète GNOME-FR est un aperçu de la vie, du travail et plus généralement du monde des membres de la communauté GNOME-FR.

Certains billets sont rédigés en anglais car nous collaborons avec des gens du monde entier.

Dernière mise à jour :
25 October 2014 à 23:30 UTC
Toutes les heures sont UTC.

Colophon

Planète GNOME-FR est propulsée par l'agrégateur Planet, cron, Python, Red Hat (qui héberge ce serveur).

Le design du site est basé sur celui des sites GNOME et de Planet GNOME.

Planète GNOME-FR est maintenue par Frédéric Péters et Luis Menina. Si vous souhaitez ajouter votre blog à cette planète, il vous suffit d'ouvrir un bug. N'hésitez pas à nous contacter par courriel pour toute autre question.