GNOME.ORG

24 hours a day, 7 days a week, 365 days per year...

January 29, 2015

DX hackfest 2015: day 5

Day 5, and the DX and docs hackfest in Collabora HQ, Cambridge has drawn to a close. It’s been great to have everyone here, and there have been a lot of in-depth discussions over the last few days about the details of app sandboxing, runtimes, Builder integration with various new services, the development of an IDE abstraction layer, approaches for making build systems accessible to Builder, lots of new things to statically analyse, and some fairly fundamental additions to GLib in the form of G_DECLARE_[FINAL|DERIVABLE]_TYPE and general-purpose reference counted memory areas. Whew! We even had a fleeting visit by Richard Hughes to discuss packaging issues for apps.

For all the details, see the blogs by Ryan, Cosimo, Ryan again, Alberto, Christian and Emmanuele.

I can’t do justice to the work of the docs team, who put in consistent, solid effort throughout the hackfest. See the blogs by Petr, Bastian, Kat and Jim for all the details. They even left me with a seemingly endless supply of Mallard balls to throw around the office!

Dave and I have spent a little while working on further deprecating gnome-common. More details to come once the migration guide is finished.

Thank you to Collabora for hosting, and Endless, Codethink and Red Hat for letting people attend; and thank you to the GNOME Foundation for sponsoring some of the attendees. It would not be a hackfest without the hackers!

Collabora logo

2015 Winter Docs Hackfest

I’m here in lovely Cambridge for the winter GNOME docs hackfest. This time, the docs team is sharing a room with the Developer Experience (DX) hackfest, which provides us with a great opportunity to reach out to GNOME developers for expert’s advice.

Yesterday, Christian Hergert presented a new GNOME IDE in development, called Builder:

Builder comes with a feature-rich text editor that can also be useful for documentation writers who often author documents in XML.

Cosimo Cecchi showed us some of the downstream changes the Endless team made to gnome-user-docs and gnome-getting-started-docs. For me, personally, the most interesting part was their feedback on the GNOME docs style and content. Endless seem to target their product to a slightly different customer, still, they appear to have data on their users that the upstream project lacks. The GNOME help suite, written by different authors and in different style over the course of many years, is actually targeted at multiple audiences, spanning from quite inexperienced desktop end-users to skilled users who need to troubleshoot VMs in GNOME Boxes.

Shaun McCance showcased some of the cool features of Ducktype, a new lightweight syntax for Mallard. Although still a work in progress, this new syntax brings to the world of Mallard docs the flexibility of formats such as AsciiDoc or Markdown, which are now gaining strong popularity in both the developer and technical communication communities.

The docs team focused on squashing the bugs filed against GNOME Help and application help, and on content improvements in different areas of the desktop documentation stack. Jim Campbell worked on changing the structure and layout of Files help. He also worked with Jana Svarova on VPN docs for the GNOME sysadmin guide. Jana went through the docs feedback ML archives, responding to user comments and filing new bugs. Kat worked on application help with Jim and fixed a couple of bugs in gnome-user-docs. I worked on triaging docs bugs, and then on reviewing and updating some parts of GNOME Help and the sysadmin guide.

I would like to thank Collabora for providing the venue and catering, Kat and Philip Withnall for running the hackfests, and the GNOME Foundation for sponsoring me.

It’s been great to see old and new faces from the community, now off to Brussels for FOSDEM, then back to Brno for DevConf!

sponsored-badge-shadow

Docs / Developer Experience Hackfest 2015: Tuesday

Me and my computer have been attending the Docs/DevX hackfest happening in Cambridge, England between Sunday the 25th january and Thursday the 29th january.

2015-01-29 DevXHackfest

As you might see in the picture above, we are all seated in a cozy conference room at the Collabora office. I’m sitting with a whole bunch of people from the #docs team being a busy bee.

Tuesday whereabouts

  • Shobha (curiousDTU) has been doing some review of the documentation for GNOME’s games.
  • Ekaterina (kittykat) and Shaun (shaunm) have been discussing how new contributors could be attracted to the documentation team. Furthermore they have also discussed Mallard and how the future could look for it.
  • Ekaterina has also been fixing application documentation bugs here and there in the vast collection of GNOME’s.
  • Peter (pmkovar) successfully converted the translated release video subtitles from *.po and back to *.srt. This means that the GNOME 3.14 release video is now available in 14 different languages! A big thanks goes to the translation team for translating them.
  • Jana (jsvarova) got a public GNOME blog and carried on fixing bugs from the cue that were in scope for this hackfest.

I myself have been focusing on rewriting the Getting started with GTK+ tutorial and learning how to make applications with GTK 3 along the way. My patch is currently undergoing review. Furthermore the GNOME Platform demos has gotten an overhaul – which also is something that currently is going under review. Once the patches land, new developers should hopefully have a better experience which is more up-to-date with how we currently recommend making applications with GTK 3.

Hackfest overall

I am writing this blogpost while I’m on my way back to Denmark. The hackfest has been a great experience in many ways. First, it is great to meet with fellow GNOME contributors face-to-face. This hackfest has also been a big learning experience for me in terms of Git, Mallard, GTK3 and C. I have gained knowledge much more rapidly because I have had great people right next to me, ready to answer any questions of mine (+whisky) and review my (initially poorly written) patches. Thanks everyone!

Once I get home I will probably have a few more patches to submit. Afterwards, it is time to work on planning the GNOME 3.16 release video again. I would definitely love to work further with the developer experience again in the future. And I would definitely attend another GNOME hackfest.

LibreOffice under the hood: progress to 4.4.0

Today we release LibreOffice 4.4.0, packed with a load of new features for people to enjoy - you can read and enjoy all the great news about the user visible features from so many great hackers, but there are, as always, many contributors whose work is primarily behind the scenes in places that are not so easy to see. That work is, of course, still vitally important to the project. It can be hard to extract those from the over eleven thousand commits since LibreOffice 4.3 was branched, so let me expand:

Complete User Interface Dialog / Layout

The UI migration to a much improved, Glade compatible XML representation of VCL dialogs, complete with automatic layout is now almost complete (after thinking we'd done them all - Caolan discovered a lot of docking windows that need further work but these are now also migrated, all but two). Also a lot of work was put into cleaning up and tweaking the look / arrangement of dialogs. Many thanks to Caolán McNamara (Red Hat) - for his incredible work & leadership here, and to Adolfo Jayme Barrientos, Palenik Mihály (GSoC 2014), Olivier Hallot (EDX), Szymon Kłos (GSoc 2014), Rachit Gupta (GSoC 2014), Tor Lillqvist (Collabora), Jan Holesovsky (Collabora), Maxim Monastirsky, Efe Gürkan YALAMAN, Yousuf Philips and many others. Thanks also to our translators who hopefully will have much less string churn to suffer now. As a side-note the resource-compiler in rsc/ has gone on a nice diet.

Graph of progress in UI layout conversion

Initial OpenGL rendering backend

The switch to move VCL to use OpenGL for rendering is one of those things that ideally should be entirely under-the-hood, but ends up having small but important visual impact. All the work here was done by Collabora engineers, with a beefy re-factor and the initial OpenGLContext management by Markus Mohrhard, much of the rendering implemented by Louis-Francis Ratté-Boulianne with anti-aliasing, and image scaling work from Lubos Lunak, various Windows fixes and porting work from Jan Holesovsky and some bits from Chris Sherlock. During the work we also implemented a half-way decent and increasingly complete VCL demo application exercising rendering. A rational for the work with some pictures is available.

By moving to a pure OpenGL rendering model, we can accelerate those operations that badly need to taking advantage of the power and parallelism of the huge APU die-area given over to modern GPUs. Being able to interact much more directly with the underlying graphics hardware helps us to both render our image previews at high quality, and not to sacrifice scroll / zoom performance: having our cake and eating it too. We've also used some of that power to not only significantly accelerate our image rendering, but also improve its quality too from before:

Before: image down-scaling
to After (NB. if your browser scales it too you're likely to loose the sense; zoom to 1:1 and checkout eg. the top of the circular window, or other high-frequency areas.)
After: faster, better, GL image down-scaling

There is a fair bit more work to get OpenGL into a suitable state including several odd Windows / lifecycle bugs; it is necessary to export SAL_FORCEGL=1 to override the black-listing, but we hope to nail these in the 4.4.x cycle. Several ongoing and intersecting features such as the true Idle handler work from Munich's Jennifer Liebel and Tobias Madl as well as more future / pending work in-progress from Munich's Michael Jaumann (working on OpenGL canvas) and Stefan Weiberg (on OpenGL Transitions) are due in 4.5, both mentored by Thorsten Behrens (SUSE).

Mobile Viewer / LibreOfficeKit

The recently announced Android Viewer (Beta) has a number of invisible pieces included there. Particularly the improvements to LibreOfficeKit: an easy way to re-use the rendering and file-format goodness from LibreOffice from Andrzej Hunt and Kohei Yoshida (Collabora) to get Impress and Calc rendering to tiles at least to a Beta level. You can read more about the just started editing work done for TDF there too. LibreOfficeKit has also become more powerful at extracting key document meta-data from yet more of the host of file formats that LibreOffice supports - important for indexing un-structured data.

Build / platform improvements

30% faster Windows builds

With the new build system functionally completed, we've looked at the most significant problem with it: rather slow build times on Windows. An investigation and some benchmarking revealed that the usage of Cygwin make was the main cause of the slowness, and hence Michael Stahl (Red Hat) made it possible to build LO 4.4 with a Win32 native build of GNU make, cutting from-scratch build time by almost a third over stock Cygwin make, and speeding up incremental rebuilds even more.

Win64 porting action

Another major improvement is from David Ostrovsky (CIB), which is to do some significant work towards completing the native Win64 port. This we expect will ship in LibreOffice 4.5, but should significantly help eg. Java users and those with very large spreadsheets. See the Windows 64bit wiki page for more detail, thanks also to Mark Williams for some tricky UNO bridge fixing work, and to Tor Lillqvist (Collabora) who laid a lot of the initial ground-work here.

Code quality work

There has been a lot of work on code quality and improving the maintainability and cleanliness of the code. Another 59 or so commits to fix cppcheck errors are thanks to Thomas Arnhold, Julien Nabet and Simon Danner, along with the daily commits to build without any compile warnings -Werror -Wall -Wextra on many platforms with thanks primarily to Tor Lillqvist (Collabora), Caolán McNamara (Red Hat), and Thomas Arnhold.

Awesome Coverity

We have been chewing through the huge amount of analysis from the Coverity Scan, well - in particular Caolán McNamara (Red Hat) has done an awesome job here; his blog on that is typically modest.

We now have a defect density that bumps along close to 0.00, though as Coverity introduces new checks, and new code gets committed that goes up and down a little; currently 0.02 so - 2 static checking warnings per 100,000 lines. That compares extremely well with the average Open Source project which has 65 warnings per 100,000 lines.

Grokking commits with coverity in them we have 1530 fixes since LibreOffice 4.3 with the top three contributors after Caolan (1378 commits) being: Norbert Thiebaud, David Tardon (Red Hat), Miklos Vajna (Collabora).

Increasing use of asserts

In the 3.5 release we switched away from custom macros to use normal 'assert' calls to sanity check various invariants; we're doing more sanity checking left and right these days:

Graph of number of run-time assertions
Import and now export testing

Markus Mohrhard (Collabora)'s great import/export crash testing has been further expanded to cover 76,000+ problem/bug documents up from 55k last release, with a selection of odd images now also included. Another major win here was the provision by TDF (thanks to our donors) of a beefy new 64 core box to run the load/save/validate tests on. This, combined with some re-working and better parallelism of the python scripts driving that, has speeded up our test runs from five days to under one - allowing rapid diagnosis of new regressions in a much smaller range. We've also been able to do some Addresss Sanitizer runs of the document set which has resulted in a number of fixes, thanks too to Caolán McNamara (Red Hat) for some great work there.

Clang plugins / checkers

We have continued to add to our clang compiler plugins; a quick git grep for 'Registration' in compilerplugins shows that we've gone from 27 to 38 plugins in the last six months. These check all manner of nasty gotchas that people can fall into in our code. Some of these plugins are used manually but many are run by a tinderbox and some users to catch badness quickly. Thanks to: Stephan Bergmann (Red Hat) and Noel Grandin (Peralex) for their hard work on these checkers this cycle.

The plugins do all sorts of things, for example Bjoern Michaelsen (Canonical) wrote a plugin that detects deeply-nested conditionals such as these monsters. These are hard to read and a severe pain to debug through. Some of the worst offenders in sw/ have been rewritten and the plugin can easily be applied elsewhere in the codebase.

Unit testing

We also built and executed more unit tests with LibreOffice 4.3 to avoid regressions as we change the code. Grepping for the relevant TEST and ASSERT macros we continue to grow the number of unit tests:

Graph of number of unit tests and assertions
Our ideal is that every bug that is fixed gets a unit test to stop it ever recurring. With around 1000 commits, and over seventy committers to the unit tests in 4.4 it is hard to list everyone involved here, apologies for that; what follows is a sorted list of those with over 10x commits to the qa/ directories: Miklos Vajna (Collabora), Caolán McNamara (Red Hat), Kohei Yoshida (Collabora), Michael Stahl (Red Hat), Stephan Bergmann (Red Hat), Zolnai Tamás (Collabora), David Tardon (Red Hat), Noel Grandin (Peralex), Matúš Kukan (Collabora), Luboš Luňák (Collabora), Markus Mohrhard (Collabora), Tor Lillqvist (Collabora), Thomas Arnhold, Andrzej Hunt (Collabora), Eike Rathke (Red Hat), Jan Holesovsky (Collabora)

QA / bugzilla

Over the last six months the QA team has grown in size and effectiveness, doing some amazing work to bring our un-triaged bug count right down from one thousand (which we thought was good) to just over three hundred bugs. It's particularly knotty triaging some of those last bugs - with rather deeply technical, or super-hard-to-reproduce combinations lurking at the bottom: some excellent work there. It is rather hard to extract credits for confirming bugs, but the respective hero list overlaps with the non-developer / top closers listed below.

One metric we watch in the ESC call is who is in the top ten in the freedesktop Weekly bug summary. Here is a list of the people who have appeared more than five times in the weekly list of top bug closers in order of frequency of appearance: Caolán McNamara (Red Hat), Adolfo Jayme, tommy27, Julien Nabet, Jean-Baptiste Faure, Jay Philips, Urmas, Maxim Monastirsky, Beluga, raal, Michael Stahl (Red Hat), Joel Madero, ign_christian, Cor Nouws, V Stuart Foote, Eike Rathke (Red Hat), Robinson Tryon (TDF), Miklos Vajna (Collabora), Matthew Francis, foss, Sophie (TDF), Samuel Mehrbrodt, Markus Mohrhard (Collabora). And thanks to the many others that helped to close so many bugs for this release.

Bjoern Michaelsen (Canonical) also wrote up a new year QA update which is well worth reading.

Another win that should help us tweak our bugzilla to make it more user friendly and better structured is the migration from FreeDesktop infrastructure to TDF, with thanks to FreeDesktop for taking our large bugzilla load for all these years. This was completed recently - so now we file bugs at http://bugs.documentfoundation.org/. Thanks to Robinson 'colonelqubit' Tryon (TDF), and Tollef Fog Heen as well as our sysadmin team for that work. As is perhaps obvious, Robinson is working for TDF (funded by our generous donors) half-time to help improve our QA situation.

Code cleanup

Code that is dirty should be cleaned up - so we did a lot of that.

Ongoing German Comment redux

We continued to make progress, but sadly only a small amount of it on translating our last lingering German comments across the codebase into good, crisp technical English. This is a great way to get involved in LibreOffice development. Many thanks to: Philipp Weissenbacher, Christian M. Heller, Jennifer Liebel (Munich), Chris Sherlock (Collabora), Michael Jaumann (Munich), Luc Castermans, Jeroen Nijhof, Florian Reisinger and a number of others with just one commit. Further reductions in the number of false positives from bin/find-german-comments suggest that there are only ten top-level modules left containing German, nine of them worth translating: i18npool, include, reportdesign, sc, scaddins, sfx2, stoc, svx, sw

Graph of remaining lines of German comment to translate

One particularly encouraging contributor to our German Comment translation efforts was Lennart Poettering who it seems has an amusing plan afoot.

Upgrading to (some) C++11 subset

As time advances, C++ improves, with the upgrade of Visual Studio we've been able to move to a subset of C++11 (as supported by VS 2012) as a new compiler base-line. We also removed several optimization disabling workarounds for bugs in old GCC versions that don't do C++11 anyway, and hence both GCC and MSVC can now build all of LO with optimization. Thanks to Stephan Bergmann (Red Hat) for researching and driving this work.

OOXML Tokenizer cleanup

This cleanup builds on work by Miklos Vajna (Collabora) in the last release. A big chunk of our OOXML tokenizer was generated code, which is reasonable but it was generated using XSLT (which is trending below cobol). This was re-written from 4200 lines of XLST into 1300 lines of python - to produce the same output with a large increase in hack-ability. Then some optimization was done by Jan Holesovsky (Collabora for CloudOn), to reduce inefficiency in the generated output saving 2.2Mb from the 8Mb (stripped) writerfilter DSO. Great to see this sort of code cleanup, source size shrink and binary shrink at the same time. You can read more about it in Miklos' blog.

std:: containers

A systematic set of improvements to our usage of the std:: containers has been going on through the code. Things like avoiding inheritance from std::vector, changing std::deque to std::vector and starting to use the newer C++ constructs for iteration like for (auto& it : aTheContainer) { ... }. There are many people to credit here, thanks to Stephan Bergmann (Red Hat), Takeshi Abe, Tor Lillqvist (Collabora), Caolán McNamara (Red Hat), Michaël Lefèvre, and many others.

Performance improvements

Performance is one of those very hard to see things, that is nevertheless viscerally felt: "why am I still waiting ?". There are a number of rather encouraging performance improvements by different people in LibreOffice 4.4 that are worth noticing.

Autocorrect performance

For reasons that elude me, some people like to have huge auto-correct lists. These are stored as zipped XML. Daniel Sikeler (Munich) put some lovely improvements into the loading of these. In particular he discovered that we were re-parsing our BlockList.xml a large number of times, fixing this made a big difference. Combining that with switching to use the threaded & improved FastParser - yielded a further win. The auto-correct list is loaded after the 1st key-press, so getting this from 4.3 seconds down to 1.5 seconds (for huge correction lists) is a big win.

Image management

While profiling saving in various file formats, it was discovered that we frequently swap in (ie. re-load, and de-compress) images - this of course takes significant CPU time, particularly since we then immediately continue to preserve the (original) data in the file. In some cases this was taking a large proportion of save time for large image-filled presentations eg. Thanks to Tamaz Zolnai (Collabora) for cleaning up and fixing this, as well as hunting perennial image loss issues.

Fast Serializer

As a general rule any class named 'Fast' in the inherited OpenOffice code is a horrible mis-nomer. Many thanks to Matus Kukan (Collabora) for fixing this. We discovered that 25% of save time of large XLSX sheets was consumed in the Fast Serializer, which did a staggering 9.9 million system-calls, each writing some tiny fragment of an XML attribute eg. separate writes for opening elements, element names, attribute names namespaces etc. Matus reduced this to 76k calls to do the same thing, a 99% decrease. Quite apart from the system-call overhead we reduced cachegrind CPU pcycles for 'SaveXML' from over 12bn to under 3bn for a simple sample.

Bundle libjpeg-turbo

It has been known for many years that JPEG-turbo provides superior de-compression performance - "In the most general terms, libjpeg-turbo is 2.1 - 5.3x as fast as libjpeg v6b and 2.0 - 5.8x as fast as libjpeg v8d.". Naturally Linux vendors use the system packaged libjpeg, but when we distribute on Windows - we now bundle a 2x speed-up in the form of libjpeg-turbo - thanks to Matúš Kukan (Collabora) with some cleanups from Stephan Bergmann (Red Hat). Volunteers to make jpeg-turbo integrate nicely on Mac appreciated.

Mail merge performance

Mail-merge works by building a huge document containing the result of all the mails to be printed / merged into a single file. The wisdom of this is highly debatable, but nevertheless thanks to Lubos Lunak & Miklos Vajna (both Collabora for Munich) who put some significant effort in to very substantially accelerate large document merge, in some cases by several orders of magnitude. Sadly OpenOffice.org took a major regression here in version 3.3, and that is now comprehensively fixed. This turns a 2000 record mail-merge from a matter of hours down to a few minutes.

Calc Performance

There were a number of rather pleasant performance wins in this release of LibreOffice, which cumulatively have rather a helpful effect.

Range dependency re-work

For previous LibreOffice releases Kohei Yoshida (Collabora) spent a big block of time unifying runs of similar formulae into FormulaGroups - that fill down a large span of a column - since this is a common case for large data sets. This allowed a large memory reduction, and lots of great data sharing. However dependency management was de-coupled from this and was still performed per-cell. That is particularly expensive if you consider a range reference that is common for the whole formula group: resulting in lots of setup, and tear-down cost: essentially to notify the entire formula group. In 4.4 calc adds a listener type that is tailored for these formulae groups - potentially turning tens of thousands of complex data structure entries into a single entry. This saves a large chunk of memory, and a lot of CPU time walking lists, it also saves a ton of time when broadcasting the changes. There is plenty more work to be done to extend this, and ideally in future we should use the same approach for single-cell references as well. Thanks too to Eike Rathke (Red Hat) and Markus Mohrhard (Collabora) for some associated fixes.

Script type optimizations

For various reasons, detecting the script-type of a cell is an expensive operation; is it some asian text, complex text or simple - which affects the font, sizing & various metrics. Kohei Yoshida (Collabora) discovered that in several common operations - copying/pasting large chunks of data - that this work was being needlessly re-done and removed this cost. Similarly, for simple data types with standard formatting on eg. a large span of doubles, it was possible to significantly simplify the calculation of script types.

Chart deferred re-rendering

Another area that (still) causes some grief is that whenever a data range changes which a chart depends on, the entire chart is re-generated. That involves tearing down a lot of drawing shapes and re-creating them, which in the case of text is particularly expensive. Kohei Yoshida (Collabora) implemented a great optimization to defer this work until the chart is visible. This should have a pleasant effect on editing time for large data sets which are charted on many other sheets, and also for macros operating on many charts.

Getting involved

I hope you get the idea that more developers continue to find a home at LibreOffice and work together to complete some rather significant work both under the hood, and also on the surface. If you want to get involved there are plenty of great people to meet and work alongside. As you can see individuals make a huge impact to the diversity of LibreOffice (the colour legends on the right should be read left to right, top to bottom, which maps to top down in the chart):

Graph showing individual code committers per month

And also in terms of diversity of code commits, we love to see the unaffiliated volunteers contribution by volume, though clearly the volume and balance changes with the season, release cycle, and volunteers vacation / business plans:

Graph of number of commits per month by affiliation

Naturally we maintain a list of small, bite-sized tasks which you can use to get involved at our Easy Hacks page, with simple build / setup instructions. It is extremely easy to build LibreOffice, each easy-hack should have code pointers and be a nicely self contained task that is easy to solve. In addition some of them are really nice-to-have features or performance improvements. Please do consider getting stuck in with something.

Another thing that really helps is running pre-release builds and reporting bugs just grab and install a pre-release and you're ready to contribute alongside the rest of the development team.

Conclusion

LibreOffice 4.4 is the next in a series of releases that incrementally improve not only the features, but also the foundation of the Free Software office suite. It is of course not perfect yet, this is just the first in a long series of monthly 4.4.x releases which will bring a stream of bug fixes and quality improvements over the next months as we start working in parallel on LibreOffice 4.5.

I hope you enjoy LibreOffice 4.4.0, thanks for reading, don't forget to checkout the user visible feature page and thank you for supporting LibreOffice.

Raw data for many of the above graphs is available.

2015-01-29 Thursday

  • Quick mail check, morning off - out for a walk with J. along a rather perplexing Devil's Dyke nearby - apparently each end used to be an impassable fen / swamp. Nice to wander along.

How Do I…

I’ve struggled for some time with long-form tutorial style documentation for various bits of things in our platform. It feels out of place in the reference documentation (since it’s not reference documentation) and often it doesn’t fit neatly into one module or another.

In 2013 the GNOME foundation sponsored my attendance at OpenHelp and the documentation hackfest in Cincinnati. We talked about this problem for a while and I laid out a few simple criteria that I had at the time for making it less painful to write docs:

  • must be a non-xml markdown style language
  • must not involve using version control tools (git, etc.)
  • must not involve getting patches reviewed
  • needs to go online instantly (and not after the next tarball is released)

The best possible solution that we could think up at the time was to make use of the wiki to launch a new experiment called “HowDoI”. A HowDoI is a wiki page that describes how to make use of a specific GNOME technology or platform feature. The target audience is generally developers who know their way around but are not yet familiar with a particular new feature, or for those looking for the latest “best practice”.

There is a How Do I HowDoI? page that you can read for more information.

A couple of years on, this has been a moderate success. We have HowDoI pages on a reasonable range of important topics and they have been very popular with the people who have used them.

In general, it is my opinion that we should be aiming to write these pages for new technologies as they appear in GNOME. I just wrote one that makes use of the new type declaration macros, for example.

If you didn’t know about these, check them out — they contain some helpful hints. If you did know about these, and you are writing new GNOME technologies, please write one!

Protected: Locking down a GNOME setting

This content is password protected. To view it please enter your password below:

Going to FOSDEM 2015

Tomorrow I am flying to Brussels to attend FOSDEM for the 8th time!
It is amazing to see how much the event grew in these 8 years and I am looking forward to having another great weekend of interesting presentations, meeting old friends and sipping tasty beer.

I need to thank CERN for making this trip possible and if you want to find out about my current project there (soon to be announced), do let me know.

See you in Brussels!

FOSDEM

January 28, 2015

The GNOME Infrastructure Apprentice Program

Many times it happened seeing someone joining the #sysadmin IRC channel requesting participation to the team after having spent around 5 minutes trying to explain what the skills and the knowledge were and why this person felt it was the right figure for the position. And it was always very disappointing for me having to reject all these requests as we just didn’t have the infrastructure in place to let new people join the rest of the team with limited privileges.

With the introduction of FreeIPA, more fine-grained ACLs (and hiera-eyaml-gpg for securing tokens, secrets, passwords out of Puppet itself) we are so glad to announce the launch of the “GNOME Infrastructure Apprentice Program” (from now till the end of the post just “Program”). If you are familiar with the Fedora Infrastructure and how it works you might know what this is about already. If you don’t please read further ahead.

The Program will allow apprentices to join the Sysadmin Team with a limited set of privileges which mainly consist in being able to access the Puppet repository and all the stored configuration files that run the machines powering the GNOME Infrastructure every day. Once approved to the Program apprentices will be able to submit patches for review to the team and finally see their work merged on the production environment if the proposed changes matched the expectations and addressed comments.

While the Program is open to everyone to join, we have some prerequisites in place. The interested person should be:

  1. Part of an existing FOSS community
  2. Familiar with how a FOSS Project works behind the scenes
  3. Familiar with popular tools like Puppet, Git
  4. Familiar with RHEL as the OS of choice
  5. Familiar with popular Sysadmin tools, softwares and procedures
  6. Eager to learn new things, make constructive discussions with a team, provide feedback and new ideas

If you feel like having all the needed prerequisites and would be willing to join follow these steps:

  1. Subscribe to the gnome-infrastructure and infrastructure-announce mailing lists
  2. Join the #sysadmin IRC channel on irc.gnome.org
  3. Send a presentation e-mail to the gnome-infrastructure mailing list stating who you are, what your past experiences and plans are as an Apprentice
  4. Once the presentation has been sent an existing Sysadmin Team member will evaluate your application and follow-up with you introducing you to the Program

More information about the Program is available here.

Detecting fake flash

I’ve been using F3 to check my flash drives, and this is how I discovered my drives were counterfeit. It seems to me this kind of feature needs to be built inside gnome-multi-writer itself to avoid sending fake flash out to customers. Last night I wrote a simple tool called gnome-multi-writer-probe which does the following few things:

* Reads the existing data from the drive in 32kb chunks every 32Mbish into RAM
* Writes random blocks of 32kb every 32MBish, and also stores in RAM
* Resets the drive
* Reads all the 32k blocks from slightly different addresses and sizes and compares them to the random data in RAM
* Writes all the saved data back to the drive.

I only takes a few seconds on most drives. It also tries to be paranoid, and saves the data back to the drive the best it can when it encounters an error. That said, please don’t use this tool on any drives that have important data on them; assume you’ll have to reformat them after using this tool. Also, it’s probably a really good idea to unmount any drives before you try this.

If you’ve got access to gnome-multi-writer from git (either from jhbuild, or from my repo) then please could you try this:

sudo gnome-multi-writer-probe /dev/sdX

Where sdX is the USB drive you want to test. I’d be interested of the output, and especially interested if you have any fake flash media you can test this with. Either leave a comment here, grab me on IRC or send me an email. Thanks.

GNOME Docs in Cambridge: day two

Working hard (or lazing around)

Day two of the hackfest saw more progress…

Application and desktop help

I worked on merging new games documentation which was written by Rashi Aswani and fixing some of our 100-odd bugs against application help which Petr Kovar has continued triaging.

Jim Campbell started refactoring Files (nautilus) desktop help as the style of the pages was a bit outdated. It now looks awesome.

In the mean time, Jana Švárová continued powering through the feedback.

Licensing

gedit documentation saw some licensing improvements thanks to Jim. A number of the help pages had previously been published without a license which is something that the team has been fixing over the last few years. Adding the license after the pages have been written is a bit of an arduous task. Progress has been slow but steady.

Developer Documentation

Bastian Ilsø and David King made further progress on gnome-devel-docs. Bastian made improvements to the first user experience of writing an application using the platform demos and learnt the importance of validating the XML.

yelp

Around August 2014, the documentation team started accepting emailed feedback about the documentation from help.gnome.org. It has been quite a success and yelp will see this feature as soon as Shaun McCance can build it.

Mallard

Shaun also improved projectmallard.org, the home of Mallard, and furthered the development of DuckType, a markdown Mallard language.

The end

We finished the day with lovely cream tea, which caused a number of altercations due to cultural and regional conflicts.


COLLABORA_03_PMS

android-galaxyzoo: Superficial porting to Android 5.0 (Material design)

Here are some notes about my experience adapting android-galaxyzoo to Material design for Android 5.0 (Lollipop) though I only used the most superficial parts of Material design.

AppCompat v21

Android 5.0 (Lollipop) has a new UI theme and some new APIs. However, for the next few years, almost everyone will use the slightly awkward AppCompat v21 compatibility API instead to achieve most of the same behavior on older devices too. Chris Banes wrote up a nice overview of AppCompat v21, some of which I mention here again for completeness.

I’m using Gradle, as should you, so I added this to the dependencies block in my app/build.gradle file. You’ll want to use the latest version.

compile "com.android.support:appcompat-v7:21.0.3"

Theme

First, I switched from the dark Holo theme to the (AppCompat) dark Material theme by changing the parent theme in my styles.xml. See the Toolbar section below about the use of the “.NoActionBar” versions of these themes.

- <style name="AppTheme" parent="android:Theme.Holo">
+ <style name="AppTheme" parent="Theme.AppCompat.NoActionBar">

If you were using the light theme, that would be:

- <style name="AppTheme" parent="android:Theme.Holo.Light">
+ <style name="AppTheme" parent="Theme.AppCompat.Light.NoActionBar">

Note that we don’t use the android: prefix with the AppCompat theme, because the theme is being bundled directly into our app via the appcompatv21 library.

I then specified  the standard colorPrimary and colorAccent colors along with some more shenanigans to get the right text and icon colors in my toolbar.

I also used the TextAppearance_AppCompat_* widget styles instead of the regular textAppearance* style attributes, because it’s recommended in the Typography section of this official “Implementing Material Design in Your Android app” blog entry. However, I didn’t notice any difference in appearance, and I wonder why we wouldn’t just get the correct styles by just using the new overall theme.

I actually created a base style and two derived styles, to support Transitions – see below.

Toolbar

The new Toolbar widget replaces the ActionBar, though the documentation doesn’t actually say that yet. Generically, they are called the “App Bar” in the Material Design document. I’m not sure that I really got any benefit from using it because my App Bar doesn’t do anything special, but I wanted to use the latest API.

To use Toolbar instead of ActionBar,  you should derive from the .NoActionBar version of the theme, such as Theme.AppCompat.NoActionBar, though I used the regular Theme.AppCompat for a long time without noticing any difference.

Then you’ll want to add a Toolbar widget to the Layout XML files for every activity. I did that by creating a toolbar.xml file

<?xml version="1.0" encoding="utf-8"?>

<android.support.v7.widget.Toolbar
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    android:id="@+id/toolbar"
    android:layout_width="match_parent"
    android:layout_height="wrap_content"
    android:background="@color/color_primary">
</android.support.v7.widget.Toolbar>

and then I just including that from all the activity layouts like so:

<include layout="@layout/toolbar" />

I also specified the app:theme and app:popupTheme to get the right text and icon colors in my toolbar.

To use this toolbar as the App Bar, you need to derive from ActionBarActivity and call its setSupportActionBar() method. I did that in a utility function, like so:

Toolbar toolbar = (Toolbar) activity.findViewById(R.id.toolbar);
activity.setSupportActionBar(toolbar);

RecyclerView

Android 5.0 adds the RecyclerView widget, available for older API levels via the support.recyclerview library. For instance, I added this to the dependencies block in my app/build.gradle file. You’ll want to use the latest version.

compile 'com.android.support:recyclerview-v7:21.0.0'

RecyclerView apparently replaces ListView and GridView, though the documentation doesn’t yet actually say that, adding confusion for new developers. I replaced my use of GridView with RecyclerView to get support for simple Material Design transitions.

Unfortunately, RecyclerView has no real cursor support, so it’s hard to use it to view data from a ContentProvider. This is particularly annoying because the API of ListView and GridView previously pushed us towards moving code into a ContentProvider. My own Cursor-based RecyclerView.Adapter is terribly inefficient and totally unsuitable for a large number of items.

There’s an awful lack of practical documentation or example code for RecyclerView, even for simple tasks like responding to a click. Until there’s an easier way, you’ll need some tedious boilerplate code to derive your RecyclerView.ViewHolder from View.OnClickListener and call setOnClickListener() on one of your child views.

Transitions

I wanted the typical move-and-scale transition that we see in Material Design apps, so that clicking on one of many items makes its image expand and move into the subsequent detail activity, and shrink back again when you go back.

To achieve this, I had to specify various settings in my theme, but only Android 5.0 (Lollipop) devices support these transition items, so I created a base theme in res/styles.xml, and derived from it:

<?xml version="1.0" encoding="utf-8"?>
<resources>

    <style name="AppTheme" parent="AppTheme.Base" />

    <style name="AppTheme.Base" parent="Theme.AppCompat.NoActionBar">
        <item name="colorPrimary">@color/color_primary</item>
        ...

I then created a styles.xml file just for API Level 21, by putting it in res/v21/styles.xml, where I derived from the base theme again:

<?xml version="1.0" encoding="utf-8"?>
<resources>
    <style name="AppTheme" parent="AppTheme.Base">
        <item name="android:windowContentTransitions">true</item>
        <item name="android:windowAllowEnterTransitionOverlap">true</item>
        <item name="android:windowAllowReturnTransitionOverlap">true</item>

        <!-- specify shared element transitions -->
        <item name="android:windowSharedElementEnterTransition">
            @transition/change_image_transform</item>
        <item name="android:windowSharedElementExitTransition">
            @transition/change_image_transform</item>
    </style>
</resources>

I then defined that change_image_transform transition in my res/transition/change_image_transform.xml file, like so:

<?xml version="1.0" encoding="utf-8"?>
<transitionSet>
    <changeBounds/>
    <changeImageTransform/>
</transitionSet>

To actually use this transition on images, I needed to specify the android:transitionName on the two ImageViews in the the layout XML files for the two Activities (Fragments in my case).

Mostly this was all voodoo which I put together gradually after finding clues scattered around the internet. I haven’t found a good official example that shows this.

Unfortunately, the transition doesn’t seem to work when the user presses the Up button on the toolbar instead of using the standard Back button, even though that’s indistinguishable from Back for most users in most activities.

Metrics and Keylines

I made a fair effort to adapt my margins and padding to fit in with the Material Design Metrics and Keylines, which wasn’t too hard.

Unfortunately, the standard Android Button’s appearance is just as usless for Material design as it was for the Holo theme. It has a fake margin around its inside edges, which is part of its background graphic rather than any adjustable margin or paddng property.

So, to make the button’s sides actually flush with other widgets, and to position them properly on the layout grid, I had to specify a custom background image or color. But then I lost the nice Material Design ripple effect. I hope someone knows how to do this properly.

Activity classes without Toolbar support

There are a few helpful derived activity classes, such as AccountAuthenticatorActivity and PreferenceActivity, but these haven’t been changed to derived from ActionBarActivity, so you can’t call setSupportActionBar() on them. They can’t be changed without breaking compatibility, so you’ll have to reimplement them in your code. It’s not a lot of code but it’s an unpleasant developer experience.

 

 

January 27, 2015

Can or Can’t?

10628746_986665307681_7544861487392315883_o

Can read or can’t eat books?

What I love about open source is that it’s a “can” world by default. You can do anything you think needs doing and nobody will tell you that you can’t. (They may not take your patch but they won’t tell you that you can’t create it!)

It’s often easier to define things by what they are not or what we can’t do. And the danger of that is you create a culture of “can’t”. Any one who has raised kids or animals knows this. “No, don’t jump.” You can’t jump on people. “No, off the sofa.” You can’t be on the furniture. “No, don’t lick!” You can’t slobber on me. And hopefully when you realize it, you can fix it. “You can have this stuffed animal (instead of my favorite shoe). Good dog!”

Often when we aren’t sure how to do something, we fill the world with can’ts. “I don’t know how we should do this, but I know you can’t do that on a proprietary mailing list.” “I don’t know how I should lose weight, but I know you can’t have dessert.” I don’t know. Can’t. Don’t know. Can’t. Unsure. Can’t.

Watch the world around you. Is your world full of can’ts or full of “can do”s? Can you change it for the better?

Tue 2015/Jan/27

  • An inlaid GNOME logo, part 2

    Esta parte en español

    To continue with yesterday's piece — the amargoso board which I glued is now dry, and now it is time to flatten it. We use a straightedge to see how bad it is on the "good" side.

    Not flat

    We use a jack plane with a cambered blade. There is a slight curvature to the edge; this lets us remove wood quickly. We plane across the grain to remove the cupping of the board. I put some shavings in strategic spots between the board and the workbench to keep the board from rocking around, as its bottom is not flat yet.

    Cambered iron Cross-grain planing

    We use winding sticks at the ends of the board to test if the wood is twisted. Sight almost level across them, and if they look parallel, then the wood is not twisted. Otherwise, plane away the high spots.

    Winding sticks Not twisted

    This gives us a flat board with scalloped tracks. We use a smoothing plane to remove the tracks, planing along the grain. This finally gives us a perfectly flat, smooth surface. This will be our reference face.

    Scalloped board Smoothing plane Smooth, flat surface

    On that last picture, you'll see that both halves of the board are not of the same thickness, and we need to even them up. We set a marking gauge to the thinnest part of the boards. Mark all four sides, using the flat side as the reference face, so we have a line around the board at a constant distance to the reference face.

    Gauging the thinnest part Marking all around Marked all around

    Again, plane the board flat across the grain with a jack plane and its cambered iron. When you reach the gauged line, you are done. Use a smoothing plane along the grain to make the surface pretty. Now we have a perfectly flat board of uniform thickness.

    Thicknessing with the jack plane Smoothing plane Flat and uniform board

    Now we go back to the light-colored maple board from yesterday. First I finished flattening the reference face. Then, I used the marking gauge to put a line all around at about 5mm to the reference face. This will be our slice of maple for the inlaid GNOME logo.

    Marking the maple board

    We have to resaw the board in order to extract that slice. I took my coarsest ripsaw and started a bit away from the line at a corner, being careful to sight down the saw to make it coplanar with the lines on two edges. It is useful to clamp the board at about 45 degrees from level.

    Starting to resaw at a corner

    Once the saw is into the corner, tilt it down gradually to lengthen the kerf...

    Kerfing one side

    Tilt it gradually the other way to make the kerf on the other edge...

    Kerfing the other side

    And now you can really begin to saw powerfully, since the kerfs will guide the saw.

    Resawing

    Gradually extend the cut until the other corner, and repeat the process on all four sides.

    Extending the cut Resawing

    Admire your handiwork; wipe away the sweat.

    Resawn slice

    Plane to the line and leave a smooth surface. Since the board is too thin to hold down with the normal planing stops on the workbench, I used a couple of nails as planing stops to keep the board from sliding forward.

    Nail as planing stop

    Now we can see the contrast between the woods. The next step is to glue templates on each board, and start cutting.

    Contrast between woods

Re: Scammers at promo-newa.com

Beware of promo-newa.com, they scammed my colleague and friend Richard. As someone with experience in importing from China I know how scary and risky it can be so I completely sympathize with them. Apparently they sent hacked 96Mb flash drives that reported to be 1Gb flash drives.

Let's make sure the internet is filled with references to this scam. Also, if you live in Shenzhen and/or can think of any of helping him that'd be really nice.

At the very least, make sure you share this post around in your preferred social media wall!

DX Hackfest in Cambridge

I’m thrilled to spend this week attending the 2015 edition of the Developer Experience hackfest! I arrived in Cambridge Sunday morning and I’ll be hacking away with my GNOME friends until Thursday. My main focus this year is on application bundling and sandboxing: I’ve been giving Alex’s xdg-app a spin, and even though there’s still a lot to do in terms of tooling, it looks extremely promising.

I also spent some time working on a proposal for a preview widget in GTK, which is being discussed on the GTK mailing list as I write too.

Stay tuned on Planet GNOME more updates about this week’s hackfest.

I would like to thank the GNOME Foundation for contributing to my travel expenses, and my employer Endless for letting me attend this event.

sponsored-by-gnome-foundation

Scammers at promo-newa.com

tl;dr Don’t use promo-newa.com, they are scammers that sell fake flash.

Longer version: For the ColorHug project we buy a lot of the custom parts direct from China at a fraction of the price available to us in the UK, even with import tax considered. It would be impossible to produce such a low cost device and still make enough money to make it worth giving up our evenings and weekends. This often means sending thousands of dollars to sketchy-looking companies willing to take on small (to them!) custom orders of a few thousand parts.

So far we’ve been very lucky, until last week. I ordered 1000 customized 1GB flash drives to use as a LiveUSB image rather than using a LiveCD. I checked out the company as usual, and ordered a sample. The sample came back good quality, with 1GB of fast flash. Payment in full was sent, which isn’t unusual for my other suppliers in China.

Fast forward a few weeks. 1000 USB drives arrived, which look great. Great, until you start using them with GNOME MultiWriter, which kept throwing validation warnings. Using the awesome F3 and a few remove-insert cylces later, the f3probe tool told me the flash chip was fake, reporting the capacity to be 1GB, when it was actually 96Mb looped around 10 times.

Taking the drives apart you could also see the chip itself was different from the sample, and the plastic molding and metal retaining tray was a lower quality. I contacted the seller, who said he would speak to the factory later that day. The seller got back to me today, and told me that the factory has produced “B quality drives” and basically, that I got what I paid for. For another 1600USD they would send me the 1GB ICs, which I would have to switch in the USB units. Fool me once, shame on you; fool me twice, shame on me.

I suppose people can use the tiny flash drives to get the .icc profile off the LiveCD image, which was always a stumbling block for some people, but basically the drives are worthless to me as LiveUSB devices. I’m still undecided whether to include them in the ColorHug box; i.e. is a free 96Mb drive better than them all going into landfill?

As this is China, I understand all my money is gone. The company listing is gone from Alibaba, so there’s not a lot I can do there. So other people can hopefully avoid this same mistake, I’ve listed all the details here, which hopefully will become googleable:

Promo-Newa Electronic Limited(Shenzhen)
Wei and Ping Group Limited(Hongkong)  

Office: Building A, HuaQiang Garden, North HuaQiang Road, Futian district, Shenzhen China, 0755-3631 4600
Factory: Building 4, DengXinKeng Industrial Zone, JiHua Road,LongGang District, Shenzhen, China
Registered Address: 15/B—15/F Cheuk Nang Plaza 250 Hennessy Road, HongKong
Email: sales@promo-newa.com
Skype: promonewa

Your app is not a lottery ticket

Many app developers are secretly hoping to win the lottery. You know all those horrible free apps full of ads? I bet most of them were hoping to be the next Flappy Bird app. (The Flappy Bird author was making $50K/day from ads for a while.)

The problem is that when you are that focused on making millions, you are not focused on making a good app that people actually want. When you add ads before you add value, you’ll end up with no users no matter how strategically placed your ads are.

So, the secret to making millions with your app?

  • Find a need or problem that people have that you can solve.
  • Solve the problem.
  • Make your users awesome. Luke first sent me a pointer to Kathy Sierra’s idea of making your users awesome.  Instagram let people create awesome pictures. Then their friends asked them how they did it …
  • Then monetize. (You can think about this earlier but don’t focus on it until you are doing well.)

If you are a good app developer or web developer, you’ll probably find it easier to do well financially helping small businesses around you create the apps and web pages they need than you will trying to randomly guess what game people might like. (If you have a good idea for a game, that you are sure you and your friends and then perhaps others would like to play, go for it!)

G_DECLARE_{FINAL,DERIVABLE}_TYPE

… 7 years later.

This is a public service announcement.

Please stop writing this:

#define G_DESKTOP_APP_INFO(o) (G_TYPE_CHECK_INSTANCE_CAST ((o), G_TYPE_DESKTOP_APP_INFO, GDesktopAppInfo))
#define G_DESKTOP_APP_INFO_CLASS(k) (G_TYPE_CHECK_CLASS_CAST((k), G_TYPE_DESKTOP_APP_INFO, GDesktopAppInfoClass))
#define G_IS_DESKTOP_APP_INFO(o) (G_TYPE_CHECK_INSTANCE_TYPE ((o), G_TYPE_DESKTOP_APP_INFO))
#define G_IS_DESKTOP_APP_INFO_CLASS(k) (G_TYPE_CHECK_CLASS_TYPE ((k), G_TYPE_DESKTOP_APP_INFO))
#define G_DESKTOP_APP_INFO_GET_CLASS(o) (G_TYPE_INSTANCE_GET_CLASS ((o), G_TYPE_DESKTOP_APP_INFO, GDesktopAppInfoClass))

typedef struct _GDesktopAppInfo GDesktopAppInfo;
typedef struct _GDesktopAppInfoClass GDesktopAppInfoClass;
 			
struct _GDesktopAppInfoClass
{
  GObjectClass parent_class;
};

GType g_desktop_app_info_get_type (void) G_GNUC_CONST;

and start writing this:

G_DECLARE_FINAL_TYPE(GDesktopAppInfo, g_desktop_app_info, G, DESKTOP_APP_INFO, GObject)

Thank you for your attention.

DevX Hackfest 2015

Yesterday I arrived to Cambridge to attend the DevX hackfest. Loads of good stuff going on, I am mostly focusing on trying to integrate the hundreds of ignored pull requests we're getting in Github's mirror with Bugzilla automatically. In the meantime loads of interesting discussions about sandboxing, Builder, docs and mallard balls being thrown all over the place and hitting my face (thanks Kat).

It is real nice to catch up with everyone, we went for dinner to a pretty good Korean place, I should thank Codethink for kindly sponsoring the dinner. Afterwards we went to The Eagle pub, apparently the place where DNA discovery was celebrated and discussed.

And this morning we are celebrating Christian Hergert making it to the 50K stretch goal just before the end of the crowdfunding campaign for GNOME Builder.

I would like to thank my employer, Red Hat, for sponsoring my trip here too.

imgflo 0.3: GEGL metaoperations++

Time for a new release of imgflo, the image processing server and dataflow runtime based on GEGL. This iteration has been mostly focused on ironing out various workflow issues, including documentation. Primarily so that the creatives in our team can be productive in developing new image filters/processing. Eventually this will also be an extension point for third parties on our platform.

By porting the png and jpeg loading operations in GEGL to GIO, we’ve added support for loading images into imgflo over HTTP or dataURLs. The latter enables opening local file through a file selector in Flowhub. Eventually we’d like to also support picking from web services.

Loading local file using html input type="file"

Loading local file using HTML5 input type=”file”

 

Another big feature is allowing to live-code new GEGL operations (in C) and load them. This works by sending the code over to the runtime, which then compiles it into a new .so file and loads it. Newly instatiated operations then uses that revision of code. We currently do not change the active operation of currently running instances, though we could.
Operations are never unloaded, due both to a glib limitation and the general trickyness of guaranteeing this to be safe for native code. This is not a big deal as this is a development-only feature, and the memory growth is slow.

Live-coding new image processing operations in C

Live-coding new image processing operations in C

 

imgflo now supports showing the data going through edges, which is very useful to understand how a particular graph works.

Selecting edges shows the buffer at that point in the graph

Selecting edges shows the buffer at that point in the graph

Using Heroku one can get started without installing anything locally. Eventually we might have installers for common OS’es as well.

Get started with imgflo using Heroku

 

Vilson Viera added a set of new image filters to the server, inspired by Instagram. Vilson is also working on our image analytics pipeline, the other piece required for intelligent automatic- and semi-automatic image processing.

Various insta filters

 

GEGL has for a long time supported meta-operations: operations which are built as a sub-graph of other operations. However, they had to be built programatically using the C API which limited tooling support and the platform-specific nature made them hard to distribute.
Now GEGL can load such operations from the JSON format also used by imgflo (and several other runtimes). This lets one use operations built with Flowhub+imgflo in GIMP:

imgflo-meta-ops-gimp

This makes Flowhub+imgflo a useful tool also outside the web-based processing workflow it is primarily built for. Feature is available in GEGL and GIMP master as of last week, and will be released in GIMP 2.10 / GEGL 0.3.

 

Next iteration will be primarily about scaling out. Both allowing multiple “apps” (including individual access to graphs and usage monitoring/quotas) served from a single service, and scaling performance horizontally. The latter will be critical when the ~20k+ users who have signed up start coming onboard.
If you have an interest in using our hosted imgflo service outside of The Grid, get in contact.

flattr this!

January 26, 2015

Thanks for all the applications

Jobs at Red Hat
So I got a LOT of responses to my blog post about the open positions we have here at Red Hat working on Fedora and the Desktop. In fact I got so many it will probably take a bit of time before we can work through them all. So you might have to wait a little bit before getting a response from us. Anyway, thanks you to everyone who sent me their CV, much appreciated and looking forward to working with those of you we end up hiring!

Builder campaign closes in 13 hours
I want to make one last pitch for everyone to contribute to the Builder crowdfunding campaign. It has just passed 47 000 USD as I write this, which means we just need another 3000 USD to reach
the graphical debugger stretch goal. Don’t miss out on this opportunity to help this exciting open source project!

Mon 2015/Jan/26

  • An inlaid GNOME logo, part 1

    Esta parte en español

    I am making a special little piece. It will be an inlaid GNOME logo, made of light-colored wood on a dark-colored background.

    First, we need to make a board wide enough. Here I'm looking for which two sections of those longer/narrower boards to use.

    Grain matching pieces

    Once I am happy with the sections to use — similar grain, not too many flaws — I cross-cut them to length.

    Cross cutting

    (Yes, working in one's pajamas is fantastic and I thoroughly recommend it.)

    This is a local wood which the sawmill people call "amargoso", or bitter one. And indeed — the sawdust feels bitter in your nose.

    Once cut, we have two pieces of approximately the same length and width. They have matching grain in a V shape down the middle, which is what I want for the shape of this piece.

    V shaped grain match

    We clamp the pieces togther and match-plane them. Once we open them like a book, there should be no gaps between them and we can glue them.

    Clamped pieces Match-planing Match-planed pieces

    No light shows between the boards, so there are no gaps! On to gluing. Rub both boards back and forth to spread the glue evenly. Clamp them, and wait overnight.

    No gaps! Gluing boards Clamped boards

    Meanwhile, we can prepare the wood for the inlaid pieces. I used a piece of soft maple, which is of course pretty hard — unlike hard maple, which would be too goddamn hard.

    Rough maple board

    This little board is not flat. Plane it cross-wise and check for flatness.

    Checking for flatness Planing

    Tomorrow I'll finish flattening this face of the maple, and I'll resaw a thinner slice for the inlay.

    Planed board

summing up 68

i am trying to build a jigsaw puzzle which has no lid and is missing half of the pieces. i am unable to show you what it will be, but i can show you some of the pieces and why they matter to me. if you are building a different puzzle, it is possible that these pieces won't mean much to you, maybe they won't fit or they won't fit yet. then again, these might just be the pieces you're looking for. this is summing up, please find previous editions here.

  • start-ups and emotional debt, i realize that many people who do successful start-ups say it was the best thing that ever happened to them. but they've also become different people, and they are not the same people they would have been if they had decided to pursue another course. they have different sets of relationships, different skills, different attitudes, and different desires. they really have no idea what kind of person they otherwise would have been become. recommended
  • waffling, i've always been very dubious about the idea of learning from people who have been successful. there's this whole cult of worshipping rich people, reading interviews with them, getting their opinions on things, trying to learn what made them successful. i think it's mostly nonsense. the thing is, if you just look at who the biggest earners are, it's almost entirely luck. the point is if you just look at successful business people, they will probably be confident, decisive, risk takers, aggressive at seizing opportunities, aggressive about growing the business quickly, etc. that doesn't mean that those are the right things to do. it just means that those are variance-increasing traits that give them a chance to be a big success
  • why don't software development methodologies work?, my own experience, validated by cockburn's thesis and frederick brooks in no silver bullet, is that software development projects succeed when the key people on the team share a common vision, what brooks calls "conceptual integrity." this doesn't arise from any particular methodology, and can happen in the absence of anything resembling a process. i know the feeling working on a team where everyone clicks and things just get done
  • 7 principles of rich web applications, the web remains one of the most versatile mediums for the transmission of information. as we continue to add more dynamism to our pages, we must ensure that we retain some of its great historical benefits while we incorporate new ones
  • "the road to wisdom? - well, it's plain and simple to express: err and err and err again but less and less and less", piet hein

Cambridge DX Hackfest

Like @ebassi and many others, I'm here at the DX Hackfest in Cambridge. Thanks Collabora for hosting us in your lovely office!

It's a bit cold for a laid-back hippy Californian like me, but good beer fixes that.

Lots to update you all on, and I hope to get to that soon as I collect my thoughts. But in the mean time, there is about 20 hours left on the crowdfunding campaign. As I write this, we are at 46k. That means only 4k more to get external hardware support and a graphical debugger!

My primary goal for this hackfest is to start solidifying the designs around LibIDE, which will implement the core IDE features underneath Builder. It is sort of a meta-layer above tools like autotools, clang, hardware devices, debuggers, profilers, and such.

Lots to do, more updates soon :)

Cambridge GNOME DX Hackfest / Day 1

we’re here in sunny (and cold) Cambridge, in the offices kindly sponsored by Collabora.

attendees arrived on Sunday, and various discussions already started on the performance of the GLib slice allocator; GNOME Builder; and improvements of GTK+.

the evening continued at a local curry house, and ended with a pint at the pub — thus giving everyone the full English experience.

this morning is starting a bit slowly — evidently, the jet lag is hitting everyone really hard — but we’ll surely kick it up a notch as soon as a proper amount of caffeine enters our bodies.

I’m going to be working on a bunch of things, this week:

  • incorporating feedback from users of the OpenGL support in GDK
  • GSK, the GTK scene graph API
  • performance improvements in GLib/GObject

and I’ll also keep an eye on what GNOME Builder needs from GTK to improve the user experience.

I’d like to thank Collabora for giving us access to their office and their coffee machine, and to the GNOME Foundation to sponsor various attendees.

GNOME Docs in Cambridge: Day One

The Winter 2015 edition of GNOME docs hackfests is underway in Cambridge, UK, and the first day is in the books. We're making some good, initial progress. Thus far we've been able to update the status of Application Help on our wiki, triaged a lot of bug and docs-feedback reports, and have made some initial updates to platform developer documentation.

Application Help Status Review

Ekaterina Gerasimova and I discussed GNOME Application Help, reviewing the status of each application's help, and setting priorities for the upcoming release and for future releases. We want to make sure that our core set of applications is consistently covered for each release, and setting the right priorities will help us in this effort.

Based on this, Kat went through and updated the Application Help wiki page, bringing the status of each application's notes up to date, and making it more clear as to where our priorities are focused.

Feedback and Bug Review, Plus Various Fixes

Jana Švárová focused on responding to comments that we receive on the documentation feedback mailing list. The mailing list is actually the receiving end of mail that users send via the GNOME help website. If users spot a problem in our help, they're able to click on a link at the bottom of the page, and let us know of the issue they're experiencing via a simple email.

We're finding it to be a useful resource. It's helping us to get feedback from users who don't know how to (or don't want to) use Bugzilla. Also, each email that they send us includes a reference to the help page that they were visiting, making it easier for us to see and fix the problem ourselves.

With regards to bugs, Petr Kovar did a great job of triaging bugs yesterday. Actually, because he triaged over 50 bugs in one day, I think that raises his status from that of a bug triager to that of a one-person Bug Medi-vac Unit. Good job, Petr.

I spent my time alongside Petr, triaging bugs and implementing fixes in the user docs that were referenced in the bugs. For me, personally, the user docs for GNOME and GNOME Shell will be my primary focus during this event.

As part of updating the user help, we're making sure to reference the latest bits by using jhbuild builds of GNOME Shell and GNOME applications. Of course, this also means that we occasionally run into new application bugs. Fortunately, David King is here to help us out in such situations. Yesterday he fixed issues in both GNOME Control Center and Nautilus, making our documentation tasks a bit easier.

GNOME Help Website and Developer Documentation Updates

Bastian Ilsø is focusing his attention on the GNOME help website, giving an initial review of the CSS on the site. Beyond that, though, he's focusing on the developer documentation as part of this hackfest. He arrived here later in the day yesterday, but is off to a good start.

His main focus for the hackfest is update of the GNOME Platform demos. This includes GTK code examples and small tutorials that help developers get oriented to GTK and GNOME development tools. Because he's coming into this from the perspective of a new GTK developer himself, he's able to identify areas that need to be fleshted-out and made more clear for others who may be knew to the platform, as well. He and David King are working through this quite a bit as we begin day two.

Hosting and Sponsorship Thanks

All of this is possible because we have a good venue and have been able to travel here to work together. Many thanks to Collabora for providing the office space for this event, and kudos from the Documentation Project team for their helpful, clear kitchen cleanliness documentation:

Dishwasher flow chart

Additional thanks go out to the GNOME Foundation for sponsoring my travel to this hackfest. We are off to a steady, and productive start, and are making good use of our time here.

GNOME Foundation Sponsorship Banner

DX hackfest 2015: day 1

It’s a sunny Sunday here in Cambridge, UK, and GNOMErs have been arriving from far and wide for the first day of the 2015 developer experience hackfest. This is a week-long event, co-hosted with the winter docs hackfest (which Kat has promised to blog about!) in the Collabora offices.

Today was a bit of a slow start, since people were still arriving throughout the day. Regardless, there have been various discussions, with Ryan, Emmanuele and Christian discussing performance improvements in GLib, Christian and Allan plotting various different approaches to new UI in Builder, Cosimo and Carlos silently plugging away at GTK+, and Emmanuele muttering something about GProperty now and then.

Tomorrow, I hope we can flesh out some of these initial discussions a bit more and get some roadmapping down for GLib development for the next year, amongst other things. I am certain that Builder will feature heavily in discussions too, and apps and sandboxing, now that Alex has arrived.

I’ve spent a little time finishing off and releasing Walbottle, a small library and set of utilities I’ve been working on to implement JSON Schema, which is the equivalent of XML Schema or RELAX-NG, but for JSON files. It allows you to validate JSON instances against a schema, to validate schemas themselves and, unusually, to automatically generate parser unit tests from a schema. That way, you can automatically test json-glib–based JsonReader/JsonParser code, just by passing the JSON schema to Walbottle’s json-schema-generate utility.

It’s still a young project, but should be complete enough to be useful in testing JSON code. Please let me know of any bugs or missing features!

Tomorrow, I plan to dive back in to static analysis of GObject code with Tartan

January 25, 2015

GNOME Calendar added to Fedora

I packaged Calendar for Fedora and today it’s approved for including in Fedora repositories. I’m building it right now.

It will be included only for Fedora 22+.

Ekiga 5 – Progress Report

Current Status Ekiga 5 has progressed a lot lately. OpenHUB is reportin a High Activity for the project. The main reason behind this is that I am again dedicating much of my spare time to the project. Unfortunately, we are again facing a lack of contributions. Most probably (among others) because the project has been […]

GNOME Builder added to Fedora

Thanks to Mathieu Bridon, David King who worked on packaging Builder for Fedora. It will be included in Fedora 22 (right now it’s in rawhide).

GNOME Bugzilla and You.

GNOME Logo

GNOME’s Bugzilla instance is old. Too old.

It will get upgraded in the next weeks.

You should help by playing with the test instance. Go read these instructions and do it!

Let me make one thing clear already: Dear GNOME community, you all owe Krzesimir, Olav and Andrea some icecream and drinks.

More information to come. Stay tuned.

January 24, 2015

Link Pack #05

Lever Rukhin Photographs Los Angeles From His Car
Lever Rukhin shoots the sketchiest parts of Los Angeles from his car, taking a really unique perspective that helps you perceive what LA looks like, if you were in a car… An experience that is apparently common to all LA people. People drive too much in the US :-).

It’s a very interesting interview that goes well with his full site: Lev Rukhin.

What I love about this, besides the whole premise, is that Lev went the extra mile and actually hacked his car to make the images he wanted:

Phoblographer: It looks like many of these images have artificial lighting in them. What’s your gear setup, and how do you introduce so much light into the scene from your car?

Lever: About 9 months ago, I affixed a Mola beauty dish onto the roof rack of my ’75 Volvo and juice it with a profoto bi-tube. This takes a bit of practice, as making a turn changes the light completely, which I always try to keep balanced. The Canon 5D3 with a 24mm f1.4 is set up on a tripod. The strobe has allowed me to capture more detail as well as creating a somewhat surreal feel to the sets.

Lev Rukhin Lev Rukhin http://www.levrukhin.com/

The Invisible Woman: A conversation with Björk
Björk is that Icelandic singer we all hear about but never really pay much attention to because her music is too smart for our simple ears. In this interview she goes over how her latest album is a very personal work, and unexpectedly (?) ends talking about how problematic it’s been to be a female auteur in her generation.

I have seen the same problem she denounces about people assuming that the male members of a team did all the work while the women just sticked to making coffee and sandwiches. I’ve worked with exceptional women that don’t get enough credit, but I’ve also worked with potentially exceptional women who don’t give themselves enough credit.

It’s a very interesting read, specially since it comes from someone who couldn’t be higher in the “art” food chain. Björk is god-damn Björk.

Only thing that bugs me is that Pitchfork decided to hold back most of the interview for publishing next month. I’ll try to go back and read it in full, but I wonder if the technique works for them or if perhaps they are missing the opportunity for a bigger impact. But I digress.

Pitchfork: The world has a difficult time with the female auteur.

B: I have nothing against Kanye West. Help me with this—I’m not dissing him—this is about how people talk about him. With the last album he did, he got all the best beatmakers on the planet at the time to make beats for him. A lot of the time, he wasn’t even there. Yet no one would question his authorship for a second. If whatever I’m saying to you now helps women, I’m up for saying it. For example, I did 80% of the beats on Vespertine and it took me three years to work on that album, because it was all microbeats—it was like doing a huge embroidery piece. Matmos came in the last two weeks and added percussion on top of the songs, but they didn’t do any of the main parts, and they are credited everywhere as having done the whole album. [Matmos’] Drew [Daniel] is a close friend of mine, and in every single interview he did, he corrected it. And they don’t even listen to him. It really is strange.

In Defense of the Selfie Stick
Miguel proposes a different take on the consequences of the selfies stick:

When you ask someone to take a picture of you, technically, they are the photographer, and they own the copyright of your picture.

(…)

All of a sudden, your backpacking adventure in Europe requires you to pack a stack of legal contracts.

Now your exchange goes from “Can you take a picture of us?” to “Can you take a picture of us, making sure that the church is on the top right corner, and also, I am going to need you to sign this paper”.

I don’t know what’s with the selfie stick hate. Let people have fun, it doesn’t hurt. If anything, it prevents them from asking you to take their photo, and if we already established you are the kind of people not a big fan of strangers, all the better, right?

Why Top Tech CEOs Want Employees With Liberal Arts Degrees
Here’s a small extra. When I decided to pursue a humanities/art formal training, I got many naysayers telling me that I was screwing up not specializing even more as a formal (titled) engineer. I argued then, and now, that if I was gonna pay for training, I might as well pay for training outside my comfort zone.

The result resonates perfectly with this article. Of course, it’s not like the thing is settled, but I can back the various quotes in there.

Working with purely technical/engineering types can be an echo chamber, and having trained myself in the humanities and arts I have become incredibly much more sensitive to the human factor of things. I used to think I was already good at this (because we hacker types have lots of confidence), but studying humanities like human communication, social conflict and development, film language, etc; it all has made me a much more capable hacker of things.

There’s also a nice argument to be made about joining the arts when you are already highly skilled on technical matters. Like Robert Rodríguez’s teacher (mentioned in his diary/book Rebel Without a Cause, which I also have to review soon) used to say (generous paraphrasing here): the world is of those who can be their own creative and their own technician.

Both Yi and Sheer recognize that the scientific method is valuable, with its emphasis on logic and reason, especially when dealing with data or engineering problems. But they believe this approach can sometimes be limiting. “When I collaborate with people who have a strictly technical background,” says Yi, “the perspective I find most lacking is an understanding of what motivates people and how to balance multiple factors that are at work outside the realm of technology.”

Interesting food for thought, specially if you know an engineer that ditches the arts as of little value for personal growth in their careers/life.


Read more Link Pack, you’ll love it

  • Link Pack #05 - Lever Rukhin Photographs Los Angeles From His Car Lever Rukhin shoots the sketchiest parts of Los Angeles from his car, taking a really unique perspective that helps you perceive what LA looks like, if you were in a car… An experience that is apparently common to all LA people. People drive too much in the…
  • Link Pack #04 - Writing Your Way to Happiness (nytimes.com) Researches believe that the way we think about, and remember, “our story” can be so powerful that it can actually influence our happiness and success. It’s a nice little article summarizing actual research. The main study referred put fresh university students to test: a group received tools to “rewrite”…
  • Link Pack #03 - What’s that? The third edition of Link Pack of course! Playing with Power (7 minutes, Vimeo) A super awesome story about a stop motion animator that turned a Nintendo Power Glove into the perfect animation tool. It’s a fun, inspiring video :-). I love the Power Glove, it’s so bad. The Power Glove – Angry…
  • Link Pack #02 - First sequel to my Link Pack “series” (I’ll remove the quotes when it’s LP#05): Link Pack #01. This time I’m going for fewer articles, to try to keep things less overwhelming. There’s no special theme, and I’m actually leaving out some nice things I read recently. On the plus side, that means I have good…
  • Link pack #01 - Following the lead of my dear friend Daniel and his fantastic and addictive “Summing up” series, here’s a link pack of recent stuff I read around the web. Link pack is definitely a terrible name, but I’m working on it. How to Silence Negative Thinking On how to avoid the pitfall of being a Negatron…

January 23, 2015

In Defense of the Selfie Stick

From the sophisticated opinion of the trendsetters to Forbes, the Selfie Stick is the recipient of scorn and ridicule.

One of the popular arguments against the Selfie Stick is that you should build the courage to ask a stranger to take a picture of you or your group.

This poses three problems.

First, the courage/imposition problem. Asking a stranger in the street assumes that you will find such a volunteer.

Further, it assumes that the volunteer will have the patience to wait for the perfect shot ("wait, I want the waves breaking" or "Try to get the sign, just on top of me"). And that the volunteer will have the patience to show you the result and take another picture.

Often, the selfista that has amassed the courage to approach a stranger on the street, out of politeness, will just accept the shot as taken. Good or bad.

Except for a few of you (I am looking at you Patrick), most people feel uncomfortable imposing something out of the blue on a stranger.

And out of shyness, will not ask a second stranger for a better shot as long as the first one is within earshot.

I know this.

Second, you might fear for the stranger to either take your precious iPhone 6+ and run, or even worse, that he might sweat all over your beautiful phone and you might need to disinfect it.

Do not pretend like you do not care about this, because I know you do.

Third, and most important, we have the legal aspect.

When you ask someone to take a picture of you, technically, they are the photographer, and they own the copyright of your picture.

This means that they own the rights to the picture and are entitled to copyright protection. The photographer, and, not you, gets to decide on the terms to distribute, redistribute, publish or share the picture with others. Including making copies of it, or most every other thing that you might want to do with those pictures.

You need to explicitly get a license from them, or purchase the rights. Otherwise, ten years from now, you may find yourself facing a copyright lawsuit.

All of a sudden, your backpacking adventure in Europe requires you to pack a stack of legal contracts.

Now your exchange goes from "Can you take a picture of us?" to "Can you take a picture of us, making sure that the church is on the top right corner, and also, I am going to need you to sign this paper".

Using a Selfie Stick may feel awkward, but just like a condom, when properly used, it is the best protection against unwanted surprises.

In Defense of the Selfie Stick

From the sophisticated opinion of the trendsetters to Forbes, the Selfie Stick is the recipient of scorn and ridicule.

One of the popular arguments against the Selfie Stick is that you should build the courage to ask a stranger to take a picture of you or your group.

This poses three problems.

First, the courage/imposition problem. Asking a stranger in the street assumes that you will find such a volunteer.

Further, it assumes that the volunteer will have the patience to wait for the perfect shot ("wait, I want the waves breaking" or "Try to get the sign, just on top of me"). And that the volunteer will have the patience to show you the result and take another picture.

Often, the selfista that has amassed the courage to approach a stranger on the street, out of politeness, will just accept the shot as taken. Good or bad.

Except for a few of you (I am looking at you Patrick), most people feel uncomfortable imposing something out of the blue on a stranger.

And out of shyness, will not ask a second stranger for a better shot as long as the first stranger is within earshot.

I know this.

Second, you might fear for the stranger to either take your precious iPhone 6+ and run. Or even worse, that he might sweat all over your beautiful phone and you might need to disinfect it.

Do not pretend like you do not care about this, because I know you do.

Third, and most important, we have the legal aspect.

When you ask someone to take a picture of you, technically, they are the photographer, and they own the copyright of your picture.

This means that they own the rights to the picture and are entitled to copyright protection. The photographer, and not you, gets to decide on the terms to distribute, redistribute, publish or share the picture with others. Including making copies of it, or most every other thing that you might want to do with those pictures.

You need to explicitly get a license from them, or purchase the rights. Otherwise, ten years from now, you may find yourself facing a copyright lawsuit.

All of a sudden, your backpacking adventure in Europe requires you to pack a stack of legal contracts.

Now your exchange goes from "Can you take a picture of us?" to "Can you take a picture of us, making sure that the church is on the top right corner, and also, I am going to need you to sign this paper".

Using a Selfie Stick may feel awkward, but just like a condom, when properly used, it is the best protection against unwanted surprises.

January 22, 2015

Star Battle: Pentominoes, and Pentominous: F is for Fiendish

I just had my fifth puzzle published at GM Puzzles.

This was a fun one. It's my second Star Battle puzzle, and it's definitely more approachable than my previous dual-grid Star Duel. (Incidentally, that giant Star Duel was noted as one among several of the best object-placement puzzles of 2014 at GM Puzzles. I'm happy to see that people liked it.)

I got the idea for today's Star Battle when reading that best-of post. One of the other puzzles noted there was this lovely 9-pentomino Star Battle by Zoltán Horváth. The post mentioned that another designer, Jiří Hrdina, had independently designed a similar 9-pentomino Star Battle. (I can't link directly to that one since it's not freely available on the web. It's contained in The Art of Puzzles: Star Battle e-book available for sale). Then, in the comments, Matúš Demiger mentioned that he had also independently constructed a third 9-pentomino Star Battle as part of the 2014 24-Hour Puzzle Competition.

So at least three puzzle authors all happened to construct pentomino-themed Star Battle puzzles last year, and all three happened to choose the fairly-standard 10x10 grid size, (forcing the puzzle to include only 9 of the 12 possible pentomino shapes). Matúš's comment was "I hope someone will try to include all twelve pentominoes" and I couldn't resist the challenge.

And it was an interesting challenge since putting 12 pentominoes into a Star Battle requires a 13x13 grid. That's not too much of a problem in and of itself, (my Star Duel used a 15x15 grid, for example). But with this particular theme, as the puzzle grows the total area of the pentominoes grows linearly, while the total puzzle area grows quadratically. In my final puzzle there are 12 regions of size 5 and then one giant outer region with 109 cells. But all 13 regions each only contain two stars. So the real challenge here was to ensure that when solving the puzzle the stars in the huge region didn't get determined early, (causing a bunch of cells to be wasted and forcing the user to tediously mark off all of the unused cells).

Luckily, I think it just worked out. In all of my test-solving, the stars in the large region are among the very last determined.

Anyway, give this puzzle a try if you'd like. The large number of tiny regions means there are a lot of easy steps early on in the puzzle. But there are still a few more interesting deductions in store later on, (but nothing ever all that difficult in this Wednesday-level puzzle).

I should also thank Thomas Snyder for his editorial help. He found and fixed a small ambiguity in the first version of this puzzle that I submitted. I've since coded up a deductive Star Battle solver just to be able to verify uniqueness for puzzles I construct. But maybe I'll talk about that in a future post.

PS. If anyone is following closely, I neglected to mention my fourth puzzle when it was published a few weeks ago on a Friday. It's a Pentominous puzzle with almost no clues other than 12 F's, and I named it "F is for Fiendish". The title is a warning, and I think it deserves it. I think this is the hardest puzzle I've published so far. My Star Duel earned a longer estimated "Expert" time, (37 minutes compared to 20 minutes for "F is for Fiendish"), but that's mostly because Star Duel is so much bigger, (2 15x15 grids compared to a single 10x10 grid). The deductions required here are definitely harder to find.

My sister is really kind to do some of the initial testing of several of my puzzles. After I handed her a copy of "F is for Fiendish" one evening, she called me later that night to ask, "Can you email me a fresh copy of that puzzle? My husband and I have been trying it over and over and the paper is all disintegrating after so much erasing." That's a beautiful thing for a puzzle designer to hear---that someone is terribly frustrated with a puzzle, but still determined to stick with it and keep trying.

So if you want a challenge, give "F is for Fiendish" a try. There are logical steps that can be found at every point to solve the puzzle without needing any guessing or back-tracking, (but they may not be easy to find). Good luck, and happy puzzling!

Sandboxed applications for GNOME

It is no secret that we’ve been interested in sandboxed applications for a while. It is evident here, here, here or here, to name just a few.

What may not be widely known yet is that we have been working on putting together a working implementation of these ideas. Alexander Larsson has made steady progress, and we’re now almost at the point where it is useful for  other people to start playing with it.

If you want to go straight to the playing part, you can head to this wiki page, which has all the links and explanations you need.


Some rights reserved by whiteoakart

Why sandboxed apps ?

There are several reasons:

  • We want to make it possible for 3rd parties to create and distribute applications that work on multiple distributions.
  • We want to run the applications with as little access as possible to the host (for example user files or network access), so that users can try applications without necessarily having to trust them fully.
  • We want to make it much easier to write applications – jhbuild has its uses, but it is an endless source of frustration and a very high hurdle for new contributors.

Traditionally, the only answer available to people who need to distribute an executable that works across several Linux distributions is to statically link all but the lowest-level dependencies.

That is not only wasteful in terms of bandwidth for downloading, but also at runtime, when every application loads its own copy of those dependencies, instead of sharing them.

And the real problem comes when one of those dependencies needs to be updated, e.g. because of a security issue (some people still remember the infamous zlib double-free incident). Dealing with this in a fairly efficient way is a strong point of the Linux packaging model, as its proponents are quick to point out.

So,  are sandboxed apps any better ? By themselves, they aren’t.

Runtimes and bundles

We suggest to introduce the concept of a runtime to help with this. A runtime provides a well-defined environment that an app can run in – one way to think of it is as a /usr filesystem with fixed contents.
Typical examples would be “GNOME 3.14″ or “KDE 5.6″.

It is important to note: you can have multiple runtimes installed on the system, and even multiple versions of the same runtime. Things that are not included in the runtime will still have to be bundled with the application – but the problem becomes much more manageable.

What about the applications themselves ? In the filesystem, an app bundle is simply a directory which contains a metadata file that describes the application, what runtime it needs, and various other things. The actual contents of the app bundle are in a subdirectory. The last component of an app bundle is another subdirectory, containing the various files that are needed by the host system and the session to present the app to the user: desktop files, icons, etc.

The only way to run such a bundled application is through a helper, which sets up the  sandbox. It uses kernel namespaces and bind mounts to isolate the application from the host system and its filesystem. The app bundle contents get mounted under /self, and the runtime gets mounted under /usr.

But what about the developer experience ? The runtime idea has a counterpart that helps with this, the developer runtime, or sdk. It is basically, the runtime with the ‘devel’ parts added, including tools like a compiler and a debugger.  And similar to the ‘xdg-app run’ command that sets up a sandbox to run an application in, there is an ‘xdg-app build’ command that sets up a ‘developer sandbox’ with the sdk.

Is this progress ?

One question I expect is: What about freedom ? This sounds just like corporate walled gardens and app stores. I think this is a fair question – we are trying to replicate some of the strong points of the app store model. The existing examples of this model have a strong flavor of control, and focus entirely on consumption as opposed to creation.

But I think we can actually turn this into a freedom-enhancing change, if we pay attention while building it.

One vision I have is that we could have a “Modify this application” context menu item in gnome-shell which downloads the sources of the app bundle, sets up the right sdk, opens the sources in your favourite IDE, where you can make your modifications, build it, test it and create a new bundle that you can share with your friends.

In particular the last part (wrapping your modifications in an easy-to-share form) is really not easy in the traditional distribution world, where everything has to be a package that comes from your distributor.

This might be a great fit for gnome-builder, which will hopefully gain support for building bundled applications. Coincidentally, the gnome-builder project is just entering the last week of its fundraising campaign – if you haven’t donated yet, you have 7 days left to do so.

Our implementation

Some notable facts about the implementation that Alex’ has been working on:

  • Both runtimes and app bundles can be installed per-user and system-wide, and they can coexist perfectly fine with traditional applications. There’s no need for everybody to adopt this model at once, we can transition gradually to it.
  • We use OSTree to distribute both runtimes and applications as well as updates. OSTree is a good fit for this, because its use of content-based addressing and hardlinks transparently makes runtimes and bundles share disk space, and at the same time it doesn’t impose strong requirements on the host system.  But OSTree does not have to be the only distribution mechanism – the definition of the filesystem layout for applications and the sandboxing setup is has no dependencies on it.
  • The build tooling supports using rpmbuild and rpms to build runtimes and application. With this, what we do becomes very similar to the rpm-ostree project: They use rpms to populate OS images on the server side, we use rpms to put together runtimes and applications. In both cases, the results get distributed to end users via OSTree.
  • We have a repository with a few example applications and a yocto-based runtime for GNOME 3.15.
Whats next ?

There are lots of smaller (and some bigger things left to do).

Currently, we are working on making gnome-software show and handle these application bundles in addition to  traditional packaged applications.

Our short-term goal is to provide an initial test version of a ‘reference runtime’ for the GNOME 3.16 release.

If you want to learn more, you can study the SandboxedApps wiki page that I’ve already mentioned, or you can come to DevConf.cz, where Alex will be presenting his work.

Link Pack #04

Writing Your Way to Happiness (nytimes.com)
Researches believe that the way we think about, and remember, “our story” can be so powerful that it can actually influence our happiness and success. It’s a nice little article summarizing actual research. The main study referred put fresh university students to test: a group received tools to “rewrite” their memory and story of their academic performance, another group didn’t. The first group improved their grades and had only 1 student drop school within a year, the other group had 4 drop outs and no specific improvement.

I’ve been thinking about this as I recently rewrote my About page and also started writing down some past Travel journals. Looking back and rewriting your own story is incredibly empowering, it’s a fantastic rush of confidence and self-assertion. Memory is always betraying us, and remembering our success is not particularly high on the list of things to keep.

The concept is based on the idea that we all have a personal narrative that shapes our view of the world and ourselves. But sometimes our inner voice doesn’t get it completely right. Some researchers believe that by writing and then editing our own stories, we can change our perceptions of ourselves and identify obstacles that stand in the way of better health.

It may sound like self-help nonsense, but research suggests the effects are real.

Students who had been prompted to change their personal stories improved their grade-point averages and were less likely to drop out over the next year than the students who received no information. In the control group, which had received no advice about grades, 20 percent of the students had dropped out within a year. But in the intervention group, only 1 student — or just 5 percent — dropped out.

Old Masters at the Top of Their Game (nytimes.com)
Fantastic read on how these artists defy the conventions of old meaning useless. Masters at their art, they haven’t quit nor have laid to rest and cash their reputation. They keep making, they stay alive (physically and metaphorically) through art.

No rush to get to their age, but still a really interesting “letter from the future”. Full of cheat codes, read this now.

Now I am 79. I’ve written many hundreds of essays, 10 times that number of misbegotten drafts both early and late, and I begin to understand that failure is its own reward. It is in the effort to close the distance between the work imagined and the work achieved wherein it is to be found that the ceaseless labor is the freedom of play, that what’s at stake isn’t a reflection in the mirror of fame but the escape from the prison of the self.

T. H. White, the British naturalist turned novelist to write “The Once and Future King,” calls upon the druid Merlyn to teach the lesson to the young prince Arthur:

“You may grow old and trembling in your anatomies, you may lie awake at night listening to the disorder of your veins, you may miss your only love, you may see the world about you devastated by evil lunatics, or know your honour trampled in the sewers of baser minds. There is only one thing for it then — to learn. Learn why the world wags and what wags it. That is the only thing which the mind can never exhaust, never alienate, never be tortured by, never fear or distrust, and never dream of regretting.”

A Life with a View (ribbonfarm.com)
A somewhat tricky read, but with a nice payback. Take your time, and savor it slowly. It’s a very interesting look into how we keep wanting new stuff, and how we shield from ourselves by looking for the “place with no yearns”, the place where we won’t want anything anymore doesn’t exist.

Chains very well into the reads I shared a few days ago on practical contentment.

The arrival fallacy is about seeking a life from which one can look with a complacent equanimity upon the rest of reality, without yearning. It is an ideal of a life that is defined primarily by blindness to itself. You yearn while you see your life as others see it, until you arrive at a situation where you can disappear into the broader background, and see comfortably without being seen discomfittingly, especially by yourself.

Once you’re there, the yearning stops, so the theory goes. Of course it is a laughably bad theory.

How To Escape From A Moving Car (mrporter.com)
By Adam Kirley, stunt double for Daniel Craig in the crazy crane scene of Casino Royale (where 007 jumps from monkey nuts high to donkey bonkers high, a badger bum crazy distance). Really funny, and one of those things I always find myself thinking… Almost as much as what to do in case of a Post Office Showdown (xkcd.com)

Everyone’s first instinct is to put their hands or legs down first. That’s the worst thing you can do: you will break something. The pointy parts of your body hurt – elbows, knees, hips, ankles. Put your fists under your chin, and bring your elbows together. Keep your chin tucked in to your chest to protect your head. The best point of impact is the back of the shoulder and your back. If you dive out directly onto your shoulder you’ll break it.

What the World Looks Like with Social Anxiety (collegehumor.com)
Funny vignettes about how the world looks like when you are socially anxious. I can only really identify with the last one:

cfd04d22a6dfa4fb858dee8d3d5592afShea Strauss.

Helsinki Bus Station Theory (fotocommunity.com)
Don’t get off the bus. Art comes to those who wait and persevere. At first, you replicate the same route others have done, but only if you stay long enough in such path you begin to find your own path. Although perhaps a little more classic in conception, this is an interesting text advising artists to don’t give up just because they don’t compare well to the masters of their current art or genre. Only those who persevere will catch up and diverge from the masters.

You could say that diverging early is also a way to find your path, but there’s still a case to be made for learning from those who came before. Whether you want to imitate them, or rebel against them, you still need to know them.

My take: it doesn’t hurt to pick up some biographies or works from past masters and see what made them masters. Create your master genealogy, kinda like in Steal Like an Artist (which I recently read but haven’t got around to write about yet).

Georges Braque has said that out of limited means, new forms emerge. I say, we find out what we will do by knowing what we will not do.

And so, if your heart is set on 8×10 platinum landscapes in misty southern terrains, work your way through those who inspire you, ride their bus route and damn those who would say you are merely repeating what has been done before. Wait for the months and years to pass and soon your differences will begin to appear with clarity and intelligence, when your originality will become visible, even the works from those very first years of trepidation when everything you did seemed so done before.

At 90, She’s Designing Tech For Aging Boomers (npr.org)
The inspiring tale of a 90 year old woman who joined IDEO to contribute a unique point of view to the design process. You can never stop learning, life never ceases to be interesting. It’s short, and not incredibly shocking, but that this has happened somewhere as referenced and revered as IDEO says a lot.

And for the bulging demographic of baby boomers growing old, Beskind has this advice: Embrace change and design for it.


Previously on Link Pack

  • Link Pack #05 - Lever Rukhin Photographs Los Angeles From His Car Lever Rukhin shoots the sketchiest parts of Los Angeles from his car, taking a really unique perspective that helps you perceive what LA looks like, if you were in a car… An experience that is apparently common to all LA people. People drive too much in the…
  • Link Pack #04 - Writing Your Way to Happiness (nytimes.com) Researches believe that the way we think about, and remember, “our story” can be so powerful that it can actually influence our happiness and success. It’s a nice little article summarizing actual research. The main study referred put fresh university students to test: a group received tools to “rewrite”…
  • Link Pack #03 - What’s that? The third edition of Link Pack of course! Playing with Power (7 minutes, Vimeo) A super awesome story about a stop motion animator that turned a Nintendo Power Glove into the perfect animation tool. It’s a fun, inspiring video :-). I love the Power Glove, it’s so bad. The Power Glove – Angry…
  • Link Pack #02 - First sequel to my Link Pack “series” (I’ll remove the quotes when it’s LP#05): Link Pack #01. This time I’m going for fewer articles, to try to keep things less overwhelming. There’s no special theme, and I’m actually leaving out some nice things I read recently. On the plus side, that means I have good…
  • Link pack #01 - Following the lead of my dear friend Daniel and his fantastic and addictive “Summing up” series, here’s a link pack of recent stuff I read around the web. Link pack is definitely a terrible name, but I’m working on it. How to Silence Negative Thinking On how to avoid the pitfall of being a Negatron…

January 21, 2015

Want to join our innovative development team doing cool open source software?

So Red Hat are currently looking to hire into the various teams building and supporting efforts such as the Fedora Workstation, the Red Hat Enterprise Linux Workstation and of course Fedora and RHEL in generaL. We are looking at hiring around 6-7 people to help move these and other Red Hat efforts forward. We are open to candidates from any country where Red Hat has a presence, but for a subset of the positions candidates relocating to our main engineering offices in Brno, Czech Republic or Westford, Massachussets, USA, will be a requirement or candidates interested in that will be given preference. We are looking for a mix of experienced and junior candidates, so regardless of it you are fresh out of school or haven been around for a while this might be for you.

So instead of providing a list of job descriptions what I want to do here is list of various skills and backgrounds we want and then we will adjust the exact role of the candidates we end up hiring depending on the mix of skills we find. So this might be for you if you are a software developer and have one or more of these skills or backgrounds:

* Able to program in C
* Able to program in Ruby
* Able to program in Javascript
* Able to program in Assembly
* Able to program in Python
* Experience with Linux Kernel development
* Experience with GTK+
* Experience with Wayland
* Experience with x.org
* Experience with developing for PPC architecture
* Experience with compiler optimisations
* Experience with llvm-pipe
* Experience with SPICE
* Experience with developing software like Virtualbox, VNC, RDP or similar
* Experience with building web services
* Experience with OpenGL development
* Experience with release engineering
* Experience with Project Atomic
* Experience with graphics driver enablement
* Experience with other PC hardware enablement
* Experience with enterprise software management tools like Satellite or ManageIQ
* Experience with accessibility software
* Experience with RPM packaging
* Experience with Fedora
* Experience with Red Hat Enterprise Linux
* Experience with GNOME

It should be clear from the list above that we are not just looking for people with a background in desktop development this time around, two of the positions for instance will mostly be dealing with Linux kernel development. We are looking for people here who can help us make sure the latest and greatest laptops on the market works great with Fedora and RHEL, be that on the graphics side or in terms of general hardware enablement. These jobs will put you right in the middle of the action in terms of defining the future of the 3 Fedora variants, especially the Workstation; defining the future of Red Hats Enterprise Linux Workstation products and pushing the Linux platform in general forward.

If you are interested in talking with us about if we can find a place for you in Red Hat as part of this hiring round please send me your CV and some words about yourself and I will make sure to put you in contact with our recruiters. And if you are unsure about what kind of things we work on here I suggest taking a look at my latest blog about our Fedora Workstation 22 efforts to see a small sample of some of the things we are currently working on.

You can reach me at cschalle(at)redhat(dot)com.

January 20, 2015

Education Freedom Day registration launched!

efd-banner

We have just opened Education Freedom Day registration, scheduled on March 21st, 2015. For its second edition EFD has been moved to March to facilitate its celebration in both the south of the planet and China (at least…) and we hope to cater to more events this year.

As usual for all our Freedom celebrations the process is similar, you get together and decide to organize an event, then create a page in our wiki and register your team. As the date approaches you get to put more information in your wiki page (or on your organization website which is linked from the wiki) such as the date and time, the location and what people can expect to see.

Education Freedom Day is really the opportunity to review all the available Free Educational Resources available, how they have improved since last year and what you should start planning to implement to deploy in the coming months. More importantly it is the celebration of what is available and letting people aware of it!

So prepare well and see you all in two months to celebrate Education Freedom Day!

Celebrate EFD with us on March 21, 2015!

Feeds