Firefox and Color Management

For some time now Firefox has been capable of doing some level of color management of web content. Though there have always been caveats. Currently Firefox (version 26) enables color management only for images that have explicitly been tagged with a color profile (which isn’t that common yet). This default behavior results in a number of problems.

When an image is tagged with a color profile Firefox converts that image to your display profile (if configured), or otherwise to sRGB. Untagged images and other color elements defined by CSS for example, are assumed to be sRGB and are not converted to your display profile (if configured), even though they should. This means if you do not have a display profile configured, everything is well, since everything is either sRGB or is converted to sRGB. However if you do have a display profile configured, particularly if your display significantly deviates from sRGB, you might notice page elements which are composited from multiple sources (tagged images and CSS for example) having mismatching colors. This is essentially a bug, all page elements should always be converted to the same colorspace (whether that be sRGB or the display profile).

Firefox versions predating 19 required the user to manually configure a specific display profile, but since version 19 Firefox should automatically pick up on a system display profile if properly configured.

So to get Firefox to do complete color management, you’ll need to set a few parameters using about:config, or you can do the following on Ubuntu to enable it system wide:

$ sudo sh -c 'echo "pref(\"gfx.color_management.rendering_intent\", 0);" >> /etc/firefox/syspref.js'
$ sudo sh -c 'echo "pref(\"gfx.color_management.mode\", 1);" >> /etc/firefox/syspref.js'
$ sudo sh -c 'echo "pref(\"gfx.color_management.enablev4\", true);" >> /etc/firefox/syspref.js'

IMPORTANT: You do need to be aware that enabling these features means slightly increasing Firefox’s security surface.

Display Color Profiling (on Linux)

Attention: This article is a work in progress, based on my own practical experience up until the time of writing, so you may want to check back periodically to see if it has been updated.

This article outlines how you can calibrate and profile your display on Linux, assuming you have the right equipment (either a colorimeter like for example the i1 Display Pro or a spectrophotometer like for example the ColorMunki Photo). For a general overview of what color management is and details about some of it’s parlance you may want to read this before continuing.

A Fresh Start

First you may want to check if any kind of color management is already active on your machine, if you see the following then you’re fine:

$ xprop -display :0.0 -len 14 -root _ICC_PROFILE
_ICC_PROFILE:  no such atom on any window.

However if you see something like this, then there is already another color management system active:

$ xprop -display :0.0 -len 14 -root _ICC_PROFILE
_ICC_PROFILE(CARDINAL) = 0, 0, 72, 212, 108, 99, 109, 115, 2, 32, 0, 0, 109, 110

If this is the case you need to figure out what and why… For GNOME/Unity based desktops this is fairly typical, since they extract a simple profile from the display hardware itself via EDID and use that by default. I’m guessing KDE users may want to look into this before proceeding. I can’t give much advice about other desktop environments though, as I’m not particularly familiar with them. That said, I tested most of the examples in this article with XFCE 4.10 on Xubuntu 13.10 “Saucy”.

Display Types

Modern flat panel displays are comprised of two major components for purposes of our discussion, the backlight and the panel itself. There are various types of backlights, White LED (most common nowadays), CCFL (most common a few years ago), RGB LED and Wide Gamut CCFL, the latter two of which you’d typically find on higher end displays. The backlight primarily defines a displays gamut and maximum brightness. The panel on the other hand primarily defines the maximum contrast and acceptable viewing angles. Most common types are variants of IPS (usually good contrast and viewing angles) and TN (typically mediocre contrast and poor viewing angles).

Display Setup

There are two main cases, there a laptop displays, which usually allow for little configuration, and regular desktop displays. For regular displays there are a few steps to prepare your display to be profiled, first you need to reset your display to it’s factory defaults. We leave the contrast at it’s default value. If your display has a feature called dynamic contrast you need to disable it, this is critical, if you’re unlucky enough to have a display for which this cannot be disabled, then there is no use in proceeding any further. Then we set the color temperature setting to custom and set the R/G/B values to equal values (often 100/100/100 or 255/255/255). As for the brightness, set it to a level which is comfortable for prolonged viewing, typically this means reducing the brightness from it’s default setting, this will often be somewhere around 25-50 on a 0-100 scale. Laptops are a different story, often you’ll be fighting different lighting conditions, so you may want to consider profiling your laptop at it’s full brightness. We’ll get back to the brightness setting later on.

Before continuing any further, let the display settle for at least half an hour (as it’s color rendition may change while the backlight is warming up) and make sure the display doesn’t go into power saving mode during this time.

Another point worth considering is cleaning the display before starting the calibration and profiling process, do keep in mind that displays often have relatively fragile coatings, which may be deteriorated by traditional cleaning products, or easily scratched using regular cleaning cloths. There are specialist products available for safely cleaning computer displays.

You may also want to consider dimming the ambient lighting while running the calibration and profiling procedure to prevent (potential) glare from being an issue.

Software

If you’re in a GNOME or Unity environment it’s highly recommend to use GNOME Color Manager (with colord and argyll). If you have recent versions (3.8.3, 1.0.5, 1.6.2 respectively), you can profile and setup your display completely graphically via the Color applet in System Settings. It’s fully wizard driven and couldn’t be much easier in most cases. This is what I personally use and recommend. The rest of this article focuses on the case where you are not using it.

Xubuntu users in particular can get experimental packages for the latest argyll and optionally xiccd from my argyll-testing and xiccd-testing PPAs. If you’re using a different distribution you’ll need to source help from it’s respective community.

Report On The Uncalibrated Display

To get an idea of the displays uncalibrated capabilities we use argyll’s dispcal:

$ dispcal -H -y l -R
Uncalibrated response:
Black level = 0.4179 cd/m^2
50%   level = 42.93 cd/m^2
White level = 189.08 cd/m^2
Aprox. gamma = 2.14
Contrast ratio = 452:1
White     Visual Daylight Temperature = 7465K, DE 2K to locus =  3.2

Here we see the display has a fairly high uncalibrated native whitepoint at almost 7500K, which means the display is bluer than it should be. When we’re done you’ll notice the display becoming more yellow. If your displays uncalibrated native whitepoint is below 6500K you’ll notice it becoming more blue when loading the profile.

Another point to note is the fairly high white level (brightness) of almost 190 cd/m2, it’s fairly typical to target 120 cd/m2 for the final calibration, keeping in mind that we’ll lose 10 cd/m2 or so because of the calibration itself. So if your display reports a brightness significantly higher than 130 cd/m2 you may want to considering turning down the brightness another notch.

Calibrating And Profiling Your Display

First we’ll use argyll’s dispcal to measure and adjust (calibrate) the display, compensating for the displays whitepoint (targeting 6500K) and gamma (targeting industry standard 2.2, more info on gamma here):

$ dispcal -v -m -H -q l -y l -F -t 6500 -g 2.2 asus_eee_pc_1215p

Next we’ll use argyll’s targen to generate measurement patches to determine it’s gamut:

$ targen -v -d 3 -G -e 4 -s 5 -g 17 -f 64 asus_eee_pc_1215p

Then we’ll use argyll’s dispread to apply the calibration file generated by dispcal, and measure (profile) the displays gamut using the patches generated by targen:

$ dispread -v -H -N -y l -F -k asus_eee_pc_1215p.cal asus_eee_pc_1215p

Finally we’ll use argyll’s colprof to generate a standardized ICC (version 2) color profile:

$ colprof -v -D "Asus Eee PC 1215P" \ 
             -C "Copyright (c) 2013 Pascal de Bruijn. Some rights reserved." \
             -q m -a G -Z p -n c asus_eee_pc_1215p
Profile check complete, peak err = 9.771535, avg err = 3.383640, RMS = 4.094142

The parameters used to generate the ICC color profile are fairly conservative and should be fairly robust. They will likely provide good results for most use-cases. If you’re after better accuracy you may want to try replacing -a G with -a S or even -a s, but I very strongly recommend starting out using -a G.

You can inspect the contents of a standardized ICC (version 2 only) color profile using argyll’s iccdump:

$ iccdump -v 3 asus_eee_pc_1215p.icc

To try the color profile we just generated we can quickly load it using argyll’s dispwin:

$ dispwin -I asus_eee_pc_1215p.icc

Now you’ll likely see a color shift toward the yellow side. For some possibly aged displays you may notice it shifting toward the blue side.

If you’ve used a colorimeter (as opposed to a spectrophotometer) to profile your display and if you feel the profile might be off, you may want to consider reading this and this.

Report On The Calibrated Display

Next we can use argyll’s dispcal again to check our newly calibrated display:

$ dispcal -H -y l -r
Current calibration response:
Black level = 0.3432 cd/m^2
50%   level = 40.44 cd/m^2
White level = 179.63 cd/m^2
Aprox. gamma = 2.15
Contrast ratio = 523:1
White     Visual Daylight Temperature = 6420K, DE 2K to locus =  1.9

Here we see the calibrated displays whitepoint nicely around 6500K as it should be.

Loading The Profile In Your User Session

If your desktop environment is XDG autostart compliant, you may want to considering creating a .desktop file which will load the ICC color profile during all users session login:

$ cat /etc/xdg/autostart/dispwin.desktop
[Desktop Entry]
Encoding=UTF-8
Name=Argyll dispwin load color profile
Exec=dispwin -I /usr/share/color/icc/asus_eee_pc_1215p.icc
Terminal=false
Type=Application
Categories=

Alternatively you could use colord and xiccd for a more sophisticated setup. If you do make sure you have recent versions of both, particularly for xiccd as it’s still a fairly young project.

First we’ll need to start xiccd (in the background), which detects your connected displays and adds it to colord‘s device inventory:

$ nohup xiccd &

Then we can query colord for it’s list of available devices:

$ colormgr get-devices

Next we need to query colord for it’s list of available profiles (or alternatively search by a profile’s full filename):

$ colormgr get-profiles
$ colormgr find-profile-by-filename /usr/share/color/icc/asus_eee_pc_1215p.icc

Next we’ll need to assign our profile’s object path to our display’s object path:

$ colormgr device-add-profile \
           /org/freedesktop/ColorManager/devices/xrandr_HSD121PHW1_70842_pmjdebruijn_1000 \
           /org/freedesktop/ColorManager/profiles/icc_e7fc40cb41ddd25c8d79f1c8d453ec3f

You should notice your displays color shift within a second or so (xiccd applies it asynchronously), assuming you haven’t already applied it via dispwin earlier (in which case you’ll notice no change).

If you suspect xiccd isn’t properly working, you may be able to debug the issue by stopping all xiccd background processes, and starting it in debug mode in the foreground:

$ killall xiccd
$ G_MESSAGES_DEBUG=all xiccd

Also in xiccd‘s case you’ll need to create a .desktop file to load xiccd during all users session login:

$ cat /etc/xdg/autostart/xiccd.desktop
[Desktop Entry]
Encoding=UTF-8
Name=xiccd
GenericName=X11 ICC Daemon
Comment=Applies color management profiles to your session
Exec=xiccd
Terminal=false
Type=Application
Categories=
OnlyShowIn=XFCE;

You’ll note that xiccd does not need any parameters, since it will query colord‘s database what profile to load.

If your desktop environment is not XDG autostart compliant, you need to ask them how to start custom commands (dispwin or xiccd respectively) during session login.

Dual Screen Caveats

Currently having a dual screen color managed setup is complicated at best. Most programs use the _ICC_PROFILE atom to get the system display profile, and there’s only one such atom. To resolve this issue new atoms were defined to support multiple displays, but not all applications actually honor them. So with a dual screen setup there is always a risk of applications applying the profile for your first display to your second display or vice versa.

So practically speaking, if you need a reliable color managed setup, you should probably avoid dual screen setups altogether.

That said, most of argyll’s commands support a -d parameter for selecting which display to work with during calibration and profiling, but I have no personal experience with them whatsoever, since I purposefully don’t have a dual screen setup.

Application Support Caveats

As my other article explains display color profiles consist of two parts, one part (whitepoint & gamma correction) is applied via X11 and thus benefits all applications. There is however a second part (gamut correction) that needs to be applied by the application. And application support for both input and display color management vary wildly. Many consumer grade applications have no color management awareness whatsoever.

Firefox can do color management and it’s half-enabled by default, read this to properly configure Firefox.

GIMP for example has display color management disabled by default, you need to enable it via it’s preferences.

Eye of GNOME has display color management enabled by default, but it has nasty corner case behaviors, for example when a file has no metadata no color management is done at all (instead of assuming sRGB input).

Darktable has display color management enabled by default and is one of the few applications which directly support colord and the display specific atoms as well as the generic _ICC_PROFILE atom as fallback. There are however a few caveats for darktable as well, documented here.

Video Sharpening Before Encoding

Often after encoding a video, and playing it back, it seldomly looks as good as I’d expect, especially compared to professionally produced DVDs for example. Now while there are likely a multitude of differences, one of them seems to be acutance (perceived sharpness), which can be enhanced by applying some sharpening after scaling the source material before encoding the video.

The following small script encodes a source video input to a anamorphic widescreen PAL DVD resolution WebM file at a nominal (total) bitrate of 2Mbit/sec, while applying some sharpening:

#!/bin/sh
INPUT="$1"
SCALE_OPTS="-sws_flags lanczos -s 720x576 -aspect 16:9"
SHARP_OPTS="-vf unsharp=3:3:0.5:3:3:0"
VIDEO_OPTS="-vcodec libvpx -g 120 -lag-in-frames 15 -deadline good -profile 0 -qmax 51 -qmin 11 -slices 4 -b 1800000 -maxrate 3800000"
AUDIO_OPTS="-acodec libvorbis -ac 2 -ab 192000"

avconv -i "${INPUT}" ${SCALE_OPTS} ${SHARP_OPTS} ${VIDEO_OPTS}           -an -pass 1 -f webm -y "out-${INPUT}.webm"
avconv -i "${INPUT}" ${SCALE_OPTS} ${SHARP_OPTS} ${VIDEO_OPTS} ${AUDIO_OPTS} -pass 2 -f webm -y "out-${INPUT}.webm"

 

Now what particularly matters are the unsharp parameters, which can be divided into two triplets, the first set of three: luma (brightness (greyscale) information) and the second set of three: chroma (color information). Each of these two sets has three parameters, of which the first two are the horizontal and vertical matrix dimensions (e.g. the evaluation area), in our case a small matrix of 3 by 3 is the only configuration that makes sense. A 5 by 5 matrix is possible but will give a exagerated effect and halo artifacts. The last parameter in the triplets is the strength (respectively 0.5 for luma, and 0 for chroma), which means we’re enhancing acutance for luma, and we’re leaving the chroma unmodified. For luma strength values between 0.5 and 1.0 are likely to be the useful range depending on taste and source material. Typically you’d want to leave chroma be, but in some cases you could possibly use this as a poor mans denoising method by specifying negative values, which effectively turns it into a blur effect.

 

Display profiles generated from EDID

If you’re running a GNOME or Unity desktop (and probably recent versions of KDE too), you may notice differences in color rendition between different applications. The difference you’re seeing is between applications that apply the system configured display profile and those that don’t. For example Eye of GNOME and Darktable do this by default, GIMP for example doesn’t…

Now, as many people have noticed most displays render color quite differently, and display profiles are a technical means to correct that to some degree. There are several means for obtaining a display profile, one is to buy a measurement device (called a colorimeter) and actually measure your particular display. Some vendors supply a display profile ICC file on a CD that came with the display. And lastly more recent displays apparently provide information which can be used to generate a display profile via EDID (which is a protocol for information exchange via VGA/DVI/HDMI). The respective methods have been listed in order of decreasing accuracy. For a bit more in-depth information you might want to consider reading this.

At least since distributions have been shipping colord and GNOME Color Manager (so I’m guessing since Oneiric for Ubuntu users), colord actually queries your display via the EDID protocol, to extract the required information to generate an automatic display profile, which allow certain applications to correct for the displays behavior.

We Need You

Now, recently we’ve begun to have the impression that some vendors may be shipping bogus information in their displays (possibly under the assumption that it would not be the used anyhow). But currently we have no information to substantiate this.

Please read this and this first before continuing.

I’d like to ask you, to submit your EDID generated profile to the gnome-color-manager-list (you can post to the list without subscribing, your submission will be moderated and thus will take a few days to turn up) including the following:

  • Subject: [EDID] Display Make + Model
  • Attach ~/.local/share/icc/edid-*.icc
  • Display Make (if it’s a laptop, then the laptop make)
  • Display Model (if it’s a laptop, then the laptop model)
  • The Displays Age (approx. how long ago did you buy it)
  • Duty Cycle (light usage on average a few hours a day, heavy usage approx. 8 or more hours a day).
  • The output of  xprop -display :0.0 -len 14 -root _ICC_PROFILE
  • Subjective Impression (download this SmugMug calibration image, and load it into GIMP, then go the GIMP’s Preferences, to go Color Management, and then check/uncheck Try to use system monitor profile while keeping an eye on the image, tell us what looks most realistic to you (checked/unchecked) and why…

After more than 2000 submissions colord-0.1.34 was released which should detect and disable cases where the displays are supplying bogus information via EDID. Based on the current statistics it seems 4% (or thereabouts) of the displays supply bad information.

Working around bad EDID

Assuming some vendors actually provide bad information via EDID, you might need a way to disable this automatically generated profile. In older versions of GNOME Color Manager (3.6 and earlier) there wasn’t an easy way to disable this. There is however a feasible workaround. Install argyll on your system. Then assign /usr/share/color/argyll/ref/sRGB.icm to your display. (Go to the GNOME System Settings, Choose the Color applet, Choose your display, click Add Profile, select Other Profile, and then select Argyll’s sRGB.icm).

Missing Memory Card Icons on Ubuntu

Depending on your type of memory card reader, you may have noticed the following on Ubuntu (and possibly other distributions too). When you connect a USB flashdrive to your system an icon pops up informing you the drive has been mounted. When you insert (for example) an SD card into your cardreader, it may happen that another icon pops up using the same icon as the flashdrive.

While this isn’t the biggest problem in the world, it’s certainly a nuisance, as you’d need to hover over each icon to see the tooltip which explains to you which icon represents what. Ideally you’d want the SD card to show up with an appropriate SD card icon.

Which icon is displayed ultimately depends on disk management done by udisks and more importantly udev. In /lib/udev/rules.d/80-udisks.rules (do NOT modify this file) we find the following rules:

SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*SD_Reader*", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*Reader*SD*", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*CF_Reader*", ENV{ID_DRIVE_FLASH_CF}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*SM_Reader*", ENV{ID_DRIVE_FLASH_SM}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*MS_Reader*", ENV{ID_DRIVE_FLASH_MS}="1"

The above rules are matched against the device names which are passed to the kernel. With one of my cardreaders, this sadly doesn’t match:

$ dmesg | grep -i Direct-Access
scsi 12:0:0:0: Direct-Access     Generic  Compact Flash    0.00 PQ: 0 ANSI: 2
scsi 12:0:0:1: Direct-Access     Generic  SM/xD-Picture    0.00 PQ: 0 ANSI: 2
scsi 12:0:0:2: Direct-Access     Generic  SDXC/MMC         0.00 PQ: 0 ANSI: 2
scsi 12:0:0:3: Direct-Access     Generic  MS/MS-Pro/HG     0.00 PQ: 0 ANSI: 2

To create new rules, we first need to figure out what USB vendor/product IDs belong to the cardreader, you can just identify USB devices attached to your computer like so:

$ lsusb
Bus 002 Device 012: ID 048d:1345 Integrated Technology Express, Inc. Multi Cardreader

Just run the command once before attaching the device and once after attaching the device and look for the differences, typically it’ll be the last device in the list. Once we have this information create a new file (replace pmjdebruijn with your own nickname, use exclusively alphanumeric characters):

$ sudo nano -w /etc/udev/rules.d/80-udisks-pmjdebruijn.rules

In this file we put the following lines:

# ITE, Hama 00055348 V4 Cardreader 35 in 1 USB
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:0", ENV{ID_DRIVE_FLASH_CF}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:1", ENV{ID_DRIVE_FLASH_SM}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:2", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:3", ENV{ID_DRIVE_FLASH_MS}="1"

You’ll notice the idVendor and idProduct coming from the lsusb line above, also the ID_INSTANCE needs to have matching LUNs with the dmesg lines above. Once you’re done, doublecheck and save the file, and then you can reload the udev rules:

$ sudo udevadm control --reload-rules

Any newly mounted memory cards should get a proper icon now.

Not all cardreaders may be as easy as illustrated as above, for example I have a wonderful cardreader that provides no useful information at all:

$ lsusb
Bus 002 Device 009: ID 05e3:0716 Genesys Logic, Inc. USB 2.0 Multislot Card Reader/Writer
$ dmesg | grep -i Direct-Access
scsi 6:0:0:0: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:1: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:2: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:3: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:4: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0

In such a particular case, you’ll need to experiment by actually inserting various types of memory cards, and checking what device got mounted, and what LUN is it, in the following example I inserted an SD card, which got mounted as sdk, which turns out to be LUN 0:2, which we need for the ID_INSTANCE entries:

$ mount | grep media
/dev/sdk1 on /media/FC30-3DA9 type vfat (rw,nosuid,nodev,uid=1000,gid=1000,shortname=mixed,dmask=0077,utf8=1,showexec,flush,uhelper=udisks)
$ dmesg | grep sdk
sd 12:0:0:2: [sdk] Attached SCSI removable disk
sd 12:0:0:2: [sdk] 248320 512-byte logical blocks: (127 MB/121 MiB)
sd 12:0:0:2: [sdk] No Caching mode page present
sd 12:0:0:2: [sdk] Assuming drive cache: write through
sd 12:0:0:2: [sdk] No Caching mode page present
sd 12:0:0:2: [sdk] Assuming drive cache: write through
 sdk: sdk1

Another peculiarity (or feature) of this drive is that it has 5 LUNs instead of 4, this is because it actually has two SD card slots, one for full size SD cards and one for microSD cards. In the end, after some fiddling, I ended up with:

# Genesys Logic, Conrad SuperReader Ultimate
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:0", ENV{ID_DRIVE_FLASH_CF}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:1", ENV{ID_DRIVE_FLASH_SM}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:2", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:3", ENV{ID_DRIVE_FLASH_MS}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:4", ENV{ID_DRIVE_FLASH_SD}="1"

Arduino vi mode pedal

In the past I was a vim user, mostly because a full vim install had nice syntax highlighting, since recent versions of nano also do syntax highlighting and nano has become available in the default installs of many distributions I’ve switched to nano as editor of choice.

However some colleagues came across the vim clutch the other day, which is a cool hardware hack, to switch between vi’s command and text entry modes using your foot. Coincidentally I had ordered a StealthDuino 32U4 which I received last week. The StealthDuino is basically a miniaturized Arduino Leonardo (in USB stick format), which easily does USB HID (Keyboard & Mouse). The other cool and convenient thing about the StealthDuino is that it has a QRE1113 infra-red proximity sensor on-board, so it’s possible to sense if anything is closely overhead. Which means it can effectively be used as a pedal.

My Arduino sketch can be downloaded here. When you load this sketch into a StealthDuino you can control it directly with your foot (no pedal casing required), however I’d recommend only operating it with your socks on, controlling it with your boots on will quite likely significantly decrease the units lifespan.

 

Ransomware Unlocker

Some days ago, I came across a Windows machine where a lot of files were renamed (locked-filename.[four random lowercase alphabetic characters]) and no longer readable by the respective applications.  In one of the home directories there was a randomly named executable which many anti virus agents didn’t see as dangerous (including the anti virus agent installed on the machine in question, namely AVG). We checked using virustotal, back then only 15 (or so) anti virus products would detect the file in question, now (as I’m writing this), the detection rate has gone up to 29 anti virus products.

But, regardless of the origin, we were still stuck with lots of unreadable “locked” files. Now, we had access to religious backups, so we could have just reverted to the day before, but I opted to have some fun first.

I transferred a few sample files (two Word documents and a WAVE audio file, both locked and originals from backup) to a Linux machine. Running the file(1) utility on these locked files, they were all identified as data, while the originals were clearly identified as Word documents and WAVE audio. So I was pretty sure something changed in the contents. Next up I ran strings(1) on the locked and original version of one the the Word documents, and strings(1) returned plain text in both cases. So I knew the files weren’t entirely scrambled, but since file(1)‘s main mechanism to identify data formats is looking at the first few bytes in a files, the obvious theory would be that only the first part of these files were scrambled.

After searching a bit for a nice binary diff utility I found vbindiff(1) because xdelta(1) wasn’t cutting the mustard. Running vbindiff(1) on one of the Word documents (diffing the original against the locked version) it became immediately apparent that the first 4096 bytes were scrambled. Same story for the other Word document, but it was less obvious for the WAVE audio file. The difference is that classic Word documents (.doc, not .docx) have a headers with lots of 0×00 and 0xFF bytes in there. Now within the same locked file multiple 0×00 bytes weren’t scrambled into the same byte value, so some form of crypto (with a key) was being applied. When looking at two different original Word documents I noticed that a large part of the header was nearly identical between to the different Word documents. So I took a look at the two respective locked Word documents, and those were largely identical too. From this we can infer that the private key used to encrypt the first 4096 bytes is most likely static between locked files (at least on this system).

Considering the fact that a simple static private key seems to have been used and the fact that locked files weren’t even entirely encrypted, it was my guess the algorithm probably wouldn’t be very sophisticated too. Would it be a simple XOR operation? To find out, I wrote some quick and dirty FreePascal code to read the first 4096 bytes from an original file and the first 4096 bytes from a locked file, and XORed them against each other, effectively outputting the private key (at least, that was the theory). Now after I ran said utility against my three samples files, the resulting private key was identical in all three cases (even for the WAVE audio file). So I was right. It was a simple XOR operation using a static private key.

The next challenge was writing another small utility (again in FreePascal) which reads the first 4096 bytes from a locked file, and XORs it with the data from my generated private key file, and writes it back to an unlocked file, but after processing the first 4096 bytes the rest of the file would be copied verbatim. After running this new utility on all of my samples the resulting unlocked files were identical to the original files. So it really worked. It was this simple.

I built both above utilities in FreePascal, because the Pascal language is what I fall back to whenever I have to code up something quickly. A nice side effect is that the FreePascal code should be fairly portable. You can download the sources here.

Now on a even more amusing note, if you place a 4096 byte file with only zero bytes in it on your filesystem before the ransomware is activated it will most likely by accident generate it’s own private key, as 0×00 XOR key = key.

ColorHug red shift workaround

As most of you probably know already, there is a cool (and affordable) little colorimeter available now called the ColorHug, and it’s open-source too (including all companion software).

As the ColorHug’s firmware is still being improved, some people have noticed a profile created with the ColorHug makes their display shift excessively to the red, possibly due to a slight measurement inaccuracy.

A display profile generally consist of two main parts, first there is the vcgt (sometimes also called VideoLUT), which is loaded and applied by X11 itself. This is usually a correction for a displays white point (which is where it goes wrong). And the second part is gamma+matrix (which is gamma/hue/saturation correction). So to avoid the red shift we have to skip the first part of profile creation.

To prepare I recommend you (try to) do the following (for this particular procedure):

  1. Note down your display old settings (if you care to go back to these).
  2. Reset your display’s settings to factory defaults.
  3. Adjust the display’s brightness to a comfortable level (you really often don’t need maximum brightness).
  4. Generally it’s a good idea to leave contrast at the manufacturers default.
  5. Change the displays color temperature to 6500K if possible (You might notice your display shift a bit to the yellow).

Then execute the following commands in a Terminal

# targen -v -d 3 -G -f 64 make_model
# ENABLE_COLORHUG=1 dispread -v -y l make_model
# colprof -v -A "Make" -M "Model" -D "Make Model" \
          -C "Copyright (c) 2012 John Doe. Some rights reserved." \
          -q l -a G make_model

The above commands skip vcgt creation with dispcal and do a fairly simple set of measurements and create a fairly simple ICC profile. This simplicity gets us increased robustness in the profile’s creation at the expense of potential accuracy. To be honest I wouldn’t be surprised if commercial vendors use a similar strategy in their entry-level colorimetry products for the consumer market.

You’ll need to either manually import the resulting profile into GNOME Color Manager (to setup the profile at login), or directly configure it in programs like the GIMP. You can load an image like this in GIMP to check if the resulting profile makes sense. Please do mind GIMP has color management disabled by default, so you need to set it up in the Preferences.

Even with the above method, the resulting profile may still be a bit off in the reds (though it will only be visible in color managed applications). If this is still an issue for you, you could try the Community Average CCMX, or possibly my Dell U2212HM CCMX, for which I’ve gotten decent results on non-Dell displays too.

How I Screencast on Ubuntu

I’ve been screencasting a while now, mostly about Darktable. And from time to time people ask me how I do it, and what software and hardware I use. So here goes nothing…

Since my main topic is Darktable (a free software photography application), it means my target audience is primarily photographers who use free software. Which means my video’s should be easy to view on a random Linux/BSD desktops. Considering the only video/audio codec that is available on most newly installed Linux desktop are Theora and Vorbis respectively, these were going to be my primary publishing formats. The fact that Theora and Vorbis have been the longest supported format for HTML5 Video is a big plus too (since Firefox 3.6 if I recall correctly). And I surely didn’t want to motivate/require anybody to install Flash to view my video’s.

Another point of concern was audio quality. In general when watching other peoples screencasts, the often poor audio quality was for me the biggest annoyance. Especially for longer video’s where I don’t want to listen to 20 minutes of someone talking through static noise. So I went a bit overboard with this and bought an M-Audio FastTrack MkII (which is plain USB Audio, no special driver required) and a RØDE NT1a Kit which I later on mounted to a RØDE PSA1 Studio Arm.

Which brings me to my choice of recording application. I can’t say I tried them all, but recording with ffmpeg seemed to slow down my machine too much. So I settled on recordmydesktop and more particularly the gtk-recordmydesktop frontend. After some experimenting I found recording just a part of my screen (1920×1200) to be a nuisance. So I settled on doing all screencasts on my laptop recording fullscreen (1280×800). The recordmydesktop application defaults to recording 15 frames per second, which seems to be fine for my purposes. It defaults to recording audio at a 22050 Hz sampling rate, with me being a sucker for audio quality I changed to that 48000 Hz, which is commonly used on DVDs and other professional audio applications. One of recordmydesktop’s potential disadvantages is that it only encodes it’s capture to Ogg/Theora/Vorbis (.ogv) which luckily for me really isn’t an issue at all. I do max out the encoding quality to 100% for both audio and video.

When publishing my screencasts on the web I just use the HTML5 video support of modern browsers. I use the .ogv file produced by recordmydesktop directly, I don’t re-encode to reduce the bitrate or anything, as the bitrate is already acceptable to begin with, and I don’t want to degrade the quality any more than I have to. While in the past I only provided the .ogv, I recently also caved in providing an .mp4 (H264/AAC) fallback video to be able to support the ever increasing ubiquity of tablets, and secondarily to be able to support browsers like Safari which don’t support free media formats like Ogg/Theora/Vorbis out of the box. So now I’m using ffmpeg to transcode my video, however there are a couple of concerns here. My original recordings were recorded at a resolution of 1280×800, while most tablets (and most importantly The Original iPad), only supports video at a resolution of 1280×720 (H264 Level 3.1), so it would likely choke on it. That said, in many cases it’s not very useful to have 1280×800 on most tablets anyway as 1024×768 is a common resolution for 10″ tablets. So I settled on resizing my screencasts to 1024×640 (which also reduced the bitrate a bit, in the process making it more suitable for mobile viewing). Initially I tried to do the audio using the MP3 audio codec, however iPads seem to dislike that, while Android tablets handled it just fine. So I had to go with AAC and while Ubuntu’s ffmpeg isn’t built with FAAC support, it does however have ffmpeg’s builtin AAC encoder called libvo_aacenc, which isn’t as good as FAAC, but it had to do. So in the end my conversion commandline for ffmpeg is this:

avconv -i input.ogv -sws_flags bicubic -s 1024x640 -vcodec libx264 -coder 1 \
-flags +loop -cmp +chroma -partitions +parti8x8+parti4x4+partp8x8+partb8x8 \
-me_method umh -subq 8 -me_range 16 -g 250 -keyint_min 25 -sc_threshold 40 \
-i_qfactor 0.71 -b_strategy 2 -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -bf 3 \
-refs 5 -directpred 3 -trellis 1 -flags2 +bpyramid+mixed_refs+wpred+dct8x8+fastpskip \
-wpredp 2 -rc_lookahead 50 -coder 0 -bf 0 -refs 1 -flags2 -wpred-dct8x8 \
-level 30 -maxrate 10000000 -bufsize 10000000 -wpredp 0 -b 1200k \
-acodec libvo_aacenc -ac 1 -ar 48000 -ab 128k output.mp4

Darktable 1.0 Screencast Library (Addition)

Since I did my last darktable 0.9 screencast library, some things have changed. So at the very least this warranted an update screencast.

Darktable 1.0 Update (download)

Darktable Archiving & Backup (download)

These are the first screencasts that should be viewable on most tablet devices too, albeit with slightly degraded quality.