Practical Printer Profiling with Gutenprint

Some time ago I purchased an Epson Stylus Photo R3000 printer, as I wanted to be able to print at A3 size, and get good quality monochrome prints. For a while I struggled a bit to get good quality color photo output from the R3000 using Gutenprint, as it took me a while to figure out which settings proved best for generating and applying ICC profiles.

Sidenote, if you happen to have a R3000 as well and you want to be able to get good results using Gutenprint, you can get some of my profiles here, not all of these profiles have been practically tested. Obviously your milage may vary.

When reading Gutenprint’s documentation they clearly indicated that you should use the “Uncorrected” Color Correction mode, which is very much good advice, as we need deterministic output to be able to generate and apply our ICC profiles in a consistent manner. What kinda threw me off, is that the “Uncorrected” Color Correction mode produces linear gamma output, which practically means very dark output, which the ICC profile is going to need to correct for. And while this is a valid approach, it does generally mean you need to generate a profile using more color patches, which means using more ink and paper for each profile you generate. A more practical approach would be to set Composite Gamma to a value of 1.0, which gamma corrects the output to look more perceptually natural, which consequently means the ICC profile has less to correct for, and thus can be generated using less color patches, and therefore using less ink and paper to do so.

Keep in mind that a printer profile is only valid for a particular combination of Printer, Ink set, Paper, Driver and Settings. Therefore you should document all these facets while generating a profile. This can be as simple as including a similarly named plain text file with each profile you create, for example:

Filename ............: epson_r3000_tecco_photo_matte_230.icc
MD5 Sum .............: 056d6c22ea51104b5e52de8632bd77d4

Paper Type ..........: Tecco Photo Matte 230

Printer Model .......: Epson Stylus Photo R3000
Printer Ink .........: Epson UltraChrome K3 with Vivid Magenta
Printer Firmware ....: AS25E3 (09/11/14)
Printer Driver ......: Gutenprint 5.2.13 (Ubuntu 18.04 LTS)

Color Model .........: RGB
Color Precision .....: Normal
Media Type ..........: Archival Matte Paper
Print Quality .......: Super Photo
Resolution ..........: 1440x1440 DPI
Ink Set .............: Matte Black
Ink Type ............: Standard
Quality Enhancement .: None
Color Correction ....: Uncorrected
Image Type ..........: Photograph
Dither Algorithm ....: EvenTone
Composite Gamma .....: 1.0

You’ll note I’m not using the maximum 5760×2880 resolution Gutenprint supports for this printer, as the quality increase seems almost negligible, and it slows down printing immensely, and might also increase ink consumption with little to show for it.

While the Matte Black (MK) ink set and Archival Matte Paper media type works very well for matte papers, you should probably use Photo Black (PK) ink set and Premium Glossy Photo Paper media type for glossy or Premium Semigloss Photo Paper for pearl, satin & lustre media types.

The following profiling procedure uses only a single sheet of A4 paper, with very decent results, you can use multiple pages by increasing the patch count, the increase in effective output quality will likely be underwhelming though, but your mileage may vary of course.

To proceed you’ll need a spectrophotometer (a colorimeter won’t suffice) supported by ArgyllCMS, like for example the X-Rite Color Munki Photo.

To install ArgyllCMS and other relevant tools on Debian (or one of its derivatives like Ubuntu):

apt-get install argyll liblcms2-utils imagemagick

First we’ll need to generate a set of color patches (we’re including a neutral grey axis, so the profile can more effectively neutralize Epson’s warm tone grey inks):

targen -v -d 3 -G -g 14 -f 210 myprofile
printtarg -v -i CM -h -R 42 -t 360 -M 6 -p A4 myprofile

This results in a TIF file, which you need to print at whatever settings you want to use the profile at. Make sure you let the print dry (and outgas) for an hour at the very least. After the print has dried we’ll need to start measuring the patches using our spectrophotometer:

chartread -v -H myprofile

Once all the patches have been read, we’re ready to generate the actual profile.

colprof -v -D "Tecco Photo Matte 230 for Epson R3000" \
           -C "Copyright 2018 Your Name Here" \
           -Zm -Zr -qm -nc \
           -S /usr/share/color/argyll/ref/sRGB.icm \
           -cmt -dpp myprofile

Note if you’re generating a profile for a glossy or lustre paper type remove the -Zm from the colprof commandline.

Evaluating Your Profile

After generating a custom print profile we can evaluate the profile using xicclu:

xicclu -g -fb -ir myprofile.icc

Looking at the graph above, there are a few things of note, you’ll notice the graph doesn’t touch the lower right corner, which represents a profiles black point, keep in mind that the blackest black any printer can print, still reflects some light, and thus isn’t perfectly black, i.e. 0.

Another point of interest is the curvature of lines, if the graph is bowing significantly to the upper right, it means the media type you have chosen for your profile is causing Gutenprint to put down more ink than the paper you’re using is capable of taking. And conversely if the graph is bowing significantly to the lower left, it means the media type you have chosen for your profile is causing Gutenprint to put down less ink than the paper you’re using is capable of taking. While a profile will compensate for either, having a profile compensate too strongly for either may cause banding artifacts in rare cases especially with an 8-bit workflow. While, I haven’t had a case yet where I needed to, you can use the Density control to adjust the amount of ink put on paper.

Visualizing Printer Gamut

To visualize the effective gamut of your profile you can generate a 3D Lab colorspace graph using iccgamut, which you can view with any modern web browser:

iccgamut -v -w -n myprofile.icc
xdg-open myprofile.x3d.htm

Comparing Gamuts

To compare the gamut of our new custom print profile against a standard working colorspace like sRGB follow these steps:

cp /usr/share/color/argyll/ref/sRGB.icm .
iccgamut -v sRGB.icmiccgamut -v myprofile.icc
viewgam -i -n myprofile.gam sRGB.gam srgb_myprofile
Intersecting volume = 406219.5 cubic units
'epson_r3000_hema_matt_coated_photo_paper_235.gam' volume = 464977.8 cubic units, intersect = 87.36%
'sRGB.gam' volume = 899097.5 cubic units, intersect = 45.18%
xdg-open srgb_myprofile.x3d.htm

From the above output we can conclude that our custom print profile covers about 45% of sRGB, meaning the printer has a gamut that is much smaller than sRGB. However we can also see that sRGB in turn covers about 87% of our custom print profile, which means that 13% of our custom print profile gamut is actually beyond the gamut of sRGB.

This is where gamut mapping comes in. This is where declared rendering intents actually affect how colors outside of the shared gamut is handled.

While a Relative Colorimetric rendering intent limits your prints to the shared area, effectively giving you the smallest practical gamut, it will however offer you the best color accuracy.

A Perceptual rendering intent will scale down colors from an area where a working space profile has a larger gamut (the other 55% of sRGB) into a smaller gamut.

A Saturation rendering intent will also scale up colors from an area where a working space profile has a smaller gamut into a larger gamut (the 13% of our custom print profile).

Manually Preparing Prints using liblcms2-utils

To test your profile, I suggest getting a good test image, like for example from SmugMug, and applying your new profile, using either Perceptual gamut mapping or Relative Colorimetric gamut mapping with Black Point Compensation respectively:

jpgicc -v -o printer.icc -t 0    -q 95 original.jpg print.jpg
jpgicc -v -o printer.icc -t 1 -b -q 95 original.jpg print.jpg

When you open either of the print corrected images, you’ll most likely find they’ll both look awful on your computer’s display, but keep in mind, this is because the images are correcting for printer, driver, ink & paper behavior. If you actually print either image, the printed image should look fairly close to the original image on your computer’s display (presuming you have your display setup properly and calibrated as well).

Manually Preparing Prints using ImageMagick

A more sophisticated way to prepare real images for printing would be using (for example) ImageMagick, these examples below illustrate how you can use ImageMagick to scale an image to a set resolution (360 DPI) for a given paper size, add print sharpening (this is why having a known static resolution is important, otherwise the sharpening would give inconsistent results across different images), then we add a thin black border, and a larger but equidistant (presuming a 3:2 image) white border, and finally we convert the image to our custom print profile:

A4 paper

convert -profile /usr/share/color/argyll/ref/sRGB.icm \
        -resize 2466^ -density 360 -unsharp 2x2+1+0 \
        -bordercolor black -border 28x28 -bordercolor white -border 227x227 \
        -black-point-compensation -intent relative -profile myprofile.icc \
        -strip -sampling-factor 1x1 -quality 95 original.jpg print.jpg

A3 paper

convert -profile /usr/share/color/argyll/ref/sRGB.icm \
        -resize 3487^ -density 360 -unsharp 2x2+1+0 \
        -bordercolor black -border 28x28 -bordercolor white -border 333x333 \
        -black-point-compensation -intent relative -profile myprofile.icc \
        -strip -sampling-factor 1x1 -quality 95 original.jpg print.jpg

A3+ paper

convert -profile /usr/share/color/argyll/ref/sRGB.icm \
        -resize 4320^ -density 360 -unsharp 2x2+1+0 \
        -bordercolor black -border 28x28 -bordercolor white -border 152x152 \
        -black-point-compensation -intent relative -profile myprofile.icc \
        -strip -sampling-factor 1x1 -quality 95 original.jpg print.jpg

Automatically Preparing Prints via colord

While the above method describes a way that gives you a lot of control on how to prepare images for printing, you may also want to use a profile for printing on plain paper, where the input is output of any random application, as opposed to a raster image file that can be very easily preprocessed.

Via colord you can assign a printer an ICC profile that will be automatically applied through cups-filters (pdftoraster), but keep in mind that this profile can only be changed through colormgr (or another colord frontend, like GNOME Control Center) and not through an application’s print dialog, sadly. To avoid messing with driver settings too much, I would suggesting duplicating your printer in CUPS, for example:

  • a printer instance for plain paper prints (with an ICC profile assigned through colord
  • a printer instance for matte color photographic prints (without a profile assigned through colord)
  • a printer instance for (semi)glossy color photographic prints
  • a printer instance for matte black and white photographic prints (likely without a need for a profile at all).
  • a printer instance for (semi)glossy black and white photographic prints (likely without a need for a profile at all).

One caveat of having a printer duplicated in CUPS is that it essentially also creates multiple print queues, which means if you have sent prints to multiple separate queues, you’ll have a race condition where it’s anybody’s guess which queue actually delivers the next print to your single physical printer, which may result in prints coming out in a different order as you had sent them. But it’s my guess that this disadvantage will hardly be noticeable for most people, and very tolerable to most who would notice it.

One thing to keep in mind is that pdftoraster applies an ICC profile by default using a Perceptual rendering intent, which means that out of gamut colors in a source image are scaled to fit inside the the print profile’s gamut. Fundamentally the Perceptual rendering intent makes the tradeoff to keep gradients intact, at the expense of color accuracy, which is most often a fairly sensible thing to do. Given this tidbit of information, and the fact that pdftoraster assumes sRGB input (unless explicitly told otherwise), I’d like to emphasize the importance of passing the -S parameter with an sRGB profile to colprof when generating print profiles for on Linux.

To assign an ICC profile to be applied automatically by cups-filters:

sudo cp navigator_colour_documents_120.icc /var/lib/colord/icc/navigator_colour_documents_120.icc
colormgr import-profile /var/lib/colord/icc/navigator_colour_documents_120.icc
colormgr find-profile-by-filename /var/lib/colord/icc/navigator_colour_documents_120.icc
colormgr get-devices-by-kind printer
colormgr device-add-profile \
         /org/freedesktop/ColorManager/devices/cups_EPSON_Epson_Stylus_Photo_R3000 \
         /org/freedesktop/ColorManager/profiles/icc_c43e7ce085212ba8f85ae634085ecfd3

More on Gutenprint media types

In contrast to commercial printer drivers, Gutenprint gives us the opportunity to peek under the covers, and find out more about the different media types Gutenprint supports for your printer, first lookup your printers model number:

$ grep 'R3000' /usr/share/gutenprint/5.2/xml/printers.xml 
<printer ... name="Epson Stylus Photo R3000" driver="escp2-r3000" ... model="115" ...

Then find the relevant media definition file:

$ grep 'media src' /usr/share/gutenprint/5.2/xml/escp2/model/model_115.xml 
<media src="escp2/media/f360_ultrachrome_k3v.xml"/>

Finally you can dig through the relevant media type definitions, where the Density parameter is of particular interest:

$ less /usr/share/gutenprint/5.2/xml/escp2/media/f360_ultrachrome_k3v.xml
<paper ... text="Plain Paper" ... PreferredInkset="ultra3matte">
  <ink ... name="ultra3matte" text="UltraChrome Matte Black">
    <parameter type="float" name="Density">0.720000</parameter>
<paper ... text="Archival Matte Paper" ... PreferredInkset="ultra3matte">
  <ink ... name="ultra3matte" text="UltraChrome Matte Black">
    <parameter type="float" name="Density">0.920000</parameter>
<paper ... text="Premium Glossy Photo Paper" ... PreferredInkset="ultra3photo">
  <ink ... name="ultra3photo" text="UltraChrome Photo Black">
    <parameter type="float" name="Density">0.720000</parameter>
<paper ... text="Premium Semigloss Photo Paper" ... PreferredInkset="ultra3photo">
  <ink ... name="ultra3photo" text="UltraChrome Photo Black">
    <parameter type="float" name="Density">0.720000</parameter>
<paper ... text="Photo Paper" ... PreferredInkset="ultra3photo">
  <ink ... name="ultra3photo" text="UltraChrome Photo Black">
    <parameter type="float" name="Density">1.000000</parameter> 

Dedicated Grey Neutralization Profile

As mentioned earlier, the Epson R3000 uses warm tone grey inks, which results in very pleasant true black & white images, without any color inks used, at least when Gutenprint is told to print in Greyscale mode.

If, unlike me, you don’t like the warm tone effect, applying the ICC we generated should neutralize it mostly, but possibly not perfectly, which is fine for neutral area’s in color prints, but may be less satisfactory for proper black & white prints.

While I haven’t done any particular testing on this issue, you may want to consider doing a second profile dedicated and tuned to grey neutralization, just follow the normal profiling procedure with the following target generation command instead:

targen -v -d 3 -G -g 64 -f 210 -c previous_color_profile.icc -A 1.0 -N 1.0 grey_neutral_profile

Obviously you’ll need to use this particular profile in RGB color mode, even though your end goal may be monochrome, given that the profile needs to use color inks to compensate for the warm tone grey inks.

Transcoding DV Tapes

Today we enjoy half decent high definition video recording capabilities in pretty much any device which has an imaging sensor in it. About ten years ago that wasn’t the case, back then people bought so-called DV cameras. These DV cameras recorded more or less DVD resolution footage with limited compression onto digital video (DV) tapes, usually 60 minutes per tape. Some awkward choices in the DV format were made, particularly the fact that it was recorded at 50 interlaced fields per second, which can be interpolated to either 25 or 50 actual frames per second.

Most DV cameras allowed the content of a DV tape to be directly transfered to a computer fitted with a FireWire 400 interface, a 60 minute tape would result in slightly less than 15 GB of data transfered. This is fairly easy to do on a modern Linux system (keep in mind that the transfer is 1:1, so 60 minute tape requires 60 minutes to entirely transfer):

# dvgrab -rewind -showstatus -timestamp -autosplit -size 20000 -format qt

While most media players on Linux will play these captured video fragments directly, many media players on other platforms won’t. Also the resulting files are ridiculously large, so they aren’t particularly handy to keep around indefinitely. So typically you’ll want to transcode them into something more efficient, for example (at it’s simplest):

# mkdir mp4
# for F in *.mov; do \
    avconv \
      -i $F -filter:v yadif=1 -r 50 \
      -c:v libx264 -preset:v fast -profile:v main -level:v 31 -tune:v film -g 50 -crf 18 \
      -c:a ac3 -b:a 320k \
      mp4/${F}.mp4; \
  done

The above command will transcode all captured video fragments to high quality H264/AVC video and AC3 (Dolby Digital) audio. The effective video quality is controlled via the CRF parameter, transcoding a set of video fragments totaling about 60 minutes, resulted in the following sizes for me:

CRF
17
18
19
20
21
22
23
Size 4.9 GB 4.1 GB 3.4 GB 2.8 GB 2.4 GB 2.0 GB 1.7 GB

Keep in mind that CRF encoding tries to keep the quality constant, so if you have very unsteady or action rich footage the resulting files may end up bigger for you. Also keep in mind that some filesystems don’t allow single files to exceed 2GB, so if you have a single continuous piece of footage of 60 minutes, you probably should use CRF 23, otherwise I’d recommend sticking to CRF 18 for very good quality.

Ideally we’d like our filesystem modification dates to match the recording timestamp and having some metadata in each files can be helpful at times as well, so alternatively we might transcode like so:

# mkdir mp4
# for F in *.mov; do \
    avconv \
      -i $F -filter:v yadif=1 -r 50 \
      -c:v libx264 -preset:v fast -profile:v main -level:v 31 -tune:v film -g 50 -crf 18 \
      -c:a ac3 -b:a 320k \
      -metadata title="$(echo ${F} | sed 's#dvgrab-##' | sed 's#.mov##' | tr '_' ' ' | tr '-' ':')" \
      -metadata artist="John Doe" \
      -metadata album="Holiday" \
      -metadata description="JVC GR-XXXX" \
      mp4/PAL50_$(echo ${F} | sed 's#dvgrab-##' | sed 's#.mov##' | tr -d '.' | tr -d '-').MP4; \
    touch \
      -t $(echo ${F} | sed 's#dvgrab-##' | sed 's#.mov##' | tr -c '[0-9]\n' '_' | awk -F '_' '{print $1 $2 $3 $4 $5 "." $6}') \
      mp4/PAL50_$(echo ${F} | sed 's#dvgrab-##' | sed 's#.mov##' | tr -d '.' | tr -d '-').MP4; \
  done

Lastly you might want to burn the resulting video files to a DVD, and generating a good ISO image is easy enough as well:

# cd mp4
# md5sum *.MP4 > MD5SUMS.TXT
# genisoimage -J -l -V HOLIDAY_2005 -o ../HOLIDAY_2005.ISO .

Most vaguely recent BluRay players should be able to play the files on the resulting disc.

KMZ Zorki 4 (Soviet Rangefinder)

The Leica rangefinder

Rangefinder type cameras predate modern single lens reflex cameras. People still use them. It’s just a different way of shooting. Since they’re no longer a mainstream type camera most manufacturers have stopped making them a long time ago. Except Leica, Leica still makes digital and film rangefinders and as you might guess, they come at significant cost. Even old Leica film rangefinders easily cost upwards of € 1000. While Leica certainly wasn’t the only brand to manufacture rangefinders throughout photographic history, it was (and still is) certainly the most iconic rangefinder brand.

The Zorki rangefinder

Now the Soviets essentially tried to copy Leica’s cameras, the result of which, the Zorki series of cameras, was produced at KMZ. Many different versions exist, having produced nearly 2 million cameras across more than 15 years, the Zorki-4 was without a doubt it’s most popular incarnation. Many consider the Zorki-4 to be the one where the Soviets got it (mostly) right.

That said, the Zorki-4 vaguely looks like a Leica M with it’s single coupled viewfinder/rangefinder window. In most other ways it’s more like a pre-M Leica, with it’s 39mm LTM lens screw mount. Earlier Zorki-4’s have a body finished with vulcanite which is though as nails, but if damaged is very difficult to fix/replace. Later Zorki-4’s have a body finished with relatively cheap leatherette, which is much more easily damaged, and is commonly starting to peel off, but should be relatively easy to make better than new. Most Zorki’s come with either a Jupiter-8 50mm f/2.0 lens (being a Zeiss Sonnar inspired design), or an Industar-50 50mm f/3.5 (being a Zeiss Tessar inspired design). I’d highly recommend getting a Zorki-4 with a Jupiter-8 if you can find one.

Buying a Zorki rangefinder with a Jupiter lens

If you’re looking to buy a Zorki there are a few things to be aware of. Zorki’s were produced during the fifties, the sixties and the seventies in Soviet Russia often favoring quantity over quality presumably to be able to meet quota’s. The same is likely true for most Soviet optics as well. So they are both old and may not have met the highest quality standards to begin with. So when buying a Zorki you need to keep in mind it might need repairs and CLA (clean, lube, adjust). My particular Zorki had a dim viewfinder because of dirt both inside and out, the shutterspeed dial was completely stuck at 1/60th of a second and the film takeup spool was missing. I sent my Zorki-4 and Jupiter-8 to Oleg Khalyavin for repairs, shutter curtain replacement and CLA. Oleg was also able to provide me with a replacement film takeup spool or two as well. All in all having work done on your Zorki will easily set you back about € 100 including significant shipping expenses. Keep this in mind before buying. And even if you get your Zorki in a usable state, you’ll probably have to have it serviced at some point. You may very well want to consider having it serviced rather sooner than later, allowing yourself the benefit of enjoying a newly serviced camera.

Complementary accessories

Zorki’s usually come without a lens hood, and the Jupiter-8’s glass elements are said to be only single coasted, so a lens hood isn’t exactly a luxury. A suitable aftermarket lens hood isn’t hard to find though.

While my Zorki did come with it’s original clumsy (and in my case stinky) leather carrying case, it doesn’t come with a regular camera strap. Matin’s Deneb-12LN leather strap can be an affordable but stylish companion to the Zorki. The strap is relatively short, but it’s long enough to wear around your neck or arm. It’s also fairly stiff when it’s still brand new, but it will loosen up after using it for a few days. The strap seems to show signs of wear fairly quickly though.

To some it might seem asif the Zorki has a hot shoe, but it doesn’t, it’s actually a cold shoe, merely intended as an accessory mount and since it’s all metal even with a flash connected via PC Sync it’s likely to be permanently shorted. To mount a regular hot shoe flash you will need a hot shoe adapter both for isolation and PC Sync connectivity.

Choosing a film stock

So now you have a nice Zorki-4, waiting for film to be loaded into it. As of this writing (2015) there is a smörgåsbord of film available. I like shooting black & white, and I often shoot Ilford XP2 Super 400. Ilford’s XP2 is the only B&W film left that’s meant to be processed along with color print film in regular C41 chemicals (so it can be processed by a one-hour-photo service, if you’re lucky enough to still have one of those around). Like most color print film, XP2 has a big exposure latitude, remaining usable between ISO 50 — 800, which isn’t a luxury since the Zorki-4 is not equipped with a built-in lightmeter. While Ilford recommends shooting it at ISO 400, I’d suggest shooting it as if it’s ISO 200 film, giving you two stops of both underexposure and overexposure leeway.

BiertjeWith regard to color print film, I’ve only shot Kodak Gold 200 color print film thus far with pretty decent results. Kodak New Portra 400 quickly comes to mind as another good option. An inexpensive alternative could possibly be Fuji Superia X-TRA 400, which can be found very cheaply as most store-brand 400 speed color print film.

Shooting with a Zorki rangefinder

Once you have a Zorki, there are still some caveats you need to be aware of… Most importantly, don’t change shutter speeds while the shutter isn’t cocked (cocking the shutter is done by advancing the film), not heeding this warning may damage the cameras internal mechanisms. Other notable issues of lesser importance are minding the viewfinder’s parallax error (particularly when shooting at short distances) and making sure you load the film straight, I’ve managed to load film at a slight angle a couple of times already.

As I’ve mentioned, the Zorki-4 does not come with a built-in lightmeter, which means the camera won’t be helping you getting the exposure right, you are on your own. You could use a pricy dedicated light meter (or a less pricy smartphone app, which may or may not work well on your particular phone), either of which are fairly cumbersome. Considering XP2’s wide exposure latitude means an educated guesswork approach becomes feasible. There’s a rule of thumb system called Sunny 16 for making educated guesstimates of exposure for outdoors environments. Sunny 16 states that if you set your shutter speed to the closest reciprocal of your film speed, bright sunny daylight requires an aperture of f/16 to get a decent exposure. Other weather conditions require opening up the aperture according to this table:


Sunny
Slightly
Overcast

Overcast
Heavy
Overcast
Open
Shade
f/16
f/11
f/8
f/5.6
f/4

If you have doubts when classifying shooting conditions, you may want to err on the side of overexposure as color print film tends to prefer overexposure over underexposure. If you’re shooting slide film you should probably avoid using Sunny 16 altogether, as slide film can be very unforgiving if improperly exposed. Additionally, you can manually read a film canisters DX CAS code to see what a films minimum exposure tolerance is.

Quick example: When shooting XP2 on an overcast day, assuming an alternate base ISO of 200 (as suggested earlier), the shutter speed should be set at 1/250th of a second and our aperture should be set at f/8, giving a fairly large field of depth. Now if we want to reduce our field of depth we can trade +2 aperture stops for -2 stops of shutterspeed, where we end up shooting at 1/1000th of a second at f/4.

Having film processed

After shooting a roll of XP2 (or any roll of color print film) you need to take it to a local photo shop, chemist or supermarket to have a it processed, scanned and printed. Usually you’ll be able to have your film processed in C41 chemicals, scanned to CD and get a set of small prints for about € 15 or so. Keep in mind that most shops will cut your filmroll into strips of 4, 5 or 6 negatives, if left to their own devices, depending on the type of protective sleeves they use. Some shops might not offer scanning services without ordering prints, since scanning may be considered a byproduct of the printmaking process. Resulting JPEG scans are usually about 2 megapixel (1800×1200), or sometimes slightly less (1536×1024). A particular note when using XP2, since it’s processed as if it’s color print film means it’s usually scanned as if it’s color print film, where the resulting should-be-monochrome scans (and prints for that matter) can often have a slight color cast. This color cast varies, my particular local lab usually does a fairly decent job, where the scans have a subtle color cast, which isn’t too unpleasant. But I’ve heard about nasty heavier color casts as well. Regardless you need to keep in mind that you might need to convert the scans to proper monochrome manually, which can be easily done with any random photo editing software in a heartbeat. Same goes for rotating the images, aside from the usual 90 degree turns occasionally I get my images scanned upside down, where they need either 180 degree or 270 degree turns, you’ll likely need to do that yourself as well.

Post-processing the scans

Generally speaking I personally like preprocessing my scanned images using some scripted commandline tools before importing them into an image management program like for example Shotwell.

First I remove all useless data from the source JPEG, and in particular for black and white film, like XP2, remove the JPEGs chroma channels, to losslessly remove any color cast (avoiding generational loss):

$ jpegtran -copy none -grayscale -optimize -perfect ORIGINAL.JPG > OUTPUT.JPG

Using the clean image we previously created as a base, we can then add basic EXIF metadata:

$ exiv2 \
   -M"set Exif.Image.Artist John Doe" \
   -M"set Exif.Image.Make KMZ" \
   -M"set Exif.Image.Model Zorki-4" \
   -M"set Exif.Image.ImageNumber \
      $(echo ORIGINAL.JPG | tr -cd '0-9' | sed 's#^0*##g')" \
   -M"set Exif.Image.Orientation 0" \
   -M"set Exif.Image.XResolution 300/1" \
   -M"set Exif.Image.YResolution 300/1" \
   -M"set Exif.Image.ResolutionUnit 2" \
   -M"set Exif.Photo.DateTimeDigitized \
      $(stat --format="%y" ORIGINAL.JPG | awk -F '.' '{print $1}' | tr '-' ':')" \
   -M"set Exif.Photo.UserComment Ilford XP2 Super" \
   -M"set Exif.Photo.ExposureProgram 1" \
   -M"set Exif.Photo.ISOSpeedRatings 400" \
   -M"set Exif.Photo.FocalLength 50/1" \
   -M"set Exif.Image.MaxApertureValue 20/10" \
   -M"set Exif.Photo.LensMake KMZ" \
   -M"set Exif.Photo.LensModel Jupiter-8" \
   -M"set Exif.Photo.FileSource 1" \
   -M"set Exif.Photo.ColorSpace 1" \
   OUTPUT.JPG

As I previously mentioned I tend to get my scans back upside down, which is why I’m usually setting the Orientation tag to 3 (180 degree turn). Other useful values are 0 (do nothing), 6 (rotate 90 degrees clockwise) and 9 (rotate 270 degrees clockwise).

Keeping track

When you’re going to shoot a lot of film it can become a bit of a challenge keeping track of the various rolls of film you may have at an arbitrary point in your workflow. FilmTrackr has you covered.

Manual

You can find a scanned manual for the Zorki-4 rangefinder camera on Mike Butkus’ website.

Moar

If you want to read more about film photography you may want to consider adding Film Is Not Dead and Hot Shots to your bookshelf. You may also want to browse through istillshootfilm.org which seems to be a pretty good resource as well. And for your viewing pleasure, the [FRAMED] Film Show on YouTube.

KVM Basics

Not everybody likes having a full invasive install of VMware Workstation, VirtualBox or virt-manager for that matter. Luckily KVM can be installed without pulling in too many dependancies or having to kludge in poorly maintained kernel modules.

Using KVM bare isn’t particularly difficult, first you’ll need to create a disk image (in this particular example 50GB thin provisioned):

qemu-img create -f qcow2 disk.img 50G
Formatting 'disk.img', fmt=qcow2 size=1073741824 encryption=off cluster_size=65536 ...

Next we’ll need to start KVM (assuming a Windows guest OS), for example:

kvm -m 2048 -localtime -monitor stdio -soundhw ac97 -usb -usbdevice tablet -hda disk.img

The -m 2048 parameter assigns 2GB of virtual RAM. The -localtime parameter should be included for non-UNIX operating systems that do not same save the systemclock as GMT. The -monitor stdio parameter allows KVM to be controlled using it’s monitor interface presented on the terminal it was started from. The -soundhw parameter allows you to select which audio hardware KVM should emulate, the optimal choice depends heavily on the guest operating system. The -usb -usbdevice tablet parameters tell KVM to emulate a tablet pointer which at least for Windows allows decent mouse performance without requiring a guest driver.

Once KVM is started, you’ll notice it’s monitor interface popping up on the terminal.

QEMU 2.0.0 monitor - type 'help' for more information

With this monitor interface you’ll be able to control the KVM virtual machine, for example changing/ejecting a emulated floppy disk image:

(qemu) change floppy0 myfloppy.img
(qemu) eject floppy0

Or configuring the KVM virtual machine to (re)boot from a CD-ROM image:

(qemu) change ide1-cd0 mycdrom.iso
(qemu) boot_set d 
(qemu) system_reset

Obviously there is more to KVM’s monitor interface, and of course it accepts a help command, which will provide you with an elaborate list of possibilities and options.

OpenBSD Stable ISO

The OpenBSD project distributes a binary base system and packages, built from sources at release time. Any security issues or stability fixes after release require sources to be rebuilt by the end user. While this may not be much of an issue with either small deployments or fast systems. Occasionally there might be a need to build your own Stable ISO for repeated installation, or quick installation into low end systems (netbooks?). The procedure at hand is reasonably well documented, if slightly dispersed.

For this tutorial I’ll presume you have dedicated a specific multicore AMD64 machine for the purpose of building this Stable ISO, targeting an AMD64 or i386 build. Adjust where required for your own purposes. I’d recommend against execting this procedure on production systems though.

Most steps in this tutorial will take between 5-10 minutes on vaguely recent hardware (Core 2 Duo), unless noted otherwise.

First do a basic install of OpenBSD (6.0 in our particular example). I’d highly recommend to enable NTP for time syncing and perform custom disk slicing, so you have plenty of place in /usr, because you will need it.

Step 1: Preparing sources

Login as regular user, and then su to root. The name of this regular user (and the FQDN) will show up in your newly built kernel’s dmesg.

Then get all the source OpenBSD source tarball and unpack them accordingly.

cd /usr
ftp ftp://ftp.whatever.org/pub/OpenBSD/6.0/ports.tar.gz
ftp ftp://ftp.whatever.org/pub/OpenBSD/6.0/sys.tar.gz
ftp ftp://ftp.whatever.org/pub/OpenBSD/6.0/src.tar.gz
ftp ftp://ftp.whatever.org/pub/OpenBSD/6.0/xenocara.tar.gz
ftp ftp://ftp.whatever.org/pub/OpenBSD/6.0/SHA256.sig
signify -C -p /etc/signify/openbsd-60-base.pub -x SHA256.sig ports.tar.gz
signify -C -p /etc/signify/openbsd-60-base.pub -x SHA256.sig sys.tar.gz
signify -C -p /etc/signify/openbsd-60-base.pub -x SHA256.sig src.tar.gz
signify -C -p /etc/signify/openbsd-60-base.pub -x SHA256.sig xenocara.tar.gz
cd /usr/src
tar xzf ../sys.tar.gz
tar xzf ../src.tar.gz
cd /usr
tar xzf xenocara.tar.gz
tar xzf ports.tar.gz

These are the unpatched release sources, so we’ll need to update them from CVS.

cvs -qd anoncvs@anoncvs.whatever.org:/cvs get -rOPENBSD_6_0 -P src
cvs -qd anoncvs@anoncvs.whatever.org:/cvs get -rOPENBSD_6_0 -P xenocara
cvs -qd anoncvs@anoncvs.whatever.org:/cvs get -rOPENBSD_6_0 -P ports 

Remove the old release source tarballs, and generate new updated source tarball.

rm SHA256.sig
rm ports.tar.gz
rm xenocara.tar.gz
rm sys.tar.gz
rm src.tar.gz
tar czf ports.tar.gz ports
tar czf xenocara.tar.gz xenocara
cd /usr/src
mv sys ..
tar czf ../src.tar.gz .
cd /usr
tar czf sys.tar.gz sys
rm -Rf src sys xenocara ports
mkdir /usr/src
cd /usr/src
tar xzf ../src.tar.gz
tar xzf ../sys.tar.gz
cd /usr
tar xzf xenocara.tar.gz
tar xzf ports.tar.gz

Next…

Step 2: Building sources

First we’ll need to build and install an updated kernel:

cd /usr/src/sys/arch/i386/conf
config GENERIC.MP
cd /usr/src/sys/arch/i386/compile/GENERIC.MP
make clean && make
cd /usr/src/sys/arch/i386/compile/GENERIC.MP
make install
reboot

Make sure you’ve rebooted your system after having installed the new kernel, then login as your regular user again and su to root.

Next we’ll build (~1hour) and install an updated userland.

rm -rf /usr/obj/*
cd /usr/src
make obj
cd /usr/src/etc && env DESTDIR=/ make distrib-dirs
cd /usr/src
make build
reboot

Again make sure you’ve rebooted your system after having installed the new userland, then login as your regular user again and su to root.

Next we’ll build (~1hour) and install an updated Xenocara.

cd /usr/xenocara
rm -rf /usr/xobj/*
make bootstrap
make obj
make build
reboot

Next…

Step 3: Building a release

After having rebooted login as a regular user once again and su to root.

Then build a release like so.

export DESTDIR=/usr/dest
export RELEASEDIR=/usr/rel
mkdir -p ${DESTDIR} ${RELEASEDIR}
cd /usr/src/etc
make release
cd /usr/src/distrib/sets
sh checkflist

Then we do the same for Xenocara.

export DESTDIR=/usr/xdest
export RELEASEDIR=/usr/rel
mkdir -p ${DESTDIR} ${RELEASEDIR}
cd /usr/xenocara
make release

Next…

Step 4: Building Ports (optional)

Optionally you can build some ports, to include on your Stable ISO, for example…

cd /usr/ports/security/gnupg
env FLAVOR= make install
cd /usr/ports/shells/bash
make install
cd /usr/ports/editors/nano
make install
cd /usr/ports/www/links+ 
env FLAVOR=no_x11 make install
cd /usr/ports/net/wget
make install
cd /usr/ports/net/rsync
make install
cd /usr/ports/archivers/unzip
make install
cd /usr/ports/devel/gmake
make install
cd /usr/ports/lang/go
make install
cd /usr/ports/devel/git
make install

And so on…

You’ll note we’ve been using make install as opposed to make package, as make package won’t pull in dependencies that don’t matter at build-time, but likely will prevent the package from installing properly if missing.

Step 5: Create an ISO image

Prepare and populate a CD root tree.

mkdir -p /usr/cd/etc
echo 'set image /6.0/i386/bsd.rd' > /usr/cd/etc/boot.conf
mkdir -p /usr/cd/6.0/i386
cd /usr/cd
cp /usr/rel/* /usr/cd/6.0/i386
cp /usr/*.tar.gz /usr/cd/6.0

The release set include a miniature iso that merely includes the installation ramdisk which doesn’t make much sense to include on a full Stable ISO, so optionally we’ll remove that.

cd /usr/cd/6.0/i386
rm cd60.iso 
rm SHA256; cksum -a sha256 * > SHA256

Then optionally add checksums for the source tarballs.

cd /usr/cd/6.0
rm SHA256; cksum -a sha256 *.tar.gz > SHA256

Optionally add the packages built from ports.

mkdir -p /usr/cd/6.0/packages/i386
cp /usr/ports/packages/i386/all/*.tgz /usr/cd/6.0/packages/i386

Then optionally add checksums for the packages built from ports.

cd /usr/cd/6.0/packages/i386
rm SHA256; cksum -a sha256 * > SHA256

And finally build the ISO image.

cd /usr/cd
mkhybrid -v -a -r -L -l -d -D -N \
         -sysid OPENBSD \
         -V OPENBSD \
         -volset OPENBSD \
         -p "PREPARER NAME" \
         -P "PUBLISHER NAME" \
         -b 6.0/i386/cdbr \
         -c 6.0/i386/boot.cat \
         -o ../unofficial-openbsd-stable-6.0.5-20160903-i386.iso .

Since OpenBSD 5.5, both the base system and packages are signed for proper releases, the result of the above procedure will produce an unsigned base system and packages, resulting in (expected) signature warnings during installation.

Step 6: Burn

Once you have your freshly mastered ISO, you can burn it to your favorite brand of CD-R:

cdio -f cd0c tao -s 8 unofficial-openbsd-stable-6.0.4-20160903-i386.iso

And don’t forget to buy the official release media if you use OpenBSD in any significant capacity, as the project can really use your support.

Firefox and Color Management

For some time now Firefox has been capable of doing some level of color management of web content. Though there have always been caveats. Currently Firefox (version 26) enables color management only for images that have explicitly been tagged with a color profile (which isn’t that common yet). This default behavior results in a number of problems.

When an image is tagged with a color profile Firefox converts that image to your display profile (if configured), or otherwise to sRGB. Untagged images and other color elements defined by CSS for example, are assumed to be sRGB and are not converted to your display profile (if configured), even though they should. This means if you do not have a display profile configured, everything is well, since everything is either sRGB or is converted to sRGB. However if you do have a display profile configured, particularly if your display significantly deviates from sRGB, you might notice page elements which are composited from multiple sources (tagged images and CSS for example) having mismatching colors. This is essentially a bug, all page elements should always be converted to the same colorspace (whether that be sRGB or the display profile).

Firefox versions predating 19 required the user to manually configure a specific display profile, but since version 19 Firefox should automatically pick up on a system display profile if properly configured.

So to get Firefox to do complete color management, you’ll need to set a few parameters using about:config, or you can do the following on Ubuntu to enable it system wide:

$ sudo sh -c 'echo "pref(\"gfx.color_management.rendering_intent\", 0);" >> /etc/firefox/syspref.js'
$ sudo sh -c 'echo "pref(\"gfx.color_management.mode\", 1);" >> /etc/firefox/syspref.js'
$ sudo sh -c 'echo "pref(\"gfx.color_management.enablev4\", true);" >> /etc/firefox/syspref.js'

IMPORTANT: You do need to be aware that enabling these features means slightly increasing Firefox’s security surface.

Display Color Profiling (on Linux)

Attention: This article is a work in progress, based on my own practical experience up until the time of writing, so you may want to check back periodically to see if it has been updated.

This article outlines how you can calibrate and profile your display on Linux, assuming you have the right equipment (either a colorimeter like for example the i1 Display Pro or a spectrophotometer like for example the ColorMunki Photo). For a general overview of what color management is and details about some of it’s parlance you may want to read this before continuing.

A Fresh Start

First you may want to check if any kind of color management is already active on your machine, if you see the following then you’re fine:

$ xprop -display :0.0 -len 14 -root _ICC_PROFILE
_ICC_PROFILE:  no such atom on any window.

However if you see something like this, then there is already another color management system active:

$ xprop -display :0.0 -len 14 -root _ICC_PROFILE
_ICC_PROFILE(CARDINAL) = 0, 0, 72, 212, 108, 99, 109, 115, 2, 32, 0, 0, 109, 110

If this is the case you need to figure out what and why… For GNOME/Unity based desktops this is fairly typical, since they extract a simple profile from the display hardware itself via EDID and use that by default. I’m guessing KDE users may want to look into this before proceeding. I can’t give much advice about other desktop environments though, as I’m not particularly familiar with them. That said, I tested most of the examples in this article with XFCE 4.10 on Xubuntu 14.04 “Trusty”.

Display Types

Modern flat panel displays are comprised of two major components for purposes of our discussion, the backlight and the panel itself. There are various types of backlights, White LED (most common nowadays), CCFL (most common a few years ago), RGB LED and Wide Gamut CCFL, the latter two of which you’d typically find on higher end displays. The backlight primarily defines a displays gamut and maximum brightness. The panel on the other hand primarily defines the maximum contrast and acceptable viewing angles. Most common types are variants of IPS (usually good contrast and viewing angles) and TN (typically mediocre contrast and poor viewing angles).

Display Setup

There are two main cases, there a laptop displays, which usually allow for little configuration, and regular desktop displays. For regular displays there are a few steps to prepare your display to be profiled, first you need to reset your display to it’s factory defaults. We leave the contrast at it’s default value. If your display has a feature called dynamic contrast you need to disable it, this is critical, if you’re unlucky enough to have a display for which this cannot be disabled, then there is no use in proceeding any further. Then we set the color temperature setting to custom and set the R/G/B values to equal values (often 100/100/100 or 255/255/255). As for the brightness, set it to a level which is comfortable for prolonged viewing, typically this means reducing the brightness from it’s default setting, this will often be somewhere around 25-50 on a 0-100 scale. Laptops are a different story, often you’ll be fighting different lighting conditions, so you may want to consider profiling your laptop at it’s full brightness. We’ll get back to the brightness setting later on.

Before continuing any further, let the display settle for at least half an hour (as it’s color rendition may change while the backlight is warming up) and make sure the display doesn’t go into power saving mode during this time.

Another point worth considering is cleaning the display before starting the calibration and profiling process, do keep in mind that displays often have relatively fragile coatings, which may be deteriorated by traditional cleaning products, or easily scratched using regular cleaning cloths. There are specialist products available for safely cleaning computer displays.

You may also want to consider dimming the ambient lighting while running the calibration and profiling procedure to prevent (potential) glare from being an issue.

Software

If you’re in a GNOME or Unity environment it’s highly recommend to use GNOME Color Manager (with colord and argyll). If you have recent versions (3.8.3, 1.0.5, 1.6.2 respectively), you can profile and setup your display completely graphically via the Color applet in System Settings. It’s fully wizard driven and couldn’t be much easier in most cases. This is what I personally use and recommend. The rest of this article focuses on the case where you are not using it.

Xubuntu users in particular can get experimental packages for the latest argyll and optionally xiccd from my xiccd-testing PPAs. If you’re using a different distribution you’ll need to source help from it’s respective community.

Report On The Uncalibrated Display

To get an idea of the displays uncalibrated capabilities we use argyll’s dispcal:

$ dispcal -H -y l -R
Uncalibrated response:
Black level = 0.4179 cd/m^2
50%   level = 42.93 cd/m^2
White level = 189.08 cd/m^2
Aprox. gamma = 2.14
Contrast ratio = 452:1
White     Visual Daylight Temperature = 7465K, DE 2K to locus =  3.2

Here we see the display has a fairly high uncalibrated native whitepoint at almost 7500K, which means the display is bluer than it should be. When we’re done you’ll notice the display becoming more yellow. If your displays uncalibrated native whitepoint is below 6500K you’ll notice it becoming more blue when loading the profile.

Another point to note is the fairly high white level (brightness) of almost 190 cd/m2, it’s fairly typical to target 120 cd/m2 for the final calibration, keeping in mind that we’ll lose 10 cd/m2 or so because of the calibration itself. So if your display reports a brightness significantly higher than 130 cd/m2 you may want to considering turning down the brightness another notch.

Calibrating And Profiling Your Display

First we’ll use argyll’s dispcal to measure and adjust (calibrate) the display, compensating for the displays whitepoint (targeting 6500K) and gamma (targeting industry standard 2.2, more info on gamma here):

$ dispcal -v -m -H -y l -q l -t 6500 -g 2.2 asus_eee_pc_1215p

Next we’ll use argyll’s targen to generate measurement patches to determine it’s gamut:

$ targen -v -d 3 -G -f 128 asus_eee_pc_1215p

Then we’ll use argyll’s dispread to apply the calibration file generated by dispcal, and measure (profile) the displays gamut using the patches generated by targen:

$ dispread -v -N -H -y l -k asus_eee_pc_1215p.cal asus_eee_pc_1215p

Finally we’ll use argyll’s colprof to generate a standardized ICC (version 2) color profile:

$ colprof -v -D "Asus Eee PC 1215P" -C "Copyright 2013 Pascal de Bruijn" \
          -q m -a G -n c asus_eee_pc_1215p
Profile check complete, peak err = 9.771535, avg err = 3.383640, RMS = 4.094142

The parameters used to generate the ICC color profile are fairly conservative and should be fairly robust. They will likely provide good results for most use-cases. If you’re after better accuracy you may want to try replacing -a G with -a S or even -a s, but I very strongly recommend starting out using -a G.

You can inspect the contents of a standardized ICC (version 2 only) color profile using argyll’s iccdump:

$ iccdump -v 3 asus_eee_pc_1215p.icc

To try the color profile we just generated we can quickly load it using argyll’s dispwin:

$ dispwin -I asus_eee_pc_1215p.icc

Now you’ll likely see a color shift toward the yellow side. For some possibly aged displays you may notice it shifting toward the blue side.

If you’ve used a colorimeter (as opposed to a spectrophotometer) to profile your display and if you feel the profile might be off, you may want to consider reading this and this.

Report On The Calibrated Display

Next we can use argyll’s dispcal again to check our newly calibrated display:

$ dispcal -H -y l -r
Current calibration response:
Black level = 0.3432 cd/m^2
50%   level = 40.44 cd/m^2
White level = 179.63 cd/m^2
Aprox. gamma = 2.15
Contrast ratio = 523:1
White     Visual Daylight Temperature = 6420K, DE 2K to locus =  1.9

Here we see the calibrated displays whitepoint nicely around 6500K as it should be.

Loading The Profile In Your User Session

If your desktop environment is XDG autostart compliant, you may want to considering creating a .desktop file which will load the ICC color profile during all users session login:

$ cat /etc/xdg/autostart/dispwin.desktop
[Desktop Entry]
Encoding=UTF-8
Name=Argyll dispwin load color profile
Exec=dispwin -I /usr/share/color/icc/asus_eee_pc_1215p.icc
Terminal=false
Type=Application
Categories=

Alternatively you could use colord and xiccd for a more sophisticated setup. If you do make sure you have recent versions of both, particularly for xiccd as it’s still a fairly young project.

First we’ll need to start xiccd (in the background), which detects your connected displays and adds it to colord‘s device inventory:

$ nohup xiccd &

Then we can query colord for it’s list of available devices:

$ colormgr get-devices

Next we need to query colord for it’s list of available profiles (or alternatively search by a profile’s full filename):

$ colormgr get-profiles
$ colormgr find-profile-by-filename /usr/share/color/icc/asus_eee_pc_1215p.icc

Next we’ll need to assign our profile’s object path to our display’s object path:

$ colormgr device-add-profile \
           /org/freedesktop/ColorManager/devices/xrandr_HSD121PHW1_70842_pmjdebruijn_1000 \
           /org/freedesktop/ColorManager/profiles/icc_e7fc40cb41ddd25c8d79f1c8d453ec3f

You should notice your displays color shift within a second or so (xiccd applies it asynchronously), assuming you haven’t already applied it via dispwin earlier (in which case you’ll notice no change).

If you suspect xiccd isn’t properly working, you may be able to debug the issue by stopping all xiccd background processes, and starting it in debug mode in the foreground:

$ killall xiccd
$ G_MESSAGES_DEBUG=all xiccd

Also in xiccd‘s case you’ll need to create a .desktop file to load xiccd during all users session login:

$ cat /etc/xdg/autostart/xiccd.desktop
[Desktop Entry]
Encoding=UTF-8
Name=xiccd
GenericName=X11 ICC Daemon
Comment=Applies color management profiles to your session
Exec=xiccd
Terminal=false
Type=Application
Categories=
OnlyShowIn=XFCE;

You’ll note that xiccd does not need any parameters, since it will query colord‘s database what profile to load.

If your desktop environment is not XDG autostart compliant, you need to ask them how to start custom commands (dispwin or xiccd respectively) during session login.

Dual Screen Caveats

Currently having a dual screen color managed setup is complicated at best. Most programs use the _ICC_PROFILE atom to get the system display profile, and there’s only one such atom. To resolve this issue new atoms were defined to support multiple displays, but not all applications actually honor them. So with a dual screen setup there is always a risk of applications applying the profile for your first display to your second display or vice versa.

So practically speaking, if you need a reliable color managed setup, you should probably avoid dual screen setups altogether.

That said, most of argyll’s commands support a -d parameter for selecting which display to work with during calibration and profiling, but I have no personal experience with them whatsoever, since I purposefully don’t have a dual screen setup.

Application Support Caveats

As my other article explains display color profiles consist of two parts, one part (whitepoint & gamma correction) is applied via X11 and thus benefits all applications. There is however a second part (gamut correction) that needs to be applied by the application. And application support for both input and display color management vary wildly. Many consumer grade applications have no color management awareness whatsoever.

Firefox can do color management and it’s half-enabled by default, read this to properly configure Firefox.

GIMP for example has display color management disabled by default, you need to enable it via it’s preferences.

Eye of GNOME has display color management enabled by default, but it has nasty corner case behaviors, for example when a file has no metadata no color management is done at all (instead of assuming sRGB input). Some of these issues seem to have been resolved on Ubuntu Trusty (LP #272584).

Darktable has display color management enabled by default and is one of the few applications which directly support colord and the display specific atoms as well as the generic _ICC_PROFILE atom as fallback. There are however a few caveats for darktable as well, documented here.

Video Sharpening Before Encoding

Often after encoding a video, and playing it back, it seldomly looks as good as I’d expect, especially compared to professionally produced DVDs for example. Now while there are likely a multitude of differences, one of them seems to be acutance (perceived sharpness), which can be enhanced by applying some sharpening after scaling the source material before encoding the video.

The following small script encodes a source video input to a anamorphic widescreen PAL DVD resolution WebM file at a nominal (total) bitrate of 2Mbit/sec, while applying some sharpening:

#!/bin/sh
INPUT="$1"
SCALE_OPTS="-sws_flags lanczos -s 720x576 -aspect 16:9"
SHARP_OPTS="-vf unsharp=3:3:0.5:3:3:0"
VIDEO_OPTS="-vcodec libvpx -g 120 -lag-in-frames 15 -deadline good -profile 0 -qmax 51 -qmin 11 -slices 4 -b 1800000 -maxrate 3800000"
AUDIO_OPTS="-acodec libvorbis -ac 2 -ab 192000"

avconv -i "${INPUT}" ${SCALE_OPTS} ${SHARP_OPTS} ${VIDEO_OPTS}           -an -pass 1 -f webm -y "out-${INPUT}.webm"
avconv -i "${INPUT}" ${SCALE_OPTS} ${SHARP_OPTS} ${VIDEO_OPTS} ${AUDIO_OPTS} -pass 2 -f webm -y "out-${INPUT}.webm"

 

Now what particularly matters are the unsharp parameters, which can be divided into two triplets, the first set of three: luma (brightness (greyscale) information) and the second set of three: chroma (color information). Each of these two sets has three parameters, of which the first two are the horizontal and vertical matrix dimensions (e.g. the evaluation area), in our case a small matrix of 3 by 3 is the only configuration that makes sense. A 5 by 5 matrix is possible but will give a exagerated effect and halo artifacts. The last parameter in the triplets is the strength (respectively 0.5 for luma, and 0 for chroma), which means we’re enhancing acutance for luma, and we’re leaving the chroma unmodified. For luma strength values between 0.5 and 1.0 are likely to be the useful range depending on taste and source material. Typically you’d want to leave chroma be, but in some cases you could possibly use this as a poor mans denoising method by specifying negative values, which effectively turns it into a blur effect.

 

Display profiles generated from EDID

If you’re running a GNOME or Unity desktop (and probably recent versions of KDE too), you may notice differences in color rendition between different applications. The difference you’re seeing is between applications that apply the system configured display profile and those that don’t. For example Eye of GNOME and Darktable do this by default, GIMP for example doesn’t…

Now, as many people have noticed most displays render color quite differently, and display profiles are a technical means to correct that to some degree. There are several means for obtaining a display profile, one is to buy a measurement device (called a colorimeter) and actually measure your particular display. Some vendors supply a display profile ICC file on a CD that came with the display. And lastly more recent displays apparently provide information which can be used to generate a display profile via EDID (which is a protocol for information exchange via VGA/DVI/HDMI). The respective methods have been listed in order of decreasing accuracy. For a bit more in-depth information you might want to consider reading this.

At least since distributions have been shipping colord and GNOME Color Manager (so I’m guessing since Oneiric for Ubuntu users), colord actually queries your display via the EDID protocol, to extract the required information to generate an automatic display profile, which allow certain applications to correct for the displays behavior.

We Need You

Now, recently we’ve begun to have the impression that some vendors may be shipping bogus information in their displays (possibly under the assumption that it would not be the used anyhow). But currently we have no information to substantiate this.

Please read this and this first before continuing.

I’d like to ask you, to submit your EDID generated profile to the gnome-color-manager-list (you can post to the list without subscribing, your submission will be moderated and thus will take a few days to turn up) including the following:

  • Subject: [EDID] Display Make + Model
  • Attach ~/.local/share/icc/edid-*.icc
  • Display Make (if it’s a laptop, then the laptop make)
  • Display Model (if it’s a laptop, then the laptop model)
  • The Displays Age (approx. how long ago did you buy it)
  • Duty Cycle (light usage on average a few hours a day, heavy usage approx. 8 or more hours a day).
  • The output of  xprop -display :0.0 -len 14 -root _ICC_PROFILE
  • Subjective Impression (download this SmugMug calibration image, and load it into GIMP, then go the GIMP’s Preferences, to go Color Management, and then check/uncheck Try to use system monitor profile while keeping an eye on the image, tell us what looks most realistic to you (checked/unchecked) and why…

After more than 2000 submissions colord-0.1.34 was released which should detect and disable cases where the displays are supplying bogus information via EDID. Based on the current statistics it seems 4% (or thereabouts) of the displays supply bad information.

Working around bad EDID

Assuming some vendors actually provide bad information via EDID, you might need a way to disable this automatically generated profile. In older versions of GNOME Color Manager (3.6 and earlier) there wasn’t an easy way to disable this. There is however a feasible workaround. Install argyll on your system. Then assign /usr/share/color/argyll/ref/sRGB.icm to your display. (Go to the GNOME System Settings, Choose the Color applet, Choose your display, click Add Profile, select Other Profile, and then select Argyll’s sRGB.icm).

Missing Memory Card Icons on Ubuntu

Depending on your type of memory card reader, you may have noticed the following on Ubuntu (and possibly other distributions too). When you connect a USB flashdrive to your system an icon pops up informing you the drive has been mounted. When you insert (for example) an SD card into your cardreader, it may happen that another icon pops up using the same icon as the flashdrive.

While this isn’t the biggest problem in the world, it’s certainly a nuisance, as you’d need to hover over each icon to see the tooltip which explains to you which icon represents what. Ideally you’d want the SD card to show up with an appropriate SD card icon.

Which icon is displayed ultimately depends on disk management done by udisks and more importantly udev. In /lib/udev/rules.d/80-udisks.rules (do NOT modify this file) we find the following rules:

SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*SD_Reader*", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*Reader*SD*", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*CF_Reader*", ENV{ID_DRIVE_FLASH_CF}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*SM_Reader*", ENV{ID_DRIVE_FLASH_SM}="1"
SUBSYSTEMS=="usb", ENV{ID_MODEL}=="*MS_Reader*", ENV{ID_DRIVE_FLASH_MS}="1"

The above rules are matched against the device names which are passed to the kernel. With one of my cardreaders, this sadly doesn’t match:

$ dmesg | grep -i Direct-Access
scsi 12:0:0:0: Direct-Access     Generic  Compact Flash    0.00 PQ: 0 ANSI: 2
scsi 12:0:0:1: Direct-Access     Generic  SM/xD-Picture    0.00 PQ: 0 ANSI: 2
scsi 12:0:0:2: Direct-Access     Generic  SDXC/MMC         0.00 PQ: 0 ANSI: 2
scsi 12:0:0:3: Direct-Access     Generic  MS/MS-Pro/HG     0.00 PQ: 0 ANSI: 2

To create new rules, we first need to figure out what USB vendor/product IDs belong to the cardreader, you can just identify USB devices attached to your computer like so:

$ lsusb
Bus 002 Device 012: ID 048d:1345 Integrated Technology Express, Inc. Multi Cardreader

Just run the command once before attaching the device and once after attaching the device and look for the differences, typically it’ll be the last device in the list. Once we have this information create a new file (replace pmjdebruijn with your own nickname, use exclusively alphanumeric characters):

$ sudo nano -w /etc/udev/rules.d/80-udisks-pmjdebruijn.rules

In this file we put the following lines:

# ITE, Hama 00055348 V4 Cardreader 35 in 1 USB
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:0", ENV{ID_DRIVE_FLASH_CF}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:1", ENV{ID_DRIVE_FLASH_SM}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:2", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="048d", ATTRS{idProduct}=="1345", ENV{ID_INSTANCE}=="0:3", ENV{ID_DRIVE_FLASH_MS}="1"

You’ll notice the idVendor and idProduct coming from the lsusb line above, also the ID_INSTANCE needs to have matching LUNs with the dmesg lines above. Once you’re done, doublecheck and save the file, and then you can reload the udev rules:

$ sudo udevadm control --reload-rules

Any newly mounted memory cards should get a proper icon now.

Not all cardreaders may be as easy as illustrated as above, for example I have a wonderful cardreader that provides no useful information at all:

$ lsusb
Bus 002 Device 009: ID 05e3:0716 Genesys Logic, Inc. USB 2.0 Multislot Card Reader/Writer
$ dmesg | grep -i Direct-Access
scsi 6:0:0:0: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:1: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:2: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:3: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0
scsi 6:0:0:4: Direct-Access     Generic  STORAGE DEVICE   9744 PQ: 0 ANSI: 0

In such a particular case, you’ll need to experiment by actually inserting various types of memory cards, and checking what device got mounted, and what LUN is it, in the following example I inserted an SD card, which got mounted as sdk, which turns out to be LUN 0:2, which we need for the ID_INSTANCE entries:

$ mount | grep media
/dev/sdk1 on /media/FC30-3DA9 type vfat (rw,nosuid,nodev,uid=1000,gid=1000,shortname=mixed,dmask=0077,utf8=1,showexec,flush,uhelper=udisks)
$ dmesg | grep sdk
sd 12:0:0:2: [sdk] Attached SCSI removable disk
sd 12:0:0:2: [sdk] 248320 512-byte logical blocks: (127 MB/121 MiB)
sd 12:0:0:2: [sdk] No Caching mode page present
sd 12:0:0:2: [sdk] Assuming drive cache: write through
sd 12:0:0:2: [sdk] No Caching mode page present
sd 12:0:0:2: [sdk] Assuming drive cache: write through
 sdk: sdk1

Another peculiarity (or feature) of this drive is that it has 5 LUNs instead of 4, this is because it actually has two SD card slots, one for full size SD cards and one for microSD cards. In the end, after some fiddling, I ended up with:

# Genesys Logic, Conrad SuperReader Ultimate
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:0", ENV{ID_DRIVE_FLASH_CF}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:1", ENV{ID_DRIVE_FLASH_SM}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:2", ENV{ID_DRIVE_FLASH_SD}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:3", ENV{ID_DRIVE_FLASH_MS}="1"
SUBSYSTEMS=="usb", ATTRS{idVendor}=="05e3", ATTRS{idProduct}=="0716", ENV{ID_INSTANCE}=="0:4", ENV{ID_DRIVE_FLASH_SD}="1"