Step 5: Exporting & visualizing output

The whole point of this package is to make it easier to use other methods!

You can also tour the functions in the function gallery.


Making color maps is an obviously visual process, so it’s good to use visual feedback as much as possible. We’ve already seen a few of these functions in action, specifically plotColorPalette and plotImageArray, which are used in almost every function that produces a recolorize object. I’ll point out three others that I think are quite useful: imDist, plotColorClusters, and splitByColor (which also doubles as an export function).


Compares two versions of the same image by calculating the color distance between the colors of each pair of pixels (imDist), and gives you a few more options for plotting the results (imHeatmap). You can use it to get the distances between the original image and the color map:

The resulting object is a simple matrix of distances between each pair of pixels in the given color space. These are essentially residuals:

A word of warning here: it is easy to look at this and decide to come up with a procedure for automatically fitting color maps using a kind of AIC metric, trying to get the lowest SSE with the minimum set of color centers. You’re welcome to try that, but given that this is discarding spatial information, it is probably not a general solution (I haven’t had much luck with it). But there is probably some room to play here.


This is a dual-use function: by splitting up the color map into individual layers, you not only can examine the individual layers and decide whether they need any editing or merging, but you also get out a binary mask representing each layer, so you can export individual patches.


Exporting to images

The most direct thing you can do is simply export your recolored images as images, then pass those to whatever other tool you’d like to use, although obviously this doesn’t take full advantage of the format:

pavo package

You can also convert a recolorize object to a classify object in the wonderful pavo package and then run an adjacency analysis. Bonus points if you have reflectance spectra for each of your color patches: by combining the spatial information in the color map with the coldist object generated by spectral measurements, you can run adjacency analysis for the visual system(s) of your choice right out of the box!

You can also run an adjacency analysis with recolorize_adjacency, but only as long as you keep your skeptic hat on. This function works by calculating a coldist object right from the CIE Lab colors in the color maps, which are themselves probably derived from your RGB image, which is at best a very loose representation of how these colors appear to human eyes. The only reason this is at all reasonable is that it’s producing these values for human vision, so you will be able to see if it’s completely unreasonable. This is fine for getting some preliminary results or if you’re working with aggregate data from many sources and you’re content with specifically human (not just non-UV, but only human) vision. Otherwise, it’s probably a last resort.


Because of the way patternize is structured, going from recolorize to patternize is slightly more involved. See the GitHub ( for links to a worked example of this, but the general process is:

  1. Load images into patternize and align them into a list of rasters, using e.g. alignLan().
  2. Use brick_to_array to convert the list of rasters to a list of images.
  3. Cluster images as normal, ensuring that all images have the same set of colors.
  4. Use recolorize_to_patternize to convert clustered images back to a raster list.
  5. Proceed with downstream patternize analyses.