Abstract: Many remote-sensing applications produce large sets of images, such as hyperspectral images or time-indexed image
sequences. We explore methods to display such image sets by linearly projecting them onto basis functions designed for the red,
green, and blue (RGB) primaries of a standard tristimulus display, for the human visual system, and for the signal-to-noise ratio of
the dataset, creating a single color image. Projecting the data onto three basis functions reduces the information but allows each datapoint to be rendered by a single color. Principal components analysis is perhaps the most commonly used linear projection method, but it is data adaptive and, thus, yields inconsistent visualizations that may be difficult to interpret. Instead, we focus on designing fixed basis functions based on optimizing criteria in the perceptual colorspace CIELab and the standardized device colorspace sRGB. This approach yields visualizations with rich meaning that users can readily extract. Example visualizations are shown for passive radar video and Airborne Visible/Infrared Imaging Spectrometer hyperspectral imagery. Additionally, we show how probabilistic classification information can be layered on top of the visualization
Maya Gupta completed her Ph.D. in Electrical Engineering in 2003 at Stanford University as a National Science Foundation Graduate Fellow. She did her BS in Electrical Engineering and a
BA in Economics at Rice University, 1994-1997. From 1999-2003 she worked for Ricoh’s California Research Center as a color image processing research engineer. In the fall of 2003, she joined the EE faculty of the University of Washington as an Assistant Professor. She was awarded the 2007 Office of Naval Research Young Investigator Award and the 2007 Univ of Washington Dept. of Electrical Engineering Outstanding Teaching Award. More information about her research is available at her group’s webpage: idl.ee.washington.edu.