jump to navigation

Hyperspectral imaging July 3, 2010

Posted by Sarah in Uncategorized.
Tags:
add a comment

For Luminita Vese’s course, we’re supposed to be doing group projects on various topics in image processing. My group is looking into hyperspectral imaging, and I started reading about it and found some basic techniques.

Hyperspectral images are visualized in a three-dimensional cube: two dimensions for the image, and the third dimension for a spectrum of frequencies, potentially outside the visual range. Whereas a color image has three bands, red, green, and blue, a hyperspectral image has a continuum (or, in practice, a large number) of bands.
Hyperspectral images are useful for pathology (detecting benign from malignant tumors), agriculture, mineralogy, and surveillance (for interpreting high-altitude photographs.) The salient problem is: how do you identify an anomaly in the image? An “interesting” non-background object? (The tumor or missile or mineral deposit.)

I read about two techniques from Chein-I-Chang and Shao-Shan Chang, “Anomaly detection and classification for hyperspectral imagery,” IEEE Transactions on Geoscience and Remote Sensing, Vol. 40, No. 6, June 2002. (Link here.
These are the RX detector (RXD) and the low probability detector (LPD.) Both are matched filters; the RX detector uses the pixel currently being processed as the matched signal, while the low probability detector uses the ones vector.

The RX detector:
L: number of spectral bands
R: hyperspectral pixel vector.
\delta_{RXD} ( r) = (r - \mu)^T K_{L \times L}^{-1}(r - \mu)
Here \mu is the global sample mean, K is the sample covariance matrix.
How this works: interesting target pixels which occur with low probabilities in the data do not show up in the large components of the covariance matrix (as in PCA) but rather in the small ones. RXD is then a sort of inverse for PCA.

More specifically:
if we let \Lambda be the diagonal matrix of the L eigenvalues of K, there exists a matrix A such that
A^T K_{L \times L} A = \Lambda.
Then, if we let y = A^T r then
r^T K_{L\times L} r = (Ay)^T K_{L \times L} (Ay) = y^T[A^T K A]y
= y^T \Lambda y = \sum \lambda_l y_l^2
Using this, the RXD is
r^T K_{L \times L}^{-1} r = \sum \lambda_l^{-1} y_l^2
The smaller the eigenvalue, the greater the value.
Crucial and difficult step: determining how many targets there are, so you know where to draw the cut-off.

Uniform Target Detector (or Low Probability Detector.)
\delta_{UTD}(r) = (1 - \mu)^T K_{L \times L}^{-1} (r-\mu)
Unlike the RXD, the UTD uses the ones vector as the matched signal. This means assuming that an anomalous target may have radiance uniformly distributed over all the spectral bands. If we know more about what wavelengths we’re interested in, we can change the matched signal. The UTD tends to extract the background; subtracting it from the RXD can increase accuracy of anomaly detection.

Incidentally, hyperspectral images also reveal what was crossed out from the Declaration of Independence.

Advertisements