jump to navigation

Hiking July 8, 2010

Posted by Sarah in Uncategorized.
add a comment

We hiked to Sunset Peak in the Wasatch Mountains the other day; easily the most serious hike I’ve yet been on, but now I’m addicted. Humbling thought: the PCMI director runs this trail!
(I’m third from left.)

Fourth of July July 4, 2010

Posted by Sarah in Uncategorized.
Tags:
add a comment

Here in Utah, it comes with a 5k race, a parade, and fireworks.

I celebrate privately by reading Tom Paine.

The sun never shined on a cause of greater worth. ‘Tis not the affair of a city, a country, a province, or a kingdom, but of a continent- of at least one eighth part of the habitable globe. ‘Tis not the concern of a day, a year, or an age; posterity are virtually involved in the contest, and will be more or less affected, even to the end of time, by the proceedings now. Now is the seed time of continental union, faith and honor. The least fracture now will be like a name engraved with the point of a pin on the tender rind of a young oak; The wound will enlarge with the tree, and posterity read it in full grown characters.

I treasure living in America.

Hyperspectral imaging July 3, 2010

Posted by Sarah in Uncategorized.
Tags:
add a comment

For Luminita Vese’s course, we’re supposed to be doing group projects on various topics in image processing. My group is looking into hyperspectral imaging, and I started reading about it and found some basic techniques.

Hyperspectral images are visualized in a three-dimensional cube: two dimensions for the image, and the third dimension for a spectrum of frequencies, potentially outside the visual range. Whereas a color image has three bands, red, green, and blue, a hyperspectral image has a continuum (or, in practice, a large number) of bands.
Hyperspectral images are useful for pathology (detecting benign from malignant tumors), agriculture, mineralogy, and surveillance (for interpreting high-altitude photographs.) The salient problem is: how do you identify an anomaly in the image? An “interesting” non-background object? (The tumor or missile or mineral deposit.)

I read about two techniques from Chein-I-Chang and Shao-Shan Chang, “Anomaly detection and classification for hyperspectral imagery,” IEEE Transactions on Geoscience and Remote Sensing, Vol. 40, No. 6, June 2002. (Link here.
These are the RX detector (RXD) and the low probability detector (LPD.) Both are matched filters; the RX detector uses the pixel currently being processed as the matched signal, while the low probability detector uses the ones vector.

The RX detector:
L: number of spectral bands
R: hyperspectral pixel vector.
\delta_{RXD} ( r) = (r - \mu)^T K_{L \times L}^{-1}(r - \mu)
Here \mu is the global sample mean, K is the sample covariance matrix.
How this works: interesting target pixels which occur with low probabilities in the data do not show up in the large components of the covariance matrix (as in PCA) but rather in the small ones. RXD is then a sort of inverse for PCA.

More specifically:
if we let \Lambda be the diagonal matrix of the L eigenvalues of K, there exists a matrix A such that
A^T K_{L \times L} A = \Lambda.
Then, if we let y = A^T r then
r^T K_{L\times L} r = (Ay)^T K_{L \times L} (Ay) = y^T[A^T K A]y
= y^T \Lambda y = \sum \lambda_l y_l^2
Using this, the RXD is
r^T K_{L \times L}^{-1} r = \sum \lambda_l^{-1} y_l^2
The smaller the eigenvalue, the greater the value.
Crucial and difficult step: determining how many targets there are, so you know where to draw the cut-off.

Uniform Target Detector (or Low Probability Detector.)
\delta_{UTD}(r) = (1 - \mu)^T K_{L \times L}^{-1} (r-\mu)
Unlike the RXD, the UTD uses the ones vector as the matched signal. This means assuming that an anomalous target may have radiance uniformly distributed over all the spectral bands. If we know more about what wavelengths we’re interested in, we can change the matched signal. The UTD tends to extract the background; subtracting it from the RXD can increase accuracy of anomaly detection.

Incidentally, hyperspectral images also reveal what was crossed out from the Declaration of Independence.

PCMI (1) July 1, 2010

Posted by Sarah in Uncategorized.
Tags: , , , , ,
add a comment

First, let me get it out of the way: Utah is beautiful. I’ve wanted to go to Utah ever since I read A Primer on Mapping Class Groups and realized that there are people doing cool math out west. It’s amazing up here. Liquid blue sky, sagebrush on the mountains, columbines and lupines and pine martens…
It’s actually so pretty that it tempts me to get too ambitious with the trail running.

It’s pretty busy, as I’m taking both undergraduate and graduate courses.

Highlights:

Jared Tanner’s course on compressed sensing is, as far as anyone knows, the first set of lectures on this for an undergraduate audience. We showed that sparse recovery is possible for matrices in general position (checking this property is impractical, as it takes exponential time, but it happens that random matrices are in general position with high probability.) We went on to deal with Haar and Fourier bases, and we’re now studying basis pursuit and coherence.

Richard Baraniuk gave some graduate lectures on compressed sensing, from more of an application perspective. He gave a visual illustration that I really liked of why l^1 minimization is a better way to find sparsity than l^2 minimization — the l^1 ball is an octahedron, and it’s likely to intersect a line of random angle close to a coordinate axis. The l^2 ball is a sphere, which is less “concentrated” around the axes, and so is likely to intersect a line of random angle farther from a coordinate axis — that is, less sparse.

I’m now starting a course with Anna Gilbert about sparse approximation — she’s going to take more of the computational complexity approach.

The other nice thing about PCMI is meeting people you’ve only heard of through their research. I went hiking (and then pizza-eating) with Arthur Szlam, whom I knew about because he got his PhD from Yale and does the same kind of computational harmonic analysis that I’d (ideally) like to do. Talking to him was great — he runs at the prodigious rate of about three big mathematical insights per beer.