What is InSAR?
InSAR stands for Interferometric Synthetic Aperture Radar. This is thus a remote sensing technique that uses radar satellite images. Those radar satellite (ERS1, ERS2, JERS, IRS or Radarsat) shoot constantly beams of radar waves towards the earth and record them after they bounced back off the Earth's surface.
Two information compose the images. One contains how much of the wave bounced back off to the satellite (signal intensity). That depends on how much of the wave has been absorbed on the way and how much has been reflected in the direction of the satellite.
The second information is the 'phase' of the wave. When a wave travels through space, we can think of it as a hand on a clock. It starts on 12 when the wave leaves the satellite. The 'hand' (phase) keeps running round and round the clock until the wave reaches the ground. When the wave hits the ground, the hand stops and indicating a certain 'time' or 'phase'. When the wave comes back to the satellite, it tells the satellite on what value the hand/phase got stopped.
Every point in a satellite image (pixel) is carrying those two information: the intensity and and the phase. The intensity can be used to characterize the material in which the surface the wave bounced off is made of and what orientation it has. Oil leaks on the sea, for instance, can be spotted in that way. They look much smoother than the water surrounding.
The phase is used in another way. When the radar satellite revisists the exact same portion of the Earth, the phase image should be identical. If it is not the case, then something has been going on. And by combining those two images, we can measure how much and where the ground has moved.
Our research focuses on tectonic and volcanic ground deformations. And sometimes, very unexpectedly, we find that human impact is much more important than we would have thought before.
源文档
How satellite radar interferometry works?
Return signal from satellite holds the key
The technical details of how and why radar interferometry works are rooted in physics and radar engineering, but for our purposes a much simpler explanation will suffice. A pulse of radar energy is successively emitted from a satellite (left), scattered by the Earth's surface, and recorded back at the satellite (right). The radar energy received by the satellite contains two important types of information.
Figures from T. Freeman, Jet Propulsion Laboratory
The first type information is encoded in the strength or amplitude(振幅强度) of the return signal, which is influenced by various physical properties of the surface including ground slope, particle size (i.e., sand versus boulders), and soil moisture. The ERS satellites record return signal strength from a continuous swath of the Earth's surface about 100 km wide (60 mi) wide, and scientists on the ground assemble this information in the form of a radar image. The image is a portrayal of the surface that resembles a conventional photograph in some ways, but not entirely. Think of the difference between a conventional photo and an infrared image, which shows warm areas as bright irrespective of their brightness in visible light. Radar images differ from conventional photos in a similar way. The second type of information contained in the return radar signal has to do with the round trip distance(往返距离) from the satellite to the ground and back again. We can think of a radar pulse as an invisible tape measure calibrated in units of the radar wavelength. We call the fractional part of the round trip distance the phase of the return signal. For the ERS satellites, the radar wavelength is 5.66 cm (2.2 inches).
If we were able to acquire two radar images at different times from exactly the same vantage point in space and compare them, any movement of the ground surface toward or away from the satellite would show up as a phase difference(相位差) between the images. For example, if a point on the ground moved toward the satellite (mostly upward) by one-half wavelength, the phase of the return signal from that point would increase by one full wavelength relative to the first image. It isn't possible to steer a satellite accurately enough to return it to exactly the same point in space on different orbits, but it's relatively easy to get within a few hundred feet and then do the necessary geometric corrections.
Combining or "interfering" images from different satellite passes
It turns out that the most accurate way to measure small phase changes(相变) is to combine two images together after all of the necessary corrections have been made. This process is sometimes called "interfering"(干涉) the images, because combining two waves causes them to either reinforce or cancel one another, depending on the relative phases. For example, you may have observed interference between two sources of water waves on a pond. Now imagine that we can keep track of all the places where two radar images reinforce one another(相互加强), and all the places they cancel one another(相互抵消). We'll represent the first case as a red pixel in a new image that we'll create, and the second case as a blue pixel. Intermediate cases(中间情形) will be represented as intermediate colors of the spectrum from red to blue. The resulting image is called an interferogram.
The properties of waves are such that we can't tell the difference between waves that reinforce one another because they are exactly in phase with(同相) one another, or out of phase(异相) by any number of whole wavelengths (1, 2, 3...). As a result, an interferogram for an area that domed upward during the time interval between two radar images would show a concentric pattern(同心纹) of color bands(色带), called fringes(干涉条纹), not unlike the contours on a topographic map (above). In this case, though, each fringe would represent just one-half wavelength of surface movement toward the satellite--nearly 3 cm for ERS (just over an inch). To determine the total amount of movement, we only have to count the number of fringes. Our geodetic camera is ready to track volcano deformation from space!
源文档