3 — Remote Sensing Floods

Brianna Pagan is first and foremost an environmentalist with a passion for making science accessible and understandable to the masses.

Over the past 10 years, she has worked on local and international environmental efforts in four different countries. Her research and expertise are in satellite-based remote sensing, eco-hydrology and climate with specific focuses on drought and climate change scenario planning. She is currently the Director of Remote Sensing at SpaceSense.ai, working to further democratize space intelligence for environmental purposes.

www.briannapagan.com

Transcript

Good morning everyone. First, thanks to Selena and Pascal for organizing this session. I’ve been really looking forward to participating and helping facilitate the conversations between scientists and artists. As Selena said, my name is Brianna and I’m going to be giving a space perspective of how I look at water, specifically water for global purposes and global studies.

To open the discussion, I want to first define what remote sensing is, before I explain what a remote-sensing engineer does. Remote sensing is a technique to obtain information from a distance, so any observations that we make are indirect measurements. If I want to measure how much rain is falling outside, I can put out a cup of water, it can capture the rain, I can measure that volume, and I can tell you exactly how much rain fell. But that’s not very possible to scale globally—we cannot put a sensor to measure rainfall on every piece of land; we can’t put a sensor out to measure soil moisture for every crop globally.

That’s where the power of remote sensing—especially remote sensing from satellites—comes into play. It allows us to monitor, globally, environmental conditions where those constant indirect measurements are just not possible. Satellites are doing this by flying around the world and essentially taking photos. We’re all familiar with photos and the nature of taking photos, but in order to understand how we’re getting information from photos, we have to understand something called the electromagnetic spectrum. We as humans see colors right in the visible spectrum of light. We have some source of energy, usually coming from the sun or some source of light, that reflects back different colors which our eyes are able to see; and our brains can process that information. But those visible wavelengths are actually only a very small part of the wavelength spectrum. There are other wavelengths where every object is emitting energy in those wavelengths, including UV or infrared microwave radio wavelengths. Different satellites that are orbiting the earth or going into deep space are taking photos using those different wavelengths—I think the Hubble Space Telescope is probably the most famous. We’ve seen really remarkable and beautiful visible imagery from the Hubble Space Telescope, but we also are sending different satellites out for different purposes.

Like I mentioned, every object is reflecting some energy through a variety of these wavelengths, and how much that object is reflecting through different wavelengths gives it a sense of a spectral signature—which is like a fingerprint. If I take a picture of bare soil or a plant in the visible wavelength, we can see different features like how green the plant is. If I take that same photo but use a micro-wavelength, an infrared wavelength, then I can tell different features of that object—more specifically, in infrared and micro-wavelengths we can actually detect water and see how much water is in soils or in plants.

Image made by a “typical optical satellite.”

To give a few very simple visualizations of what we’re seeing from satellites, let’s say we have a typical optical satellite, which is again taking pictures in our visible wavelength spectrum, which we can understand, with our own eyesight—I can clearly make out in this image different crop fields with different crops, different coloration; some crops are maybe doing worse than others. I can see the city; I can make out the road features. But while we use satellites using optical data globally and widely, a huge limitation is just clouds—it’s cloudy, especially where I’m based, in northern Europe, in Paris. We sometimes have clouds for fifty or sixty percent of the year, so we can’t just rely on visible wavelengths. Also, if the satellite is passing over at night, if it’s just dark out, we have no source of energy—and we can’t record visible wavelengths from that.

But there are other types of satellites that, again, are taking photos in different parts of the spectrum. One example is something we call SAR: synthetic-aperture radar, a very fancy word. Basically, instead of relying on the sun for its source of energy to reflect back and record data, we’re using the satellite itself to send down its own signal, using its own wavelength, and then recording that back in a different spectrum, so maybe in the infrared or in the microwave.

Image created with synthetic-aperture radar.

And when we receive this information to our naked eye, it looks like this gray part of the image—we can’t understand it because we’re used to seeing things in the visible spectrum. We’re now looking at different properties of this field on different wavelengths, and using different scientific algorithms and different techniques to interpret that data.

Now I want to show you a very real use case of how I’m using satellite data to answer the question of how I can observe, monitor, and quantify flux. This is an interesting case study, because if it’s flooding, it’s raining, and that means there are clouds—so we can’t rely on the typical visible spectrum picture from space. We have to use one of these other satellites, the SAR satellites, to be able to evaluate that. I’m going to do a quick demonstration of a large flooding event that occurred in Mozambique:

This is our case study area in again a visual RGB photo we can understand—this is clearly a river; there’s clearly some vegetation here. Before the flood, if I took a picture using that SAR imagery, this is what it would look like.

I can easily distinguish the water in this photo. The water in this photo is very black, and I can see the river outline very clearly. If I show you the same image, but after the flood, this is what I’m looking at.

If I take the difference between those two images, I can actually tell you these are the areas that were flooded, and I can bring in other sources of information—like population density, or where we know crops are being grown—and quantify: we had over 100,000 people exposed to the flood and we had a loss of 200,000 acres of land from this flood.

I just showed you a very small teaser, just one example of a satellite. I always like showing one other use case, and that is the image on the upper left.

That is a satellite called “Grace,” and instead of using just pictures it’s actually measuring the gravity around earth. It’s two twin satellites that are flying around earth, and are pulled closer to the earth or go farther away based on how much groundwater there is under the soil. If there is more groundwater, it’s a heavier gravitational pull, and the satellites will actually come together.

There are many different ways that we’re using satellites to really look at water—and it’s also not just satellites. The figure on the right is showing all the different techniques that we use for remote sensing, whether it’s drones, or doppler radar, et cetera; those are all remote sensing techniques. The bottom left photo is actually quite outdated—to date, there’s over 5,000 satellites orbiting the earth (over 2,000 active). I’m sure we’re all familiar with efforts like StarLink and Elon Musk, and this is only going to continue to grow. Already, we have petabytes of data coming in of observations of the world daily.

So it’s a very exciting time to be in the field, especially as we see efforts like machine-learning applications continue to grow, because we as humans are still learning how to interpret all of these images at such high frequency. It’s such global scales, and we can really use these advancements in technology to better interpret these images and handle all of the data coming in. There are lots of different applications, from hydrological to economic, but one of my personal favorites is environmental justice—and remote sensing allows us to level the playing field. There are countries who can afford to directly measure a lot of different environmental parameters and there are countries that cannot—but we would all agree that everyone should be able to know what their water quality is or how I am going to be impacted by a flood. Satellite-based remote sensing allows us to do that equally.

I want to leave with a quote that’s from Louis Pasteur that I thought could apply to science or art:

“Science knows no country because knowledge belongs to humanity and it’s the torch which illuminates the world. Science or art is the highest personification of the nation, because that nation will remain the first which carries the furthest the works of thoughts and intelligence.”

I think it connects with remote sensing because, globally, there are no borders naturally occurring on earth, and doing earth observations from satellites allows us to again more equally provide information to humanity. Thank you.