Vivid Headlines

¬AI model harnesses physics to autocorrect remote sensing data


¬AI model harnesses physics to autocorrect remote sensing data

Credit: Animation by Sara Levine | Pacific Northwest National Laboratory

Turbulence, temperature changes, water vapor, carbon dioxide, ozone, methane, and other gases absorb, reflect, and scatter sunlight as it passes through the atmosphere, bounces off the Earth's surface, and is collected by a sensor on a remote sensing satellite. As a result, the spectral data received at the sensor is distorted.

Scientists know this and have devised several ways to account for the atmosphere's corrupting influence on remote sensing data.

"This problem is as old as overhead imagery," said James Koch, a data scientist at Pacific Northwest National Laboratory (PNNL) who developed a new way to address the problem that uses a branch of artificial intelligence called physics-informed machine learning and along the way enhances remote sensing capabilities.

Koch presented a paper describing his physics-informed machine learning framework last week at the International Geoscience and Remote Sensing Symposium in Athens, Greece. This work is part of PNNL's remote exploitation capability and was supported by PNNL's Laboratory Directed Research and Development portfolio.

Scientists can solve the atmospheric corruption problem because they understand the physics of how the atmosphere distorts sunlight as it passes through the atmosphere. This allows them to remove the atmosphere's influence from the data collected at the sensor. The process is called atmospheric correction. An atmospheric transmission profile is generally required prior knowledge to perform atmospheric correction. The profile is a representation of the properties and composition of the atmosphere at different altitudes that shows how light at different wavelengths interacts with an atmosphere.

The process of creating an atmospheric transmission profile without prior knowledge is where Koch's AI technique is a potential game changer.

Today, many atmospheric correction applications rely on off-the-shelf tools that use generic, statistics-based atmospheric profiles. These tools are sufficient for time-sensitive tasks such as disaster response monitoring and are cost efficient when mapping a large area. Applications where high accuracy is paramount, such as target detection, require the data-intensive and computationally expensive creation of high-fidelity profiles.

"I've taken some of what the subject matter experts do on this high end and wrapped that into a machine learning pipeline so that I can do that process in a data-informed way," Koch said. "This is a meet-in-the-middle approach when higher fidelity is required but we don't necessarily have all the resources to identify all the properties associated with the atmosphere. We use the available data."

To train and evaluate the machine learning pipeline, Koch used a dataset of labeled overhead imagery of Cook City, Montana, that includes cars and pieces of fabric with known spectral signatures. He used 112 of them, or 0.05% of those available of the scene, and performed the training runs on a mid-range laptop computer.

The trained model can take pixels from any spectral scene to infer an atmospheric transmission profile and automatically perform atmospheric correction. At the core of the approach is a suite of differential equations that describe how sunlight changes as it passes through the atmosphere, bounces off a target, goes back up through the atmosphere, and hits a sensor.

"The constraint of the differential equation, that physics-informed machine learning, is the secret sauce for making sure that this works well," Koch said. "By construction, what this model can issue is a prediction that will satisfy the first-order physics."

In addition to performance that hits the middle range between the off-the-shelf models and the high-fidelity approach, Koch's framework is bidirectional -- it can both remove the influence of the atmosphere from a spectral scene collected by a remote sensor and infer how a material on the ground would appear if imaged through a particular atmosphere.

"Some things are highlighted or hidden depending on where things are observed," Koch explained. "It's not a one-stop shop. You've got to poke and prod at where things are most fruitful."

Remote sensing is used for tasks that run the gamut from drought and vegetation indices that track changes in photosynthetic activity and water content over time to the detection of methane plumes, activity at foreign military bases, and human traffic at border crossings.

Different approaches for atmospheric correction are applied to different scenarios, depending on factors such as time, cost, and available data.

PNNL intern Luis Cedillo, an undergraduate at the University of Texas El Paso, presented a conference poster at SPIE Defense and Commercial Sensing 2024 in National Harbor, Maryland, about using the physics-informed machine learning technique for coastal ecosystem health monitoring. He used the machine learning pipeline to jointly learn the profile of the atmosphere and coastal waters, unlocking a new capability to track the health of coral reefs from satellites.

The researchers are currently refining their approach with an eye toward applications where data is limited but high fidelity is required, such as target detection.

"The key benefit here is we can get good accuracy with a limited amount of data while not having to rely on a lot of prior knowledge in the sense of where the sensor was, or where the sun was," said Koch. "We're learning those things on the fly."

Previous articleNext article

POPULAR CATEGORY

entertainment

11450

discovery

5156

multipurpose

12062

athletics

11864