Research Areas
Wildfire emissions
Estimating the impact of prescribed fire on later wildfire burn severity using Sentinel-2A remote sensing data and land management records.

Wildfire, Prescribed Fire Smoke Modeling and Mitigation

Due to a warming climate, a legacy of fire suppression, and expanding development into the wildland-urban interface (WUI), the western US has experienced a recent rise in extreme wildfire seasons. Wildfires not only damage ecosystems and infrastructure but also degrade air quality and pose serious public health risks from smoke exposure. Prescribed ("Rx") fire is often promoted as a policy solution in the western US, yet its use remains limited in practice and few studies have evaluated its effectiveness against wildfire impacts. My research is motivated by key gaps in our understanding: (1) we lack observational and modeling systems to accurately project how scaling Rx fire use would affect air quality and health outcomes in the western US; and (2) the efficacy of past Rx fire treatments remains poorly quantified across varied landscapes and fire seasons. It remains unclear whether expanding Rx burning will reduce wildfire risk or simply add to the smoke burden without preventing future fires. Some of my recent work shows that Rx fire treatments, while modestly effective, are frequently least successful in the WUI, a central focus of wildfire policy. Such findings highlight the limitations of current wildfire strategies and underscore the need for data-driven, policy-relevant approaches to guide the proposed expansion of Rx fire.


Publications: Kelp et al., (2023) Earth's Future

Co-Authored Wildfire Publications: Qiu et al., (2024), Liu et al., (2024)


Machine-learned solver
Deep learning chemical solver embedded in a 3-D global chemical transport model.

Deep Learning Atmospheric Chemistry

Ozone is a harmful surface pollutant to human health and an important greenhouse gas in the free troposphere. Despite being the longest- and most-measured trace gas in our observational record, global atmospheric models show wide disagreement in ozone’s spatial and temporal patterns and trends. The modeling of tropospheric ozone expresses the ultimate skill of a global model, as bias can be driven from any physical process: tropical emissions, nonlinear NOx-VOC chemistry, stratosphere-troposphere exchange, boundary layer mixing, or missing chemical mechanisms. Deep learning seems well-suited for atmospheric chemistry modeling because it can learn complex, non-obvious interactions in air pollution data and accelerate computations of chemical simulations. My past work explored how deep learning can emulate and replace computationally expensive components of global models for fast, stable simulations. My current research focuses on developing transformer-based architectures for improved spatial and temporal generalization, applying transfer learning from AI foundation models, and leveraging self-supervised training methods specialized for remote sensing data. These efforts aim to build accurate, stable, and physically consistent ML models to enable the simulation of comprehensive atmospheric chemistry in Earth System models and better understand the drivers of ozone bias.


Publications: Kelp et al., (2022) JAMES, Kelp et al., (2020) JGR: Atmospheres, Kelp et al., (2018) ArXiv

Future Outlook Priorities of ML in Atmos Chem: Applications of Machine Learning and Artificial Intelligence in Tropospheric Ozone Research


PM2.5 sensor locations
Distribution of sensor locations in the EPA monitoring network compared to those identified as optimal by the compressed sensing (mrDMD) algorithm in the western US.

Data-Driven Air Pollution Sensing

Despite major investments in air quality (AQ) monitoring, existing sensor networks often fail to capture extreme air pollution. My research uses data-driven methods to improve the design of sensor networks, air quality forecasts, and environmental early warning systems. In one national-scale study, I applied compressed sensing algorithms, a signal processing method that uncovers key spatiotemporal patterns found in data, to determine optimal AQ sensor locations based on recent pollution trends. This analysis revealed major gaps in the current EPA's monitoring network across the western US, particularly in regions affected by wildfire smoke. In a related study, I incorporated equity constraints into the sensor network optimization to improve coverage in historically segregated neighborhoods in cities such as St. Louis and Houston. These approaches provide a foundation for rethinking how we design air quality monitoring networks to better capture extreme events and ensure more equitable coverage. At the same time, commercial platforms increasingly deliver AQ forecasts through proprietary systems, raising concerns about transparency and public accessibility. These systems are likely to become more prevalent in the coming decade due to the rapid commercialization of environmental data and advances in AI and cloud computing. In response, my research is guided by a set of core questions: What are the early warning signals of extreme air pollution (fires, inversions, smog)? Can open data outperform commercial forecasts? And can we learn more from air quality measurements than what is directly observed?


Publications: Kelp et al., (2023) GeoHealth, Kelp et al., (2022) ERL

Co-Authored Sensor Publications: Kawano et al., (2025), Yang et al., (2022)