Alert button
Picture for Jared Willard

Jared Willard

Alert button

Mini-Batch Learning Strategies for modeling long term temporal dependencies: A study in environmental applications

Oct 15, 2022
Shaoming Xu, Ankush Khandelwal, Xiang Li, Xiaowei Jia, Licheng Liu, Jared Willard, Rahul Ghosh, Kelly Cutler, Michael Steinbach, Christopher Duffy, John Nieber, Vipin Kumar

Figure 1 for Mini-Batch Learning Strategies for modeling long term temporal dependencies: A study in environmental applications
Figure 2 for Mini-Batch Learning Strategies for modeling long term temporal dependencies: A study in environmental applications
Figure 3 for Mini-Batch Learning Strategies for modeling long term temporal dependencies: A study in environmental applications
Figure 4 for Mini-Batch Learning Strategies for modeling long term temporal dependencies: A study in environmental applications

In many environmental applications, recurrent neural networks (RNNs) are often used to model physical variables with long temporal dependencies. However, due to mini-batch training, temporal relationships between training segments within the batch (intra-batch) as well as between batches (inter-batch) are not considered, which can lead to limited performance. Stateful RNNs aim to address this issue by passing hidden states between batches. Since Stateful RNNs ignore intra-batch temporal dependency, there exists a trade-off between training stability and capturing temporal dependency. In this paper, we provide a quantitative comparison of different Stateful RNN modeling strategies, and propose two strategies to enforce both intra- and inter-batch temporal dependency. First, we extend Stateful RNNs by defining a batch as a temporally ordered set of training segments, which enables intra-batch sharing of temporal information. While this approach significantly improves the performance, it leads to much larger training times due to highly sequential training. To address this issue, we further propose a new strategy which augments a training segment with an initial value of the target variable from the timestep right before the starting of the training segment. In other words, we provide an initial value of the target variable as additional input so that the network can focus on learning changes relative to that initial value. By using this strategy, samples can be passed in any order (mini-batch training) which significantly reduces the training time while maintaining the performance. In demonstrating our approach in hydrological modeling, we observe that the most significant gains in predictive accuracy occur when these methods are applied to state variables whose values change more slowly, such as soil water and snowpack, rather than continuously moving flux variables such as streamflow.

* submitted to SIAM International Conference on Data Mining (SDM23) 
Viaarxiv icon

Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks

Sep 26, 2020
Xiaowei Jia, Jacob Zwart, Jeffery Sadler, Alison Appling, Samantha Oliver, Steven Markstrom, Jared Willard, Shaoming Xu, Michael Steinbach, Jordan Read, Vipin Kumar

Figure 1 for Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
Figure 2 for Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
Figure 3 for Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks
Figure 4 for Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks

This paper proposes a physics-guided machine learning approach that combines advanced machine learning models and physics-based models to improve the prediction of water flow and temperature in river networks. We first build a recurrent graph network model to capture the interactions among multiple segments in the river network. Then we present a pre-training technique which transfers knowledge from physics-based models to initialize the machine learning model and learn the physics of streamflow and thermodynamics. We also propose a new loss function that balances the performance over different river segments. We demonstrate the effectiveness of the proposed method in predicting temperature and streamflow in a subset of the Delaware River Basin. In particular, we show that the proposed method brings a 33\%/14\% improvement over the state-of-the-art physics-based model and 24\%/14\% over traditional machine learning models (e.g., Long-Short Term Memory Neural Network) in temperature/streamflow prediction using very sparse (0.1\%) observation data for training. The proposed method has also been shown to produce better performance when generalized to different seasons or river segments with different streamflow ranges.

Viaarxiv icon

Integrating Physics-Based Modeling with Machine Learning: A Survey

Apr 01, 2020
Jared Willard, Xiaowei Jia, Shaoming Xu, Michael Steinbach, Vipin Kumar

Figure 1 for Integrating Physics-Based Modeling with Machine Learning: A Survey
Figure 2 for Integrating Physics-Based Modeling with Machine Learning: A Survey
Figure 3 for Integrating Physics-Based Modeling with Machine Learning: A Survey
Figure 4 for Integrating Physics-Based Modeling with Machine Learning: A Survey

In this manuscript, we provide a structured and comprehensive overview of techniques to integrate machine learning with physics-based modeling. First, we provide a summary of application areas for which these approaches have been applied. Then, we describe classes of methodologies used to construct physics-guided machine learning models and hybrid physics-machine learning frameworks from a machine learning standpoint. With this foundation, we then provide a systematic organization of these existing techniques and discuss ideas for future research.

* 11 pages, 4 figures, submitted to IJCAI 
Viaarxiv icon

Physics-Guided Machine Learning for Scientific Discovery: An Application in Simulating Lake Temperature Profiles

Jan 28, 2020
Xiaowei Jia, Jared Willard, Anuj Karpatne, Jordan S Read, Jacob A Zwart, Michael Steinbach, Vipin Kumar

Figure 1 for Physics-Guided Machine Learning for Scientific Discovery: An Application in Simulating Lake Temperature Profiles
Figure 2 for Physics-Guided Machine Learning for Scientific Discovery: An Application in Simulating Lake Temperature Profiles
Figure 3 for Physics-Guided Machine Learning for Scientific Discovery: An Application in Simulating Lake Temperature Profiles
Figure 4 for Physics-Guided Machine Learning for Scientific Discovery: An Application in Simulating Lake Temperature Profiles

Physics-based models of dynamical systems are often used to study engineering and environmental systems. Despite their extensive use, these models have several well-known limitations due to simplified representations of the physical processes being modeled or challenges in selecting appropriate parameters. While-state-of-the-art machine learning models can sometimes outperform physics-based models given ample amount of training data, they can produce results that are physically inconsistent. This paper proposes a physics-guided recurrent neural network model (PGRNN) that combines RNNs and physics-based models to leverage their complementary strengths and improves the modeling of physical processes. Specifically, we show that a PGRNN can improve prediction accuracy over that of physics-based models, while generating outputs consistent with physical laws. An important aspect of our PGRNN approach lies in its ability to incorporate the knowledge encoded in physics-based models. This allows training the PGRNN model using very few true observed data while also ensuring high prediction accuracy. Although we present and evaluate this methodology in the context of modeling the dynamics of temperature in lakes, it is applicable more widely to a range of scientific and engineering disciplines where physics-based (also known as mechanistic) models are used, e.g., climate science, materials science, computational chemistry, and biomedicine.

* arXiv admin note: text overlap with arXiv:1810.13075 
Viaarxiv icon

Physics Guided RNNs for Modeling Dynamical Systems: A Case Study in Simulating Lake Temperature Profiles

Oct 31, 2018
Xiaowei Jia, Jared Willard, Anuj Karpatne, Jordan Read, Jacob Zward, Michael Steinbach, Vipin Kumar

Figure 1 for Physics Guided RNNs for Modeling Dynamical Systems: A Case Study in Simulating Lake Temperature Profiles
Figure 2 for Physics Guided RNNs for Modeling Dynamical Systems: A Case Study in Simulating Lake Temperature Profiles
Figure 3 for Physics Guided RNNs for Modeling Dynamical Systems: A Case Study in Simulating Lake Temperature Profiles
Figure 4 for Physics Guided RNNs for Modeling Dynamical Systems: A Case Study in Simulating Lake Temperature Profiles

This paper proposes a physics-guided recurrent neural network model (PGRNN) that combines RNNs and physics-based models to leverage their complementary strengths and improve the modeling of physical processes. Specifically, we show that a PGRNN can improve prediction accuracy over that of physical models, while generating outputs consistent with physical laws, and achieving good generalizability. Standard RNNs, even when producing superior prediction accuracy, often produce physically inconsistent results and lack generalizability. We further enhance this approach by using a pre-training method that leverages the simulated data from a physics-based model to address the scarcity of observed data. The PGRNN has the flexibility to incorporate additional physical constraints and we incorporate a density-depth relationship. Both enhancements further improve PGRNN performance. Although we present and evaluate this methodology in the context of modeling the dynamics of temperature in lakes, it is applicable more widely to a range of scientific and engineering disciplines where mechanistic (also known as process-based) models are used, e.g., power engineering, climate science, materials science, computational chemistry, and biomedicine.

Viaarxiv icon