Alert button
Picture for Michael F. Zimmer

Michael F. Zimmer

Alert button

Comment on "Machine learning conservation laws from differential equations"

Add code
Bookmark button
Alert button
Apr 03, 2024
Michael F. Zimmer

Viaarxiv icon

Constants of Motion for Conserved and Non-conserved Dynamics

Add code
Bookmark button
Alert button
Mar 28, 2024
Michael F. Zimmer

Figure 1 for Constants of Motion for Conserved and Non-conserved Dynamics
Figure 2 for Constants of Motion for Conserved and Non-conserved Dynamics
Figure 3 for Constants of Motion for Conserved and Non-conserved Dynamics
Figure 4 for Constants of Motion for Conserved and Non-conserved Dynamics
Viaarxiv icon

Extracting Dynamical Models from Data

Add code
Bookmark button
Alert button
Oct 27, 2021
Michael F. Zimmer

Figure 1 for Extracting Dynamical Models from Data
Figure 2 for Extracting Dynamical Models from Data
Figure 3 for Extracting Dynamical Models from Data
Figure 4 for Extracting Dynamical Models from Data
Viaarxiv icon

2nd-order Updates with 1st-order Complexity

Add code
Bookmark button
Alert button
May 27, 2021
Michael F. Zimmer

Figure 1 for 2nd-order Updates with 1st-order Complexity
Figure 2 for 2nd-order Updates with 1st-order Complexity
Figure 3 for 2nd-order Updates with 1st-order Complexity
Viaarxiv icon

Neograd: Gradient Descent with a Near-Ideal Learning Rate

Add code
Bookmark button
Alert button
Oct 25, 2020
Michael F. Zimmer

Figure 1 for Neograd: Gradient Descent with a Near-Ideal Learning Rate
Figure 2 for Neograd: Gradient Descent with a Near-Ideal Learning Rate
Figure 3 for Neograd: Gradient Descent with a Near-Ideal Learning Rate
Figure 4 for Neograd: Gradient Descent with a Near-Ideal Learning Rate
Viaarxiv icon

Neograd: gradient descent with an adaptive learning rate

Add code
Bookmark button
Alert button
Oct 15, 2020
Michael F. Zimmer

Figure 1 for Neograd: gradient descent with an adaptive learning rate
Figure 2 for Neograd: gradient descent with an adaptive learning rate
Figure 3 for Neograd: gradient descent with an adaptive learning rate
Figure 4 for Neograd: gradient descent with an adaptive learning rate
Viaarxiv icon

Speedup from a different parametrization within the Neural Network algorithm

Add code
Bookmark button
Alert button
Jun 02, 2017
Michael F. Zimmer

Figure 1 for Speedup from a different parametrization within the Neural Network algorithm
Figure 2 for Speedup from a different parametrization within the Neural Network algorithm
Figure 3 for Speedup from a different parametrization within the Neural Network algorithm
Figure 4 for Speedup from a different parametrization within the Neural Network algorithm
Viaarxiv icon