Alert button
Picture for Lucas Liebenwein

Lucas Liebenwein

Alert button

End-to-End Sensitivity-Based Filter Pruning

Apr 15, 2022
Zahra Babaiee, Lucas Liebenwein, Ramin Hasani, Daniela Rus, Radu Grosu

Figure 1 for End-to-End Sensitivity-Based Filter Pruning
Figure 2 for End-to-End Sensitivity-Based Filter Pruning
Figure 3 for End-to-End Sensitivity-Based Filter Pruning
Figure 4 for End-to-End Sensitivity-Based Filter Pruning
Viaarxiv icon

Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition

Jul 23, 2021
Lucas Liebenwein, Alaa Maalouf, Oren Gal, Dan Feldman, Daniela Rus

Figure 1 for Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
Figure 2 for Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
Figure 3 for Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
Figure 4 for Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
Viaarxiv icon

Closed-form Continuous-Depth Models

Jun 25, 2021
Ramin Hasani, Mathias Lechner, Alexander Amini, Lucas Liebenwein, Max Tschaikowski, Gerald Teschl, Daniela Rus

Figure 1 for Closed-form Continuous-Depth Models
Figure 2 for Closed-form Continuous-Depth Models
Figure 3 for Closed-form Continuous-Depth Models
Figure 4 for Closed-form Continuous-Depth Models
Viaarxiv icon

Sparse Flows: Pruning Continuous-depth Models

Jun 24, 2021
Lucas Liebenwein, Ramin Hasani, Alexander Amini, Daniela Rus

Figure 1 for Sparse Flows: Pruning Continuous-depth Models
Figure 2 for Sparse Flows: Pruning Continuous-depth Models
Figure 3 for Sparse Flows: Pruning Continuous-depth Models
Figure 4 for Sparse Flows: Pruning Continuous-depth Models
Viaarxiv icon

Low-Regret Active learning

Apr 06, 2021
Cenk Baykal, Lucas Liebenwein, Dan Feldman, Daniela Rus

Figure 1 for Low-Regret Active learning
Figure 2 for Low-Regret Active learning
Figure 3 for Low-Regret Active learning
Figure 4 for Low-Regret Active learning
Viaarxiv icon

Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy

Mar 04, 2021
Lucas Liebenwein, Cenk Baykal, Brandon Carter, David Gifford, Daniela Rus

Figure 1 for Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy
Figure 2 for Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy
Figure 3 for Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy
Figure 4 for Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy
Viaarxiv icon

Deep Latent Competition: Learning to Race Using Visual Control Policies in Latent Space

Feb 19, 2021
Wilko Schwarting, Tim Seyde, Igor Gilitschenski, Lucas Liebenwein, Ryan Sander, Sertac Karaman, Daniela Rus

Figure 1 for Deep Latent Competition: Learning to Race Using Visual Control Policies in Latent Space
Figure 2 for Deep Latent Competition: Learning to Race Using Visual Control Policies in Latent Space
Figure 3 for Deep Latent Competition: Learning to Race Using Visual Control Policies in Latent Space
Figure 4 for Deep Latent Competition: Learning to Race Using Visual Control Policies in Latent Space
Viaarxiv icon

Machine Learning-based Estimation of Forest Carbon Stocks to increase Transparency of Forest Preservation Efforts

Dec 17, 2019
Björn Lütjens, Lucas Liebenwein, Katharina Kramer

Figure 1 for Machine Learning-based Estimation of Forest Carbon Stocks to increase Transparency of Forest Preservation Efforts
Figure 2 for Machine Learning-based Estimation of Forest Carbon Stocks to increase Transparency of Forest Preservation Efforts
Viaarxiv icon

Provable Filter Pruning for Efficient Neural Networks

Nov 18, 2019
Lucas Liebenwein, Cenk Baykal, Harry Lang, Dan Feldman, Daniela Rus

Figure 1 for Provable Filter Pruning for Efficient Neural Networks
Figure 2 for Provable Filter Pruning for Efficient Neural Networks
Figure 3 for Provable Filter Pruning for Efficient Neural Networks
Figure 4 for Provable Filter Pruning for Efficient Neural Networks
Viaarxiv icon

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

Oct 11, 2019
Cenk Baykal, Lucas Liebenwein, Igor Gilitschenski, Dan Feldman, Daniela Rus

Figure 1 for SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks
Figure 2 for SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks
Figure 3 for SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks
Figure 4 for SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks
Viaarxiv icon