Picture for Itamar Zimerman

Itamar Zimerman

Efficient Decoding Methods for Language Models on Encrypted Data

Add code
Sep 10, 2025
Viaarxiv icon

Differential Mamba

Add code
Jul 08, 2025
Viaarxiv icon

Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs

Add code
Jun 08, 2025
Viaarxiv icon

Overflow Prevention Enhances Long-Context Recurrent LLMs

Add code
May 12, 2025
Viaarxiv icon

On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach

Add code
Feb 04, 2025
Figure 1 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Figure 2 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Figure 3 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Figure 4 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Viaarxiv icon

Power-Softmax: Towards Secure LLM Inference over Encrypted Data

Add code
Oct 12, 2024
Figure 1 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Figure 2 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Figure 3 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Figure 4 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Viaarxiv icon

DeciMamba: Exploring the Length Extrapolation Potential of Mamba

Add code
Jun 20, 2024
Figure 1 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Figure 2 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Figure 3 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Figure 4 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Viaarxiv icon

A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models

Add code
May 26, 2024
Figure 1 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Figure 2 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Figure 3 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Figure 4 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Viaarxiv icon

The Hidden Attention of Mamba Models

Add code
Mar 03, 2024
Figure 1 for The Hidden Attention of Mamba Models
Figure 2 for The Hidden Attention of Mamba Models
Figure 3 for The Hidden Attention of Mamba Models
Figure 4 for The Hidden Attention of Mamba Models
Viaarxiv icon

On the Long Range Abilities of Transformers

Add code
Nov 28, 2023
Figure 1 for On the Long Range Abilities of Transformers
Figure 2 for On the Long Range Abilities of Transformers
Figure 3 for On the Long Range Abilities of Transformers
Figure 4 for On the Long Range Abilities of Transformers
Viaarxiv icon