Alert button
Picture for Hofit Bata

Hofit Bata

Alert button

Jamba: A Hybrid Transformer-Mamba Language Model

Add code
Bookmark button
Alert button
Mar 28, 2024
Opher Lieber, Barak Lenz, Hofit Bata, Gal Cohen, Jhonathan Osin, Itay Dalmedigos, Erez Safahi, Shaked Meirom, Yonatan Belinkov, Shai Shalev-Shwartz, Omri Abend, Raz Alon, Tomer Asida, Amir Bergman, Roman Glozman, Michael Gokhman, Avashalom Manevich, Nir Ratner, Noam Rozen, Erez Shwartz, Mor Zusman, Yoav Shoham

Figure 1 for Jamba: A Hybrid Transformer-Mamba Language Model
Figure 2 for Jamba: A Hybrid Transformer-Mamba Language Model
Figure 3 for Jamba: A Hybrid Transformer-Mamba Language Model
Figure 4 for Jamba: A Hybrid Transformer-Mamba Language Model
Viaarxiv icon

MRKL Systems: A modular, neuro-symbolic architecture that combines large language models, external knowledge sources and discrete reasoning

Add code
Bookmark button
Alert button
May 01, 2022
Ehud Karpas, Omri Abend, Yonatan Belinkov, Barak Lenz, Opher Lieber, Nir Ratner, Yoav Shoham, Hofit Bata, Yoav Levine, Kevin Leyton-Brown, Dor Muhlgay, Noam Rozen, Erez Schwartz, Gal Shachaf, Shai Shalev-Shwartz, Amnon Shashua, Moshe Tenenholtz

Figure 1 for MRKL Systems: A modular, neuro-symbolic architecture that combines large language models, external knowledge sources and discrete reasoning
Figure 2 for MRKL Systems: A modular, neuro-symbolic architecture that combines large language models, external knowledge sources and discrete reasoning
Figure 3 for MRKL Systems: A modular, neuro-symbolic architecture that combines large language models, external knowledge sources and discrete reasoning
Figure 4 for MRKL Systems: A modular, neuro-symbolic architecture that combines large language models, external knowledge sources and discrete reasoning
Viaarxiv icon

Limits to Depth Efficiencies of Self-Attention

Add code
Bookmark button
Alert button
Jun 22, 2020
Yoav Levine, Noam Wies, Or Sharir, Hofit Bata, Amnon Shashua

Figure 1 for Limits to Depth Efficiencies of Self-Attention
Viaarxiv icon