Picture for Paul Smolensky

Paul Smolensky

Scalable knowledge base completion with superposition memories

Add code
Oct 24, 2021
Figure 1 for Scalable knowledge base completion with superposition memories
Figure 2 for Scalable knowledge base completion with superposition memories
Figure 3 for Scalable knowledge base completion with superposition memories
Figure 4 for Scalable knowledge base completion with superposition memories
Viaarxiv icon

Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization

Add code
Jun 02, 2021
Figure 1 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Figure 2 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Figure 3 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Figure 4 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Viaarxiv icon

Compositional Processing Emerges in Neural Networks Solving Math Problems

Add code
May 19, 2021
Figure 1 for Compositional Processing Emerges in Neural Networks Solving Math Problems
Figure 2 for Compositional Processing Emerges in Neural Networks Solving Math Problems
Figure 3 for Compositional Processing Emerges in Neural Networks Solving Math Problems
Viaarxiv icon

Neuro-Symbolic Representations for Video Captioning: A Case for Leveraging Inductive Biases for Vision and Language

Add code
Nov 18, 2020
Figure 1 for Neuro-Symbolic Representations for Video Captioning: A Case for Leveraging Inductive Biases for Vision and Language
Figure 2 for Neuro-Symbolic Representations for Video Captioning: A Case for Leveraging Inductive Biases for Vision and Language
Figure 3 for Neuro-Symbolic Representations for Video Captioning: A Case for Leveraging Inductive Biases for Vision and Language
Figure 4 for Neuro-Symbolic Representations for Video Captioning: A Case for Leveraging Inductive Biases for Vision and Language
Viaarxiv icon

Universal linguistic inductive biases via meta-learning

Add code
Jun 29, 2020
Figure 1 for Universal linguistic inductive biases via meta-learning
Figure 2 for Universal linguistic inductive biases via meta-learning
Figure 3 for Universal linguistic inductive biases via meta-learning
Figure 4 for Universal linguistic inductive biases via meta-learning
Viaarxiv icon

Discovering the Compositional Structure of Vector Representations with Role Learning Networks

Add code
Nov 17, 2019
Figure 1 for Discovering the Compositional Structure of Vector Representations with Role Learning Networks
Figure 2 for Discovering the Compositional Structure of Vector Representations with Role Learning Networks
Figure 3 for Discovering the Compositional Structure of Vector Representations with Role Learning Networks
Figure 4 for Discovering the Compositional Structure of Vector Representations with Role Learning Networks
Viaarxiv icon

HUBERT Untangles BERT to Improve Transfer across NLP Tasks

Add code
Oct 25, 2019
Figure 1 for HUBERT Untangles BERT to Improve Transfer across NLP Tasks
Figure 2 for HUBERT Untangles BERT to Improve Transfer across NLP Tasks
Figure 3 for HUBERT Untangles BERT to Improve Transfer across NLP Tasks
Figure 4 for HUBERT Untangles BERT to Improve Transfer across NLP Tasks
Viaarxiv icon

Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving

Add code
Oct 15, 2019
Figure 1 for Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving
Figure 2 for Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving
Figure 3 for Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving
Figure 4 for Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving
Viaarxiv icon

Natural- to formal-language generation using Tensor Product Representations

Add code
Oct 05, 2019
Figure 1 for Natural- to formal-language generation using Tensor Product Representations
Figure 2 for Natural- to formal-language generation using Tensor Product Representations
Figure 3 for Natural- to formal-language generation using Tensor Product Representations
Figure 4 for Natural- to formal-language generation using Tensor Product Representations
Viaarxiv icon

RNNs Implicitly Implement Tensor Product Representations

Add code
Dec 20, 2018
Figure 1 for RNNs Implicitly Implement Tensor Product Representations
Figure 2 for RNNs Implicitly Implement Tensor Product Representations
Figure 3 for RNNs Implicitly Implement Tensor Product Representations
Figure 4 for RNNs Implicitly Implement Tensor Product Representations
Viaarxiv icon