Picture for Paul Smolensky

Paul Smolensky

Implicit Chain of Thought Reasoning via Knowledge Distillation

Add code
Nov 02, 2023
Figure 1 for Implicit Chain of Thought Reasoning via Knowledge Distillation
Figure 2 for Implicit Chain of Thought Reasoning via Knowledge Distillation
Figure 3 for Implicit Chain of Thought Reasoning via Knowledge Distillation
Figure 4 for Implicit Chain of Thought Reasoning via Knowledge Distillation
Viaarxiv icon

Differentiable Tree Operations Promote Compositional Generalization

Add code
Jun 01, 2023
Figure 1 for Differentiable Tree Operations Promote Compositional Generalization
Figure 2 for Differentiable Tree Operations Promote Compositional Generalization
Figure 3 for Differentiable Tree Operations Promote Compositional Generalization
Figure 4 for Differentiable Tree Operations Promote Compositional Generalization
Viaarxiv icon

Uncontrolled Lexical Exposure Leads to Overestimation of Compositional Generalization in Pretrained Models

Add code
Dec 21, 2022
Figure 1 for Uncontrolled Lexical Exposure Leads to Overestimation of Compositional Generalization in Pretrained Models
Figure 2 for Uncontrolled Lexical Exposure Leads to Overestimation of Compositional Generalization in Pretrained Models
Figure 3 for Uncontrolled Lexical Exposure Leads to Overestimation of Compositional Generalization in Pretrained Models
Figure 4 for Uncontrolled Lexical Exposure Leads to Overestimation of Compositional Generalization in Pretrained Models
Viaarxiv icon

Structural Biases for Improving Transformers on Translation into Morphologically Rich Languages

Add code
Aug 11, 2022
Figure 1 for Structural Biases for Improving Transformers on Translation into Morphologically Rich Languages
Figure 2 for Structural Biases for Improving Transformers on Translation into Morphologically Rich Languages
Figure 3 for Structural Biases for Improving Transformers on Translation into Morphologically Rich Languages
Figure 4 for Structural Biases for Improving Transformers on Translation into Morphologically Rich Languages
Viaarxiv icon

Neurocompositional computing: From the Central Paradox of Cognition to a new generation of AI systems

Add code
May 02, 2022
Figure 1 for Neurocompositional computing: From the Central Paradox of Cognition to a new generation of AI systems
Figure 2 for Neurocompositional computing: From the Central Paradox of Cognition to a new generation of AI systems
Figure 3 for Neurocompositional computing: From the Central Paradox of Cognition to a new generation of AI systems
Figure 4 for Neurocompositional computing: From the Central Paradox of Cognition to a new generation of AI systems
Viaarxiv icon

How much do language models copy from their training data? Evaluating linguistic novelty in text generation using RAVEN

Add code
Nov 18, 2021
Figure 1 for How much do language models copy from their training data? Evaluating linguistic novelty in text generation using RAVEN
Figure 2 for How much do language models copy from their training data? Evaluating linguistic novelty in text generation using RAVEN
Figure 3 for How much do language models copy from their training data? Evaluating linguistic novelty in text generation using RAVEN
Figure 4 for How much do language models copy from their training data? Evaluating linguistic novelty in text generation using RAVEN
Viaarxiv icon

Distributed neural encoding of binding to thematic roles

Add code
Oct 24, 2021
Figure 1 for Distributed neural encoding of binding to thematic roles
Figure 2 for Distributed neural encoding of binding to thematic roles
Figure 3 for Distributed neural encoding of binding to thematic roles
Figure 4 for Distributed neural encoding of binding to thematic roles
Viaarxiv icon

Scalable knowledge base completion with superposition memories

Add code
Oct 24, 2021
Figure 1 for Scalable knowledge base completion with superposition memories
Figure 2 for Scalable knowledge base completion with superposition memories
Figure 3 for Scalable knowledge base completion with superposition memories
Figure 4 for Scalable knowledge base completion with superposition memories
Viaarxiv icon

Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization

Add code
Jun 02, 2021
Figure 1 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Figure 2 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Figure 3 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Figure 4 for Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization
Viaarxiv icon

Compositional Processing Emerges in Neural Networks Solving Math Problems

Add code
May 19, 2021
Figure 1 for Compositional Processing Emerges in Neural Networks Solving Math Problems
Figure 2 for Compositional Processing Emerges in Neural Networks Solving Math Problems
Figure 3 for Compositional Processing Emerges in Neural Networks Solving Math Problems
Viaarxiv icon