Picture for Idan Szpektor

Idan Szpektor

Dima

RefVNLI: Towards Scalable Evaluation of Subject-driven Text-to-image Generation

Add code
Apr 24, 2025
Viaarxiv icon

Latent Beam Diffusion Models for Decoding Image Sequences

Add code
Mar 26, 2025
Viaarxiv icon

Gemma 3 Technical Report

Add code
Mar 25, 2025
Viaarxiv icon

ECLeKTic: a Novel Challenge Set for Evaluation of Cross-Lingual Knowledge Transfer

Add code
Feb 28, 2025
Viaarxiv icon

Bridging the Visual Gap: Fine-Tuning Multimodal Models with Knowledge-Adapted Captions

Add code
Nov 13, 2024
Viaarxiv icon

MDCure: A Scalable Pipeline for Multi-Document Instruction-Following

Add code
Oct 30, 2024
Figure 1 for MDCure: A Scalable Pipeline for Multi-Document Instruction-Following
Figure 2 for MDCure: A Scalable Pipeline for Multi-Document Instruction-Following
Figure 3 for MDCure: A Scalable Pipeline for Multi-Document Instruction-Following
Figure 4 for MDCure: A Scalable Pipeline for Multi-Document Instruction-Following
Viaarxiv icon

Distinguishing Ignorance from Error in LLM Hallucinations

Add code
Oct 29, 2024
Figure 1 for Distinguishing Ignorance from Error in LLM Hallucinations
Figure 2 for Distinguishing Ignorance from Error in LLM Hallucinations
Figure 3 for Distinguishing Ignorance from Error in LLM Hallucinations
Figure 4 for Distinguishing Ignorance from Error in LLM Hallucinations
Viaarxiv icon

Are LLMs Better than Reported? Detecting Label Errors and Mitigating Their Effect on Model Performance

Add code
Oct 24, 2024
Viaarxiv icon

Localizing Factual Inconsistencies in Attributable Text Generation

Add code
Oct 09, 2024
Figure 1 for Localizing Factual Inconsistencies in Attributable Text Generation
Figure 2 for Localizing Factual Inconsistencies in Attributable Text Generation
Figure 3 for Localizing Factual Inconsistencies in Attributable Text Generation
Figure 4 for Localizing Factual Inconsistencies in Attributable Text Generation
Viaarxiv icon

LLMs Know More Than They Show: On the Intrinsic Representation of LLM Hallucinations

Add code
Oct 03, 2024
Figure 1 for LLMs Know More Than They Show: On the Intrinsic Representation of LLM Hallucinations
Figure 2 for LLMs Know More Than They Show: On the Intrinsic Representation of LLM Hallucinations
Figure 3 for LLMs Know More Than They Show: On the Intrinsic Representation of LLM Hallucinations
Figure 4 for LLMs Know More Than They Show: On the Intrinsic Representation of LLM Hallucinations
Viaarxiv icon