Picture for Neha Nagaraja

Neha Nagaraja

Goal-Driven Risk Assessment for LLM-Powered Systems: A Healthcare Case Study

Add code
Mar 04, 2026
Viaarxiv icon

Image-based Prompt Injection: Hijacking Multimodal LLMs through Visually Embedded Adversarial Instructions

Add code
Mar 04, 2026
Viaarxiv icon

To Protect the LLM Agent Against the Prompt Injection Attack with Polymorphic Prompt

Add code
Jun 06, 2025
Viaarxiv icon