I am a doctoral researcher at SFB TRR318 “Constructing Explainability”, a Collaborative Research Center at Paderborn and Bielefeld University. I am supervised by Prof. Dr. Katharina J. Rohlfing, with Prof. Dr.-Ing. Britta Wrede as co-PI in Project A05. My work lies at the interdisciplinary intersection of Cognitive Science and Psycholinguistics. This includes statistical modeling and computational cognitive science. Before joining SFB TRR318, I completed an M.Sc. in Cognitive Science and a Bachelor’s in Germanistik from JNU.

I investigate multimodal processes in both humans and artificial agents to understand the desiderata of successful explanations. The core idea across my investigation is to integrate XAI techniques with interpretable cognitive/neurosymbolic models to enable holistic explanation generation.

The idea goes beyond applying ‘cognitively agnostic’ classical XAI techniques (such as LIME/SHAP, contrastive explanations) in isolation, and instead utilizes XAI methods within theories and cognitive models of (language) processing.

Methodologically, I study human–human explanation using eyetracking and psycholinguistic experimentation.

For human–AI teaming, I work with foundation models (prompting, feature extraction, fine-tuning), Human-Robot Interaction, and statistical modeling (mostly Bayesian) to quantify explanation practices.

Updates

A few recent highlights from my work:

TeaP 2026, Tübingen

I recently presented my work at TeaP 2026 in Tübingen. Read the abstract

IEEE ICDL 2025, Prague

At IEEE ICDL 2025 in Prague, I gave a talk on language-vision interaction in dialogical human-robot interaction. Watch the talk on YouTube

WinGaze

WinGaze: A tool for exploring multimodal interaction in HRI. View on GitHub Scenario illustration from multimodal interaction studies