I am a doctoral researcher at SFB TRR318 “Constructing Explainability”, a Collaborative Research Center at Paderborn and Bielefeld University. I am supervised by Prof. Dr. Katharina J. Rohlfing, with Prof. Dr.-Ing. Britta Wrede as co-PI in Project A05. My work lies at the interdisciplinary intersection of Cognitive Science and Psycholinguistics. This includes statistical modeling and (computational) cognitive science. Before joining SFB TRR318, I completed an M.Sc. in Cognitive Science and a Bachelor’s in Germanistik from JNU.

I investigate multimodal processes in both humans and artificial agents to better understand the desiderata of successful explanations. In particular, I am keen to bring established methods from psycholingustics and cognitive science, such as online language processing, language-vision interaction and visual world paradigms, into HRI and explanability research.

Methodologically, I study human–human explanation using eyetracking and psycholinguistic experimentation.

For human–AI teaming, I work with foundation models, Human-Robot Interaction, and statistical modeling (mostly Bayesian).

Updates

A few recent highlights from my work:

The open-access book on Social Explainable AI (sXAI) is now out.Read

I contributed a chapter on Explanation Goals.Link

TeaP 2026, Tübingen

I recently presented my work at TeaP 2026 in Tübingen. Read the abstract

IEEE ICDL 2025, Prague

At IEEE ICDL 2025 in Prague, I gave a talk on language-vision interaction in dialogical human-robot interaction. Watch the talk on YouTube

WinGaze

WinGaze: A tool for exploring multimodal interaction in HRI. View on GitHub Scenario illustration from multimodal interaction studies