Dementia Care, Cueing, and Everyday Support

This research area explores how interactive systems can support people living with dementia in everyday life, communication, and cognitively complex activities. Our work develops embodied, self-cueing, and multimodal prompting systems that scaffold memory, task comprehension, confidence, and relational engagement.

Key Research Themes

  • Embodied cueing and movement-based interaction
  • Self-cueing and adaptive prompting systems
  • Multimodal interaction (visual, auditory, object-based)
  • Relational communication and co-regulation
  • Everyday activities and ritual continuity

Projects

CogniPrompt

A conversational AI system designed to support everyday planning, engagement, and communication through a non-human agent interface. Explores relational interaction and sustained engagement in dementia care.

Cue-D

A multimodal prompting system that supports task completion through contextual cues and adaptive interaction, grounded in participatory design with people living with dementia.

Self-Cueing

Systems that enable individuals to generate and use their own cues, supporting autonomy, agency, and continuity in everyday activities.

Mixed Reality Interactions

Early work exploring embodied cueing through AR/MR systems, informing later developments in prompting architectures and interaction design.

Team and Trainees

This work is led by Dr. Shital Desai and involves PhD, Master’s, and undergraduate researchers working across interaction design, health, and emerging technologies.

Partners and Collaborators

  • Memory & Company
  • Memory Lane Home Living
  • Alzheimer Society
  • Care communities and clinical partners

Selected Outputs

  • Publications in Frontiers in Psychology, IJHCS
  • Conference papers (CHI, DIS, ISMAR)
  • Prototype systems and design frameworks