My current research interest is in human-centric multimedia signal processing. My work is concentrated on two major areas, where the data is inherently multimodal.


Emotion & Behavior Analysis: Understanding human emotion and behavior is critical to many human-centric systems involving human-human and human-machine interactions. Human emotion and behavior are inherently multimodal i.e., they are perceived and expressed through different verbal and non-verbal cues. We use image/video, audio and physiological signals to capture various behavioral and emotional cues, and analyze them using a combination of signal processing and machine learning techniques.

Multimedia Analytics: Media content is another source of rich, complex multimodal data (video, audio, metadata, text). Given the huge impact media has in our lives, content analysis can often help in insight generation, experience and interaction improvement, and discovery of social trends. Apart from the application possibilities, processing media data is extremely challenging. We combine computer vision, machine learning and signal processing to develop efficient algorithms for multimedia information analysis.






Ongoing projects

Solely out of personal interest in music, I also conduct research on music signal processing and try to understand emotion in music.

Role of head motion in affect

Behavior analysis in Autism

Behavior in public speaking

Movie content analysis

Affect analysis in image, video and audio