Vignesh Prasad

I am now a Post-Doc at the PEARL Lab at TU Darmstadt with Prof. Georgia Chalvatzaki.!!

Vignesh Prasad joined TU Darmstadt in July 2019 as a Ph.D. student working on Learning Physically Interactive Human-Robot Interaction for Humanoid Social Robots. Vignesh is jointly supervised by Jan Peters, Georgia Chalvatzaki, Dorothea Koert and Ruth Stock-Homburg. His current areas of research include Human-Robot Interaction, Learning from Demonstrations, Human Motion Prediction, and Social Robotics.

Prior to this, Vignesh worked as a researcher in the Machine Vision Group at TCS Innovation Labs, Kolkata under Dr. Brojeshwar Bhowmick, where he worked on Deep Learning for Monocular 3D Reconstruction and Computer Vision. During this time, Vignesh's work won the Best Paper Award at the 2018 Indian Conference on Computer Vision, Graphics and Image Processing (ICVGIP). Vignesh pursued his Bachelors and Masters in Computer Science and Engineering from IIIT Hyderabad, India. His Master's thesis titled "Learning Effective Navigational Strategies for Active Monocular Simultaneous Localization and Mapping" was done at the Robotics Research Center under Dr. K. Madhava Krishna in collaboration with Prof. Balaraman Ravindran.

Publications

  •     Bib
    Prasad, V.; Heitlinger, L; Koert, D.; Stock-Homburg, R.; Peters, J.; Chalvatzaki, G. (submitted). Learning Multimodal Latent Dynamics for Human-Robot Interaction, Submitted to the IEEE Transaction of Robotics (T-RO).
  •   Bib
    Prasad, V.; Kshirsagar, A; Koert, D.; Stock-Homburg, R.; Peters, J.; Chalvatzaki, G. (in press). MoVEInt: Mixture of Variational Experts for Learning Human-Robot Interactions from Demonstrations, Submitted to the IEEE Robotics and Automation Letters (RA-L).
  •     Bib
    Goeksu, Y.; Almeida-Correia, A.; Prasad, V.; Kshirsagar, A.; Koert, D.; Peters, J.; Chalvatzaki, G. (2024). Kinematically Constrained Human-like Bimanual Robot-to-Human Handovers, ACM/IEEE International Conference on Human Robot Interaction (HRI), Late Breaking Report.
  •     Bib
    Gassen, M.; Metzler, F.; Prescher, E.; Prasad, V.; Scherf, L. Kaiser, F.; Koert, D. (2023). I^3: Interactive Iterative Improvement for Few-Shot Action Segmentation, 32th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN).
  •   Bib
    Prasad, V.; (2023). Learning Human-Robot Interaction: A Case Study on Human-Robot Handshaking, Ph.D. Thesis.
  •       Bib
    Prasad, V.; Stock-Homburg, R.; Peters, J. (2022). Human-Robot Handshaking: A Review, International Journal of Social Robotics (IJSR), 14, 1, pp.277-293.
  •     Bib
    Prasad, V.; Koert, D.; Stock-Homburg, R.; Peters, J.; Chalvatzaki, G. (2022). MILD: Multimodal Interactive Latent Dynamics for Learning Human-Robot Interaction, IEEE-RAS International Conference on Humanoid Robots (Humanoids).
  •     Bib
    Prasad, V.; Stock-Homburg, R.; Peters, J. (2021). Learning Human-like Hand Reaching for Human-Robot Handshaking, Proceedings of the IEEE International Conference on Robotics and Automation (ICRA).
  •     Bib
    Stock-Homburg, R.; Peters, J.; Schneider, K.; Prasad, V.; Nukovic, L. (2020). Evaluation of the Handshake Turing Test for anthropomorphic Robots, Proceedings of the ACM/IEEE International Conference on Human Robot Interaction (HRI), Late Breaking Report.
  •     Bib
    Prasad, V.; Stock-Homburg, R.; Peters, J. (2020). Advances in Human-Robot Handshaking, International Conference on Social Robotics, Springer.

For a full list of his publications, please see his Google Scholar page.

Supervised Theses and Projects

YearThesis/ProjectStudent(s)TopicTogether with
OngoingBachelor ThesisFabian HahneHierarchical Hidden Markov Models for Interaction Segmentation and Learning 
OngoingBachelor ThesisArne BacksteinLearning Human-Robot Interaction using Normalizing Flows 
2023Integrated ProjectFrederic Metzler, Martina Gassen, Erik PrescherI^3: Interactive Iterative Improvement for Few-Shot Action SegmentationLisa Scherf, Felix Kaiser
2023Integrated ProjectAntonio De Almeida Correia, Yasemin GöksuLearning Action Representations For Primitives-Based Motion GenerationAlap Kshirsagar
2023Master ThesisRuiyong PiBluetooth Low Enery Localization for the Social Robot ZenboSven Schultze
2023Master ThesisRukang XuSLAM-itation: SLAM-based robotic teleoperationSuman Pal
2023Bachelor ThesisHongzhe GaoUnderstanding Haptic Emotions for Human-Robot Handshaking 
2022Bachelor ThesisErik PrescherVisual Hierarchical Interaction Recognition and Segmentation 
2022Bachelor ThesisLouis SterkerSocial Affordance Segmentation and Learning using Hidden semi-Markov Models 
2022Master ThesisOriol Hinojo ComellasBinaural Sound Localisation with Spiking Neural NetworksSven Schultze
2022Master ThesisYannik FrischAnalysis of Self-supervised Keypoint Detection Methods for Robot LearningAli Younes, Georgia Chalvatzaki
2022Master ThesisZhicheng YangExploring Gripping Behaviours and Haptic Emotions for Human-Robot Handshaking 
2021Bachelor ThesisMartina GassenLearning a library of Physical Interactions for Social RobotsDorothea Koert
2021Master ThesisMaxim RedkinPersonalizing Customer Interactions with Service Robots using Hand Gestures 
2021Bachelor ThesisLennard ScherbringAnalyzing the role of Physical Interactions on Robot Acceptance 
2021Master ThesisMichel KohlLearning Latent Interaction Models using Interaction Primitives 
2020Bachelor ThesisYug AjmeraLearning Movement Primitives for Handshaking Behaviours 
2020Bachelor ThesisMark BaierlLearning Action Representations For Primitives-Based Motion Generation