For this research project focused on enhancing haptic navigation and tactile feedback systems, we are seeking a highly motivated and skilled PhD student. This interdisciplinary project combines elements of joint-action models, sensory substitution, and human-computer interaction to develop innovative solutions for individuals with visual disabilities, prosthetics users and applications in virtual reality (VR) environments.
Job Description In this project you will
develop a joint action model for haptic navigation, based on existing Bayesian decision-making or dynamic neural field theories. You will design and implement vibrotactile feedback into a working prototype. Initially this will be for goal-directed hand movements in VR environments, but it will be extended to whole body movements. To evaluate the effectiveness of the tactile feedback system you will plan and conduct lab experiments involving human participants. You will apply psychophysical models and methods as well as analyze time-series of movement trajectories to assess the performance and impact of the haptic navigation cues. Unity, Unreal Engine, C#, or Python will be used to program and refine the VR-based prototype.
As a PhD researcher in the
Human Technology Interaction group, you will work in a setting with extensive expertise in and facilities for conducting research with human participants. We have VR setups and an Optitrak position sensing system. Moreover, you will be working with wearables incorporating vibrotactile feedback.