Post-doctorat – Dijon

The aviation industry must implement new technological means of tracking/processing of human behavior adapted to the functional human factors demonstrations. The capture tools (eye tracking, motion tracking, physiological measures, etc.) have strongly been developed in recent years. Similarly, analyses of the subsequent data and their interpretation in terms of human factors concepts (raw data processing, data fusion, etc.) constitutes a major line of research in constant evolution. According to the very specific needs and requirements of human factors demonstrations (study period, efficiency and robustness of the results, short iteration loop), data processing tools that will be developed to carry out the collected data and produce results for cockpit evaluations are to be defined and developed.
In this perspective, we are looking for a researcher profile knowing biomechanical models and the how to implement it with Motion Capture (MOCAP) data.
The candidate should have:
• Programming skills (Matlab (preferably) and / or C ++ and / or Python);
• Good knowledge of biomechanics (including knowing how to put in place models that can be integrated into software such as OpenSim or Mokka);
• Good knowledge of human motion capture devices would be a plus.
The work will consist in identifying a complete biomechanical model of the upper body (for adult human – airplane pilot), integrate it into software and conduct biomechanical analyses with motion capture data (ART system) recorded on pilots. The ultimate goal is to extract metrics corresponding to Airbus HF needs (essentially articulatory angles).
Expected profile:
• PhD in Biomechanics, Science of Human Movement, Cognitive Science, Cognitive Psychology, Physiological Ergonomics, Computer Science or equivalent. The candidate must have a solid knowledge in Biomechanics, Ergonomics (physiological) and Computer Science. Good knowledge in data analysis (statistics). An experience in one of the investigated MOCAP techniques (ART system) would be appreciated. The lab (LEAD) is equipped with different eye-trackers (Remote and mobile), a motion capture system and many physiological tracking (EDA, EKG, EEG…).
• Fluency in written English (and possibly French).
• A real autonomy and an availability to travel from Dijon to Toulouse.

The start of this research is scheduled for 1 September 2019 and for a period of 12 months. Salary: € 2100 net / Month. The research is supervised by Pr. Thierry Baccino and Dr Véronique Drai-Zerbib. The working place will be at CNRS-LEAD located at the University of Burgundy (Dijon).
To apply, please send a recent CV, a brief statement of research interests, and the names of two references to Thierry Baccino at Thierry.Baccino@u-bourgogne.fr and Véronique Drai-Zerbib at Veronique.Drai-Zerbib@u-bourgogne.fr

Review of applications will continue until the position is filled.

Véronique Drai-Zerbib, Université de Bourgogne Franche-Comté, LEAD – CNRS UMR5022, Institut Marey 64 rue de Sully, 21000 Dijon. Tel: +33 (0) 3 80 39 57 32, E-mail : Veronique.Drai-Zerbib@u-bourgogne.fr

Auteur du message
Thierry Baccino
E-mail
Thierry.Baccino@u-bourgogne.fr
Discipline scientifique
Biomechanics - motion capture
Lieu et institution de rattachement
CNRS-LEAD / Dijon