AAP R&S 2019/MOVCAP (Motion Capture on the Move) (Q3680013): Difference between revisions

From EU Knowledge Graph
Jump to navigation Jump to search
(‎Created claim: summary (P836): The MOVCAP project aims to develop a tool for capturing the human movement kinematics (MOCAP), able to track its target during its movements — the project therefore proposes to free MOCAP from these physical constraints by making it portable by a drone which implies that the motion sensor itself is in motion. Motion capture systems currently rely on fixed cameras whose multiplicity of viewing angles makes it possible to reconstruct the position...)
(‎Changed label, description and/or aliases in en: translated_label)
label / enlabel / en
 
AAP R&S 2019/MOVCAP (Motion Capture on the Move)

Revision as of 16:52, 18 November 2021

Project Q3680013 in France
Language Label Description Also known as
English
AAP R&S 2019/MOVCAP (Motion Capture on the Move)
Project Q3680013 in France

    Statements

    0 references
    489,343.89 Euro
    0 references
    724,635.8 Euro
    0 references
    67.53 percent
    0 references
    1 November 2019
    0 references
    31 October 2022
    0 references
    Université de Montpellier
    0 references
    Le projet MOVCAP vise à développer un outil de capture de la cinématique du mouvement humain (MOCAP), capable de suivre sa cible au cours de ses déplacements – le projet propose donc de libérer la MOCAP de ces contraintes physiques en la rendant portable par un drone ce qui implique que le capteur de mouvement soit, lui-même, en mouvement. Les systèmes de capture du mouvement reposent actuellement sur des caméras fixes dont la multiplicité des angles de vue permet de reconstruire en 3 dimensions la position de marqueurs. La combinaison d'une mesure optique de grande précision avec un système mobile asservi à la position de la cible permettra l'acquisition inédite de données cinématiques en conditions écologiques, c'est-à-dire libérée des contraintes physiques du laboratoire. Cet outil robotique permettra de repousser les limites actuelles de la capture du mouvement en fournissant une mesure continue, fiable et reproductible de la cinématique du mouvement humain. Le projet associe : - trois partenaires académiques : l’Université de Montpellier, l’IMT - Mines d’Alès et le CNRS - à travers 3 laboratoires : • EuroMov, • LIRMM - Laboratoire d'Informatique, de Robotique et de Microélectronique de Montpellier et • LGI2P Laboratoire de Génie Informatique et d'Ingénierie de Production et - un partenaire privé : l’entreprise GAMBI-M, jeune entreprise innovante dans le secteur de l’ingénierie 3D pour les industriels (French)
    0 references
    The MOVCAP project aims to develop a tool for capturing the human movement kinematics (MOCAP), able to track its target during its movements — the project therefore proposes to free MOCAP from these physical constraints by making it portable by a drone which implies that the motion sensor itself is in motion. Motion capture systems currently rely on fixed cameras whose multiplicity of viewing angles makes it possible to reconstruct the position of markers in 3 dimensions. The combination of a high-precision optical measurement with a mobile system enslaved to the target position will allow the unprecedented acquisition of kinematic data under ecological conditions, i.e. free from the physical constraints of the laboratory. This robotic tool will push the current limits of motion capture by providing a continuous, reliable and reproducible measurement of the kinematics of human movement. The project involves: — three academic partners: the University of Montpellier, the IMT — Mines d’Alès and the CNRS — through 3 laboratories: • EuroMov, • LIRMM — Montpellier Computer, Robotics and Microelectronics Laboratory and • LGI2P Laboratory of Computer Engineering and Production Engineering and — a private partner: company Gambi-M, a young innovative company in the field of 3D engineering for industrialists (English)
    18 November 2021
    0 references

    Identifiers

    LR0021769
    0 references