Project DescriptionHow can we model building data to enable robots to query the information they need to autonomously navigate in a building? How can we keep the data up to date with respect to changes in structural (e.g., walls, columns) as well as dynamic elements (e.g., doors can be closed or open). How can multiple robots with different sensing modalities contribute to a safe and trustworthy building digital twin data update? These are just some example questions that arise from research into building digital twins and robotics.
Our group has started to investigate some of those questions. In particular, we can effectively derive graph-based environmental descriptions based on data in Building Information Models (BIM) [1--3]. And we have demonstrated the world models can be automatically generated with data representations that are tailored to what a particular localization [2] or navigation algorithm [3] expects.
In this project we want to further develop and validate innovative methods for generating (semantic) world models from BIM/digital twins that are suitable for navigation and localization algorithms of robots.
In addition, we pose the question of how to dynamically maintain a correspondence between the reality, as perceived by multiple robots through different sensing modalities (e.g. 2D and 3D lidar), and the live data stored in the digital twin in the form of a graph.
For this position, we are looking for a candidate that is fascinated by interdisciplinary (i.e., robotics and digital construction) research and can effectively communicate with an interdisciplinary supervision team. We also appreciate independent and analytical mindsets. This project is in collaboration with TU Munich. The candidate is expected to spend a cumulative period of 6 months at TU Munich.
The researcher will be part of the Robotics Section of TU/e which possesses cutting edge robotics lab facilities including multiple turtlebots, ROSbots, Clearpath robots and state of the art localization systems. The Section counts with a team of 13 faculty members, 26 Phd/Postdocs and 11 support staff. The digital twins of several buildings at the TU/e campus are also available.
References:[1] Pauwels, P., de Koning, R., Hendrikx, B., & Torta, E. (2023). Live semantic data from building digital twins for robot navigation: Overview of data transfer methods.
Advanced Engineering Informatics,
56, 101959.
[2] Hendrikx, R. W. M., Pauwels, P., Torta, E., Bruyninckx, H. P., & van de Molengraft, M. J. G. (2021, May). Connecting semantic building information models and robotics: An application to 2D LiDAR-based localization. In
2021 IEEE International Conference on Robotics and Automation (ICRA) (pp. 11654-11660). IEEE.
[3] de Vos, K., Torta, E., Bruyninckx, H., Martinez, C. A., & van de Molengraft, M. J. G. (2023). Automatic Configuration of Multi-Agent Model Predictive Controllers based on Semantic Graph World Models.
arXiv preprint arXiv:2311.01180.
The Robotics Section (RBT;
website) section as part of the department of Mechanical Engineering (ME) has an internationally recognized reputation in robotics, including perception, control and knowledge representation and reasoning. RBT targets application areas in robotics for care and cure, construction, logistics, services, agrifood, manufacturing and entertainment. The RBT group has a track record in European projects. It was coordinator of the FP7 - ROBOEARTH project, has been participating in H2020 projects EUREYECASE, ROPOD, AUTOPILOT and SAFE-UP. More recently is a partner of the Horizon Europe projects AI Matters and MUSIT. RBT has also a large portfolio of nationally funded projects (INTEREG, OPZUID, FAST, TOWR, AMBER, HER+).
Since 2005, RBT is the main contributor to the TU/e RoboCup team, named Tech United, 6 times world champion ('12, '14, '16, '18, '19, '23) in the Middle Size League and the 2019 world champion in the @Home Domestic Platform League with the TOYOTA Human Support Robot. RBT is led by Prof. Renè van de Molengraft.