Témata semestrálních projektů

Typ studia:
Studijní program:
Katedra vedoucího:
Vedoucí:
Název tématu Vedoucí Typ Kapacita Katedra vedoucího
Efektivní průzkum povrchu těla s taktilními senzory u humanoidních robotů Mgr. Matěj Hoffmann, Ph.D. BM 1/2 13133

Popis
The goal of this project is to develop a simulation environment for the iCub humanoid robot with tactile sensors on large areas of its body and tune it for a self-touch scenario.

Studijní program
EECS EEK LK EI KyR EEM EK SIT BIO OI OES KME BII IB

Literatura
Gama, F.; Shcherban, M.; Rolf, M. & Hoffmann, M. (2020), Active exploration for body model learning through self-touch on a humanoid robot with artificial skin, in 'Joint IEEE 10th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)'.

Mannella, Francesco, et al. "Know your body through intrinsic goals." Frontiers in neurorobotics 12 (2018): 30.

Yamada, Y., Kanazawa, H., Iwasaki, S., Tsukahara, Y., Iwata, O., Yamada, S., & Kuniyoshi, Y. (2016). An embodied brain model of the human foetus. Scientific reports, 6, 27893.

Haptický průzkum a kategorizace předmětů pomocí robotických uchopovačů Mgr. Matěj Hoffmann, Ph.D. BM 0/3 13133

Popis
The goal of this project is to use different robotic arms and grippers (KUKA LBR iiwa with Barrett Hand, UR10 with QB Soft Hand or OnRobot RG6) to explore different objects and collect data from proprioceptive, tactile, and force feedback. Different clustering or classification algorithms will be employed on this data to differentiate between the objects, focusing in particular on properties that can only be extracted from haptic exploration (manipulating the objects) such as elasticity, surface properties, etc. In a second step, the choice of grasping actions that aid recognition will be studied. Finally, priors extracted from vision or other sources (Internet – linguistic description) and can be also employed. This work is part of a newly starting European project IPALM (Interactive Perception-Action-Learning for Modelling Objects).

Studijní program
EECS EEK LK EI KyR EEM EK SIT BIO OI OES KME BII IB

Požadavky
1.Familiarization with robotic platforms.
2. Pilot data collection - grasping different objects with different grippers.
3. First clustering / categorization of objects from obtained data.
4. Optimizing actions (grasps) to improve categorization.
[Combining with priors from other sources]

Literatura
Bajcsy, R., Aloimonos, Y., & Tsotsos, J. K. (2018). Revisiting active perception. Autonomous Robots, 42(2), 177-196.
Bohg, J., Hausman, K., Sankaran, B., Brock, O., Kragic, D., Schaal, S., & Sukhatme, G. S. (2017). Interactive perception: Leveraging action in perception and perception in action. IEEE Transactions on Robotics, 33(6), 1273-1291.
Hoffmann, M., Stepanova, K. & Reinstein, M. (2014), 'The effect of motor action and different sensory modalities on terrain classification in a quadruped robot running with multiple gaits', Robotics and Autonomous Systems 62, 1790-1798.
Nikandrova, E., & Kyrki, V. (2015). Category-based task specific grasping. Robotics and Autonomous Systems, 70, 25-35.

Průzkum předmětů skrze robotickou manipulaci Mgr. Matěj Hoffmann, Ph.D. BM 1/2 13133

Popis
The goal of this project is to explore the properties of deformable objects through robot manipulation, focusing in particular on properties that can only be extracted from haptic exploration such as elasticity. This project focuses on simulating this interaction in physics-based simulators (e.g., Mujoco).

Studijní program
EECS EEK LK EI KyR EEM EK SIT BIO OI OES KME BII IB

Literatura
Sanchez, J., Corrales, J. A., Bouzgarrou, B. C., & Mezouar, Y. (2018). Robotic manipulation and sensing of deformable objects in domestic and industrial applications: a survey. The International Journal of Robotics Research, 37(7), 688-716.

Li, Q., Kroemer, O., Su, Z., Veiga, F. F., Kaboli, M., & Ritter, H. J. (2020). A Review of Tactile Information: Perception and Action Through Touch. IEEE Transactions on Robotics.

Spiers, A. J., Liarokapis, M. V., Calli, B., & Dollar, A. M. (2016). Single-grasp object classification and feature extraction with simple robot hands and tactile sensors. IEEE transactions on haptics, 9(2), 207-220.

Rozpoznávání předmětů pomocí vidění a dotyku robotickými rukami Mgr. Matěj Hoffmann, Ph.D. BM 1/3 13133

Popis
The goal of this project is to use different robotic arms and grippers (KUKA LBR iiwa with Barrett Hand, UR10 with OnRobot RG6 or QB Soft Hand, Kinova Gen3 with Robotiq 2F-85) to aid visual perception, focusing in particular on object material properties (stiffness, surface roughness, etc.). For visual perception, state-of-the-art algorithms from our European partners (https://sites.google.com/view/ipalm/) will be employed to obtain estimates of object pose, shape, and material. Based on these priors, the goal is to develop an object exploration strategy to verify these hypotheses. The actions may involve: (1) manipulation (e.g., squeezing, pushing) and (2) visual exploration using a moving RGB-D camera (Intel Realsense D410 in the wrist of Kinova Gen3). The grippers / handshave different feedback signals available - the Barrett Hand, for example, has 96 tactile sensors and 3 fingertip torque sensors.
Video illustration: Barrett Hand grasping a soft object: https://youtu.be/J6YXZgbDjBw

Studijní program
EECS EEK LK EI KyR EEM EK SIT BIO OI OES KME BII IB

Požadavky
(subset of the following - depending on project type)
1. Familiarization with robotic platforms.
2. Getting object pose, shape, and material priors from RGB-D camera (code from project partners).
3. Development of action repertoire for manipulation actions - squeezing, poking, pushing, etc.
4. Using haptic (tactile and proprioceptive) sensory feedback to learn about object properties and verify priors from vision.
5. Action selection algorithm to pick the actions that can reduce uncertainty about certain object properties the most.
6. If time permits: active visual exploration using wrist camera in Kinova Gen 3.

Literatura
Bajcsy, R., Aloimonos, Y., & Tsotsos, J. K. (2018). Revisiting active perception. Autonomous Robots, 42(2), 177-196.

Bohg, J., Hausman, K., Sankaran, B., Brock, O., Kragic, D., Schaal, S., & Sukhatme, G. S. (2017). Interactive perception: Leveraging action in perception and perception in action. IEEE Transactions on Robotics, 33(6), 1273-1291.

Davis, A., Bouman, K. L., Chen, J. G., Rubinstein, M., Durand, F., & Freeman, W. T. (2015). Visual vibrometry: Estimating material properties from small motion in video. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 5335-5343).

Nikandrova, E., Laaksonen, J., & Kyrki, V. (2014). Towards informative sensor-based grasp planning. Robotics and Autonomous Systems, 62(3), 340-354.

Pumarola, A., Agudo, A., Porzi, L., Sanfeliu, A., Lepetit, V., & Moreno-Noguer, F. (2018). Geometry-aware network for non-rigid shape prediction from a single view. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 4681-4690).

Za obsah odpovídá: doc. Ing. Ivan Jelínek, CSc.