This is the Hamlyn Special Session of an online-only series for sharing within the common platform community. The overall theme is experimental minimally-invasive surgical robotics research, typically, but not limited to, work on the Raven and dVRK platforms.
This session includes presentations that focus on two primary themes for the use of computer assistance in robotic surgery: (1) to improve surgeon task performance, and (2) to provide autonomous capabilities.
One method to improve surgeon performance is through better training platforms, such as the presented sensorized physical simulator that provides additional feedback (in this case, force) during training that is not available during surgery. Another presented approach is to use machine learning to estimate the missing feedback (e.g.,force) from other sensor data and to study how this estimated feedback affects surgeon performance.
Automation of surgical tasks relies on perception of the surgical environment, which can be enabled by the presented method for 3D reconstruction from multiple viewpoints. It will also rely on the ability to manipulate soft tissue, as discussed in the context of deformable object manipulation. Furthermore, it may be necessary to teach surgical tasks to the robot, such as in the presented method for learning from demonstration for autonomous bimanual suturing. Finally, automation of assistive tasks, such as the presented system for applying suction, can reduce cognitive load for the surgeon.
By completing this workshop, participants will gain knowledge about open platforms used for medical robotics research, as well as the following current research topics:
- Perception of the surgical environment via 3D reconstruction from multiple viewpoints
- Methods for estimating interaction forces in minimally-invasive robotic surgery, and haptic performance using these methods
- Learning from demonstration for autonomous bimanual suturing
- Improving surgeon training through use of a realistic sensorized physical simulator
- An autonomous robotic assistant for suction during robotic surgery
- A control framework for autonomous deformable object manipulation using non-fixed contact
This session will be 1.5 hours in length, with 6 presentations. Each presentation will be followed by a brief question & answer period.
All times are in BST.
|15:00||Non-fixed Contact Manipulation Control Framework for Deformable Objects with Active Contact Adjustment||Jing Huang||Chinese Univ. of Hong Kong|
|15:15||Multicamera 3D Viewpoint Adjustment for Robotic Surgery via Deep Reinforcement Learning||Heidi Zhang, Melody Su||Mt. Holyoke College|
|15:30||Characterization of Haptic Feedback from Multimodal Neural Network-based Force Estimates during Teleoperation||Zonghe Chua||Stanford University|
|15:45||Toward Autonomous Suturing Using MOPS: A Modular and Open Platform for Surgical Robotics Research||Kim Lindberg Schwaner||Univ. of Southern Denmark|
|16:00||Controlling Cognitive Demands With Semi-Autonomous Suction Framework for Robotic-Assisted Surgery||Juan Antonio Barragan||Purdue University|
|16:15||A Sensorized Physical Simulator for Training in Robot-Assisted Lung Lobectomy||Dario Galeazzi||Politecnico di Milano|
Best Presentation Award
Intuitive Surgical is sponsoring a $500 award for the best presentation.
This workshop is worth 1.5 CPD points, please register to qualify for certification.