The Hamlyn Symposium on Medical Robotics 2022: 26th-29th June 2022
  • Home
  • Programme
  • Keynotes
  • Workshops
  • Industry Forum
  • Funders’ Forum
  • Registration
  • Home
    • About
    • Ethical MedTech Compliance
    • Proceedings
    • IEEE T-MRB Journal
  • Programme
    • Programme Committee
    • Keynotes
    • Industry Forum
    • Funders’ Forum
    • Workshops
    • Sponsor-Led Events
  • Fees & Registration
  • Sponsors & Exhibitors
  • Surgical Robot Challenge
  • Contact
Get involved

Image Registration between Different Views in Laparoscopic Image

In cancer surgery, reliable intraoperative visualization is still a technological difficulty. Recently, a novel tethered laparoscopic gamma detector was introduced to identify the location of tracer activity to help identify lymph node.

However, the location of the probe (‘SENSEI®’) and the tissue surface it points to will not be clearly indicated. For better tracking of the sensing area of the probe, a miniaturized camera and a structured light will be integrated into the probe. Therefore, the aim of this study is to propose a fast method for image registration between laparoscopic view and an attached miniaturized camera. Meanwhile, the sensing area of probe should be found in the view of laparoscope. We designed a structure to connect the hardware: camera, structured light and probe. A selfsupervised convolutional network (AMIRNet) was designed to learn the discriminative features of the images to directly make registration. After that, structured light was used to determine the sensing area of probe in the laparoscopic image.

View poster
Share this page

Stay in touch

Sign up to our newsletter to be the first to hear about exciting events, opportunities and news.
  • This field is for validation purposes and should be left unchanged.

The information you supply will be processed in line with our privacy notice.

The Hamlyn Symposium
on Medical Robotics
Imperial College London
  • Programme
  • Showcase
  • Challenges
  • Accessibility
  • Press
  • Privacy
© 2021 Imperial College London