A fitting analogy for the modern operating theatre is this of a multiverse formed by a cluster of smaller universes: these of patient, surgeon, staff, equipment and so on. These universes exist parallel to each other, and although they can be considered in isolation, they are in reality interconnected.Therefore, if we wish to fully appreciate all aspects governing surgical performance, in and around the operating theatre, we need to adopt a holistic approach and consider all universes in parallel. Inspired by the dichotomy between classic and quantum physics, we postulate that events/ripples in one universe can be detected in other parallel universes.

At the same time, causalities of events in the operating theatre multiverse can be traced back to all individual universes. Adopting this approach essentially implies unification of the human, the physical, and the digital entities which are present in the operating theatre under a single framework.

One way to achieve this unification is through the integration of four key elements:

  1. Holistic sensing of patient, staff, operating theatre environment and equipment, with emphasis on wearable and distributed human-centric sensing;
  2. AI for perception and cognition through data fusion from multiple sensors;
  3. Robotics and allied technologies leveraged by perceptual user interfaces for augmenting task performance;
  4. A multipoint-to-multipoint communication and device interoperability platform, to allow full integration of all above key elements.

This workshop will bring together researchers from diverse areas and will establish how these key elements can be more effectively fused and help creating a just, safe, and optimised surgical environment, with a direct positive impact on patient outcomes.

Date: Wednesday 29th June 2022

Time: 09:00 – 17:00



Dan Hashimoto, Surgical Translation of Computer Vision: Integrating Data Streams for Augmented Decisions
Pietro Mascagni, Computer vision for intraoperative assistance: from proof of concept to clinical value
Marco Zenati, OpenICE: An Operating Theatre Integrated Clinical Environment
Gernot Kronreif, Surgical Data for Optimized Therapy
Riccardo Muradore, Trade-off between AI and Supervisory Control in the Next Generation of Semi-Autonomous Surgical Robots
Paul Stretton, Human Factors: Leaving our monochrome perspective of safe systems for a technicolour future
Dan Stoyanov, Building visual perception of the surgical site
Stefanie Speidel, Human-machine collaboration in surgery – bridging the gap between robotics and data science
Juan Margalo, Wearable technology for biomechanical analysis in minimally invasive surgery
Felipe Orihuela Espina, Ad-hoc data analysis strategies for surgical neuroergonomics; a historical perspective
Ravi Naik, Measuring cognitive workload in surgery
Fabio Cuzzolin, The SARAS surgical action detection challenges
Dan Leff, Clinical Neuroergonomics – From Expertise Development to Cognitive Burden Detection
James Kinross, The digital surgeon – the realities of digital surgery in an analogue world
George Mylonas, Multi-sensed AI environments for surgical task and role optimisation


Learning Outcomes

Currently most AI research in the operating theatre takes a narrow view and considers data obtained from isolated sensing modalities. With the proposed multiverse concept instead, we promote a holistic approach. The workshop will offer a unique opportunity to examine the state-of-the-art and unite under the same roof the different efforts towards human-centric sensing and AI.

Another major outcome of this workshop is the expected ideas cross-fertilisation between the diverse spectrum of relevant stakeholders. To this effect, the invited speakers represent the entire spectrum of the parallel universes forming the operating room multiverse, including engineers, clinicians, scientists, and industry.

More specifically, invited speaker will bring in knowledge and experiences from the following domains:

– Computer Vision
– Perceptual Interfaces
– Robotics
– Integrated and Smart Theatres
– Wearables
– Multi sensor fusion
– Human Factors


This workshop is worth 6 CPD points, please register to qualify for certification.