ImmersiveTouch review

ImmersiveTouch is a surgical simulation platform that leverages most advanced technologies in the Virtual and Augmented Reality field to help surgeons in training their skill and in planning complex surgeries.

Video overview:


This technology was born with the aim of training surgeons but the huge amount of possibilities it offers make it suitable for many applications in surgical field. They are:

  • Involving the patient making him clear what the surgery is and how surgeons will operate making him aware of the chosen approach and reducing anxiety. Furthermore, it will educate the patient giving him the possibility to participate in making the final decision.
  • Planning most critical and delicate surgeries in every detail to find the best path and methodologies to minimize the side effects of the intervention.
  • Surgeons can practice as many times they want without risks with the possibility to try new tools and strategies in a realistic way. Given that they will be more confident and secure when working with real patients.

There are two features that make this technology so effective. First, the possibility to use models from real patients converting images from MRI (Magnetic Resonance Imaging) or CT (Computed Tomography) into Virtual Reality objects. In this way the models used in the system will refer to real patients giving doctors the possibility to study solutions that can suit any situation.

Figure1. example of object virtualization

Each patient is different from the others for many reasons and the possibility to customize the surgical approach can be very important to reduce as most the risk. Second, the user can interact with the system using haptic controllers that gives two big advantages:

  • Simulation of surgical tools in dimension and in usage, allowing the user to hold them as they really were surgical devices.
  • Realistic feedback depending on the tissue the user is interacting with. We know how much hands are critical for surgeons and how the information they retrieve from them is important. For this reason, recreating realistic hand feelings in this case is as important as recreating a realistic visual experience.

In 2015 ImmersiveTouch obtained the (Food and Drug Administration) FDA clearance for the developed system. It means that the technology has been recognized as an improvement in the health of patients.

Real case usage

Few months ago, this technology has been used in New Delhi to plan the separation surgery for conjoined twins, a very complex operation that had very few attempts in the last 50 years and with a success rate below 25%.
A team of roughly forty doctors, from different medical fields, studied this case using ImmersiveTouch and the surgery was simulated several times using HTC Vive to plan the most effective surgical path. The first step of the surgical operation was successfully completed and the doctor leading the team underlined the importance of practicing before such a delicate operation and was grateful for the possibility to use such technology.


The project was born in the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago in 2005 as an integrated system including both hardware and software based on augmented reality. Now the system supports also virtual reality using HTC Vive or Oculus Rift providing a simulation of an operating room environment.


The hardware of the original prototype consists in three main components: the display, the hand tracking system and the head tracking system.

An augmented reality display has been chosen because is the best one to be used in haptic applications because the user’s hands are behind the screen and are incorporated in the virtual image without interfering with it. This is possible because the image in the high resolution monitor go through a half-silvered mirror and is projected on a virtual plane that in the system is between the user and the haptic controllers.
With traditional screens, where the hands of the user and the hand tracking system would cover the image, it would be difficult for the user to interact with the virtual object.

Figure 2. Model of the system with tracking transmitter highlighted

To make the system accessible and user-friendly, the haptic device is on a desk and a wrist and elbow support is provided to guarantee a comfortable posture for the user. For the same reason the virtual projection plane should be at 45° with the table. Given that, we need to position the display and the mirror in a proper way. The mirror is in the bisector of the angle between the plane and the display, and assuming that the user should not be between the mirror and the screen, the best option is the one depicted in Figure 2.


Head tracking system is important to orientate the virtual object according to the movement and tilting of the user’s head. In this way the two images building the 3D stereo visualization are always aligned with the eyes of the user. The used system is pciBIRD, an electromagnetic tracking system consisting in a transmitter and in several sensors. The transmitter is positioned on the desk, this is the ideal position to avoid interference with the display and loss of signal from receivers that would decrease the accuracy. Receivers to track head movement are positioned on the glasses.

Hand tracking system is composed by two parts because each hand is track separately. One hand is tracked by the stylus of the haptic device that guarantees precision and the tactile feedback. The device used in the system is the PHANTOM produced by SensAble. In the other hand the user should hold a device used to move the virtual object (the used device is the SpaceGrips holding pciBIRD receivers to track and four buttons to interact with the scene).

Figure 3. PHANTOM haptic device



Considering software, the system uses only open source libraries that have been integrated in the ImmersiveTouch API. They are:

  • The Visualization ToolKit (VTK): cross-platform C++ library used to manage and process volumetric data, in this case it is used to elaborate data coming from MRI and CT generating isosurfaces with homogeneous density.
  • Coin: high-level 3D graphic library used to render real-time graphics.
  • The General Haptic Open Software Toolkit (GHOST): cross-platform library used to associate to each object a material, modifying some coefficients. Given a set of parameters the library computes the force the haptic controller should oppose to the user movement recreating the felling of interaction with human tissues.
  • pciBIRD API: library to manage tracking using pciBIRD.
  • Fast Light ToolKit (FLTK): library used to create a graphical user interface
  • Open Audio Library (OpenAL): library used to manage the audio of the system.


Modularity of the system allows an easy integration of new features and upgrading of the used technologies like display and controllers. Considering how fast VR/AR technologies are growing this is essential to let the system being always updated.
Obviously, the precision and the feedback given by the system must be very accurate and nothing should be neglected when building and upgrading the system considering the delicate application of the technology.

A doctor trained on an inaccurate environment would expect an erroneous response when operating on real patients exposing the latter to several risks. Bugs in software or an undesired behavior of hardware can be critical for the health of patients. Given that, the usage of open source libraries guarantees the possibility to react to flaws in a fast way.

Despite this, such a technology is a great leap for surgery and will be probably considered in the future as a milestone in the technological enhancement in medicine given the wide range of opportunities that it offers to improve the health of patients.




Slides here


%d bloggers like this: