iPad Mobile Surgical Console

Last updated 05/16/2011


The main goal of this project is to develop a simple yet resourceful application for the Apple iPad that allows remote configuration of surgical robot components. When fully implemented, the application will serve the role as a mobile, centralized, ergonomically designed tablet with which the surgical team can dynamically change settings of the robot. The targetted system to test this device is the EyeRobot.

Student Members: Hanlin Wan, Jonathan Satria

Mentors: Balazs Vagvolgyi

Other Collaborators: Russell Taylor

Background, Specific Aims, and Significance

An operating room that utilizes a robotic surgical assistant often requires several computer workstations, each controlling different devices or aspects of the robot-human interface. This decentralized approach places both ergonomic and efficiency constraints to the surgeon and his or her team. Practically speaking, individuals are required to approach the correct component workstations, which often placed several meters away from the machine if not in another room, and input the correct settings. This certainly poses disadvantages, which a central command module could dramatically improve upon. The current input method of keyboard and mouse also poses some disadvantages – besides being difficult to sterilize and clean, it is certainly not appropriate for the fast-paced operating room, especially when touchscreens have become much more widely available.

Through the iPad, our application will provide an easy to use interface with which to control several component modules. As a portable device, it will also allow easy access of the robot components to several members of the surgical team.

Our specific aims are:

  1. Compile the cisst library onto the iOS platform.
  2. Design a GUI system on the iPad.
  3. Control various components using the iPad through the ICE interface.
  4. Full documentation so others can expand the controls to other devices and components.
  5. Revise GUI layout based on user feedback (time permitting).
  6. Clinical testing of system in a mock OR (time permitting).


  • Minimum: (no longer applicable)
    1. Use iPad as a dummy console to VNC into a computer.
    2. Create a GUI system to control the multiple components through the Scenario Manager.
  • Expected:
    1. Compile and build the cisst library on the iPad using CMake. (completed)
    2. Build a GUI application for the iPad to control the various components. (completed)
    3. Perform clinical tests of the iPad control system in a mock OR setting. (completed)
    4. Perform revisions to the GUI based on user feedback. (completed)
    5. Detailed tutorial for iOS-cisst interface. (completed)
  • Maximum:
    1. Live video display on iPad (sorta complete) See Milestone 4 for details.


  1. Core Components:
    • Install the core components – CMake, CISST, ICE – onto the iOS (completed)
    • Documentation for the interface between these (completed)
    • Status: Milestone 1 completed.
  2. GUI Component:
    • Become familiar with iOS GUI development (completed)
    • Build GUI to control the various components (completed)
    • Code the interfaces between the GUI and the components (completed)
    • Documentation for GUI design and interfaces (completed)
    • Status: Milestone 2 completed.
    • Expected Date: 5/19/11
  3. Testing/Revisions:
    • Test device with EyeRobot (completed)
    • Revise GUI based on user feedback (completed)
    • Status: Milestone 3 Completed
  4. Extended Features:
    • Implement new feature: video streaming (completed)
    • Status: Milestone 4 Conditionally Completed
    • We have working code to display bmp images. Balazs needs to write the code that converts the video feed into bmp images, which will be done after the end of the project.


Technical Approach

  • Cisst Library: This library is the backbone of the project. All communications to and from the various components uses this library.
  • CMake: CMake allows the compilation of the source code on any platform. While it doesn't officially support iOS, online sources have stated that this is possible, albeit difficult.
  • iOS: Development of the iPad application relies on the iOS framework. Apple's iOS Development Kit will be used for the design and implementation of the GUI.
  • ICE: The Internet Communications Environment manages the connections between multiple computers. This simplifies the programming required to get computers to communicate with each other.
  • Scenario Manager: This tool controls the interactions between various components. On startup, it finds all connected components and forms links between them.

This project attempts to use all of these tools to build a functioning iPad Mobile Surgical Console. The iPad will initially connect to the Scenario Manager in order to find all the components. Then, these components will be controlled from the iPad through the ICE interface. But first, the challenges involved in compiling the cisst library in the iOS environment will need to be resolved.


We had a successful test with the EyeRobot in a bunny experiment on April 21, 2011. The iPad interface worked perfectly. The users found our device to be much easier to use and a huge time saver. Furthermore, they gave us ideas to improve the GUI, which we are currently working on. Here is a video of it in use.

Below is a screenshot of the visualization console where the user can set various settings for the camera.

Click here to see full documentation and tutorials on how to write programs for the iOS using the cisst libraries. This page is currently private and under construction.

ICE Source Code - for compiling ICE and cisst for the iOS environment

XCode Project Source Code - for the iPad application

Reports and presentations

NOTE: In uploading media, upload them to your own name space or to private sub-namespace, depending on whether the general public should be able to see them.

Project Bibliography

  1. A. Deguet, R. Kumar, R. Taylor, and P. Kazanzides, “The cisst libraries for computer assisted intervention systems,” in MICCAI Workshop on Systems and Arch. for Computer Assisted Interventions, Midas Journal, Sep 2008.
  2. M. Henning, M. Spruiell, Distributed Programming with Ice, <http://www.zeroc.com/doc/Ice-3.4.1-IceTouch/manual/>
  3. M.Y. Jung, G. Sevinc, A. Deguet, R. Kumar, R. Taylor, P. Kazanzides, “Surgical Assistant Workstation (SAW) Communication Interfaces for Teleoperation.”
  4. P. Kazanzides, A. Deguet, A. Kapoor, O. Sadowsky, A. LaMora, R. Taylor, “Development of open source software for computer-assisted intervention systems,” In MICCAI Workshop on Open-Source Software, Oct 2005.

Other Resources and Project Files

Here give list of other project files (e.g., source code) associated with the project. If these are online give a link to an appropriate external repository or to uploaded media files under this name space.

courses/446/2011/446-2011-3/interactive_control_of_surgical_assistant_devices_using_ipad_multi-touch_graphical_user_interface.txt · Last modified: 2012/12/10 17:36 (external edit)