The advanced visualization project invited seemingly unlikely interest from the National Aeronautics and Space Administration, which felt it had potential for its own robotic surgeries. Instead of a system for human patients, however, it needed a tool to help astronauts effectively conduct often-exacting maintenance operations on satellites and other spacecraft without having to take strenuous and dangerous space walks. Essentially, the astronauts were doing much the same work as the surgeons, performing intricate operations from a remote location. Like the surgeons, they reported becoming confused by the limited camera views.
In the surgeon's augmented-reality project, SSIM researchers have written software code that includes overlaying three-dimensional graphic representations of various structures on the camera's video scene. The collective real-time view with the synthetically generated graphics yields a broad map to ease navigation. Virtual reality then takes over to show the surgeon exactly where his or her surgical instrument is located on the map. The researchers sent the code to NASA, which made modifications before installing it on a space station robot simulator. The SSIM researchers are now working with NASA to refine the code and make it flight-ready.