Dirty hands and computer devices don´t mix, especially if you work as a maintenance engineer in the oil industry. But one way or another, it is important for operators to retrieve digital information in order to assess when something in the production process needs to be fixed. One way to get around the problem is to use eye tracking technology and a gestural camera. This way, the operator can navigate on the computer screen by both using his eyes and gestural hand movements. With inspiration from the games industry, RISE Interactive (formerly called Interactive Institute) has built such a prototype for ABB Corporate Research.
What might human-machine-interaction in the processing industry look like in 5 to 10 years? Are there alternatives to the traditional way of doing things? That, in short, was the assignment RISE Interactive was given by ABB Corporate Research.
“We have been working together with ABB for several years, building prototype concepts for the future, and we keep track of current technology. ABB already had connections with the Swedish company Tobii and ABB knew that we were interested in exploring the use of Microsoft's Kinect Camera,” says Ru Zarin, an Interaction Designer at the RISE Interactive, explaining the explorative approach of the research project.
Two combined technologies
Kinect is a motion sensing input device. The technology uses a 3D camera that detects 3D gestures and makes it possible to steer objects on the screen. Tobii Technology has developed eye tracking technology, enabling users to interact with computers using their eyes. Both Tobii and Kinect have publicly available Application Programming Interfaces.
Navigate and select without touching
The prototype consists of two flat screens, placed a couple of meters in front of the operator. On the right-hand screen, a 3D representation of an oil rig is shown. By swiping vertically with the arm, the operator can navigate through different levels of the oil rig model. Instead of using a mouse to highlight objects and make them clickable on the screen, the operator uses his or her eyes. When an object is selected, it is moved by a swipe gesture to the screen on the left. Here, in the process view, the operator can interact more in detail with a specific object, e.g. a compressor, and see how it performs. Via the eye tracking, menu items can also be expanded to reveal more information.
“We did this to show what would happen when these two technologies were combined. The project had an explorative approach and we were hoping to initiate a dialogue on how access to pertinent data could be better retrieved in specific environmental conditions with the use of new technology,” says Ru Zarin.
So far, the showcase piece has received good reviews at an internal ABB conference in Texas.
“With a prototype, people start to think: Can we use this in another context? Often they can,” says Ru Zarin.
The research project was conducted by RISE Interactive in only two months and involved a team of seven.
RISE Interactive and ABB Corporate Research.