Motion sensing technologies like Microsoft Kinect won’t be limited to PCs and video game consoles, Microsoft’s chief research and strategy office Craig Mundie said in a speech at the Massachusetts Institute of Technology Thursday.
A surgeon in an operating room could move his hands over a screen and update a patient’s information while keeping the environment sterile. Or a person could play a board game with an opponent at another location, by displaying the game on a surface and allowing the sensors to pick up the players gestures, Mundie said.
Sensors and display advancements will change how humans interact with computers and lead to more systems in which people gesture to operate machines, Mundie said.
Microsoft will target these HUI (human user interface) technologies first at the gaming space, said Mundie. Its first attempt enters markets in November when the Kinect system for its Xbox 360 gaming system goes on sale globally.
The aim is to eliminate controller-based gaming and “just operate quite normally” on an HUI system, he said. Kinect looks to meet this goal by allowing people to control games by movement and voice commands.
Nintendo popularized the concept of motion-controlled gaming in 2006 when it launched its Wii game console. The first Wii required a wireless controller to play games, moving the peripheral displayed motions, such as hitting a baseball or rolling a bowling ball, on screen. The company has improved the system considerably over the past four years.
Microsoft is also working on avatars, such as a virtual receptionist, that “try to emulate the social experience of interacting with a person,” he said.
3D technology, which every player in the IT industry is adding to their products, also ties in nicely with HUI, according to the demonstration. Attendees donned glasses and shopped with Mundie as he virtually picked up, disassembled, turned around and eventually placed an item in a box for purchase. He also showed a 3D television show that allows viewers to participate in the plot. He also spoke commands to control and interact with the computer.
“Whether games and TV will emerge in this way is hard to tell” but the technology is there, Mundie said.
Microsoft plans on learning from what initial uses are developed for HUI technology and develop the system accordingly. For example future developments could account for recurring actions, like writing a signature.
“We’re going to see what emerges from using this thing across a wide range of applications,” Mundie said.
The HUI system Mundie demonstrated combines imagines from an infrared and a regular camera to develop a “skeletal map” of a person and microphones on the device pick up audio commands. In the case of the Kinect system, this data is sent to the Xbox 360 for processing. Placing the computing power in the HUI system would have increased the price, Mundie said.
Do not expect cloud computing to handle the data gathered from HUI systems, or at least those from Microsoft.
During a question-and-answer session following his talk, Mundie was asked if cloud computing, an increasingly popular method of processing data, factored in to how an HUI system would compute the input.
“A lot of this HUI interaction will be locally computed,” he said after saying that he sees the future as the “client plus the cloud taken as one thing, not two things.”
Companies and users do not want to pay for the bandwidth to transmit data for remote processing and cloud computing’s latency issues mean a smooth interaction is unlikely, Mundie said.
“Humans are very sensitive to latency,” he said.