We’re hearing swirling rumors of a successor to Microsoft’s Kinect controller, and what the new system might be able to do. Here is what we would want from Kinect 2, and how close the technology actually is.
Rumors are flying about a sequel to the Kinect controller for Xbox 360. The original device, released about a year ago, was a groundbreaking whole-body controller that relies on infrared sensors and camera detection to read your movements. The device sits near your HDTV and can identify you by body shape, understand voice commands, and map your movements (plus it’s hackable, and PM gave Microsoft a 2011 Breakthrough Award for the Kinect development kit, which encourages people to use the hardware in new, innovative ways).
Read Complex Finger Movements and Gestures
The current Kinect is very good at detecting broad gestures, but several companies are working on more sophisticated gesture-control interfaces. That includes Russian developer DisplairFraunhofer FIT, and Oblong Industries, the company that developed the mockup system seen in Minority Report. These systems support complex gestures such as selecting an object in space, turning it around, zooming into a scene, and picking up objects. Next year Microsoft will release Kinect for Windows and has already made the SDK available for those who want to tinker with new subroutines for gesture control.
And the Kinect 2 may come equipped with a more powerful gesture-detection system. Rob Enderle, a consumer analyst with Enderle Group, suggested one idea: The Kinect 2 would understand sign language or let you create gestures that send messages to other players. Another: The new system could sense that you are holding an object in real life and want to use it in the game. For example, the Kinect 2 might know you’re holding a Nerf gun or a paintball rifle and create a virtual equivalent.
Roger Kay, an analyst with Endpoint Technologies Group, says the runaway success of the Kinect and Nintendo Wii has shown that people generally like interacting with games through gestures. More complex control in the Kinect 2 could enhance game play. Imagine using one finger pointed to show you’re using a gun, or a quick double-tap gesture to reload.
Be Your Translator
According to the rumors published in Eurogamer, Kinect 2 will be able to read your lips. It wouldn’t be voice-recognition technology that understands spoken voice commands, but rather motion detection that can tell what your lips are saying if you’re just silently mouthing words.
Imagine, though, if your gaming system not only could read your lips, but also could translate what you say on the fly. Already gamers across the world compete over Xbox Live, but instant translation would allow them to cross the language barrier and talk during the game.
Enderle says this could be a brilliant innovation. The Kinect 2 could also make your voice sound different in the game. And the fact that the technology might read your lips rather than listen to your voice could help to get around one problem with voice commands: They are annoying to those around you. Enderle says he can envision a lip-reading system in which multiple players in the same room all mouth their commands to the Kinect—though this might look pretty amusing to a bystander.
Still, lip-reading tech could be tough for Microsoft to pull off. For one thing, Kay says, the Kinect system (at least for now) lacks contextualization—the clues that help a computer understand why you’re saying something or remember what you’ve said before for clues to your intentions (Apple’s iPhone 4S helper Siri does this). And accuracy is important. The system would need to know the difference between mouthing “bull” and “pull”, which are often hard to interpret.
Sense Your Mood
The EuroGamer report also suggested that the Kinect 2 would be able to read facial movements to sense when you’re angry, and judge the volume and tone in your voice. The implication is that the game might know when you are getting upset and suggest taking a break, similar to how a high-end Mercedes CLS63 can sense when you have been driving for too many hours (it senses that your head is swaying side to side and compares that to many other data points, including how long you’ve been driving and erratic acceleration) and politely asks you to take 5.
Kay says there are existing algorithms that can sense when people are getting angry because they say the same word repeatedly, scowl often, or make faster hand gestures. (Of course, there’s one gesture that sends a clear message to the developers of the game that a level is too difficult.) But Suran Goonatilake, chairman and founder of Bodymetrics, a system that analyzes body sizes, says the resolution of sensors and cameras for gaming systems like Kinect need to improve before they can really tell how you’re feeling. There’s also a challenge in understanding the science of body language (how a particular stance might mean you’re angry in one context or happy in another). Yet, he says, programmers could develop a sensing system that sees when you cross your arms as a gesture of anger or dissatisfaction.
Enderle says a feature like this in Kinect 2 could have major implications for gaming: sensing when you have played the game too long based on your mood (not just that you have been playing too long, as it does now), or are starting to get depressed. Some gamers find these features to be annoying (the comical characters in Nintendo Wii can also suggest taking a break), but mood-sensing could mean understanding when gamers are showing more violent tendencies.
Mood-sensing isn’t just about cutting off your gameplay time before you throw your controller, though. It can also change gameplay. Imagine if a system like Kinect included programming to sense when you are being deceptive, looking for the kinds of micro-expressions and body language that Transportation Safety Administration (TSA) agents are trained to spot. It would certainly make it tougher to beat the AI at Texas Hold ‘em.
Read Your Mind
Here’s one of the most interesting concepts for the Kinect 2 (or maybe Kinect 3), one that might seem far-fetched at first: using thoughts to directly control a game. In Gears of War, maybe you’d think the phrase “switch to sniper rifle” or “jump” and trigger those actions, or you might think of a move to make in a puzzle game and the Kinect interprets those commands.
NeuroSky has already developed toys like the Mattel MindFlex and Star Wars Force Trainer that senses electroencephalogram pulses—tiny voltage changes along the scalp of your head. These fluctuations are still a long way from thinking of a command and having it execute in a sophisticated Xbox 360 game. Force Trainer, for example, simply scans for a pattern of signals in the brain associated with “concentration,” and when it reads them in the player’s brain, activates a fan to plow a ball upward in a tube as though you were using The Force to make it happen. Despite the simplicity of these early games, they show it is possible to capture brain activity and translate those electric pulses.
The company Haier America is already developing a Brain Wave TV that senses electrical impulses from your brain using a headset. They are working with PrimeSense and NeuroSky to develop technology that would let you change channels or start playing a movie by thinking of the command.
“[This] represents a type of gaming that does not require a person to know how to work a controller, and consumers of all ages are demonstrating that they are willing to quickly and readily embrace this technology,” says Douglas Lane, president of digital products at Haier America. “Americans are becoming increasingly comfortable with sensor control and gesture control, and we want to apply this technology more and more to televisions as the connected TV continues to evolve.”
Judge Your Singing Voice
The ultimate coup for Microsoft might be judging the quality of your singing or speaking voice—allowing millions of Americans to play Xbox 360 American Idol in their living rooms. Shadi Farhangrazi, a biotechnology consultant and speaker, says Microsoft already owns TellMe, a company that develops speech-recognition software used primarily for customer support systems. It’s not a stretch, she says, to take this to the next level and judge the tone and quality of a voice. The current Kinect system can already interpret commands and respond to what you say.
Enderle says another possibility is that the Kinect 2 could judge a musical instrument. It could detect whether you’re playing on pitch and playing the right notes, and even use that information in games that teach you how to play. This feature, he says, sounds particularly apt for a system like the Kinect—real, non-Guitar Hero instruments are difficult to connect to game systems, but if the Kinect 2 or 3 could understand how well you’re playing, it could be an ideal teaching tool. Kay agreed, saying the Kinect 2 could even sense when you are playing or singing too loudly, out of rhythm, or out of sync with the backing music.
For now, Microsoft isn’t saying anything. It gave a quick “no comment” when asked about upcoming features, and wouldn’t confirm or deny that there will even be a Kinect 2. But every expert we consulted says Kinect could add features that radically alter how we play games, especially in group environments, far beyond the innovations of the original system. Someday, your Kinect might be your translator, your mind-reader, your own private Simon Cowell, and more.