Tuesday 18 December 2012

IBM's next revolution: Computers that see, smell, hear, and taste

Following in Watson's footsteps, computers will let users fondle virtual silk shirts and translate baby talk in next five years

Google
Computers will be capable of predicting colds, devising healthy food recipes, translating baby talk, and replicating the feel of textures over the next five years, according to IBM's Next 5 in 5, a forecast of how computers will mimic the senses by 2017.
Citing Watson -- IBM's supercomputer "Jeopardy" champion -- as the first step in developing computers capable of learning from interactions with big data and humans, IBM Chief Innovation Officer Bernard Meyerson writes "one of the most intriguing aspects of this shift is our ability to give machines some of the capabilities of the right side of the human brain. New technologies make it possible for machines to mimic and augment the senses."
Those senses include touch, sight, hearing, taste, and smell, according to IBM. In terms of touch, IBM researchers noted it's already possible to re-create a sense of texture through vibration, such as in rumble packs found in game controls. "Those vibrations haven't been translated into a lexicon, or dictionary of textures that match the physical experience," according to Robyn Schwartz, associate director of IBM Research Retail Analytics; Dhandapani Shanmugam, solutions architect; and Siddique A. Mohammed, software architect of IBM Software Group Industry Solutions. "By matching variable-frequency patterns of vibration to physical objects so that when a shopper touches what the webpage says is a silk shirt, the screen will emit vibrations that match what our skin mentally translates to the feel of silk."
Using digital image processing and digital image correlation, the researchers wrote, it will be possible to capture texture qualities in a PIM (product information management) system, which retailers could use to match textures with their products and their products' data, such as sizes, ingredients, and dimensions.
Other applications for this touch technology include gaining understanding of our environment. "Take farming, for example. Farmers could use a mobile device to determine the health of their crop by comparing what they're growing to a dictionary of healthy options that they feels through a tablet," they wrote.
There could be health applications as well, such as be able to send an image of an injury to a doctor, who could use the more data-rich image to quickly render a diagnosis.
IBM's John Smith, senior manager of Intelligent Information Management, outlined how computers will be able to learn from seeing to help us understand the 500 billion photos we take every year. For example, by showing a computer thousands of pictures of beaches, it will learn over time to detect the patterns that go into properly identifying seaside photos.
This sort of technology has health implications, according to Smith. "Take dermatology. Patients often have visible symptoms of skin cancer by the time they see a doctor. By having many images of patients from scans over time, a computer then could look for patterns and identify situations where there may be something pre-cancerous, well before melanomas become visible," he wrote.
On the retail front, this sort of technology will enable companies to glean from photos posted to Facebook and Pinterest what specific products and promotions to market at users. In a similar vein, the technology could be used by public-safety organizations and utility companies during disasters. "Photos of severe storms -- and the damage they cause, such as fires or electrical outages -- uploaded to the Web could help electrical utilities and local emergency services to determine in real time what's happening, what the safety conditions are and where to send crews," Smith wrote.

IBM Master Inventor Dimitri Kanevsky, meanwhile, described how computers over the next five years will be capable of understanding sounds, thanks to algorithms embedded in cognitive systems. "Some of my colleagues and I patented a way to take the data from typical baby sounds, collected at different ages by monitoring brain, heart and lung activity, to interpret how babies feel," he wrote. "Soon, a mother will be able to translate her baby's cries in real time into meaningful phrases, via a baby monitor or smartphone."

What's more, systems using sound-gathering sensors will be able to monitor the environment, such as determining that a tree might be on the brink of toppling over during a storm based on the sounds it's making.

Kanevsky also described a theoretical device capable of hearing ultrasonic frequencies and translating, say, dolphin or dog chatter. A smartphone, associated with an ultrasonic system, could turn the speaker's voice into an ultrasonic frequency that cuts through sounds in the room to be delivered to, and retranslated for only the recipient of the message, he wrote.

Also, computers will be capable of detecting taste over the next five years, according to Dr. Lav Varshney, research scientist for IBM Services Research. "Computers will be able to construct never-before-heard-of recipes to delight palates -- even those with health or dietary constraints -- using foods' molecular structure," he wrote.

Varshney's team is designing a learning system that adds creativity to cognitive computing. The result is a system capable of not just coming up with a fixed answer to a question but with something entirely new. "The system analyzes foods in terms of how chemical compounds interact with each other, the number of atoms in each compound, and the bonding structure and shapes of compounds," he wrote. "Coupled with psychophysical data and models on which chemicals produce perceptions of pleasantness, familiarity and enjoyment, the end result is a unique recipe, using combinations of ingredients that are scientifically flavorful."

Finally, Dr. Hendrik F. Hamann, research manager of physical analytics for IBM Research, wrote that within the next five years, mobile devices will likely be able to tell you you're getting a cold before your very first sneeze through a computer version of smell.

"Tiny sensors that 'smell' can be integrated into cell phones and other mobile devices, feeding information contained on the biomarkers to a computer system that can analyze the data," he wrote. "Similar to how a breathalyzer can detect alcohol from a breath sample, sensors can be designed to collect other specific data from the biomarkers. Potential applications could include identifying liver and kidney disorders, diabetes, and tuberculosis, among others."

This article, "IBM's next revolution: Computers that see, smell, hear, and taste," was originally published at InfoWorld.com. Get the first word on what the important tech news really means with the InfoWorld Tech Watch blog. For the latest business technology news, follow InfoWorld.com on Twitter.