If you’re lucky enough to be the proud owner of a shiny new smartphone, tablet or PC, you’re probably feeling pretty pleased with its capabilities. We like to think that today’s modern devices can do pretty much everything – for example, paying our bills, plotting our route to work, and telling us what the weather will be like for the next five days.
But try and get your PC to tell you what a particular odor smells like, or how soft something feels, and you soon realize that perhaps our technology isn’t as good as all that.
This is all set to change soon though, with IBM boldly predicting that in just five years, our devices will develop ‘senses’ that allow them to touch, feel, see, hear and taste the environment around them.
Big Blue says that its researchers are already working on a range of practical applications to take advantage of these new developments. One of its major projects at the moment involves its research into touchscreens – IBM is currently developing touchscreens that will be able to simulate the “feel” of various materials using a combination of haptic feedback, pressure and infrared. For example, a user might be able to ask their smartphone what ‘silk’ feels like, then run their finger over the touchscreen and experience a soft, smooth sensation as they do so.
The use of haptic feedback is already fairly widespread in computer games, adding an extra element to the user’s experience by producing vibrations and jolts to coincide with the on-screen action. IBM says that this can also be applied to smartphone tech by altering its vibrations to recreate unique sensations that accurately represent materials such as linen, cotton or silk.
Sensitive touchscreens would immediately be of interest to retailers, say IBM. Shoppers would be able to browse through dresses on a site like Amazon.com and be able to ‘feel’ the texture of the material before choosing whether or not to buy, whilst the technology could also work the opposite way round too. Through use of image correlation and digital image processing technology, it would also be possible to take a photo of a particular material, have your smartphone identify it, and then show you how it feels.
Another use of the technology could be in the medical field. Although this is probably further away, IBM says that one day it may even be possible for patients to take a photo of an injury they are concerned about, so that their doctor can remotely “feel” the affected area for any signs of serious damage.
The ability touch our smartphones is an intriguing one, but it’s just one sense-ational development of many that we can expect in the near future. Researchers are hoping to program future computers with all five sensory skills in the next five years.
Among the uses suggested by IBM, computers will be able to ‘see’ visual data and make sense out of it in the same way that humans can, with big implications for the medical industry, where a computerized eye might be able to spot subtle differences in tissue that could indicate potential disease.
The ability to hear things sounds even more intriuguing, with a wide range of applicatiosn being suggested by IBM, the most compelling being that soon, computers will be able to understand “baby talk” (crying) and inform mothers what their baby is asking them for.
Should the little tot be hungry, your computer might just able to help out with that too – using its sense of taste, it will be able to break down different ingredients to their molecular level and use the data it gathers to come up with novel new food pairings, suited to your own particular tastes.
Finally, future computers will also develop a super-human sense of smell that could be used in hospitals to sniff out unwanted antibiotic-resistant bacteria, or by city authorities to keep pollution in check before it becomes a problem.