Pranav Mistry is the inventor of SixthSense,
a wearable device that enables new interactions
between the real world and the world of data.
We grew up interacting with the physical objects around us. There is an enormous number of them that we use every day. Unlike most of our computing devices, these objects are much more fun to use. When you talk about objects, one other thing automatically comes attached to that thing, and that is gestures: how we manipulate these objects, how we use these
objects in everyday life. We use gestures not only to interact with these objects,
but we also use them to interact with each other.
A gesture of "Namaste!",
maybe, to respect someone, or maybe -- in India I don't need to teach
a kid that this means "four runs" in cricket. It comes as a part of our everyday learning.
So, I am very interested, from the beginning, that how --
and how we use these objects, can be leveraged to our interactions
with the digital world. Rather than using a keyboard and mouse,
why can I not use my computer in the same way that I interact in the physical world?