Think of the situation, where you want to reach out for a cup of office, that is located somewhere in front of you at a table. Which sensor are you using to pick up the coffee mug?
As a first step, you will be using your eye vision to identify and locate the cup. Perhaps you can even see the cup handle, perhaps not. Next, you will stretch your arm to reach out for the cup. As soon as your hand is within vicinity of the cup, you don’t necessarily use your eyes for further helping you to get your coffee. In fact, you may look away and still be able to pick up the cup. Your nervous system captures your hands contact with the cup using tactile sensing. It is the tactile sensing the identifies the orientation of the cup, and ensures that the hand has a firm grip of the cup.
From early ages, any normal person has trained the skills of picking up objects using a combination of vision and tactile sensing – to the point of it being an ability that anyone can do without even thinking about it.
Robots with tactile sensing ability can mimic humans ability to handle objects, and this has inspired Graspian to create better robots using tactile sensing.