
Encouraged by development in haptic technology, we have undertaken ‘Project Braille Tab’ – a prototype that triggers images of keywords extracted from audio. This brings stories and images to your fingerprints in real-time and brings the community one step closer to digital equality. Any keyword extracted from an audio can be made tactile for all users. We have taken great inspiration from children’s rebus stories which essentially, represent crucial words in a story and display it in a pictorial manner. Our model does exactly this but uses raised dots to display a shape or a picture. The Braille Tab is creating the tactile display that pushes through the boundaries and expands the possibilities of digital accessibility.

The hardware part of our project comprises of an 8x8 tablet with 64 server (servo SG90) motors. Our tablet works on a SMPS 12V Power supply with a buck boost converter and consists of 64 pins in a pixel-like grid that can quickly be set to be in up or down positions with the help of the motors, forming easily identifiable shapes. These are being controlled by Arduino mega that work on a and have a rod attached to them that move at an angle from 0°-180° thereby reflecting the objects.


Stage 1- initially selected 100 objects which are the most common keywords in any childhood stories. I have collected at least 100 images of each object and divided the dataset in the ratio of 80:20. The 80% data are used for training and the remaining 20% are used for testing the CNN model.
Stage 2- Our prototype hardware was originally run using electromagnets and solenoids, but the project was not feasible and had too much power consumption.
2023 All Rights Reserved.