12 months ago by SignIt
Which problem are you trying to solve?
SignIt aims to help improve the everyday lives of the deaf/mute community. It does so with the help of a glove that
translates hand gestures to vocal form.
How are you going to solve the problem?
By using sensors,along with micro-controllers, our gloves will translate certain hand movements into spoken English
form, a form which is understood by the majority of the society.
What is the impact of your project?
Our team has conducted a survey and interviews. This helped us notice that most people have not interacted with a deaf
person. SignIt aims to make the deaf feel less excluded from society.
How can the project be manufactured in the OpenLab?
Using the FabLab utilities will help us better customize our glove in order to make it more wearable. We can achieve
this by swapping some of our sensors, or lessen the amount of wires.
Describe your project in detail
Nowadays the media and television has a great influence on people’s knowledge and opinions, sometimes it is for the
greater good, and at other times it is just a distraction from the reality around us. In our case, the media has brought
to light the significance of the deaf/mute community’s struggles: such as their inability to communicate as naturally
with others as easily as we do. According to the World Federation of the Deaf , there are approximately 70 million deaf
people whose first language is sign language; that is a large number of people who have a hard time communicating their
Technology is being applied to a wide variety of areas, such as medicine, education etc. However, inventors
have focused more on everyday lives of the majority of individuals, and have not put as much focus on the minorities.
The minority we aim to focus on is the deaf/mute community. We have come to realize that this community has gotten a
lesser share in this society than most of us do, due to the fact that they basically speak another language. Taking this
reality into consideration, we noticed that we rarely see deaf/mute individuals in public places which are open to
everybody, and that is due to the lack of means of communication with the world around them. Our team believes that sign
language should be seen as another language, rather than a disability, an obstacle even, that stands in the way of equal
opportunity between this minority group and what they would like to take part in. SignIt is a project that focuses on
helping the deaf/mute community to interact with other individuals in a simple and an efficient manner. It does so by
the use of a glove that has diverse types of sensors, serving different functionalities, mounted on top of it aiding in
the detection of movements made by the user. It then refers to our given ranges and measurements in order to identify
which gestures are being performed and what it corresponds to, then it translates the intended gesture into vocal form.
SignIt is a technology that is meant to be used in everyday context, whether it may be simple everyday tasks, like
making a trip to the grocery store, or grabbing some coffee. This technology is used to prevent the deaf/mute’s
disability from acting as a barrier between them and the society. This disability should be seen as something that is
easy to adapt to, not something hard to live with. All in all, SignIt acts as a common ground between the deaf/mute and
the individual whom their speaking to; thus allowing the deaf/mute to perform everyday tasks more easily with a glove
that helps them convey their messages in a manner which can be understood by the surrounding individuals. Describing the
design in more details, it makes use of two gloves connected together. It uses two small microcontrollers and several
sensors on them. The devices on one glove are flex sensors, pressure sensors, accelerometer, Emic-2-Module, which is
used for speech output of gestures made, and an accelerometer. The second glove consists of just one accelerometer used
for additional accuracy. Sensors detect changes in movement, resulting in change of readings which will then be compared
with the ranges we have identified each gesture with, then that result will be the output in vocal form.
is a link for SignIt :