Flowers in Chania Flowers in Chania

Easy Tasks AI

Easy.Tasks.AI

By Jia Ruparel and Veer Ruparel

The Problem

Do you have elderly people in your house? People with special needs? Well, if you do then you know that there are many problems that they have to face in their daily lives. For example getting tired really fast or having problems moving around. We have created a solution for some basic problems using Machine Learning.

About Our Device

We have created a prototype that performs specific tasks based on hand gestures.

According to hand gestures, our device can do things that everyone wishes they could do by just sitting down.

- It can turn on and off lights and the TV.

- It can also increase and decrease volume

Our Device can also help in other situations like:

- Call 911 for emergencies

- Help people who can't talk by translating sign language to text that we can understand.

Imagine Doing all these things with just a simple hand gesture!



(Look for Light_On, Light_Off, Tv_On, Tv_Off, Emergency commands triggered by IFTTT. And Volume Up, Volume Down commands powered by "pycaw" library.)





SOFTWARE (ML)

To make our device we used Machine Learning. We Trained the computer to label specific hand gestures. For example if we do a thumbs up, then it means 'Light On'. To do this, we used a Machine Learning platform called Lobe and took pictures of us giving a thumbs up and then the computer trained itself to recognize a thumbs up as light on. This is why it is called Machine learning, because if you give it some input, the computer learns and trains as per the input.

HARDWARE(Jetson Nano/Raspberry Pi)

We used Jetson Nano to create our device. Jetson Nano is a small computer with a lot of proseccesing power and it is perfectly suited for AI. We created a 3d-printed model and put our Jetson Nano in it to create the hardware. We exported our program into the Jetson Nano but since our project is image based we needed a camera, which we had but unfortunately the camera broke! Without a camera our device cannot work. We even tried using Raspberry PIl but it didn't work either and it has less processing power so it would take much longer for the tasks to happen of ot worked. We then figured out a way to connect our computer directly with the lights and TV so we can do all the tasks using our computer webcam. This was one of the biggest challenges that we faced!

FUTURE APPLICATIONS

Our device can become even better in the future! Over time as Machine learning expands our device can become even more accurate and become a real product. If our device becomes 100% accurate, it will really be able to help people who can't talk and make their life much easier. Using ML, in the future many other basic tasks can be done with hand gestures. Our device can pave the way for creating different types of AI. There are a lot of voice based AIs like, Amazon Alexa, Google Home, e.t.c. We can build an image based AI, which does image classifications. In the future, if our camera for the Jetson Nano is working then you could have a portable device that you can put in your house that will perform tasks based on you hand gestures!

* We had created both the hardware and software for this device but the camera of the hardware was broken so we could not use the hardware because our device is about analyzing hand gestures for which we need a camera. So we had to use our computer webcam to get our project to work and we couldn't use our hardware *

CHECK OUT THE DEMO VIDEO BELOW!



mlDemo

Click the image above or this link




















About Us

This device is made by Veer and Jia, and we wanted to help our grandfather by making this device!

Your input: Give suggestions on how to improve our device by filling out the form below.

https://forms.gle/1L9cACsmcw72v8kM8

Things we used; Lobe, Teachable Machine, Jetson Nano, Raspberry Pi