Are you surprised? Don’t be. Yes, again technology has proved its worth. A team of computer science researchers Kyle Rector, Cynthia Bennett and Julie Kientz from Washington have designed a software program that will help blind to do yoga and stay healthy.
The team made software which can view a user’s movement and give them audible feedback. In feedback it tells user to what changes user should do to make an exact and correct yoga pose. The program is named as “Eyes free Yoga”. The program uses Microsoft Kinect software to trace human body movements. The software program proffers perceptible feedback for 6 yoga poses. Each of the six poses has almost 30 different commands for enhancement. This is based on a number of rules which are considered necessary for each yoga position. The main yoga pose involve Tree, Chair, Warrior I and Warrior II. The technology utilizes simple easy geometry and law of cosines to calculate angles formed during yoga positions.
Kyle Rector, the project leader, a UW doctoral student from CSE (computer science engineering) coded the programming code which instructs Kinect to read user’s body language and discover the body angle formed during yoga positions. Then it provides verbal feedback to user to adjust the arm or leg or neck or body to complete the yoga position in correct way. The resultant is an easily manageable and reachable yoga video game “exergame”. This yoga exercise video game is used for exercise which permits people who are visually challenged to have a complete conversation with a simulated yoga instructor. It guides them properly to make an accurate yoga pose. He started the project with a vision for this technology for blind or low vision people who can try or understand the basics of yoga poses in much contented zone.
Julie Kientz, an assistant professor in Human Centered Design and Engineering, University of Washington and Cynthia Bennett, a research assistant in computer science and engineering are collaborators of Kyle Rector. They feel that this visual activity can be a source of enjoyment and encouragement to visually challenged people. Rector and collaborators prepare to make this technology accessible on internet with online facility. Thus users can download the program, plug in the Kinect and start doing yoga. If the launch goes successful many people can download it and enjoy the audio yoga instructor.
The team worked with various different yoga instructors to understand the concept and find out the criteria for making the proper alignment in each yoga position. The Kinect software initially observes user’s body pose from head to neck area to arms and lastly to legs. After observation it gives audible suggestions to users to make changes following the same line from head to neck area to arms and legs. The software even provides positive feedback if the user is making a correct and accurate yoga position. For example the software program may either say rotate your leg left or lean sideways towards right or rotate shoulder left or simply say good job.
Rector studied with 16 visually challenged and low-vision people in and near Washington to test the program. Many of the participants had never done yoga before. Some of them had tried it few time and others had undergone regular yoga classes. 13 out of 16 people were happy and satisfied with the software and its feedback programming. They recommended others to use it regularly.
Thus with this unique technology of computer and Kinect “eyes free yoga” can act as virtual yoga instructor that can provide speech feedback for visually challenged people.