Scientists have developed an innovative software programme that watches a user's movements and gives spoken feedback on what to change to accurately complete a yoga pose.
"My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting," said project lead Kyle Rector, a University of Washington doctoral student in computer science and engineering.
The programme, called Eyes-Free Yoga, uses Microsoft Kinect software to track body movements and offer auditory feedback in real time for six yoga poses.
Rector wrote programming code that instructs the Kinect to read a user's body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose.
For example, the programme might say: "Rotate your shoulders left" or "Lean sideways toward your left".
The result is an accessible yoga 'exergame', a video game used for exercise that allows people without sight to interact verbally with a simulated yoga instructor.
Rector and collaborators Julie Kientz, a UW assistant professor in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.
Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga instructors to put together the criteria for reaching the correct alignment in each pose.
The Kinect first checks a person's core and suggests alignment changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.
Rector worked with 16 blind and low-vision people around Washington to test the programme and get feedback. Several of the participants had never done yoga before, while others had tried it a few times or took yoga classes regularly.
Thirteen of the 16 people said they would recommend the programme and nearly everyone would use it again.
The technology uses simple geometry and the law of cosines to calculate angles created during yoga. For example, in some poses a bent leg must be at a 90-degree angle, while the arm spread must form a 160-degree angle.
The Kinect reads the angle of the pose using cameras and skeletal-tracking technology, then tells the user how to move to reach the desired angle.
Rector and collaborators plan to make this technology available online so users could download the programme, plug in their Kinect and start doing yoga.
"My hope for this technology is for people who are blind or low-vision to be able to try it out, and help give a basic understanding of yoga in a more comfortable setting," said project lead Kyle Rector, a University of Washington doctoral student in computer science and engineering.
The programme, called Eyes-Free Yoga, uses Microsoft Kinect software to track body movements and offer auditory feedback in real time for six yoga poses.
Rector wrote programming code that instructs the Kinect to read a user's body angles, then gives verbal feedback on how to adjust his or her arms, legs, neck or back to complete the pose.
For example, the programme might say: "Rotate your shoulders left" or "Lean sideways toward your left".
The result is an accessible yoga 'exergame', a video game used for exercise that allows people without sight to interact verbally with a simulated yoga instructor.
Rector and collaborators Julie Kientz, a UW assistant professor in Human Centered Design & Engineering, and Cynthia Bennett, a research assistant in computer science and engineering, believe this can transform a typically visual activity into something that blind people can also enjoy.
Each of the six poses has about 30 different commands for improvement based on a dozen rules deemed essential for each yoga position. Rector worked with a number of yoga instructors to put together the criteria for reaching the correct alignment in each pose.
The Kinect first checks a person's core and suggests alignment changes, then moves to the head and neck area, and finally the arms and legs. It also gives positive feedback when a person is holding a pose correctly.
Rector worked with 16 blind and low-vision people around Washington to test the programme and get feedback. Several of the participants had never done yoga before, while others had tried it a few times or took yoga classes regularly.
Thirteen of the 16 people said they would recommend the programme and nearly everyone would use it again.
The technology uses simple geometry and the law of cosines to calculate angles created during yoga. For example, in some poses a bent leg must be at a 90-degree angle, while the arm spread must form a 160-degree angle.
The Kinect reads the angle of the pose using cameras and skeletal-tracking technology, then tells the user how to move to reach the desired angle.
Rector and collaborators plan to make this technology available online so users could download the programme, plug in their Kinect and start doing yoga.
No comments:
Post a Comment