
Vocalytics, one of the many projects that came out of our Disrupt SF hackathon this weekend, wants to make you a better public speaker. The project uses machine learning to analyze videos of your performances and gives you feedback on your body language. The team trained the system to look for your hand gestures and pose, but the plan is to expand this project to also look at your eye gaze,…
Read More
via
Zero Tech Blog