Abstract presentation

Sensory input and its application to machine learning in robotics (TWIP)
Titus Mickley

Perform laboratory

Advisor: Kamran Binaee

Humans use there eyes to perform various actions with little effort. Currently, advanced robots have trouble correctly predicting (location, position, force applied, ect?) 

By recording a humans gaze movements and hand/eye coordination in a controlled virtual space using VR (virtual reality) as the subject repeatedly attempted to catch a ball, we were able to bypass uncontrollable variables such as... which would be encountered in reality

and accurately control parameters and allow for simplified collection of subject’s performance data.
it is possible to...

recording target interception paradigms (patterns of intercepting or missing the ball), using sensory input from hand/eye coordination’s, within a virtual space where a ball was repeatedly caught. Experimentation was performed in a virtual space to accurately control parameters, and allow for simplified collection of subject’s performance data. The outcome of the analysis could assist in the development of applications using machine learning technology, by creating parameters in mathematics designed for programming, and technological implementations specific to robotics coordination software. 

Purpose
Method 
Scope

Comments

Popular posts from this blog

Day 28

Day 25

Day 26