Posts

Showing posts from July, 2017

Day 18

Today i hastily finished a portion of my structure draft in time to hand it in, but it is far from complete. The majority of the day consisted of more paper summaries but later i did finally switch it up by facilitating an experiment on Emily where we finally successfully collected the data we need. But unfortunately the data we reviewed was biased so it will have little to no use in any final analysis. The rest of the day i sat down with Kamran and began to decipher open-AI code and how we can use it in our 3D simulation.

Outline DRAFT

Modelling Human Hand-Eye Coordination Using Recurrent Neural Networks Introduction What the presentation is going to discuss/ introduce the project ·          Rationale o     The goal of this project is to … ·          Background A.       Recurrent neural network (RNN) §   A RNN is a class of artificial neural network where connections between units form a directed cycle, allowing dynamic temporal behaviour. §   Dynamic temporal behaviour §   The trajectory of states, in a state space, followed by a system during a certain time interval [TM1]   . B.       Machine/reinforcement/deep learning §     C.       Eye movements and uses for navigation and coordination §     D.       Virtual Reality simulation §   explain VR first (present the uses of VR in the experiment e.g. controlled variables and easy data collection) ·          Method o     Collect hand, head, and gaze data from various subjects age ?-? (<=too specific?) using VR (go into more

Day 17

Today was a far shorter day than usual. The day consisted of more research summaries but i did run into a snag as the sites the papers are posted on require monetary to access them outside of RIT so i could only analyse the abstracts for now. That's it for this week now just have to do the outline over the weekend and catch up on Game of Thrones.

Day 16

Couldn't do the blog yesterday as by the time i got home it was already the next day anyway. The day consisted of a few hours of work at the lab, and helping the other lab with a few experiments, then going to the CEK observatory in which unfortunately no astrological sites could be viewed due to cloud coverage. I did get a lot of cool photos of the observatory and everything relating to the scope. We did not get back to RIT until very late in the night and i did not reach my residence until the next day. During my waiting i contacted my adviser and asked if i could come in later the next day in which he replied that i could work from home if i wanted which i did accept.

Day 15

Couldn't get the blog done yesterday because the internet was down where i lived. Anyway, yesterday was a far more interactive day in that most of the day was allotted to eye tracking data collection both on myself and others. After some successful, then unsuccessful attempts at collecting data, the rest of the day was mainly helping Kamran perform an experiment on himself, and then sitting down and finding how to get the code to work, and how to improve the visualization with a vector based trace from the "cyclopeaneyenode" to the ball.

Day 14

Image
Today was the basic 9 to 5 at the office pushing papers. The entire day i analysed various papers all relating to differing methods for robot navigation and hand control. After reading 5 of them i then created summaries of the information which was new and important to my understanding. I will do the same thing tomorrow with the last 5 papers. the chart below explains the simplifies 2D balance in force which a robotic hand of n amount of numbers must use in order to avoid dropping or crushing an object. This only applies to one method of control, but so far that I've seen this report seems to be the most accurate and has much someone could expand upon, especially since the report is from 1989 and had nearly none of the  technology available today for robotics.

Day 13

Image
Today was research day. The entire duration of the day i looked up and read over 10 research papers all relating to robotics navigation and the differing paradigms that can be used for differing situations such as global (" environment surrounding the robot is known and the path which avoids the obstacle is selected")   or local methods (" environment surrounding the robot is unknown, and sensors are used to detect the obstacles and avoid collision") . 

Day 12

Today was the slowest day of them all. After arriving i attempted to edit my abstract to clean it up a bit, figuring out i was missing some crucial information to explain to a lay audience. Later i spent the majority of the day researching methods on how to implement the new code into the simulation to no avail. later in the day Aditi and i had a teleconference with Rakshit analysing data, he then told us to collect data on a non-biased individual, which was funny as 3 of our fellow interns arrived minutes before the instruction. Unfortunately only 1 session could be recorded as the left eye of the gaze tracker became dysfunctional for reasons unknown. The day was ended with a relaxing hour of testing VR games with most fellow interns, which helped me unwind the weeks fatigue. I will travel back to RIT on the weekend to hopefully finish all High School summer assignments and possibly mess around with more VR and DOOM modding if i can find the time.

Day 11

Image
After spending about an hour and a half getting the laptop to have everything necessary for the project i spent the rest of the day coding, save the hour long meeting at 2:00 where the entire floor relating to eye tracking research met up and shared their projects and progress.

Day 10

Image
I have officially made it 10 days without failing someones expectations. Today was a very simple day as Aditi and i worked on the exact same code the entire day to correctly determine separate rotational velocities in the right, left, and cyclopean eye. I also learned that it was possible to put pictures into the blogs, i will revise them all and put the appropriate code for each day into them via screenshot beginning tomorrow.

Day 9

I began the day by successfully modding my copy of DOOM with Project Brutality v2.0.3, which i can't wait to play around with. When actually beginning work, Aditi and i worked the entire day coding Rakshit's and Kamran's projects. We found far more luck with Kamran's code, Rakshit's unfortunately was not functional due to unknown reasons which i believed to be the Spyder. Either than that we read about more AI development and Machine Learning.

Day 8

A steady start to the week, many loose ends were tied today. After finally finishing Rakshit's homework, Aditi and i spent a majority of the day coding on several projects, one of which was retyping a file of lost code. The file was successfully reconstructed and sent to Kamran for analysis and extra modifications. For once Aditi was the one to instigate a "walk around break" at the end of the day, maybe i'm rubbing off on her a bit?

Abstract presentation

Sensory input and its application to machine learning in robotics (TWIP) Titus Mickley Perform laboratory Advisor: Kamran Binaee Humans use there eyes to perform various actions with little effort. Currently, advanced robots have trouble correctly predicting (location, position, force applied, ect?)  By recording a humans gaze movements and hand/eye coordination in a controlled virtual space using VR (virtual reality) as the subject repeatedly attempted to catch a ball, we were able to bypass uncontrollable variables such as... which would be encountered in reality and accurately control parameters and  allow for simplified collection of subject’s performance data. it is possible to... recording target interception paradigms (patterns of intercepting or missing the ball), using sensory input from hand/eye coordination’s, within a virtual space where a ball was repeatedly caught. Experimentation was performed in a virtual space to accurately control parameters, and allo

Day 7

Image
Today was the last day before the weekend. Not much was accomplished in scale of quantity but what little was done did have great quality to it. The entirety of the day was used on programming data sets and vector sets into Vizard. After devising a working method after several hours, it felt amazing to finally have a method that not only worked, but could be applied to other projects currently having the same problem. For the most part it was Aditi who did the research and method creation for the codes setup i was mainly the one typing and came up with other methods, most unsuccessful but some were in fact exactly what we were missing. After my long drive home i had some time to think back on the week and realised that i have not only done more work than i have since Australia, but that I've learned more about what could deeply effect my future if not the rest of my life. I do owe an immense amount to my mentors Kamran and Rakshit, along with everyone else who enters the lab.

Day 6

Image
 Today i was dropped off at RIT, so it allowed me to refresh my mind with some extra sleep. Today was a notably slow day, first Aditi and i did more vector math programming, attempted to work on Kamran's code more but ran into a problem and moved on, and recorded and analysed more eye tracking video data. Nothing truly notable about today, but i hope tomorrow has more in store for me.

Day 5

Image
Today was a very productive day. Aditi and i worked on the Vizard code for half the day and implemented various functions for the animation all bound to specific keyDown commands. this includes the ability to; pause, play, go forward by a frame, go backwards by a frame, raise and lower by 1 metre, and finally change from a fixed position camera view to a free-cam mode. if not for Aditi's knowledge of code vocabulary it would have taken me forever to achieve the same results. the second half of the day consisted of analysing a small amount of eye-tracking footage i captured last week. After hours of analysis, many concepts and methods for both eye travel, reflexes, and a fundamental concept on how the human body actually uses data to move and turn were understood, prompting me to think more in-depth on possible research tasks for the future. I also found out that other interns can already see the blogs, i do hope my content isn't too boring and formal.

Day 4

Image
After my usual morning routine consisting of bussing and the morning meeting, Aditi and i began work on deciphering Mr. Binaee's virtual experiment, cutting out some unnecessary code, and learning how it all functioned. This task was supposedly supposed to take two days to do, we achieved it in just a couple hours. After we hit a bug in the code, which now resolved turned out to be an inaccuracy in Vizard's file detection system and the improper placement of vital files, the rest of the day was a mixture of vector mathematics, in which Aditi taught me to solve using Python, a small meeting allowing Kamran to see how we were both faring in the new environment, and finally tampering with more code. Our task for tomorrow should be simple, but i will wait and see.

Day 3

Image
After a long drive back from my house in Athens PA, i arrived a little later than i usually would, being 8 in the morning. After morning meeting, Aditi and i returned to the lab and finished all the Vizard tutorials we would need to do. We then waited about an hour with no duties until Mr. Binaee arrived minutes before lunch break. After lunch Aditi and i visited another lab on our floor with a group working with similar goals and technology to ours. We then brought back any new knowledge to the team and continued until the end of the day doing a variety of coding and modelling activities. On a final note it does not feel like it has been only 3 days, this summer is going to be one to remember.

Day 2

Image
Upon arrival at RIT early in the morning, i entered the interactive lab and proceeded to work on Vizard programming for an hour and a half. I then attended the daily meeting and returned to my work. I then brought Aditi up to speed on my progress and we continued to program until noon. Come noon, Aditi and i were introduced to Rakshit Kothari, another of our employers. After brief greetings with Mr. Rakshit, we all proceeded to drive to Chipotle and order a meal, all while recording the eye tracking movements of Mr. Rakshit on his laptop. When we returned to the lab, Rakshit went into great depth about the manner of his experiments and passion for the technology, something i deeply related to. We then analysed the recording and noted multiple spatial patterns of the eye and its erratic, yet precise movements. At 2:30pm most of the researches met up to convene in a meeting and shared how they fared in their studies. The rest of the day more eye tracking experiments were conducted by me

Day 1

Image
My first day at RIT was certainly an interesting one. Arriving early in the morning i waited for my fellow interns to arrive at the meeting room. I was unfamiliar with most of the interns and went about befriending them. When the first intern arrived i quickly realised we knew each other from my days in Tioga High. After briefly catching up many other interns swiftly arrived and i introduced myself to them and asked their names and other small chat information. I don't believe i will remember their names very well, I've never been too good at that. After the introductory meeting with Mr. Pow we travelled to the Red Barn, and participated in a variety of team building exercises. Through these trials i got to better understand and cooperate with my newfound colleagues, as well as assess their skills. After travelling back to Building: 76, we were treated with a free pizza lunch, and i took the opportunity to better acquaint myself with my colleagues. Next i was sent to do my in-p