Last summer I worked with Dr. Raja Kushalnagar on a project that attempts to explore the idea of “closed interpreting,” which is similar to closed captioning, but displays an interpreter instead of English text. Such a system could be used to enhance deaf and hard of hearing individuals’ understanding of online videos. This would be especially useful and applicable for online learning, where professors upload videos of their lectures. These lecture videos often are not completely accessible to deaf students. The aim of this project would be to improve accessibility and to give deaf and hard of hearing students an equal opportunity to succeed.
For my project, I created a webpage that displayed the lecture video and an interpreter video alongside it, which could be toggled on and off. I created three different versions: 1) the interpreter video was static and remained next to the lecture video; 2) the interpreter video moves along with the information in the lecture video to help the viewer more easily switch their gaze between the interpreter and the lecture materials; 3) the viewer could adjust and move the interpreter video at their discretion. We recruited deaf and hard of hearing participants from the Rochester, NY area to take part in the experiment and answer survey questions. Eye-tracking data was taken during the experiments to see if it supported the conclusions reached from reviewing the surveys.
My work experience during the last two summers was part of the Research Experience for Undergraduates program in Accessible Multimodal Interfaces (REU AMI). I found out about this opportunity in 2015, from a friend who was in software engineering. I was accepted and participated in the AMI REU during the summer of 2015, which is when I started working on this project. Thanks to AccessComputing, I was given funding to continue my work through the summer of 2016.
I submitted my work to the 18th International ACM (Association for Computing Machinery) SIGACCESS (Special Interest Group on Accessible Computing) Conference on Computers and Accessibility (ASSETS 2016), and was accepted to the ACM Student Research Competition (SRC) portion of the conference. I flew to Reno, NV in October 2016 to present my project. My presentation won first place in the undergraduate SRC division.
I absolutely enjoyed working for the REU AMI during the past two summers. It was a great and enriching experience. The work environment was very friendly, accessible, and diverse. Many of my co-workers were deaf or hard of hearing and were fluent in sign language. We also had hearing coworkers, who were all friendly and open to learning about Deaf culture. The AMI site is located at RIT in Rochester, which is an area with a very high deaf population. The REU AMI was my first experience working in the field of human-centered computing (HCC). I learned a lot about HCC and how to recruit and work with participants. Additionally, the program provided me with valuable research experience, which will bolster my applications to Ph.D. programs. I plan to enter a doctorate program, in either computer science or human-centered computing, within the next one or two years.