header img
     
 


This page contains links to my thesis research, publications, and technical reports.

hr 612

 
 
MS THESES


Babcock, J.S. (2005). Yarbus 1.0: Eyetracking Software for Mac OS X Tiger. Master's Thesis. New York: New York University.   [PDF coming soon]  [HTML ]  

Abstract:
The technology to record eye movements has been around for over a hundred years. Despite rich historical advancements, eye-tracking technology has not surfaced to the mainstream. One of the most fundamental questions in eye movement research is how eye-tracking systems should evolve as interfaces between humans and computers. However, that question is difficult to answer because current eye-tracking solutions are expensive (~$20K), difficult to use, and rely on poorly designed user interfaces. One of the goals of this thesis has been to develop my own custom pupil tracking software so that I can explore the benefits and limitations of eyetracking technology. This publication describes Yarbus 1.0, an eyetracking system developed for the Macintosh.


Babcock, J.S. (2002). Eye Tracking Observers During Color Image Evaluation Tasks. Master's Thesis: New York: Rochester Institute of Technology.   [PDFs ]  

Abstract:
This thesis investigated eye movement behavior of subjects during image-quality evaluation and chromatic adaptation tasks. Specifically, the objectives focused on learning where people center their attention during color preference judgments, examining the differences between paired comparison, rank order, and graphical rating tasks, and determining what strategies are adopted when selecting or adjusting achromatic regions on a soft-copy display. In judging the most preferred image, measures of fixation duration showed that observers spend about 4 seconds per image in the rank order task, 1.8 seconds per image in the paired comparison task, and 3.5 seconds per image in the graphical rating task. Spatial distributions of fixations across the three tasks showed high correlation in four of the five images. Peak areas of attention gravitated toward faces and semantic features. Introspective report was not always consistent with where people foveated, implying broader regions of importance than eye movement plots. Psychophysical results across these tasks generated similar, but not identical, scale values for three of the five images. The differences in scales are likely related to statistical treatment and image confusability, rather than eye movement behavior. In adjusting patches to appear achromatic, about 95% of the total adjustment time was spent fixating only on the patch. This result shows that even when participants are free to move their eyes in this kind of task, central adjustment patches can discourage normal image viewing behavior. When subjects did look around (less than 5% of the time), they did so early during the trial. Foveations were consistently directed toward semantic features, not shadows or achromatic surfaces. This result shows that viewers do not seek out near-neutral objects to ensure that their patch adjustments appear achromatic in the context of the scene. They also do not scan the image in order to adapt to a gray world average. As demonstrated in other studies, the mean chromaticity of the image influenced observers' patch adjustments. Adaptation to the D93 white point was about 65% complete from D65. This result agrees reasonably with the time course of adaptation occurring over a 20 to 30 second exposure to the adapting illuminant. In selecting the most achromatic regions in the image, viewers spent 60% of the time scanning the scene. Unlike the achromatic patch adjustment task, foveations were consistently directed toward achromatic regions and near-neutral objects as would be expected. Eye movement records show behavior similar to what is expected from a visual search task.
 
 

hr 612

 
 
PUBLICATIONS

Adolph, K. E., Franchak, J. M., Badaly, D., Smith, M. T., & Babcock, J. S. (2008, May). Head-mounted eye-tracking with children: Visual guidance of motor action. Paper/Poster presented at the meeting of the Vision Sciences Society, Naples, FL.   [PDF coming soon]  

Li, D., Babcock, J., Parkhurst, D. J. (2006). openEyes: A low-cost head-mounted eye-tracking solution, ETRA 2006: Eye Tracking Research and Applications Symposium, 95-100.   [PDF]  

Babcock, J.S., and Pelz, J. (2004). Building a lightweight eyetracking headgear, ETRA 2004: Eye Tracking Research and Applications Symposium, 109-113.   [PDF]  

Eaddy M., Blaskó G., Babcock, J., Feiner, S., (2004) My Own Private Kiosk: Privacy-Preserving Public Displays, Proceedings of the Eighth IEEE International Symposium on Wearable Computers (ISWC 2004), Arlington, VA, USA, October 31 - November 3, 2004, pp. 132-135.   [PDF]  

Babcock, J.S., Pelz, J.B., Fairchild, M.D. (2003). Eye Tracking Observers During Rank Order, Paired Comparison, and Graphical Rating Tasks. PICS 2003: The PICS Conference, International Technical Conference on The Science and Systems of Digital Photography Rochester, NY, May 13, pp. 10-15.   [PDF]  

Babcock, J.S., Pelz, J.B., Fairchild, M.D. (2003). Eye tracking observers during color image evaluation tasks. In B.E. Rogowitz and T. N. Pappas (Eds.), Human Vision and Electronic Imaging VIII, SPIE Proceedings.   [PDF]  

Babcock, J.S., Pelz, J.B., Peak, J.F. (2003). The Wearable Eyetracker: A Tool for the Study of High-level Visual Tasks. Proceedings of the Military Sensing Symposia Specialty Group on Camouflage, Concealment, and Deception, Tucson, Arizona.   [PDF]  

Babcock, J.S., Lipps, M., Pelz, J.B. (2002). How people look at pictures before, during and after scene capture: Buswell revisited. In B.E.Rogowitz and T. N. Pappas (Eds.), Human Vision and Electronic Imaging VII, SPIE Proceedings, 4662, 34-47.   [PDF]  

Pelz, J.B., Canosa, R.L., Babcock, J.S. (2000). Extended Tasks Elicit Complex Eye Movement Patterns, ETRA 2000: Eye Tracking Research and Applications Symposium, 37-43.   [PDF]  

Pelz J.B., Canosa, R.L., Kucharczyk, D., Babcock, J., Silver, A., Konno, D. (2000). Portable eyetracking: a study of natural eye movements. In B.E.Rogowitz and T. N. Pappas (Eds.), Human Vision and Electronic Imaging V, SPIE Proceedings, 3659.   [PDF]  

Babcock, J., Krause K., Embrey, C., Hill, S., (1997). Enclosed balloon mechanism: A microgravity-based liquid behavior experiment. Journal of Biological Photography, Vol 65, No. 3, 42-47.   [PDF coming soon]  
 
 

hr 612

 
 
TECHNICAL REPORTS

Winfield, D., Li, D., Babcock, J., Parkhurst, D. J. (2005). Towards an open-hardware open-software toolkit for robust low-cost eye tracking in HCI applications. Iowa State University Human Computer Interaction Technical Report ISU-HCI-2005-04.   [PDF]  

Fairchild, M. D., Johnson, G. M., Babcock, J., & Pelz, J. B. (2001). Is your eye on the ball?: Eye tracking golfers while putting. Unpublished manuscript, Rochester Institute of Technology.   [PDF]  
 
     
     
  Copyright © 2006 Jason Babcock. All rights reserved. Valid CSS   Valid XHTML 1.0