David Coombs, Ph.D. NIST, Intelligent Systems Division MET B124 Gaithersburg MD 20899 USA Tel: (301) 975-2865, FAX: (301) 990-9688 18973 Abbotsford Circle Germantown MD 10876 (301) 515-0349 Research Objectives ______________________________________________________________________________ To design and prototype innovative systems that use computer vision to change the way computers are used, improving robustness of existing applications, expanding into new areas, and improving computer interactions with the people who use them. Dr. Coombs's interests range over computer and biological vision, human-computer interaction, multimedia communications, real-time systems, and robotics, with a focus on achieving robust performance. One effort is immersive visualization of complex data and simulations. In addition, he developed a prototype binocular telepresence system with a wide field of view and a high-resolution inset for teleoperating a robot crane. He has developed real-time active vision and gaze control of camera movements to aid visual perception. This has enabled a moving robot to keep its cameras smoothly following a nearby moving object. The same broad principle has been applied to a robot that uses low-resolution motion perception to steer between obstacles in a laboratory. Education ______________________________________________________________________________ Ph.D. Computer Science, University of Rochester (1992). Thesis: "Real-time Gaze Holding in Binocular Robot Vision," published in part as David Coombs and Christopher Brown. Real-time binocular smooth pursuit. International Journal of Computer Vision, 11(2):147-164, 1993. M.S. Computer Science, University of Rochester (1988) B.S. with Honor Computer Science, Michigan State University (1986) Computer Experience ______________________________________________________________________________ Skills designing, developing, integrating and testing real-time systems for vision, control, and visualization. Languages C++, C, Perl, Ada, LISP, Prolog, FORTRAN, Modula2, Pascal, assemblers Packages and Tools Reg Willson's version of Roger Tsai's camera calibration, RCSlib, CAVElib, make, FrameMaker, Microsoft Office, Claris HomePage, Claris Works, Quadralay Webworks, Photoshop Systems Solaris, SunOS, BSD UNIX, Irix, VxWorks, Windows, MacOS, System7, VMS, DOS, CP/M Networking and Communications TCP Sockets, NML, RPC Platforms sparc, SGI, MC680x0, Macs, PCs, ADSP, VAX, PDP11, CYBER750, MC6800 Motor Controllers Delta Tau PMAC, Directed Perception, TRC Labmate, Denning, Compumotor Image processing and Modeling WiT, VRML, Performer, Axxess, CAVELib, Mathematica, Matlab, Coreco Ultra II, Datacube MaxVideo, Aspex PIPE, Pixar, Khoros, ImagCalc Professional Experience ______________________________________________________________________________ Computer Scientist (1992-present) Intelligent Systems Division (formerly Robot Systems Division), National Institute of Standards and Technology, Gaithersburg, Maryland. Visualization and Web presence: Exploring representation and navigation in immersive environments for visualizing data (e.g., from the unmanned ground vehicle) and simulations (e.g., manufacturing processes and operations with a factory, or the Hexapod machine tool). Investigated using motorized stereo cameras as a visual telepresence system for crane teleoperation and vehicle driving: wide and narrow field cameras simultaneously provide wide field awareness and high resolution precision. Took a lead role in developing the division's presence on the web, installing several of our papers and advertising postdoctoral research opportunities, resulting in tenures by two excellent postdoctoral researchers in our lab. Intelligent Vehicles: Improving autonomous mobility competence in the Army's Demo III experimental unmanned ground vehicle program. Taking a lead role in understanding user needs, developing algorithms, rapidly prototyping systems, integrating systems, collecting sensor data and evaluating algorithm performance. . Collected ground video data annotated by vehicle motion parameters (GPS, INS, timestamp) and calibrated the camera. . With indoor robots, used flow features as the sole cues for robot mobility. Low resolution motion vision over large fields of view enabled the robot to steer safely between obstacles; active camera control simplified the motion interpretation. Motion vision also enabled the robot to represent its environment for planning efficient mobility. Minimal representations achieved robust vehicle mobility. Used camera fixation to improve negotiation of a cluttered field. These techniques resulted in autonomous runs lasting up to 26 minutes in a cluttered laboratory. The Unmanned Ground Vehicle Performance Evaluation suggested evaluations of the performance of the UGV. Developed proposed procedures to evaluate sensing systems for navigation and driving and for steering subsystems. Senior Engineer (ATR), and Guest Researcher (NIST) (1991-1992) Advanced Technology and Research Corporation, Laurel, Maryland, and Robot Systems Division, National Institute of Standards and Technology, Gaithersburg, Maryland. The Sensory Processing and World Modeling Project developed competence in autonomous and cooperative robot applications with a focus on robot mobility. Computer vision provided the robot's perceptual needs, and deliberate control of the robot's cameras aided real-time performance in dynamic environments. Empirical laboratory work for this project required the integration of a robot, cameras, and camera motors; real-time image processing used the PIPE. Research and Teaching Assistant (1986-1991) Department of Computer Science, University of Rochester, Rochester, New York. Teaching: Teaching assistant for 2 semesters for graduate courses on compilers and operating systems. Supported, administered and graded class programming projects and lectured in recitations. Research in vision for robots: Visual processing and motor control for controlling the gaze of binocular robots. The thesis work focused on holding the robot's gaze on a moving object while the robot moves (i.e., tracking the object with the robot's cameras). The approach exploits the control of camera movements to simplify the visual processing that is necessary, enabling real-time performance of the system. The real-time vision processing was performed on Datacube MaxVideo hardware. Research Intern (summer 1988) Image Understanding Group, General Electric Corporate Research and Development, Schenectady, New York. Helped install a Pixar and displayed models on it ported from the ImagCalc system. Also worked on using predicted object motion to speed matching a model to a sequence of images. Programmer/Analyst (1982-1986) Information Processing, College of Education, Michigan State University, East Lansing, Michigan. Designed and developed software and configured hardware to support office systems in a networked PC environment. Consultant (July 1985) Client: Prof. Evelyn Oka, Department of CEPSE, College of Education, Michigan State University, East Lansing, Michigan. Designed and developed a system to administer a test of reading comprehension span to children. Programmer (summer 1984) Free Shear Flow Laboratory, Division of Engineering Research, Michigan State University, East Lansing, Michigan. Developed software that identified appropriate regions of data and statistically analyzed those data. Honors and Awards ______________________________________________________________________________ NIST Cash-in-a-Flash for excellence in leading data collection 1997 NIST Cash-in-a-Flash for excellence in virtual collaboration 1997 2000 Outstanding People of the 20th Century Who's Who in the World (1998--) Dictionary of International Biography (1998--) Who's Who in the East (1997--) Phi Kappa Phi Pi Mu Epsilon (Mathematics) Tau Beta Pi (Engineering) Honors College of Michigan State University National Science Foundation Graduate Fellowship Honorable Mention National Merit Scholarship Michigan Competitive Scholarship General Electric Information Services Company Scholarship Bechtel Power Corporation Scholarship Society of Mechanical Engineers Scholarship Society of Die-Casting Engineers Scholarship Professional Activities ______________________________________________________________________________ Co-Organizer AAAI 1994 Spring Symposium on Physical Interaction and Manipulation Session Chair ISAS 1997 Conference Publications Chair ECVNet Active Vision Hardware Workshop 1995 OSA Topical Meeting on Image Understanding and Machine Vision 1989 Reviewer National Science Foundation Department of Commerce Prentice Hall International Journal of Computer Vision IEEE Journal of Robotics and Automation IEEE Transactions on Pattern Analysis and Machine Intelligence IEEE Transactions on Robotics and Automation IEEE Computational Science and Engineering CVGIP: Image Understanding Computer Vision and Image Understanding CVPR IEEE Conference on Computer Vision and Pattern Recognition ICRA International Conference on Robotics and Automation OSA Topical Meeting on Image Understanding and Machine Vision Committee service University of Rochester Computer Science Admissions Committee 1988 University of Rochester Computer Science Laboratory Committee 1988 Affiliations Behavioral and Brain Sciences Associate ACM Member IEEE Member Selected Publications ______________________________________________________________________________ Refereed Articles [1] Theodore Camus, David Coombs, Martin Herman, Tsai-Hong Hong. Real-time Single-workstation Obstacle Avoidance Using Only Wide-field Flow Divergence. Videre. accepted for publication pending revision, 1998. [2] David Coombs, Martin Herman, Tsai-Hong Hong, and Marilyn Nashman. Real-time Obstacle Avoidance Using Central Flow Divergence and Peripheral Flow. IEEE Transactions on Robotics and Automation. 14 (1) 49--59. February 1998. [3] Sandor Szabo, David Coombs, Martin Herman, Ted Camus, and Hongche Liu. A real-time computer vision platform for mobile robot applications. Journal of Real-time Imaging, 2(5):315-327, October 1996. [4] David Coombs. Sensor fusion in motion perception. Behavioral and Brain Sciences, 17(2):317-318, June 1994. [5] David Coombs and Christopher Brown. Real-time binocular smooth pursuit. International Journal of Computer Vision, 11(2):147-164, 1993. [6] Thomas Olson and David Coombs. Real-time vergence control for binocular robots. International Journal of Computer Vision, 7(1):67-89, November 1991. Invited Articles [7] Martin Herman, David Coombs, Tsai-Hong Hong, and Marilyn Nashman. Vision-based mobility using optical flow. Robotics and Machine Perception, SPIE's International Technical Working Group Newsletter, 3(2):4-5, September 1994. [8] David Coombs and Steven Whitehead. Report on the AAAI 1994 spring symposium on physical interaction and manipulation. AI Magazine, Summer 1994. [9] David Coombs and Christopher Brown. Cooperative gaze holding in binocular vision. IEEE Control Systems, June 1991. Book Chapters [10] Martin Herman, Marilyn Nashman, Tsai-Hong Hong, Henry Schneiderman, David Coombs, Gin-Shu Young, Dani Raviv, and Albert Wavering. Minimalist vision for navigation. In Yiannis Aloimonos, editor, Visual Navigation: From Biological Systems to Unmanned Ground Vehicles, pages 275-316. Lawrence Erlbaum Associates, 1997. [11] Christopher Brown, David Coombs, and John Soong. Real-time smooth pursuit tracking. In Andrew Blake and Alan Yuille, editors, Active Vision, chapter 8, pages 123-136. MIT Press, 1992. Refereed Conference Papers [12] Ted Camus, David Coombs, Martin Herman, and Tsai-Hong Hong. Real-time single-workstation obstacle avoidance using only wide-field flow divergence. In Proc. of ICPR 1996, the International Conference on Pattern Recognition, Vienna, Austria, August 1996. [13] David Coombs, Martin Herman, Tsai-Hong Hong, and Marilyn Nashman. Real-time obstacle avoidance using central flow divergence and peripheral flow. In Proc. of ICCV 1995, the Fifth International Conference on Computer Vision, Cambridge, Massachusetts, June, 1995. [14] David Coombs and Karen Roberts. Centering behavior using peripheral vision. In Proc. of CVPR'93, the IEEE Conference on Computer Vision and Pattern Recognition, New York, June 15-17, 1993. [15] David Coombs and Christopher Brown. Real-time smooth pursuit tracking for a moving binocular head. In Proc. of CVPR'92, the IEEE Conference on Computer Vision and Pattern Recognition, Champaign, Illinois, June 15-18, 1992. Invited Conference Papers [16] David Coombs and Karen Roberts. "Bee-bot": using peripheral optical flow to avoid obstacles. In Proc. of the SPIE Conf. on Intelligent Robots and Computer Vision XI: Algorithms, Techniques, and Active Vision, Boston, Massachusetts, November 15-20, 1992. [17] David Coombs, Ian Horswill, and Peter vonKaenel. Disparity filtering: Proximity detection and segmentation. In Proc. of the SPIE Conf. on Intelligent Robots and Computer Vision XI: Algorithms, Techniques, and Active Vision, Boston, Massachusetts, November 15-20, 1992. [18] David Coombs and Christopher Brown. Intelligent gaze control in binocular vision. In Proc. of the Fifth IEEE International Symposium on Intelligent Control, Philadelphia, Pennsylvania, September 1990. Other Reports [19] David Coombs, Sandor Szabo, Martin Herman. The Annotated Ground Video Data Collection Project. DARPA Project Report. National Institute of Standards and Technology. Intelligent Systems Division. . August 1997. [20] Elena Messina, David Coombs, Tom Kramer, John Michaloski, Fred Proctor, Will Shackleford, Keith Stouffer, Tsung-Ming Tsai. Findings and Recommendations for a Software Development Process. NISTIR 5989, National Institute of Standards and Technology, Gaithersburg, MD, March 1997. [21] David Coombs. Visual sensing for navigation and driving. In Recommendations for Performance Evaluation of Unmanned Ground Vehicle Technologies, chapter 3, pages 32-50. Available as NISTIR 5244, National Institute of Standards and Technology (NIST), August 1993. [22] Dana Ballard, Christopher Brown, David Coombs, and Brian Marsh. Eye movements and computer vision. In 1987-88 Computer Science and Engineering Research Review. University of Rochester, Computer Science Department, Rochester, New York, September 1987. Selected Presentations ______________________________________________________________________________ Invited Workshop presentations "Mobility Technology for Demo III" Demo III CTT Mobility White Paper presentation, Demo III Inaugural Workshop, Towson, Maryland, March, 1998. "What is Active Vision good for?" Invited talk, ECVNet Active Vision Hardware Workshop, Le Sappey, France, February, 1995. "Lessons from experience with robot heads at Rochester and NIST" Invited talk, ECVNet Active Vision Hardware Workshop, Le Sappey, France, February, 1995. "Mechanical structure issues in robot head design" Invited talk, ECVNet Active Vision Hardware Workshop, Le Sappey, France, February, 1995. "Coordinated control of Active Vision systems" Invited talk, ECVNet Active Vision Hardware Workshop, Le Sappey, France, February, 1995. "Animate Vision at The University of Rochester." Invited talk, Unmanned Ground Vehicle Workshop, Pittsburgh, Pennsylvania, May 1991. Conference and Workshop Presentations "Real-time obstacle avoidance using central flow divergence and peripheral flow." ICCV 1995, the Fifth International Conference on Computer Vision, Cambridge, Massachusetts, June, 1995. "Chairs are no obstacles." AAAI 1994 Spring Symposium on Physical Interaction and Manipulation, Stanford, California, March 1994. "Exploiting gaze holding for visual following and robot mobility." Sigma Xi Postdoctoral Poster Presentation. NIST chapter, Gaithersburg, Maryland, February 1994. "RoboVac and the cat will get along famously." AAAI 1993 Fall Symposium on Instantiating Real-World Agents, Raleigh, North Carolina, October, 1993. "Centering Behavior Using Peripheral Vision." CVPR'93, the IEEE Conference on Computer Vision and Pattern Recognition, New York, New York, June 1993. "`Bee-bot' takes the middle of the road." SPIE Conference on Intelligent Robots and Computer Vision XI: Algorithms, Techniques, and Active Vision, Boston, Massachusetts, November 1992. "Disparity filtering: proximity detection and segmentation." SPIE Conference on Intelligent Robots and Computer Vision XI: Algorithms, Techniques, and Active Vision, Boston, Massachusetts, November 1992. "Smooth pursuit for binocular robots." CVPR'92, the IEEE Conference on Computer Vision and Pattern Recognition, Champaign, Illinois, June 1992. "How do eye movements influence vision?" AAAI 1992 Spring Symposium on the Control of Selective Perception, Stanford, California, March 1992. "Intelligent gaze control in binocular vision." Fifth IEEE International Symposium on Intelligent Control, Philadelphia, Pennsylvania, September 1990. "Gaze control and segmentation." AAAI-90 Workshop on Qualitative Vision, Boston, Massachusetts, July 1990. "Gaze control and segmentation." University of Rochester Industrial Affiliates, Rochester, New York, May 1990. "Tracking objects with eye movements." Optical Society of America Topical Meeting on Image Understanding and Machine Vision, North Falmouth, Cape Cod, Massachusetts, June 1989. Invited Seminar Presentations "Robot Gaze Control and Mobility Using Minimalist Vision." Maryland Robotics Seminar, University of Maryland, College Park, Maryland, October, 1994. "Robot Gaze Control and Mobility Using Minimalist Vision." Artificial Intelligence Lab Seminar, Massachusetts Institute of Technology, Cambridge, Massachusetts, October, 1994. "Exploiting Gaze Holding for Visual Following and Robot Mobility." GRASP Lab Seminar, University of Pennsylvania, Philadelphia, Pennsylvania, June, 1994. "Exploiting Gaze Holding for Visual Following and Robot Mobility." Robotics Seminar, Yale University, New Haven, Connecticut, May, 1994. "Biologically-inspired Active Vision." Robotics Seminar, Columbia University, New York, New York, February 1994. "Biologically-inspired Active Vision." Computer Science Seminar, Colorado School of Mines, Golden, Colorado, December 1993. "Biologically-Inspired Robot Vision." Computer Vision Guest Lecture, The Johns Hopkins University, Baltimore, Maryland, April 1993. "Real-time gaze holding in binocular robot vision." Computer Science Seminar, University of Rochester, Computer Science Department, December 1991. "Real-time gaze holding in binocular robot vision." Robotics Seminar, Hughes Research Laboratories, A.I. Center, Malibu, California, June 1991. "Real-time gaze holding in binocular robot vision." Computer Vision Seminar, David Sarnoff Research Center, Princeton, New Jersey, June 1991. "Real-time gaze holding in binocular robot vision." Robotics Seminar, Martin-Marietta Corporation Space Systems, Denver, Colorado, June 1991. "Real-time gaze holding in binocular robot vision." Computer Vision Seminar, NIST, Robot Systems Division, Gaithersburg, Maryland, June 1991. "Real-time gaze holding in binocular robot vision." Robotics Seminar, The MITRE Corporation, Autonomous Systems Group, McLean, Virginia, June 1991. "Real-time gaze holding in binocular robot vision." Computer Vision Seminar, Alfred I. duPont Institute, Wilmington, Delaware, May 1991. "Real-time gaze holding in binocular robot vision." Computer Science Seminar, University of Toronto, Toronto, Canada, May 1991. "Real-time gaze holding in binocular robot vision." Computer Science Seminar, Michigan State University, East Lansing, Michigan, April 1991. Advisors ______________________________________________________________________________ Christopher Brown, Dana Ballard, Randal Nelson (Department of Computer Science, University of Rochester), W. Michael King (Department of Neurology, University of Mississippi Medical Center; previously at Department of Physiology, University of Rochester) References available on request ______________________________________________________________________________