TWiki> Kenyon Web>WebLeftBar>Biography (2009-09-07, Main.kenyon)EditAttach

Biographical Sketch

Dr. Kenyon received his B.S. degree in Electrical Engineering from the University of Rhode Island, in 1970, a M.S. degree in Bioengineering from the University of Illinois, Chicago, in 1972, and a Ph.D. in Physiological Optics from the University of California, Berkeley, in 1978. From 1979 to 1986, he was a faculty member of the Department of Aeronautics and Astronautics at the Massachusetts Institute of Technology, Cambridge and Harvard Medical School - Whitaker Health Sciences and Technology Joint Programs. He is currently a Professor of Computer Science at the University of Illinois at Chicago. From 2000-2002 he was a visiting Associate Professor at the University of Washington, Seattle. In 2006, he was a visiting Research Associate at the College de France, Laboratoire de Physiologie de la Perception et de l'Action, working with Prof. Alain Berthoz. His research has spanned the areas of sensory-motor adaptation, effects of micro-gravity on vestibular development, visuo-motor and posture control, flight simulation, virtual environments, computer graphics, Tele-immersion and sensory/motor integration for navigation and wayfinding.

While at MIT he was a collaborator on several Space Shuttle experiments that studied the effects of micro-gravity on human/animal orientation: Spacelab-1, German Space-lab (D-1), and STS-29 (''Chix in Space''). He also developed and delivered an interactive visual display system to produce simulator-like experiences for AF pilots undergoing training at Brooks AFB centrifuge and disorientation trainers. For this AF funded research, both hardware and software was designed to present the pilots with an interactive wide-field-of-view computer-generated imagery superior to the current day head mounted displays. He also was originator, director, and one of the three instructors that taught one of the first flight simulator courses in the country designed for professionals (MIT's summer session program).

His work at UIC has concentrated on virtual environments (VEs) with his involvement with the CAVE. He was co-PI on two NSF grants that were instrumental in the development of the CAVE. He also has been a major contributor to understanding how limitations of a VE system (such as the CAVE) can affect human behavior. Other work has examined human performance in VEs and how to quantify the use of VEs for training and collaboration. This work was performed using stand-alone CAVE applications and also in networked (i.e., tele-immersive) applications using a variety of networks from ISDN to the latest international networks (STARTAP). Some of this work has been specifically aimed at analyzing and improving the performance of distributed VEs themselves by understanding the characteristics of the connecting networks and modeling both the CAVE and the network using Petri-nets. Other modeling work focused on humans were he and his students developed a system identification tool based on Kalman filters that can be used to estimate in real-time the delay and model coefficients of a human operator and how these characteristics change as the operator's environment is changed.

His work on applications of VE to biocybernetics, which is being carried out at the Rehabilitation Institute of Chicago, involves the coupling of robots to VE and the integration of visual and motion information in maintaining erect posture. The VE-Robot systems are being used to explore new methods that will aid in the rehabilitation of stroke survivors. Specifically, these systems are being used to apply both visual and haptic information in combinations that help the stroke patient regain control of affected limbs [arm motion]. The application of VE and posture platform motion has been used to examine how young healthy individuals, elderly, and those with a loss of vestibular function combine visual and motion information to maintain erect posture. The use of complex visual scenes with physical motion has allowed the exploration of how these individuals integrate information from these sensors in the physical world. Some emerging research is investigating the integration of visual and kinesthetic information utilized to find locations in unfamiliar environments. His most recent work has explored the use of visual and kinesthetic information in the process of navigation. By manipulating the visual and haptic information during navigation tasks, our understanding of how navigation is processed will be explored.

Topic revision: r1 - 2009-09-07 - 14:32:42 - Main.kenyon
Copyright 2016 The Board of Trustees
of the University of
Helping Women Faculty Advance
Funded by NSF