On February 14, National Science Foundation funded a three-year collaborative project titled "Towards Life-like Computer Interfaces that Learn", that combines computer animation, virtual reality, artificial intelligence and natural language processing to develop a methodology for creating intelligent digital humans that can process human voice and gesture input and respond appropriately using similarly natural voice and gestures. The technology can be used in a variety of applications such as creating 3D archival recordings of important historical figures, virtual reality learning environments, and intelligent characters for next-generation video games.

Project collaborators include Jason Leigh, Andrew Johnson, Luc Renambot, Maxine Brown and Tom DeFanti from EVL, Steve Jones from UIC's Communication Department, and University of Central Florida's (CFU) Avelino Gonzalez and Ron Demarr.

The Chicago team will develop the graphics component and build a new state-of-the-art motion-capture studio in support of the project that will also be used by the CS department's Computer Animation, Virtual Reality, and Video Game Programming classes. Project members at CFU will work on the database and natural language processing components.









































 
Copyright 2016 The Board of Trustees
of the University of Illinois.webmaster@cs.uic.edu
WISEST
Helping Women Faculty Advance
Funded by NSF