Издательство Cambridge University Press, 1998, -360 pp.Recent advances in the field of computer vision are leading to novel and radical changes in the way we interact with computers. It will soon be possible to enable a computer linked to a video camera to detect the presence of users, track faces, arms and hands in real time, and analyse expressions and gestures. The implications for interface design are immense and are expected to have major repercussions for all areas where computers are used, from the work place to recreation. This book collects the ideas and algorithms from the world's leading scientists, offering a glimpse of the radical changes that are round the corner and which will change the way we will interact with computers in the near future. Roberto Cipolla was a Toshiba Fellow at their Research and Development Centre in Kawasaki in 1991-92 before joining the Department of Engineering at the University of Cambridge in 1992 as a Lecturer and a Fellow of Jesus College. He became a Reader in Information Engineering in 1997. His research interests are in computer vision and robotics. He has authored more than 100 papers and has written one book and edited three others. He has won two international prizes and four national prizes for scientific contributions in these areas. Alex Paul Pentland is the Academic Head of the MIT. Media Laboratory and the Toshiba Professor of Media Arts and Sciences. In 1987 he founded the Perceptual Computing Section of the Media Laboratory, a group that now includes over fifty researchers in computer vision, graphics, speech, music, and human-machine interaction. He has published more than 180 scientific articles in these areas. He has won awards from the AAAI for his research into fractals; the IEEE for his research into face recognition; and Ars Electronica for his work in computer vision interfaces to virtual environments. Newsweek magazine has recently named him one of the 100 Americans most likely to shape the next century.Foreword: Out of Sight, Out of Mind Part one: New Interfaces and Novel Applications Smart Rooms: Machine Understanding of Human Behavior GestureComputer - History, Design and Applications Human Reader: A Vision-Based Man-Machine Interface Visual Sensing of Humans for Active Public Interfaces A Human-Robot Interface using Pointing with Uncalibrated Stereo Vision Part two: Tracking Human Action Tracking Faces Towards Automated, Real-time, Facial Animation Interfacing through Visual Pointers Monocular Tracking of the Human Arm in 3D Looking at People in Action - An Overview Part three: Gesture Recognition and Interpretation A Framework for Gesture Generation and Interpretation Model-Based Interpretation of Faces and Hand Gestures Recognition of Hand Signs from Complex Backgrounds Probabilistic Models of Verbal and Body Gestures Looking at Human Gestures
Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.