Издательство Academic Press, 2015, -207 pp.Traditional multimedia content is typically consumed via audio-visual (AV) devices like displays and speakers. Recent advances in 3D video and spatial audio allow for a deeper user immersion into the digital AV content, and thus a richer user experience. The norm, however, is that just two of our five senses – sight and hearing – are exercised, while the other three (touch, smell, and taste) are neglected. The recent multitude of new sensors map the data they capture onto our five senses and enable us to better perceive the environment both locally and remotely. In the literature, the former is referred to as Augmented Reality, and the latter as Immersive Experience. In parallel, new types of actuators produce different kinds of multisensory effect. In early periods such effects were mostly used in dedicated installations in attraction parks equipped with motion chairs, lighting sources, liquid sprays, etc., but it is more and more to see multi-sensory effects produced in more familiar environments such as at home. Recognizing the need to represent, compress, and transmit this kind of contextual data captured by sensors, and of synthesizing effects that stimulate all human senses in a holistic fashion, the Moving Picture Experts Group (MPEG, formally ISO/IEC JTC 1/SC 29/WG 11) ratified in 2011 the first version of the MPEG-V standard (officially known as ISO/IEC 23005 – Media context and control). MPEG-V provides the architecture and specifies the associated information representations that enable interoperable multimedia and multimodal communication within Virtual Worlds (VWs) but also with the real world, paving the way to a Metaverse, i.e. an online shared space created by the convergence of virtually enhanced reality and physically persistent virtual space that include the sum of all Virtual Worlds and Augmented Realities. For example, MPEG-V may be used to provide multi-sensorial content associated to traditional AV data enriching multimedia presentations with sensory effects created by lights, winds, sprays, tactile sensations, scents, etc.; or it may be used to interact with a multimedia scene by using more advanced interaction paradigms such as hand/body gestures; or to access different VWs with an avatar with a similar appearance in all of them. In the MPEG-V vision, a piece of digital content is not limited to an AV asset, but may be a collection of multimedia and multimodal objects forming a scene, having their own behaviour, capturing their context, producing effects in the real world, interacting with one or several users, etc. In other words, a digital item can be as complex as an entire VW. Since a standardizing VW representation is technically possible but not aligned with industry interests, MPEG-V offers interoperability between VWs (and between any of them and the real world) by describing virtual objects, and specifically avatars, so that they can move from one VW to another. This book on MPEG-V draws a global picture of the features made possible by the MPEG-V standard, and is divided into seven chapters, covering all aspects from the global architecture, to technical details of key components – sensors, actuators, multi-sensorial effects – and to application examples. At the time this text was written (November 2014), three editions of MPEG-V have been published and the technical community developing the standard is still very active. As the main MPEG-V philosophy is not expected to change in future editions, this book is a good starting point to understand the principles that were at the basis of the standard. Readers interested in the latest technical details can see the MPEG-V Web-site (http://wg11.sc29.org/mpeg-v/).Introduction to MPEG-V Standards Adding Sensorial Effects to Media Content Standard Interfacing Format for Actuators and Sensors Adapting Sensory Effects and Adapted Control of Devices Interoperable Virtual World Common Tools for MPEG-V and MPEG-V Reference SW with Conformance Applications of MPEG-V Standard Terms, Definitions, and Abbreviated Terms
Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.