- Файл формата pdf
- размером 6,78 МБ

- Добавлен пользователем Shushimora
- Отредактирован

Издательство Springer, 1997, -278 pp.Being still in its toddlerhood, image analysis has not yet found a balance between theorisation, experimentation and engineering. But the syncretistic role of experimentation is clear from the fact that it straddles the other fields, and aims to reconcile two opposing tenets: that of the theoreticist employing abstractions for reasons of genericity, versus that of the engineer sharply focused on specificity. Relative to existing image literature this monograph is clearly biased towards theorisation. As such it does not claim to be a review covering conventional models and techniques, but rather reflects many of the ideas on image structure and front-end vision pioneered by Koenderink, and pursued by many others (the "Koenderink school"). For this reason it will appeal to the mathematician, who will appreciate that image analysis triggers many challenging problems of a fundamental nature. However, practically minded readers should not put this book aside prima facie. On the contrary, as the book deals with basic matters of image structure it lies at the core of virtually all (potential) applications that require an analysis of image content.

Like any observable an image comprises a "physical picture" (pixel and header data, say, in a digital image) as well as a "mental picture" (a model). The physical picture is a matter of public evidence, but there is no guarantee that mental pictures coincide. Indeed, the very term "image analysis" is akin to the fact that "image synthesis", i.e. an operational definition of an "image", more sophisticated than its mere physical format, remains unstressed. This is quite unfortunate, as conceptual mistakes at the axiomatic stage may well conspire to produce wrong results in the final analysis with one hundred per cent confidence. For instance, a naive application of results from standard analysis or differential geometry probably the tools for handling image structure-will almost certainly lead to failure.

Apart from mathematical rigour, the need for robustness despite noise poses an additional, equally fundamental demand. However, the latter is usually of little or no concern in pure mathematics, simply because the objects of investigation are not imposed as facts of life, but are rather arbitrarily defined. The principle of duality allows one to manifestly combine these requirements by stating/ solving problems in tenns of robust machine concepts defined within a rigorous mathematical framework. It is the guiding principle adopted in this book, and is more constrained than, for instance, the traditional "Marr paradigm", which allows more leeway at the algorithmic stage. For example, Marr's philosophy does not prohibit us to formulate an ill-posed problem as a viable theoretical model, as long as one accounts for some kind of regularisation in the algorithms. The Marr paradigm is thus somewhat indirect, because clearly one has to regularise ill-posed problems. However, a regularised model is a new model, different from the original as well as from any alternatively regularised one. This leaves no room for regularisation as a degree of freedom in algorithmic design. Regularisation must be an integral part of the theory; "incorrectly formulated problems" must be rejected from the outset, or considered incomplete.

As for the technical content, most of the mathematics used in this book is fairly elementary. However, a basic skill in analysis (differentiation, integration, etc.) and algebra (linear spaces, linear operators, etc.) is an absolute prerequisite. Geometrical expertise is helpful but not necessary. Many mathematical concepts are explained in this book to make it self-contained. Two levels of reading are supported. Sections marked with * contain technical details, and should cause no difficulties if skipped. Numbered boxes are provided as standalone figures; these contain side material that may be of interest, but likewise does not interfere with the main text. Problem sections are included at the end of each chapter, except the first, as an aid to acquire active skill; again these can be safely ignored on first reading. Solutions to a selection of problems have been provided. Finally, there is a glossary one may consult to get the gist of a few central concepts as they are used in this book.Introduction

Basic Concepts

Local Samples and Images

The Scale-Space Paradigm

Local Image Structure

Multiscale Optic Flow

Like any observable an image comprises a "physical picture" (pixel and header data, say, in a digital image) as well as a "mental picture" (a model). The physical picture is a matter of public evidence, but there is no guarantee that mental pictures coincide. Indeed, the very term "image analysis" is akin to the fact that "image synthesis", i.e. an operational definition of an "image", more sophisticated than its mere physical format, remains unstressed. This is quite unfortunate, as conceptual mistakes at the axiomatic stage may well conspire to produce wrong results in the final analysis with one hundred per cent confidence. For instance, a naive application of results from standard analysis or differential geometry probably the tools for handling image structure-will almost certainly lead to failure.

Apart from mathematical rigour, the need for robustness despite noise poses an additional, equally fundamental demand. However, the latter is usually of little or no concern in pure mathematics, simply because the objects of investigation are not imposed as facts of life, but are rather arbitrarily defined. The principle of duality allows one to manifestly combine these requirements by stating/ solving problems in tenns of robust machine concepts defined within a rigorous mathematical framework. It is the guiding principle adopted in this book, and is more constrained than, for instance, the traditional "Marr paradigm", which allows more leeway at the algorithmic stage. For example, Marr's philosophy does not prohibit us to formulate an ill-posed problem as a viable theoretical model, as long as one accounts for some kind of regularisation in the algorithms. The Marr paradigm is thus somewhat indirect, because clearly one has to regularise ill-posed problems. However, a regularised model is a new model, different from the original as well as from any alternatively regularised one. This leaves no room for regularisation as a degree of freedom in algorithmic design. Regularisation must be an integral part of the theory; "incorrectly formulated problems" must be rejected from the outset, or considered incomplete.

As for the technical content, most of the mathematics used in this book is fairly elementary. However, a basic skill in analysis (differentiation, integration, etc.) and algebra (linear spaces, linear operators, etc.) is an absolute prerequisite. Geometrical expertise is helpful but not necessary. Many mathematical concepts are explained in this book to make it self-contained. Two levels of reading are supported. Sections marked with * contain technical details, and should cause no difficulties if skipped. Numbered boxes are provided as standalone figures; these contain side material that may be of interest, but likewise does not interfere with the main text. Problem sections are included at the end of each chapter, except the first, as an aid to acquire active skill; again these can be safely ignored on first reading. Solutions to a selection of problems have been provided. Finally, there is a glossary one may consult to get the gist of a few central concepts as they are used in this book.Introduction

Basic Concepts

Local Samples and Images

The Scale-Space Paradigm

Local Image Structure

Multiscale Optic Flow

- Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
- Регистрация

- Узнайте сколько стоит уникальная работа конкретно по Вашей теме:
- Сколько стоит заказать работу?