Зарегистрироваться
Восстановить пароль
FAQ по входу

Russ J.C. The Image Processing Handbook

  • Файл формата pdf
  • размером 51,64 МБ
  • Добавлен пользователем
  • Отредактирован
Russ J.C. The Image Processing Handbook
Издательство CRC Press, 2011, -853 pp.
Sixth Edition
Image processing is used in a wide variety of applications, for two somewhat different purposes:
improving the visual appearance of images to a human observer, including their printing and transmission, and
preparing images for the measurement of the features and structures which they reveal.
The techniques that are appropriate for each of these tasks are not always the same, but there is considerable overlap. This book covers methods that are used for both tasks.
To do the best possible job, it is important to know about the uses to which the processed images will be put. For visual enhancement, this means having some familiarity with the human visual process and an appreciation of what cues the viewer responds to in images. A chapter on human vision addresses those issues. It also is useful to know about the printing or display process, since many images are processed in the context of reproduction or transmission. Printing technology for images has advanced significantly with the consumer impact of digital cameras, and up-to-date information is provided.
The measurement of images is often a principal method for acquiring scientific data and generally requires that features or structure be well defined, either by edges or unique brightness, color, texture, or some combination of these factors. The types of measurements that can be performed on entire scenes or on individual features are important in determining the appropriate processing steps. Several chapters deal with measurement in detail. Measurements of size, position, and brightness deal with topics that humans largely understand, although human vision is not quantitative and is easily fooled. Shape is a more difficult concept, and a separate chapter added in this edition summarizes a variety of ways that shape may be described by numbers. Measurement data may be used for classification or recognition of objects. There are several different strategies that can be applied, and examples are shown.
It may help to recall that image processing, like food processing or word processing, does not reduce the amount of data present but simply rearranges it. Some arrangements may be more appealing to the senses, and some may convey more meaning, but these two criteria may not be identical nor call for identical methods.
This handbook presents an extensive collection of image processing tools, so that the user of computer-based systems can both understand those methods provided in packaged software and program those additions which may be needed for particular applications. Comparisons are presented for different algorithms that may be used for similar purposes, using a selection of representative pictures from various microscopy techniques, as well as macroscopic, remote sensing, and astronomical images. It is very important to emphasize that the scale of the image matters very little to the techniques used to process or analyze it. Microscopes that have a resolution of nm and telescopes that produce images covering light years produce images that require many of the same algorithms.
The emphasis throughout the book continues to be on explaining and illustrating methods so that they can be clearly understood, rather than providing dense mathematics. With the advances in computer speed and power, tricks and approximations in search of efficiency are less important, so that examples based on exact implementation of methods with full precision can generally be implemented on desktop systems. The topics covered are generally presented in the same order in which the methods would be applied in a typical workflow.
For many years, in teaching this material to students I have described achieving mastery of these techniques as being much like becoming a skilled journeyman carpenter. The number of distinct woodworking tools — saws, planes, drills, etc. — is relatively small, and although there are some variations — slotted vs. Phillips-head screwdrivers, for example — knowing how to use each type of tool is closely linked to understanding what it does. With a set of these tools, the skilled carpenter can produce a house, a boat, or a piece of furniture. So it is with image processing tools, which are conveniently grouped into only a few classes, such as histogram modification, neighborhood operations, Fourier-space processing, and so on, and can be used to accomplish a broad range of purposes. Visiting your local hardware store and purchasing the appropriate tools do not provide the skills to use them. Understanding their use requires practice, which develops the ability to visualize beforehand what each will do. The same is true of the tools for image processing.
In revising the book for this new edition, I have again tried to respond to some of the comments and requests of readers and reviewers. New chapters on the measurement of images and the subsequent interpretation of the data were added in the second edition, and a section on surface images in the third. The fourth edition added the stereological interpretation of measurements on sections through three-dimensional structures and the various logical approaches to feature classification. The fifth edition brought expanded sections on deconvolution, extended dynamic range images, and multichannel imaging, including principal components analysis. In this sixth edition, a new chapter on the meaning of shape has been added, as well as additional material on imaging in more than two dimensions. The sections on the ever-advancing hardware for image capture and printing have been expanded and information added on the newest hardware and software technologies.
As in past editions, I have resisted suggestions to put more of the math into the book. There are excellent texts on image processing, compression, mathematical morphology, etc., that provide as much rigor and as many derivations as may be needed. Many of them are referenced here. But the thrust of this book remains teaching by example. Few people learn the principles of image processing from the equations. Just as we use images to communicate ideas and to do science, so most of us use images to learn about many things, including imaging itself. The hope is that by seeing and comparing what various operations do to representative images, you will discover how and why to use them. Then, if you need to look up the mathematical foundations, they will be easier to understand.
A very real concern for everyone involved in imaging, particularly in scientific and forensic fields, is the question of what constitutes proper and appropriate processing and what constitutes unethical or even fraudulent manipulation. The short answer is that anything that alters an image so as to create a false impression on the part of the viewer is wrong. The problem with that answer is that it does not take into account the fact that different viewers will tend to see different things in the image anyway, and that what constitutes a false impression for one person may not for another.
The first rule is always to store a permanent copy of the original image along with relevant data on its acquisition. The second rule is to carefully document whatever steps are taken to process the image and generally to report those steps when the processed image is published. Most scientific publications and the editors who review submitted papers have become more aware in recent years of the ease with which image processing can be performed and the dangers of inadequate documentation. For example, see M. Rossner and K. M. Yamada (2004; J. Cell Biology) for that journal’s policy on image ethics and examples of improper manipulation.
For forensic purposes, there is an additional responsibility to fully record the entire stepby- step procedures that are used and to make sure that those methods are acceptable in court according to the U.S. Supreme Court’s Daubert ruling (Daubert v. Merrell Dow Pharmaceuticals (92-102), 509 U.S. 579, 1993), which generally means that not only are the methods widely accepted by professionals, but also that they have been rigorously tested and have known performance outcomes. In a forensic setting, there will often be a need to explain a procedure, step by step, to a non-technical jury. This frequently requires showing that the details obtained from the image are really present in the original but only became visually evident with the processing.
Some procedures, such as rearranging features or combining them within a single image, or differently adjusting the contrast of several images to make them appear more alike, are clearly misleading and generally wrong. Some, such as using copy-and-paste to duplicate a portion of an image, or selectively erasing portions of an image, are out-and-out fraudulent. Even selective cropping of an image (or choosing which field of view to record) can create a false impression.
The general guideline to be considered is that it is never acceptable to add anything to an image, but it may be acceptable to suppress or remove some information if it makes the remaining details more accessible, either visually for presentation and communication or to facilitate measurement. Of course, the procedures used must be documented and reported. Any of the procedures shown here may be appropriate in a particular instance. But they can also be misused and should in any case never be used without understanding and careful documentation. The heart of the scientific method is replicability. If adequate information is provided on the processing steps applied and the original image data are preserved, then the validity of the results can be independently verified.
An important but often overlooked concern is the need to avoid using programs that alter the image without the user being aware of it. For example, carefully correcting the colors in an image using Photoshop® and then placing it in PowerPoint® for presentation will cause changes even on the same computer screen (as well as discarding pixels and reducing resolution if copy-and-paste is used for the transfer). In addition, the image may appear different on another computer monitor or when using a projector. Pasting an image into Microsoft® Word will reduce the resolution and color or gray scale dynamic range. This may not affect the printed document, which has less gamut than the computer screen anyway, but the image cannot be subsequently retrieved from the document in its original form. Saving an image with a lossy compression method such as jpeg will discard potentially important information that cannot be recovered.
The reader is encouraged to use this book in concert with a real source of images and a computer-based system and to freely experiment with different methods to determine which are most appropriate for his or her particular needs. Selection of image processing tools to explore images when you don’t know the contents beforehand is a much more difficult task than using tools to make it easier for another viewer or a measurement program to see the same things you have discovered. It places greater demand on computing speed and the interactive nature of the interface. But it particularly requires that you become a very analytical observer of images. If you can learn to see what the computer sees and predict what various algorithms will do, you will become a better viewer and obtain the best possible images, suitable for further processing and analysis.
Acquiring Images
Human Vision
Printing and Storage
Correcting Imaging Defects
Image Enhancement in the Spatial Domain
Processing Images in Frequency Space
Segmentation and Thresholding
Processing Binary Images
Global Image Measurements
Feature-Specific Measurements
Characterizing Shape
Feature Recognition and Classification
Tomographic Imaging
3D Visualization
Imaging Surfaces
  • Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.
  • Регистрация