Издательство Springer, 1990, -324 pp.Image data compression has been the subject of extensive research for the past two decades. It is the results of this research that have provided the basis for the development of efficient image communication and archival systems that are now being introduced at an increasingly rapid rate. This research and development continues as we continually strive to optimize the classical tradeoff between the amount of compression and the reconstructed image quality. Clearly, as we increase the com- compression ratio in these controlled quality image coding schemes, then we should expect to see some degradation in the reconstructed image, at least after some point. Depending on the application, the challenge is either to maximize the image quality at a fixed rate, or to minimize the rate required for a given quality. We have developed a new technique, namely Recursive Block Coding, or RBC, that has its roots in noncausal models for 1d and 2d signals. The underlying theoretical basis provides a multitude of compression algorithms that encompass two source coding, transform coding, quad tree coding, hybrid coding and so on. Since the noncausal models provide a fundamentally different image representation, they lead to new approaches to many existing algorithms and in particular lead to useful approaches for asymmetric, progressive, and adaptive coding techniques. On the theoretical front, our basic result shows that a random field (i.e., an ensemble of images) can be coded block by block such that the interblock redundancy can be completely removed while the individual blocks are transform coded. For the chosen image models (used commonly in the literature), the optimum (KL) transform for the blocks is shown to be the (fast) discrete sine transform. On the practical side, the major artifact of tiling, a block boundary effect, present in the conventional block by block transform coding techniques has been greatly suppressed. At the same time, the underlying algorithms remain efficient even if the block size is reduced down to 4x4 or 8x 8. This would certainly be an important factor in practical, high speed implementations. This monologue contains not only a theoretical discussion of the algorithms, but also exhaustive simulations and suggested methodologies for ensemble design techniques. Each of the resulting algorithms has been applied to two image ensembles obtained from a total of twelve 512x512 monochrome images, mainly from the USC image database, over a range of image data rates. The results are reported in various ways, namely: subjective descriptions, photographs, mathematical MSE values, and h-plots, a recently proposed graphical representation showing a high level of agreement with image quality as judged subjectively.Introduction RBC—The Theory behind the Algorithms Bit Allocation and Quantization for Transform Coding Zonal Coding Adaptive Coding Based on Activity Classes Adaptive Coding Based on Quad Tree Segmentation QVQ—Vector Quantization of the Quad Tree Residual Conclusions A Ordinary and Partial Differential and Difference Equations В Properties of the Discrete Sine Transform С Transform Domain Variance Distributions D Coding Parameters for Adaptive Coding Based on Activity Classes
Чтобы скачать этот файл зарегистрируйтесь и/или войдите на сайт используя форму сверху.