Документ взят из кэша поисковой машины. Адрес оригинального документа : http://imaging.cmc.msu.ru/pub/2011.PRIA.Nasonov_Krylov.Areas.en.pdf
Дата изменения: Sat Nov 21 02:53:22 2015
Дата индексирования: Sat Apr 9 22:33:53 2016
Кодировка:
FINDING AREAS OF TYPICAL ARTIFACTS OF IMAGE ENHANCEMENT METHODS A.V. Nasonov, A.S. Krylov Laboratory of Mathematical Methods of Image Processing, Faculty of Computational Mathematics and Cybernetics, Lomonosov Moscow State University, 119991, Russia, Moscow, Leninskie gory, (495) 939-11-29, nasonov@cs.msu.ru, kryl@cs.msu.ru
*

The paper presents a new method to find areas related to typical artifacts of image enhancements methods. Two artifacts are analyzed: edge blur and overshooting effect. The method is based on the analysis of basic edges -- the edges which remain after performing image processing algorithms. Introduction Quality estimation of image enhancement methods is important for image processing. Image metrics [1] are usually used to numerically evaluate and compare results of image resampling (interpolation), image deblurring

(sharpening), image deringing and other image enhancement algorithms. Image quality can be different in different areas. For example, bilinear interpolation blurs the edges while bicubic interpolation introduces overshooting
*

The work was supported by federal target program "Scientific and scientific-pedagogical personnel of innovative Russia" in 2009-2013 and by RFBR grant 10-01-00535.


artifact -- false edges near strong edges. Existing metrics operate with entire images and provide overall image quality estimation. There are methods for blur [2] and ringing [3] estimation, but these methods are designed for specific image enhancement algorithms. In this work, we propose a method to find the areas on the image related to two artifacts of image enhancement methods: edge blur and overshooting effect. These artifacts are typical to image interpolation, image deblurring and image deringing methods and usually appear near sharp edges distant from other edges. Basic edges We use only basic edges -- the edges which are not displaced and disappeared after image quality degradation like downsampling or blurring. The following restrictions are applied to image edges [4]: 1. An edge with low gradient value is not masked by nearby edges with high gradient value. The following rule is applied to edge points:
g
i0 , j0

max gi , j ((i i0 ) 2 ( j j0 ) 2 ) ,
i, j

where g

i, j

is the gradient modulus field, function (d ) is the mask function. We
x2 2d 2

use (d ) he

, where h

1 . d 2

2. The distance from the edge point to the closest edge is greater than a predefined threshold rT 2d . If the distance between two edges is less than rT , the edges will be displaced. We use mathematical morphology to find the edges which pass this condition [4].


3. The gradient value is greater than a given threshold. Parameter d is the factor which depends on the image degradation method. For image interpolation we use d equals to the scale factor, for ringing suppression d is the width of a single ringing oscillation, for image blur by Gauss filter with radius
we use d .

Detection of the areas of interest We consider the areas related to the following artifacts of image enhancement algorithms: blur artifact which appears in basic edge areas and ringing artifact (overshooting) in the areas of edge neighborhood. We define as basic edge points (BEP) the set of points with the basic edge as the closest edge and the distance to it less than
d . 2

Basic edge neighborhood (BEN) is formed by points with the distance to the closest edge point between
1 d and d s , where parameter s is defined by the 2 2

number of ringing oscillations. We use s = 2. The results of basic edges and areas of interest detection are shown in figure 1. Conclusion A new method to find areas related to typical artifacts of image enhancements methods has been developed. It can be used in cooperation with image metrics to estimate the quality of image enhancement methods.


a) Original image.

b) finding basic edges: white edges are basic edges, gray edges are nonbasic non-masked edges.

c) BEP (white) and BEN (gray) areas.

Fig. 1. The result of basic edges and areas of interest detection. References 1. A. J. Ahumada. Computational image quality metrics: a review // SID Digest, pp. 305­308, 1993. 2. R. Ferzli, L. J. Karam. Human Visual System Based No-Reference Objective Image Sharpness Metric // 2006 IEEE International Conference on Image

Processing (ICIP), pp. 2949­2952. 3. P. Marziliano, F. Dufaux, S. Winkler, T. Ebrahimi. Perceptual Blur and Ringing Metrics: Application to JPEG2000 // Signal Processing: Image Communication, vol. 19, num. 2, 2004, pp. 163­172. 4. A.V.Nasonov, A.S.Krylov "Basic Edges Metrics for Image Deblurring" // Proceedings of 10th Conference on Pattern Recognition and Image Analysis: New Information Technologies, St. Petersburg, 2010, Vol. 1, pp. 243­246.