Документ взят из кэша поисковой машины. Адрес оригинального документа : http://imaging.cs.msu.su/pub/2011.Graphicon.Nasonov_Krylov.Metrics.en.pdf
Дата изменения: Mon Aug 22 14:14:22 2011
Дата индексирования: Sat Apr 9 23:35:29 2016
Кодировка:
Image enhancement quality metrics
Andrey Nasonov, Andrey Krylov Laboratory of Mathematics Methods of Image Processing Department of Computational Mathematics and Cybernetics Moscow State University, Moscow, Russia {nasonov, kryl}@cs.msu.ru

Abstract
The paper presents a new adaptive full reference metrics for the quality measurement of image enhancement algorithms. The idea of the proposed metrics is to find areas related to typical artifacts of image enhancement algorithms. Two types of artifacts are considered: blur and ringing effect. The concept of basic edges is used to find areas of these artifacts which are invariant to image corruption and image enhancement methods. The metrics are illustrated with an application to image resampling and image deblurring. Keywords: Image metrics, image enhancement, blur artifact, ringing artifact.

Blurred image

Results of image deblurring

1.

INTRODUCTION

Figure 1: Deblurring of the noisy blurred image by the unsharp mask with two different parameters. PSNR values are the same, but the edges are sharper in the left result image while the non-edge area is better in the right image.

Restoration of high-frequency information of an image is a common problem in image processing. High-frequency information is corrupted of either lost during various image corruption and degradation procedures like downsampling or blurring. It is not possible to completely reconstruct lost high-frequency information, therefore artifacts appears in restored images. Typical artifacts of image enhancement algorithms caused by loss of the high frequency information are blur and ringing effect near sharp edges. Development of image metrics is important for the objective analysis of image resampling, deringing, deblurring, denoising and other image enhancement algorithms. Image metrics perform comparison of the ground truth image and the restored image. Since the ground truth image is unavailable in most cases, the simulation approach is used. In this approach, artifact free images are corrupted to simulate the effect which is aimed to be suppressed by the being evaluated image enhancement algorithm. Then the corrupted images are restored using the given algorithm and compared to the corresponding reference images using image metrics. There exist large variety of image metrics ranging from simple but fast approaches like MSE, PSNR to more complicated metrics based on modeling the human visual system [1]. Most of image metrics can provide the estimation of perceptual image quality but they cannot be used to develop effective image enhancement algorithms because they do not focus on typical artifacts caused by the corruption of high-frequency information. Two image enhancement algorithms can give the same metrics values but the results can be very different if the first algorithm processes edges well and corrupts non-edge area while the second one corrupts only edges. Such an example for image deblurring is shown in Fig.1. There also exist no-reference quality estimation algorithms that measure specific artifacts like blur and ringing for certain image restoration algorithms like image compression [2, 3, 4] but they are not applicable to the general case.
The work was supported by federal target program "Research and Development in Priority Fields of S&T Complex of Russia in 2007­2013".

In this paper, we develop metrics for image enhancement algorithms. The proposed metrics are focused on finding the areas related to the considered typical image enhancement artifacts: edge blur and ringing effect. According to the parameters of image corruption and image enhancement method, it is possible to find the areas related to these artifacts and calculate image quality metrics in these areas separately. This information can be useful to help find the most problem areas of the given image enhancement algorithm. An algorithm to find the area related to ringing effect is proposed in [5], but this algorithm has limitations and cannot be applied for most of image enhancement algorithms. Our proposed method is based on the concept of basic edges -- sharp edges which are distant from other edges thus surviving after image corruption. The perceptual metrics for these areas are suggested. The proposed metrics estimate the quality of different image enhancement methods by analyzing the image quality in the areas of blur and ringing effect. We use the simulation approach so image degradation type and its parameters are considered to be known. In section 2, we analyze blur and ringing effect for image enhancement of low-resolution images, blurred images and images with ringing effect. In section 3, we find the edges suitable for image quality estimation. In section 4, we introduce our metrics to estimate the quality of image enhancement methods. Applications of the proposed metrics to image resampling and image deblurring are shown in section 5.

2.

ARTIFACT ANALYSIS

Since both blur and ringing effect are the results of loss of high frequency information, these effects should be considered together. If all frequencies above 21 Hz are truncated in Fourier transform, p ringing oscillations appear and edges are blurred. The length of single ringing oscillation and edge width are equal to p pixels. The example of high frequency truncation is shown in Fig. 2. Although the number of ringing oscillations is unlimited for the high frequency cut off, usually no more than 1-2 oscillations are noticeable.


Original image

After high frequencies cut off

1. Masking effect. If an edge with low gradient value is located near an edge with high gradient value, it will disappear after image blurring. 2. Edge displacement. If two edges with the same or close gradient values are located near each other, they will be displaced after image blurring. To find the edges which do not suffer from masking effect and edge displacement during image corruption, we put the following conditions: 1. An edge point is not masked by nearby edges gi
0

Edge profiles:

,j0

> max ((i - i0 )2 + (j - j0 )2 ),
i,j

(1)

where gi,j is the image gradient modulus in pixel (i, j ), (d) = exp - 2D Fourier transform modulus:
d2 2p2

.

2. The distance from the edge point to the nearest edge is greater than a threshold R. This operation is performed using mathematical morphology [6]. We use R = 3p. 3. The gradient modulus gi,j is greater than a threshold g0 . The condition is used to reduce the influence of noise to blur and ringing effect. We call the edges passed all these conditions as basic edges and the edges passed only the first condition as non-masked edges.

4.
Figure 2: Appearing of blur and ringing effect after high frequency information cut off for p = 4.

IMAGE QUALITY METRICS

After detection of basic edges, we perform image segmentation. According to the analysis of the profile of the step edge after highfrequency cut-off with parameter p (see Fig. 2), we divide the image into three sets: 1. The set M1 containing all pixels for which the nearest nonmasked edge is a basic edge and the distance to this edge is less or equal than p/2. Blur effect is the most likely to appear in this set. 2. The masked or equal likely to set M2 containing all pixels for which the nearest nonedge is a basic edge and the distance to this edge is less than 2p and greater than p/2. Ringing effect is the most appear in this set.

In practice, the high frequency information is usually corrupted but not completely absent, and the cut off frequency cannot be obtained directly from Fourier transform. In this case additional investigations are required to estimate blur and ringing effect parameter. This parameter can be predicted from image degradation type a priori. Low-resolution images are constructed using downsampling procedure which includes low-pass antialiasing filtering followed by the decimation procedure. During the decimation with scale factor s, 1 the frequencies greater than 2s are discarded. The cut off is not ideal because of two-dimensionality of the image. For any linear image resampling method producing blur and ringing effect, its parameter p depends only on scale factor s and p = s. For non-linear image resampling methods we use p = s too. In image deringing the parameter p is already known from the definition of the problem. Blurred images are the results of low-pass filtering followed by adding noise. We consider Gaussian blur with known radius and a noise with Gaussian distribution and standard deviation equals to d. There is no frequency cut off, and parameter p depends on image deblurring method. For unsharp mask, we use p = k , where 2 .5 k 3 . In the appendix, these results are confirmed experimentally.

3. The set M3 of all pixels with the distance to the nearest nonmasked edge greater than 2p. This set contains no non-masked edges and corresponds to flat and textures areas in the image. The example of finding these sets is shown in Fig. 3. To measure image quality, we calculate metrics values in the sets M1 , M2 and M3 . Any metrics can be used here. We use S S I M [7] due to its simplicity and good correlation with the perceptual image quality: S S I M (M , u, v ) = (2µu µv + c1 )(2uv + c2 ) , 2 2 (µ2 + µ2 + c1 )(u + v + c2 ) u v

2 2 where µu , µv are the averages of u and v respectively, u , v -- variances, uv - the covariance of u and v , L is the dynamic range of the pixel-values (typically this is 255), k1 = 0.01 and k2 = 0.03. 2 2 The values µu , µv , u , v , uv are calculated only in the set M .

3.

BASIC EDGES

Blur and ringing effect appears near sharp edges. But any sharp edge cannot be used for image quality analysis. Some edges can disappear or can be displaced after image corruption. If these edges are used for blur and ringing analysis, the results will be incorrect. There are two effects observed in images with corrupted high frequency information:

Now we are ready to introduce the image quality value vector for image u with ground truth image v and given blur-ringing parameter p: QV (u, v , p) = (Q1 , Q2 , Q3 , Q4 ) = = (S S I M (M1 , u, v ), S S I M (M2 , u, v ), S S I M (M3 , u, v ), S S I M (u, v )).

(2)


given methods. This make possible to say that the results of combination based on the results of the proposed metrics are better than the results of the methods used for combination.

Original image

White edges are the edges passed the condition (1), blue edges -- not passed.

Low-resolution image

White edges are basic edges, red -- non-masked non-basic edges

Yellow area -- set M1 , green -- M2 , blue -- M3 .

Blurred ( = 3) and noisy

Bicubic interpolation QV = (0.9255, 0.9987, 0.9996, 0.9831)

Figure 3: The result of basic edges detection for p = 4. QV value is a vector containing S S I M values calculated in the sets M1 , M2 , M3 and in the entire image. Higher values mean better image quality. The sets M1 , M2 , M3 are constructed for the image v with given parameter p.

5.

APPLICATIONS
Sinc (ideal) interpolation QV = (0.9423, 0.9978, 0.9989, 0.9837) Combined method QV = (0.9424, 0.9986, 0.9996, 0.9862)

The proposed metrics are demonstrated by its application to construct combined algorithms for image resampling and image deblurring. We consider the case when there are two image enhancement algorithms which give relatively the same values of some general purpose metric but produce different artifacts: the first algorithm has strong blur artifact while the second one has strong ringing artifact. This difference is detected by the proposed metrics. The combined algorithm is constructed as a linear combination of two image enhancement algorithms u, v w
i,j

Figure 4: Application of the proposed metrics to improve the results of image resampling methods.

6.

CONCLUSION

= a(di,j )u

i,j

+ (1 - a(di,j ))vi,j ,

where a(d) is the weight coefficient depending on the distance to the closest non-masked edge di,j in the blurred image. Consider u and v such that QV1 (u) < QV1 (v ) and QV2 (u) > QV2 (v ). In this case we use 0, d < p, 2 2d-p , p d < p, a(d) = 2 p 1, d p. The result for combination of bicubic interpolation and sinc interpolation for the problem of image resampling is shown in Fig. 4. To the problem of image deblurring, the result for combination of unsharp mask and regularized total variation (TV) deconvolution in low-frequency domain is shown in Fig. 5. In both cases the combined method shows better S S I M calculated in the whole image than two given methods. Also Q1 , Q2 and Q3 values of the combined methods are better than the corresponding best values of the

New full-reference metrics for quality measurement of image enhancement algorithms were developing. These metrics were approbated on image resampling and image deblurring. It looks promising for combining two different image enhancement algorithms to obtain better result.

7.

REFERENCES

[1] W. S. Lin, Digital Video Image Quality and Perceptual Coding, chapter Computational Models for Just-Noticeable Difference, pp. 281­303, CRC Press, 2006. [2] R. Ferzli and L. J. Karam, "Human visual system based noreference objective image sharpness metric," ICIP'06, pp. 2949­2952, 2006. [3] M. Balasubramanian, S. S. Iyengar, J. Reynaud, and R. W. Beuerman, "A ringing metric to evaluate the quality of images restored using iterative deconvolution algorithms," Proc. of the 18th Int. Conf. on Systems Engineering (ICSENG'05), pp. 483­488, 2005.


8.

APPENDIX

We have performed frequency analysis of different image enhancement algorithms to confirm the preposition from Section 2 that parameter p can be estimated from image degradation method. For every image we calculate the cumulative spectrum function A(w) (CSF):
2

A(w) =
0

^ |f (w cos , w sin )|2 d,

Original image

^ where f (w1 , w2 ) is linearly interpolated discrete Fourier transform of the image f . The analysis consists in calculating the difference between CSFs A(w) for reference images from the set of standard images (baboon, cameraman, house, goldhill, lena, peppers) and CSFs of enhanced images. Frequency power functions for the different methods of image resampling, deringing and deblurring are shown in Fig. 6. It can be seen that the change of the curve shape happens in the expected 1 point w = 21 = 4 . p

Blurred ( = 3) and noisy QV = (0.8168, 0.9747, 0.9861, 0.9828)

Regularized deconvolution QV = (0.9520, 0.9754, 0.9843, 0.9844)

Unsharp mask with = 2 QV = (0.8791, 0.9852, 0.9788, 0.9833)

Combined method QV = (0.9409, 0.9901, 0.9984, 0.9889)

Figure 5: Application of the proposed metrics to improve the results of deblurring methods. [4] A. Punchihewa and A. Keerl, "Test pattern based evaluation of ringing and blur in jpeg and jpeg2000 compressed images," 4th International Conference on Signal Processing and Communication Systems (ICSPCS2010), pp. 1­7, 2010. [5] P. Marziliano, F. Dufaux, S. Winkler, and T. Ebrahimi, "Perceptual blur and ringing metrics: Application to jpeg2000," Signal Processing: Image Communication, vol. 19, no. 2, pp. 163­ 172, 2004. [6] A. V. Nasonov and A. S. Krylov, "Basic edges metrics for image delurring," Proceedings of 10th Conference on Pattern Recognition and Image Analysis: New Information Technologies, vol. 1, pp. 243­246, 2010. [7] Z. Wang, A. Bovik, H. Sheikh, and E. Simoncelli, "Image quality assessment: from error visibility to structural similarity," IEEE Transactions on Image Processing, vol. 13, no. 4, pp. 600­612, 2004. [8] A. S. Krylov, A. S. Lukin, and A. V. Nasonov, "Edgepreserving nonlinear iterative image resampling method," ICIP'09, pp. 385­388, 2009.

Image resampling (s = 2): ideal (zero filling); regularization based Lanczos3; interpolation with low bicubic; regularization parameter [8]. Image deringing (p = 2): Gaussian blur, = 1; TV projection. Image deblurring ( = 0.7): TV regularization in unsharp mask, = 6. low-frequency domain; Figure 6: Cumulative spectrum functions differences for different image corruption and enhancement methods.

ABOUT THE AUTHORS
Andrey Nasonov is a member of scientific staff at Laboratory of Mathematical Methods of Image Processing, Faculty of Computational Mathematics and Cybernetics, Lomonosov Moscow State University. His contact email is nasonov@cs.msu.ru. Andrey Krylov is a professor, head of Laboratory of Mathematical Methods of Image Processing, Faculty of Computational Mathematics and Cybernetics, Lomonosov Moscow State University. His contact email is kryl@cs.msu.ru.