SciELO - Scientific Electronic Library Online

 
vol.40 número1An improved scheme for solving atmospheric radiative transfer problems with the spectral nodal methodHeat and mass transfer limitations in monolith reactor simulation with non uniform washcoat thickness índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

  • No hay articulos citadosCitado por SciELO

Links relacionados

  • No hay articulos similaresSimilares en SciELO

Compartir


Latin American applied research

versión impresa ISSN 0327-0793

Lat. Am. appl. res. vol.40 no.1 Bahía Blanca ene. 2010

 

ARTICLES

A robust algorithm for binarization of objects

R. Rodríguez

Digital Signal Processing Group
Institute of Cybernetics, Mathematics & Physics (ICIMAF)
Calle 15 No. 551 e/ C y D CP 10400,
La Habana, Cuba. rrm@icmf.inf.cu

Abstract - Many times, binarization is recognized to be one of the most important steps in most high-level image analysis systems; particularly, for the object recognition. However, experience has shown that the most effective methods continue to be the iterative ones. In this work, entropy is used as a stopping criterion when segmenting an image by recursively applying mean shift filtering. Of this way, a new algorithm for the binarization of objects is introduced.

Keywords - Images Segmentation; Mean Shift; Algorithm; Binarization; Entropy; Otsu's Method

I. INTRODUCTION

Many binarization methods have been proposed; particularly, for medical-image data (Kenong et al., 1995; Sijbers et al., 1997; Chin-Hsing et al., 1998; Shareef et al., 1999; Schmid, 1999; Koss et al.,1999). Unfortunately, binarization using traditional low-level image processing techniques, such as thresholding, region growing and other classical operations require a considerable amount of interactive guidance in order to attain satisfactory results. Automating these model-free approaches is difficult because of complexity, shadows, and variability within and across individual objects.

Today, the most robust segmentation algorithms are the iterative methods, which cover a variety of techniques; for example, mathematical morphology, deformable models, thresholding methods and others. However, one of the problems of these iterative techniques is the stopping criterion, where many methods have been proposed (Chenyang et al., 2000; Vicent and Soille, 1991; Cheriet et al., 1998)

Mean shift is a non-parametric and versatile tool for feature analysis and can provide reliable solutions for many vision tasks (Comaniciu, 2000; Comaniciu et al., 2002). The mean shift was proposed in Fukunaga and Hostetler (1975) and largely forgotten until Cheng's paper (Cheng, 1995) retook interest on it. Segmentation by means of the mean shift method carries out as a first step a smoothing filter before segmentation is performed (Comaniciu, 2000).

The term of entropy is not a new concept in the field of information theory. Entropy has been used in image restoration, edge detection and recently as an objective evaluation method for image segmentation (Zhanget al., 2003).

In this work a new binarization strategy based on the computation of the mean shift is proposed. The proposal makes use of entropy as a stopping criterion, where the binarization is carried out after the segmented image is obtained.

The obtained results with this algorithm are compared with other developed by the author and collaborators and also, with the Otsu's method (Rodríguez et al., 2005; Rodríguez et al., 2002a; Rodríguez et al., 2003). In other words, our interest in this research is to determine which algorithm is the most suitable and robust for object binarization; in this case, blood vessels. In this work the important information to be extracted from images is just the number of objects.

The remainder of the paper is organized as follows: In Section 2, we provide the more significant theoretical aspects of the mean shift. In Section 3, we shortly introduce the entropy concept. In Section 4, we describe our binarization algorithm based on the mean shift by taking entropy as a stopping criterion. In Section 5, the experimental results, comparisons and discussion are presented. In this section a quantitative verification of the obtained results is also carried out. Finally, in Section 6 the conclusions are exposed.

II. THEORETICAL ASPECTS

The iterative procedure to compute the mean shift is introduced as normalized density estimate of the gradient. By employing a differentiable kernel, an estimate of the density gradient can be defined as the gradient of the kernel density estimate; that is,

(1)

where n is the data points.

The kernel function K(x) is now a function, defined for d-dimensional x, satisfying

(2)

Other conditions on the kernel K(x) and the window radio h are derived in order to guarantee asymptotic unbiasedness, mean-square consistency, and uniform consistency of the estimate in the Eq. (1) (Fukunaga and Hostetler, 1975). For example, for Epanechnikov kernel:

(3)
(4)

where the region is a hypersphere of radius h having volume , centered at x, and containing data points, that is, the uniform kernel. In addition, in this case d = 3, for being the x vector of three dimensions, two for the spatial domain and one for the range domain (gray levels). The last factor in expression (4) is called the sample mean shift,

(5)

The quantity is the kernel density estimate (where U means the uniform kernel) computed with the hypersphere , and thus we can write the expression (4) as:

(6)

which yields,

(7)

Equation (7) shows that an estimate of the normalized gradient can be obtained by computing the sample mean shift in a uniform kernel centered on x. In addition, the mean shift has the direction of the gradient of the density estimate at x when this estimate is obtained with the Epanechnikov kernel. Since the mean shift vector always points towards the direction of the maximum increase in the density, it can define a path leading to a local density maximum; that is, to a mode of the density.

Other works have proven that, in the case of unimodal histograms, the mean shift vector points towards the mode (Fukunaga and Hostetler, 1975; Comaniciu, 2000; Comaniciu et al., 2002). Another interesting result very recent, in the case of unimodal histogram, it is also using fractal dimension (Salvatelli et al., 2007).

In Comaniciu et al. (2002) it was proven that the mean shift procedure, obtained by successive:

  • computing the mean shift vector Mh (x)
  • translating the window Sh (x) by Mh (x),

guarantees convergence.

Therefore, if the individual mean shift procedure is guaranteed to converge, a recursively procedure of the mean shift converges too. In other words, if one considers the recursively procedure like the individual sum of many procedures of the mean shift and each individual procedure converges; then, the recursive procedure converges too. This claim was already proven in Grenier et al. (2006). The question that continues open is when to stop the recursive procedure. The answer is in the use of entropy, as it will be shown in Section 3.

A digital image can be represented as a two-dimensional array of p-dimensional vectors (pixels), where p =1 in the gray level case, three for color images, and p > 3 in the multispectral case.

As was pointed in Comaniciu et al. (2002) when the location and range vectors are concatenated in the joint spatial-range domain of dimension d = p+2, their different nature has to be compensated by proper normalization of parameters hs and hr. Thus, the multi-variable kernel is defined as the product of two radially symmetric kernels and the Euclidean metric allows a single bandwidth for each domain, that is:

(8)

where xs is the spatial part, xr is the range part of a feature vector, k(x) the common profile used in both domains, hs and hr the employed kernel bandwidths, and C the corresponding normalization constant.

In this work, all the experiments were carried out by using a uniform kernel. In this case, the comparison with both algorithms was carried out using the same parameters (hr = 15 and hs = 12).

III. ENTROPY

From the point of view of digital image processing the entropy of an image is defined as:

(9)

where B is the total quantity of bits of the digitized image and by agreement ; is the probability of occurrence of a gray-level value. Within a totally uniform region, entropy reaches the minimum value. Theoretically speaking, the probability of occurrence of the gray-level value, within a uniform region is always one. In practice, when one works with real images entropy value does not reach, in general, the zero value. This is due to the noise in the image. Therefore, if we consider entropy as a measure of the disorder within a system, it could be used as a good stopping criterion for an iterative process, by using mean shift filtering. Entropy within each region diminishes in the measure that the regions become more homogeneous and also in the whole image, until reaching a stable value. When convergence is reached, a totally segmented image is obtained, because the mean shift filtering is not idempotent. In addition, as in (Comaniciu et al., 2002) was pointed out, the mean shift based image segmentation procedure is a straightforward extension of the discontinuity preserving smoothing algorithm and the segmentation step does not add a significant overhead to the filtering process.

The choice of entropy as a measure of goodness deserves several observations. Entropy reduces the randomness in corrupted probability density functions and tries to counteract noise. Then, following this analysis, as the segmented image is a simplified version of the original image, entropy of the segmented image should be smaller. Recently, it was empirically found that the entropy of the noise diminishes faster than that of the signal (Zhang et al., 2003). Therefore, an effective criterion to stop would be when the relative rate of change of entropy; from one iteration to the next, falls below a given threshold. This is the essential part of this work.

IV. ALGORITHMS

In this section two algorithms, one related with the filtering of the signal and the other related with the segmentation step are given.

Algorithm No. 1.Filtering algorithm by using the mean shift:

Let Xi and Zi, i=1,..,n, be the input and filtered images in the joint spatial-range domain. For each pixel p E Xi, p = (x,y,z) E R3, where (x,y) E R2 and z E [0,2ß-1], ß being the quantity of bits/pixel in the image. The filtering algorithm comprises the following steps (Comaniciu et al., 2002):

  1. Initialize j = 1 and yi,1 = pi.

  2. Compute through the mean shift (see expression (3), yi,j+1), the mode where the pixel converges; that is, the calculation of the mean shift is carried out until convergence, y = yi,c.

  3. Store at Zi the component of the gray level of calculated value: , where is the spatial component and is the range component.

Algorithm No. 2. Binarization algorithm by recursively applying the mean shift filtering:

Let ent1 be the initial value of entropy of the first iteration. Let ent2 be the second value of entropy after the first iteration. Let errabs be the absolute value of the difference of entropy between the first one and the second iteration. Let parlog be the parameter to carry out parametric logarithm. Let edsEnt be the threshold to stop the iterations; that is, this is threshold to stop when the relative rate of change of entropy; from one iteration to the next, falls below this threshold. Then, the segmentation algorithm comprises the following steps:

  1. Initialize ent2 = 1, errabs = 1, edsEnt = 0.001.

  2. While errabs > edsEnt then

    1. Filter the image according to the steps of the previous algorithm; store in Z[k] the filtered image.

    2. Calculate entropy from the filtered image according to expression (6); store in ent1.

    3. Calculate the absolute difference with the entropy value obtained in the previous step; errabs = /ent1 - ent2/

    4. Update the value of the parameter; ent2 = ent1; Z[k+1] = Z[k]

  3. To carry out a parametric logarithm (parlog = 70)

  4. Binarization: to assign to the background the white color and to the objects the black color.

It is possible to observe that, in this case, the proposed segmentation algorithm is a direct extension of the filtering algorithm, which finishes when entropy reaches stability. Note the simplification of this algorithm compared with the one proposed in Rodríguez et al. (2005).

Some comments on this algorithm follow. In Christoudias et al. (2002) it was stated that the recursive application of the mean shift property yields a simple mode detection procedure. The modes are the local maxima of the density. Therefore, with the new segmentation algorithm, by recursively applying mean shift, convergence is guaranteed. Indeed, the proposed algorithm is a straightforward extension of the filtering process. In Comaniciu (2000) it was proven that the mean shift procedure converges. In other words, one can consider the new segmentation algorithm as a concatenated application of individual mean shift filtering operations. Therefore, if we consider the whole event as linear, the recursive algorithm converges. Binarization is carried out after the segmented image is obtained.

V. EXPERIMENTAL RESULTS. ANALYSIS AND DISCUSSION

A. The method of evaluation

Manual segmentation generally gives the best and most reliable results when identifying structures for a particular clinical task. Up to now and due to the lack of ground truth, the quantitative evaluation of a binarization method is difficult to achieve. An alternative is to use manual-binarization results as the ground truth.

In order to evaluate the performance of both techniques, we calculated the percentage of false negatives (FN, objects, which are not found by the strategy) and the false positives (FP, noise, which is classified as objects). These were defined according to the following expressions,

(10)

where Vp is the actual number quantity of objects identified by the physician, fn being the quantity of objects, which were not marked by the strategy and fp being the number of spurious regions, which were marked as objects.

B. Experimental Results

In order to win in clarity of the objects that will be isolated (blood vessels), some details on the original images are given. Studied images are biopsies of malignant tumors with angiogenesis process. These were included in paraffin by using the inmunohistoquimic technique with the complex method of avidina biotina. Finally, monoclonal CD34 was contrasted with green methyl to accentuate formation of new blood vessels. These biopsies were obtained from soft parts of human bodies. This analysis was carried out for more than 80 patients. These images were captured via MADIP system with a resolution of 512x512x8 bit/pixels (Rodríguez et al., 2001).

In Fig. 1, a first segmentation example carried out with both algorithms is shown. Although the Otsu's method was already applied in a previous work (Rodriguez et al., 2001); here also, one can observe the obtained result by using this technique.


Fig. 1. (a) Original image, (b) Binarized image by using the new algorithm, (c) Binarized image by using graph, (d) Binarized image via Otsu's method.

In Fig. 1, one can note that the binarized image using the new algorithm was cleaner and it didn't accentuate the spurious information that appears in the original image (see arrows in Fig. 1 (a)). The obtained result by using Otsu's method is evident. One can observe that a lot of noise was originated. This best obtained result with the new algorithm is because this is a direct extension of the filtering process. The used parameter to carry out the parametric logarithm was similar to 70 and this value was the same for all the binarized images. In the experimentation was proven that the final result is not very sensitive to this parameter, because a variation in the range from 40 to 70 led to the same result. Observe the obtained results with the new algorithm and that attained in Rodríguez et al. (2002a) and Rodríguez et al. (2003). In Table I one can see that, iteration by iteration, entropy diminishes, until reaching a stable value; that is, until convergence.

Table I: Decrease of the entropy in each one of the iterations

In Fig. 2, another binarization example carried out with both algorithms is shown.


Fig. 2. (a) Original image, (b) Binarized image by using the new algorithm, (c) Binarized image by using graph, (d) Binarized image via Otsu's method.

Comparing both images, one can observe that the binarized image by using the new algorithm has better appearance than the obtained image by using graph. See that the binarized objects with the new algorithm (see Fig. 2 (b)) are more similar to the original objects (see interior part of the blood vessels). On the other hand, the binarized image using the Otsu's method was very noisy.

From Table II, one can see that, iteration by iteration, the convergence was reached.

Table II: Decrease of the entropy in each one of the iterations

It is interesting to observe that in the iteration 4 an increase of entropy is noticed, but starting from this iteration entropy falls quickly.

In Fig. 3, it is shown another example of binarization carried out with both algorithms.


Fig. 3. (a) Original image, (b) Binarized image by using the new algorithm, (c) Binarized image by using graph, (d) Binarized image via Otsu's method.

One can observe that although, apparently, the result of the new algorithm is noisier, according to criterion of the specialists, the result of Fig. 3 (b) it is the correct one. In the case of the obtained result with the algorithm by using graph, some objects were eliminated (see arrows in original image). This behavior is due to the parameter M, which has as goal to eliminate the irrelevant objects. With the new algorithm is not necessary to use this parameter. The obtained result by using the Otsu's method is very noisy. In the reference (Rodríguez et al., 2002a), we proposed a variant in order to work with the Otsu's method. If one carries out a morphological filter (for example, an opening) after the obtained result by using the Otsu's method, many objects (blood vessels) can be eliminated. This issue was proven in a previous work (Rodríguez et al., 2002 b).

Finally, in Table III one can appreciate the behavior of the entropy until the algorithm reaches convergence.

Table III: Decrease of the entropy in each one of the iterations

Other examples of binarization appear in Figures 4, 5 and 6.


Fig. 4. (a) Original image, (b) Binarized image by using the new algorithm, (c) Binarized image by using graph, (d) Binarized image via Otsu's method.


Fig. 5. (a) Original image, (b) Binarized image by using the new algorithm, (c) Binarized image by using graph, (d) Binarized image via Otsu's method.


Fig. 6. (a) Original image, (b) Binarized image by using the new algorithm, (c) Binarized image by using graph, (d) Binarized image via Otsu's method.

In this case, the most significant differences in the obtained results with both algorithms can be observed in the images of Figs. 4 and 6, which with arrows are shown. The differences of both algorithms with respect to the Otsu's method are evident. In all cases the Otsu's method originated a lot of noise. It was proven that even though the images were filtered with Gauss, the obtained results by using the Otsu's method were not good. The best result obtained by using the mean shift algorithm is due to that the mean shift is a good low pass filter. On the other hand, observe in Fig. 6 that in the obtained result with the algorithm that uses graph, at least, an object was eliminated (see circle in the Fig. 6 (c)). However, this problem didn't happen to the binarized image obtained with the new algorithm (see Fig. 6 (b)).

C. Quantitative comparison among all algorithms

Since much of the motivation for object binarization is to automate all or part of the binarization process, it is perhaps more important to compare the results obtained by automatic binarization against those obtained by manual binarization. In Figure 7 three examples of the calculate XOR among the manual binarization and binarization by using the algorithms. In this validation only three of the analyzed images were presented; nevertheless, this comparison was carried out with all images.



Fig. 7. (a) Manual binarization images, (b) XOR with the binarized images by recursively applying the mean shift filtering, (c) XOR with the binarized images by using graph, (d) XOR with the binarized images via Otsu's method.

Results reported in this investigation have been confirmed by qualitative and quantitative comparisons by specialist, because they know the objects that they want to isolate. Therefore, in order to evaluate the performance of all algorithms, the percentage of false positives and false negatives was calculated among the result from the binarization and the manual binarization. Numerical results of the comparison, by using the Eq. (10), were summarized in Table IV, Table V, Table VI.

Table IV: Numerical results of comparison between manual and automatic binarization (by recursively applying the mean shift filtering)

Table V: Numerical results of comparison between manual and automatic binarization (by using graph)

Table VI: Numerical results of comparison between manual and automatic binarization (by using the Otsu's method)

In Table IV one can observe that, in the case of recursively applying the mean shift filtering, the percent error for false negatives were equal to 0 %; that is, all regions belonging to objects were detected. This denoted the correct performance of the new algorithm. However, in Table V it is possible to note that with the algorithm using graph, in two occasions, the FN was different of cero; that is, the algorithm was not able to detect all objects. This is due to the aggressiveness of the parameter M in this algorithm. In Table VI one can see that the Otsu's method did not originate FN; that is, all objects were detected. However, this method (Otsu) originated a lot of FP, which it was related with the noise. This evidences the advantage of the new algorithm with regard to the previous one. Nevertheless, the algorithms that use the mean shift were superior to the Otsu's method.

VI. CONCLUSIONS

In this work a new algorithm that applies recursively the mean shift filtering and the entropy as a stopping criterion was proposed. With this new algorithm the binarization was carried out after the image was segmented. It was demonstrated that the new binarization algorithm through applying recursively the mean shift was more effective and more robust than the algorithm using graph. It was proven also that, in all cases, the new algorithm was superior to the Otsu's method.

REFERENCES

1. Cheng, Y., "Mean Shift, Mode Seeking, and Clustering", IEEE Trans. Pattern Analysis and Machine Intelligence, 17, 790-799 (1995).        [ Links ]

2. Cheriet, M., J.N. Said and C.Y. Suen, "A Recursive Thresholding Technique for Image Segmentation", IEEE Transactions on Image Processing, 7, 918-921 (1998).         [ Links ]

3. Chin-Hsing, C., J. Lee, J. Wang and C.W. Mao, "Color image segmentation for bladder cancer diagnosis", Math. Comput. Modeling, 27, 103-120 (1998).        [ Links ]

4. Chenyang, X., P. Dzung and P. Jerry, "Image Segmentation Using Deformable Models", SPIE Handbook on Medical Imaging, Medical Image Analysis, Edited by J. M. Fitzpatrick and M. Sonka, III, 129-174 (2000).        [ Links ]

5. Comaniciu, D.I., Nonparametric Robust Method for Computer Vision, Ph.D. thesis, New Brunswick, Rutgers, The State University of New Jersey (2000).        [ Links ]

6. Comaniciu, D. and P. Meer, "Mean Shift: A Robust Approach Toward Feature Space Analysis", IEEE Transaction on Pattern Analysis and Machine Intelligence, 24, 603-619 (2002).        [ Links ]

7. Christoudias, C.M., B. Georgescu, P. Meer, "Synergism in Low Level Vision", 16th International Conference on Pattern Recognition, Quebec City, Canada, IV, 150-155 (2002).        [ Links ]

8. Fukunaga, K. and L.D. Hostetler, "The Estimation of the Gradient of a Density Function", IEEE Trans., Information Theory, 21, 32-40 (1975).         [ Links ]

9. Grenier, T., C. Revol-Muller, F. Davignon and G. Gimenez, "Hybrid Approach for Multiparametric Mean Shift Filtering", IEEE International Conference Image Processing, Atlanta, GA, 1541-1544, (2006).        [ Links ]

10. Kenong, W., D. Gauthier and M.D. Levine, "Live Cell Image Segmentation", IEEE Transactions on Biomedical Engineering, 42, 1-12 (1995).        [ Links ]

11. Koss, J.E., F.D. Newman, T.K. Johnson and D.L. Kirch, "Abdominal organ segmentation using texture transforms and a Hopfield neural network", IEEE Trans. Med. Imag., 18, 640-648 (1999).        [ Links ]

12. Rodríguez, R., T.E. Alarcón and L. Sánchez, "MADIP: Morphometrical Analysis by Digital Image Processing", Proceedings of the IX Spanish Symposium on Pattern Recognition and Image Analysis, ISBN 84-8021-349-3, I, 291-298 (2001).        [ Links ]

13. Rodríguez, R., T.E. Alarcón and I. Castellanos, "A strategy for reduction of noise in segmented images. Its use in the study of angiogenesis", Journal of Intelligent and Robotic System, 33, 99-112 (2002a).         [ Links ]

14. Rodríguez, R., T.E. Alarcón, R. Wong and L. Cuello, "Color Segmentation Applied to Study of the Angiogenesis", Journal of Intelligent and Robotic System, 34, 83-97 (2002b).        [ Links ]

15. Rodríguez, R., T.E. Alarcón and J.J. Abad, "Blood vessel segmentation via neural network in histological images", Journal Intelligent and Robotic System, 36, 451-465 (2003).        [ Links ]

16. Rodríguez, R., A.G. Suárez and P. J. Castillo, "Utilización de la media desplaza para la segmentación de imágenes", Boletín de la Sociedad Cubana de Matemática y Computación, 3(2005),        [ Links ]

17. Salvatelli, A., J. Caropresi, C. Delrieux, M.F. Izaguirre and V. Caso, "Cellular Outline Segmentation using Fractal Estimators", Journal of Computer Science and Technology, 7, 14-22 (2007).        [ Links ]

18. Shareef, N., D.L Wang and R. Yagel, "Segmentation of medical images using LEGION", IEEE Trans. Med. Imag., 18, 74-91 (1999).        [ Links ]

19. Schmid, P., "Segmentation of digitized dermatoscopic images by two-dimensional colour clustering",IEEE Trans. Med. Imag., 18, 164-171 (1999).        [ Links ]

20. Sijbers, J., P. Scheunders, M. Verhoye, A. Van der Linden, D. Van Dyck and E. Raman, "Watershed-based segmentation of 3D MR data for volume quatization", Magnetic Resonance Imaging, 15, 679-688 (1997).        [ Links ]

21. Vicent, L, and P. Soille, "Watersheds in digital spaces: An efficient algorithm based on immersion simulations", IEEE Transact. Pattern Anal. Machine Intell., 13, 583-593 (1991).        [ Links ]

22. Zhang, H., J.E. Fritts and S.A. Goldman, "A Entropy-based Objective Evaluation Method for Image Segmentation", Storage and Retrieval Methods and Applications for Multimedia 2004. Edited by Yeung, Minerva; Lienhart, M., Rainer W.; Li, Choung-Sheng, Proceeding of The SPIE , 5307, 38-49 (2003).         [ Links ]

Received: April 25, 2008.
Accepted: December 30, 2008.
Recommended by Subject Editor Jorge Solsona.

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons