(BQ) Part 2 book Dermoscopy image analysis presents the following contents: Dermoscopy image assessment based on perceptible color regions, accurate and scalable system for automatic detection of malignant melanoma, from dermoscopy to mobile teledermatology,...
Dermoscopy Image Assessment Based on Perceptible Color Regions 237 (a) (b) (c) (d) (e) (f ) (g) (h) (i) FIGURE 8.1 (See color insert.) An example of the DMB system and lesion segmentation (a) The original image (b) The average color distance ratio image for determining the region of interest (ROI) (c) The result of automated segmentation for the pigmented lesion, where the lesion border is shown by the white line (d–f) The red, green, blue channel images, respectively (g–i) Three degrees of brightness images from d–f, respectively (From Lee, G et al., Skin Research and Technology, vol 18, pp 462–470, 2012.) After identifying the lesion, three diagnostic parameters were measured: the dominant color regions (DCRs), the bluish dominant regions (BDRs), and the number of minor color regions (MCRs) on 150 dermoscopy images, including 75 malignant melanomas and 75 benign pigmented lesions With the DMB system, color regions (ddd, ddm, dmm, mdd, mdm, mmd, mmm, bmd, and bmm) were present in more than 1% of dermoscopy images, whereas 18 color regions (ddb, dmd, dmb, dbd, dbm, dbb, mdb, mmb, mbd, mbm, mbb, bdd, bdm, bdb, bmb, bbd, bbm, and bbb) were detected under 1% in both malignant melanoma and benign pigmented lesion images [28] The ddd, mdd, mmd, mmm, and bmm regions were present in more than 70% of all images selected as five DCRs (5-DCRs) because they were so common The percentage lesion area occupied by 5-DCRs is shown in Figure 8.2 The 5-DCRs made up a larger percentage of the total lesion region T&F Cat #K23910 — K23910 C008 — page 237 — 7/14/2015 — 9:32 Dermoscopy Image Analysis 238 TABLE 8.1 Presence Rate and Occupying Rate of DMB System in Malignant Melanoma and Benign Pigmented Lesions on 150 Dermoscopy Images Comprising 75 Malignant Melanomas and 75 Benign Pigmented Lesions Presence Rate (%) DMB Color Region ddd ddm ddb dmd dmm dmb dbd dbm dbb mdd mdm mdb mmd mmm mmb mbd mbm mbb bdd bdm bdb bmd bmm bmb bbd bbm bbb Occupying Rate (%) Malignant Melanoma Benign Pigmented Lesion Malignant Melanoma Benign Pigmented Lesion 100 53.3 2.67 18.7 1.33 0 100 40 100 97.3 14.7 1.33 1.33 1.33 44 88 10.7 24 14.7 100 2.67 13.3 6.67 0 0 96 2.67 98.7 85.3 0 9.33 0 0 34.7 78.7 0 26.7 1.33 33.5 3.25 0.18 1.2 0.03 0 16.6 1.66 11.8 17.6 0.36 0.08 0.05 0.09 0.05 2.01 9.65 0.3 0.99 0.55 41.4 0.32 0.47 0.24 0 0 16.5 0.17 15.4 15.8 0.04 0.29 0.03 0.02 0 1.13 7.14 0.02 0.88 0.15 Note: The DMB group incidences in each lesion were calculated if they occupied more than 1% of the total ROI area of interest (ROI) for the benign pigmented lesions than for the malignant melanomas The 5-DCRs occupied more than 90% of the ROI area in 93.33% of the benign pigmented lesions In contrast, the 5-DCRs comprised more than 90% of the area in 52.0% of the malignant melanoma lesions 5-DCRs were commonly present in both malignant melanoma and benign pigmented lesions, and most benign lesions consisted of 5-DCRs However, the occupying rate of melanoma was less than that of the benign pigmented lesion because a number of colors were considered as one of important factors in melanoma diagnosis T&F Cat #K23910 — K23910 C008 — page 238 — 7/14/2015 — 9:32 Dermoscopy Image Assessment Based on Perceptible Color Regions 239 100 Malignant melanoma Incidence of presence 80 Benign pigmented lesion 60 40 20 0~90% 90~100% Percentage area of 5-DCR in lesion (%) FIGURE 8.2 The percentage of 5-DCRs occupying lesions in each malignant melanoma (black bar) and benign pigmented lesion (white bar) 60 Malignant melanoma Benign pigmented lesion Incidence of presence 50 40 30 20 10 0 Number of MCR (%) FIGURE 8.3 The number of MCRs in each malignant melanoma (black bar) and benign pigmented lesion (white bar) In nine color regions which that present in more than 1% of dermoscopy images, five color regions (ddd, mdd, mmd, mmm, and bmm) were selected as 5-DCRs, which are commonly presented in lesions The remaining four color regions (ddm, dmm, mdm, and bmd ) were defined as minor color regions (MCRs) The number of MCRs in lesions is shown in Figure 8.3 Less than one MCR was detected in 94.67% of the benign pigmented lesion group, and T&F Cat #K23910 — K23910 C008 — page 239 — 7/14/2015 — 9:32 Dermoscopy Image Analysis 240 100 Incidence of presence (%) Malignant melanoma Benign pigmented lesion 80 60 40 20 Absence Presence Presence of BDR in lesions FIGURE 8.4 The incidence of BDRs in malignant melanoma (black bars) and benign pigmented lesions (white bar) 52% was not detected In contrast, more than two MCRs were detected in 58.46% of the malignant melanoma group Bluish colors such as blue–white veil are another important color feature in melanoma diagnosis, and these colors are expressed when the B channel is higher in RGB color space We defined the BDRs (ddm, ddb, dmb, mdb, and mmb) as regions having higher brightness in the B channel than in the R and G channels The ddb and mdb were not detected in any images, and therefore only the ddm, dmb, and mmb were included as BDRs in this study (Table 8.1) The ratio of BDRs in the lesions is shown in Figure 8.4 The BDRs were present (at more than 1% of the lesion) in only 6.67% of the benign pigmented lesion group, compared to 61.33% of the malignant melanoma group The diagnostic accuracy was calculated using three diagnostic parameters derived from the 5-DCRs, BDRs, and the number of MCRs The DCR diagnostic parameter was considered positive when the DCRs occupied less than 80% of the lesion The BDR diagnostic parameter was considered positive when the area of the BDRs was detected in the lesion area The number of MCRs diagnostic parameter was considered positive when the lesion contained more than two A positive melanoma diagnosis resulted from each of these three diagnostic parameters being positive The diagnostic accuracy using the three diagnostic parameters was calculated in terms of sensitivity and specificity (Table 8.2) In the case of one positive diagnostic parameter, the sensitivity was 73.33% and specificity was 92.00% In the case of two positive diagnostic parameters, the sensitivity was 53.33% and specificity was 96.00% In the case of three positive diagnostic parameters, the sensitivity was 30.67% and specificity was 98.67% T&F Cat #K23910 — K23910 C008 — page 240 — 7/14/2015 — 9:32 Dermoscopy Image Assessment Based on Perceptible Color Regions 241 TABLE 8.2 Diagnostic Accuracy of Melanoma Based on the Three Diagnostic Parameters Diagnostic Accuracy by Three Diagnostic Parameters Sensitivity Specificity Positive in Single Parameter Positive in Two Parameters Positive in Three Parameters 73.33% 92.00% 53.33% 96% 30.67% 98.67% Note: The sensitivity and specificity were calculated for combinations of the diagnostic parameters derived from 5-DCRs, BDRs, and the number of MCRs 8.3.5 PERCEPTIBLE COLOR DIFFERENCE The colors of the color regions are based on the three gray levels in each channel These colors are different from the colors observed in the original image Hence, we approximated the color regions to the color of the original image by using the average color of each color region In order to assess the number of colors, every color for the assessment is required to have a perceptible color difference from the other colors If two colors have a slight difference or an imperceptible difference, these two colors have to be considered one color We used the National Bureau of Standards (NBS) unit to calculate the color difference of the approximated colors The NBS unit was established to better approximate human color perception, and it has a close relation with the value of human color perception [29] However, the NBS unit is based on the CIE 1994 color difference model (ΔE∗94 ) calculated using the CIE L*a*b* color space Therefore, a color space conversion is required The definition of CIE L*a*b* is based on the CIEXYZ color space, which is derived from the RGB color space as follows: ⎡ ⎤ ⎡ ⎤⎡ ⎤ X 0.4124564 0.3575761 0.1804375 R ⎣ Y ⎦ = ⎣0.2126729 0.7151522 0.0721750⎦ ⎣G⎦ (8.8) Z 0.0193339 0.1191920 0.9503041 B Then, the XYZ color space is transformed into L*a*b* using the CIE L*a*b* formula given in Equation 8.9 ⎫ ⎧ Y ⎪ ⎪ ∗ ⎪ ⎪ − 16 L = 116f ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ 100 ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎬ ⎨ X Y ∗ a = 500 f −f (8.9) ⎪ ⎪ 95.05 100 ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ ⎪ Z Y ⎪ ⎪ ∗ ⎪ ⎪ −f ⎭ ⎩b = 200 f 100 108.88 T&F Cat #K23910 — K23910 C008 — page 241 — 7/14/2015 — 9:32 Dermoscopy Image Analysis 242 ⎧ ⎨q , with if q > 0.008856 (8.10) 16 ⎩7.787q + , otherwise 116 In this study we set the white reference as D65 for the two transformations In the CIE L*a*b* color space, L* correlates with the perceived lightness, and a* and b* correlate approximately with the red–green and yellow–blue chroma perceptions a* and b* in this color space can also be represented in terms of C∗ab (chroma) and H∗ab (hue) as expressed in Equations 8.11 and 8.12, respectively [30] √ ∗ = a∗ + b∗ (8.11) Cab f (q) = ∗ Hab = tan−1 b∗ a∗ (8.12) In the transformed color space CIE L*a*b*, the CIE 1994 color difference model with the symbol ΔE∗94 is calculated using Equations 8.13 and 8.14 ∗ = ΔE94 SL = 1, ΔL∗ kL SL + ∗ ΔCab kC SC ∗ SC = + 0.045Cab , + ∗ ΔHab kH SH ∗ SH = + 0.015Cab (8.13) (8.14) The parametric factors, kL , kC , and kH , are used for adjusting the relative weighting of the lightness, chroma, and hue, respectively, of the color difference for various experimental conditions [30] In order to calculate ΔE∗94 , these three parametric factors are set as follows: KL = KC = KH = (8.15) The NBS unit is used as the unit representing the color difference, and it has one when ΔE∗ = The NBS unit can be roughly classified into five levels according to the degrees of color difference perceived by humans as given in Table 8.3 From the table, the NBS unit indicates that the color difference is almost the same or slightly different when the NBS unit is smaller than 3.0 and is remarkably different when the NBS unit is between 3.0 and 6.0 In this study, we define color difference to be imperceptibly the same for three grades (NBS unit: less than 3, 4.5, and 6) and the color regions that the color difference is less than each grade are merged 8.3.6 COLOR ASSESSMENT BASED ON PERCEPTIBLE GRADE Imperceptible colors are determined on the basis of three NBS units (3, 4.5, and 6), and the number of colors is assessed by the number of perceptible colors in the lesion Table 8.4 shows the sensitivity, specificity, and diagnosis accuracy values obtained by using the three NBS units In the case of three NBS units, T&F Cat #K23910 — K23910 C008 — page 242 — 7/14/2015 — 9:32 Dermoscopy Image Assessment Based on Perceptible Color Regions 243 TABLE 8.3 Correspondence between the Human Color Perception and the NBS Unit NBS Unit Human Perception 0–1.5 1.5–3.0 3.0–6.0 6.0–12.0 12.0– Almost the same Slightly different Remarkably different Very different Different color TABLE 8.4 Sensitivity (SE), Speci city (SP), and Diagnostic Accuracy (ACC) Values at Three Grades of NBS Unit NBS Units More than colors ≥2 ≥3 ≥4 ≥5 ≥6 ≥7 4.5 NBS Units SE (%) SP (%) ACC (%) SE (%) 100.00 2.67 51.33 100.00 100.00 100.00 96.00 92.00 64.00 18.97 42.67 60 93.33 98.67 59.33 71.33 78.00 92.67 81.33 100.00 100.00 92.00 78.67 48.00 SP (%) NBS Units ACC (%) SE (%) SP (%) ACC (%) 2.67 51.33 100.00 2.67 51.33 22.67 53.33 77.33 96.00 100.00 61.33 76.67 84.67 87.33 74.00 100.00 97.33 82.67 52.00 29.33 33.33 69.33 94.67 98.67 98.67 66.67 83.33 88.67 75.33 64.00 the highest diagnosis accuracy of 92.67% with 92.00% sensitivity and 93.33% specificity is obtained at more than six colors In the case of 4.5 NBS units, the highest diagnosis accuracy of 87.33% with 78.67% sensitivity and 96.00% specificity is obtained at more than six colors In the case of six NBS units, the highest diagnosis accuracy of 88.67% with 82.67% sensitivity and 94.67% specificity is obtained at more than five colors In the color assessment, each color region is formed by three discernable gray levels in each channel The three gray levels are regarded as the lesion, the doubtful area, and the surrounding skin from the perspective of extraction They are also regarded as dark, middle, and bright from the perspective of color Therefore, each color region refers to a distinct region constructed by a perceptible classification, and the approximated colors of color regions by means of each color region can be expressed as the representative colors in the dermoscopic image However, the number of representative colors is not equal to the number of colors in a lesion because some representative colors T&F Cat #K23910 — K23910 C008 — page 243 — 7/14/2015 — 9:32 244 Dermoscopy Image Analysis can be slightly different The number of colors is estimated by counting the number of different perceptible colors The NBS unit is useful for judging the perceptible color difference This unit is based on the CIE 1994 color difference model and is closely related to the value of human color perception The NBS unit indicates that the colors are almost the same or slightly different when its value is less than 3, remarkably different when the value is between and 6, and very different when the value is more than In this study, we defined the imperceptible color difference on the basis of three grades (3, 4.5, and NBS units) in a remarkably different range and counted the number of colors in the lesion The number of colors is assessed from the sensitivity, specificity, and diagnosis accuracy In the case of three NBS units, the highest diagnosis accuracy of 92.67% with 92.00% sensitivity and 93.33% specificity is obtained at more than six colors In the case of 4.5 NBS units, the highest diagnosis accuracy of 87.33% with 78.67% sensitivity and 96.00% specificity was obtained at more than six colors In the case of six NBS units, the highest diagnosis accuracy of 88.67% with 82.67% sensitivity and 94.67% specificity was obtained at more than five colors The highest diagnosis accuracy was obtained in the three NBS units 8.4 CONCLUSION In this chapter, we have presented a new method for color assessment in melanocytic lesions based on 27 color regions called the DMB system, simplifying the color information in dermoscopy images We classified each color channel into three degrees of brightness using the multithresholding method We performed the color assessment as based on the DMB system, which is constructed by perceptible three degrees of brightness in each RGB channel Five dominant color regions (5-DCR), bluish dominant regions (BDRs), and the number of minor color regions (MCRs) were calculated as diagnostic parameters, and diagnostic accuracy was calculated according to the number of positive parameters ACKNOWLEDGMENT This work was supported by the Ministry of Commerce, Industry and Energy by a grant from the Strategic Nation R&D Program (Grant 10028284), a Korea University grant (K0717401), and the National Research Foundation of Korea (NRF) (Grant 2012R1A1A2006556) Also, the Seoul Research and Business Development Program supported this study financially (Grant 10574) REFERENCES A A Marghoob and A Scope, The complexity of diagnosing melanoma, Journal of Investigative Dermatology, vol 129, pp 11–13, 2009 T&F Cat #K23910 — K23910 C008 — page 244 — 7/14/2015 — 9:32 Dermoscopy Image Assessment Based on Perceptible Color Regions 245 H Kittler, H Pehamberger, K Wolff, and M Binder, Diagnostic accuracy of dermoscopy, Lancet Oncology, vol 3, pp 159–165, 2002 M E Vestergaard, P Macaskill, P E Holt, and S W Menzies, Dermoscopy compared with naked eye examination for the diagnosis of primary melanoma: a meta-analysis of studies performed in a clinical setting, British Journal of Dermatology, vol 159, pp 669–676, 2008 S W Menzies, K A Crotty, C Ingvar, and W J McCarthy, An Atlas of Surface Microscopy of Pigmented Skin Lesions: Dermoscopy McGraw-Hill, Roseville, 2003 O Noor, A Nanda, and B K Rao, A dermoscopy survey to assess who is using it and why it is or is not being used, International Journal of Dermatology, vol 48, pp 951–952, 2009 J Scharcanski and M E Celebi, eds., Computer Vision Techniques for the Diagnosis of Skin Cancer, Springer-Verlag, Berlin, Heidelberg, 2013 K Korotkov and R Garcia, Computerized analysis of pigmented skin lesions: a review, Artificial Intelligence in Medicine, vol 56, pp 69–90, 2012 M E Celebi, W V Stoecker, and R H Moss, Advances in skin cancer image analysis, Computerized Medical Imaging and Graphics, vol 35, pp 83–84, 2011 G Argenziano, G Ferrara, S Francione, K Di Nola, A Martino, and I Zalaudek, Dermoscopy: the ultimate tool for melanoma diagnosis, Seminars in Cutaneous Medicine and Surgery, vol 28, pp 142–148, 2009 10 G Campos-do-Carmo and M R E Silva, Dermoscopy: basic concepts, International Journal of Dermatology, vol 47, pp 712–719, 2008 11 W Stolz, O Braun-Falco, P Bilek, M Landthaler, W H C Burgforf, and A B Cognetta, Color Atlas of Dermatoscopy, 2nd ed., Blackwell Publishing, Hoboken, NJ, 2002 12 R H Johr, Dermoscopy: alternative melanocytic algorithms—the ABCD rule of dermatoscopy, Menzies scoring method, and 7-point checklist, Clinics in Dermatology, vol 20, pp 240–247, 2002 13 S Seidenari, G Pellacani, and C Grana, Computer description of colours in dermoscopic melanocytic lesion images reproducing clinical assessment, British Journal of Dermatology, vol 149, pp 523–529, 2003 14 W Stolz, A Riemann, A B Cognetta, L Pillet, W Abmayr, D Holzel, P Bilek, F Nachbar, M Landthaler, and O Braunfalco, ABCD rule of dermatoscopy: a new practical method for early recognition of malignant-melanoma, European Journal of Dermatology, vol 4, pp 521–527, 1994 15 S W Menzies, C Ingvar, K A Crotty, and W H McCarthy, Frequency and morphologic characteristics of invasive melanomas lacking specific surface microscopic features, Archives of Dermatology, vol 132, pp 1178–1182, 1996 16 G Argenziano, G Fabbrocini, P Carli, V De Giorgi, E Sammarco, and M Delfino, Epiluminescence microscopy for the diagnosis of doubtful melanocytic skin lesions: comparison of the ABCD rule of dermatoscopy and a new 7-point checklist based on pattern analysis, Archives of Dermatology, vol 134, pp 1563–1570, 1998 17 M E Celebi and A Zornberg, Automated quantification of clinically significant colors in dermoscopy images and its application to skin lesion classification, IEEE Systems Journal, vol 8, pp 980–984, 2014 T&F Cat #K23910 — K23910 C008 — page 245 — 7/14/2015 — 9:32 PH2 433 5.15%, and 2.05%, respectively These results demonstrate that the implemented segmentation algorithm provides good results for the majority of the tested images However, there are three main groups of images in which the algorithm may demonstrate limitations: (1) images in which the lesion is fragmented, (2) lesions presenting a great variety of colors and textures, and (3) images with a very low contrast between the lesion and the skin 13.3.2 COLOR LABELING Color plays a major role in the analysis of dermoscopy images [1] For example, the ABCD rule of dermoscopy considers six clinically significant colors (blue– gray, black, white, dark and light brown, red) that medical doctors try to detect in melanocytic lesions The number of colors observed in an image is a malignancy measure that is combined with other criteria (border, symmetry, differential structures) [5] The detection of colors is a subjective task that requires considerable training Nonetheless, the development of computational methods for this task is a desired goal This problem is addressed in [13], assuming that the feature vector y, associated with a small patch of size 12 × 12 and a color c, is a random variable described by a mixture of Gaussians kc p(y|c, θc ) = αcm p(y|c, θcm ) (13.7) m=1 where kc is the number of Gaussians in the mixture and θcm = (αcm , μcm , Rcm ) denotes the set of parameters (weight, mean, covariance matrix) of the mth Gaussian mode The number of Gaussians, kc , and their parameters, θcm , should be estimated from a set of dermoscopy images annotated by an expert This problem can be solved since the PH2 database contains 29 images with color segmentations performed by an experienced dermatologist The estimation of the mixture order and parameters was carried out using the algorithm proposed in [14] This procedure is repeated for each of the five colors considered in this chapter (blue–gray, black, white, dark and light brown) The red color was excluded due to the lack of examples associated with this color in the annotated images After estimating the five mixtures, each pixel is then classified into the most probable color This is achieved by the Bayes’ law p(c|y) = p(y|c, θc )p(c) p(y|θ) c = 1, , (13.8) where θc = (θc1 , , θckc ), θ = (θ1 , , θ5 ) are the estimates of the mixture parameters A label c is assigned to the pixel y using the following decision rule First, the degrees of membership to each of the five colors are sorted T&F Cat #K23910 — K23910 C013 — page 433 — 8/14/2015 — 10:51 Dermoscopy Image Analysis 434 The highest and second-highest values are denoted as c1 and c2 , respectively Then, this information is used to either label the patch or reject it, as follows: If p(c1 |y) ≥ δ and p(c1 |y) − p(c2 |y) > , where δ and are empirically determined thresholds, the patch is labeled according to color c1 If p(c1 |y) ≥ δ and p(c1 |y) − p(c2 |y) ≤ , the patch receives a label that expresses doubt between c1 and c2 If p(c1 |y) < δ , the patch is rejected Finally, we assign color labels to the test images This task is performed using the patches previously labeled with one of the five colors We not consider the patches that have doubt labels in this process For each color, we compute the area of the patches with its label (i.e., the number of pixels) and compare it with an empirically determined area ratio threshold Each color is validated only if its area ratio is above a specific threshold, where the area ratio (λc ) for color c is defined as λc = Apatchesc Alesion (13.9) For each color we have a different area ratio threshold, which was empirically determined We will now present experimental results obtained on the PH2 database The five mixture models were estimated using 29 annotated images from the PH2 database (see Section 13.2) Each of these images is associated with five binary masks defining examples of each color Then, we selected another 123 images (test set) to evaluate the performance of the algorithm The test images were classified according to the learned models, and a set of labels was automatically computed for each image and compared with the colors annotated by a medical specialist Table 13.2 compares the performance of the proposed system with the labeling performed by a specialist We used the following TABLE 13.2 Evaluation of Color Labeling Algorithm in the PH2 Database Blue–gray Dark brown Light brown Black White Total CD DF CND FA SE SP ACC 24 74 71 24 199 18 62 16 19 54 71 222 15 21 25 76 92.3% 94.9% 91.0% 85.7% 85.7% 91.7% 80.5% 64.0% 76.0% 72.0% 74.0% 74.5% 86.4% 79.4% 83.5% 78.9% 79.8% 81.7% Note: CD, correct detections; DF, detection failures; CND, correct non detections; FA, false alarms; SE, sensitivity, SP, specificity, and ACC, accuracy T&F Cat #K23910 — K23910 C013 — page 434 — 8/14/2015 — 10:51 PH2 Blue Dark brown Black 435 Blue Dark brown Light brown Black White Doubts Blue Dark brown Light brown Black White Doubts Dark brown Light brown FIGURE 13.8 Examples of color detection in melanoma (top) and benign lesions (bottom) metrics to evaluate the algorithm Each color correctly identified in an image is considered a correct detection (CD), a missed color is a detection failure (DF), a detected color that has no corresponding label is a false alarm (FA), and each color that is correctly nondetected is a correct nondetection (CND) Additionally, we also compute three different statistics: sensitivity (SE), specificity (SP), and accuracy (ACC) It is possible to see that a good match was obtained in this dataset Figure 13.8 shows two examples of color detection 13.3.3 MELANOMA DETECTION Melanoma is the most aggressive type of skin cancer The early detection of melanomas is therefore a major goal in dermoscopy analysis Several methods have been proposed in the literature to perform this task [15–19] Most of them fit into a three-step structure: (1) lesion segmentation, (2) feature extraction, and (3) feature classification First, the lesion boundary is estimated in the input images Then, a set of features (color, texture, shape, symmetry) is T&F Cat #K23910 — K23910 C013 — page 435 — 8/14/2015 — 10:51 Dermoscopy Image Analysis 436 extracted from each lesion Finally, a classifier (e.g., support vector machine (SVM)) is trained to distinguish benign lesions from malign ones We developed computer diagnosis systems for the detection of melanomas, based on two classification strategies: (1) global classification strategy, using global features extracted from the lesion [19] and (2) local strategy, representing the lesion by a set of local patches, each of them described by local features; the decision is obtained using a bag-of-features (BoF) approach [19, 20] Several types of features were considered in the global approach: color features, texture features, shape, and symmetry features The BoF approach also considered color and texture features associated with lesion patches See the different descriptors in Table 13.3 In each approach, the color features were computed using multiple color spaces (RGB, HSV, CIE L*a*b*, and opponent) The global system used a k-nearest neighbor (kNN) classifier with k ∈ {3, 5, , 25} neighbors The BoF system used a dictionary of {100, 200, 300} words; the decision is made by a kNN classifier as well, with k ∈ {3, 5, , 25} Details can be found in [19, 20] The algorithms were evaluated by 10-fold cross-validation The set of 200 images is split into 10 subsets (folds) of 20 images each; folds are used for training and fold for testing This procedure is repeated 10 times with different test folds The results are shown in Table 13.4 for both classification strategies and different types of features Very good performance is achieved by several configurations The best types of features are color and color symmetry features Both strategies (global and local) lead to very good results We prefer local classifiers since they take into account local properties of the dermoscopy image, as the examination is performed by medical experts The global methods, however, are much faster during the training phase TABLE 13.3 Considered Features Features Global Local Color Color histograms Mean color vectors Color histograms Mean color vectors Texture Gradient histograms Gray Level Co-occurence Matrix (GLCM) Laws’ masks Gabor filters Gradient histograms GLCM Laws’ masks Gabor filters Shape Simple shape descriptors Fourier descriptors Wavelet descriptors Moment invariants — — — — Symmetry Shape symmetry Color symmetry (histograms and mean color vector) — — T&F Cat #K23910 — K23910 C013 — page 436 — 8/14/2015 — 10:51 PH2 437 TABLE 13.4 Performance of Melanoma Detection Algorithms for Dierent Types of Features and Classi cation Strategies Global Features Local Sensitivity Speci city Descriptor Sensitivity Speci city Descriptor Color 90% 89% 93% 84% Texture 93% 78% 88% 76% Shape Symmetry 81% 92% 88% 85% HSV histogram Gradient histogram Wavelets HSV mean color — — — — L*a*b* histogram Laws’ masks — — FIGURE 13.9 Examples of benign (top) and malign (bottom) lesions classified by global system (the symbol × denotes a misclassification) Figures 13.9 and 13.10 show examples of melanocytic lesions extracted from the PH2 database, together with the decision provided by the two systems described in this section Color features were used in both cases to characterize the lesion or its patches We observe that the systems provide correct decisions in most cases and solve some difficult examples in which the classification of the skin lesions is not simple 13.4 CONCLUSIONS This chapter presents a dataset of 200 dermoscopic images with medical annotations, publicly available at www.fc.up.pt/addi The images and medical annotations were provided by the dermatology service of Hospital Pedro Hispano, Matosinhos, Portugal, and comprise both benign nevi (160) and malign melanomas (40) T&F Cat #K23910 — K23910 C013 — page 437 — 8/14/2015 — 10:51 438 Dermoscopy Image Analysis FIGURE 13.10 Examples of benign (top) and malign (bottom) lesions classified by the local system (the symbol × denotes a misclassification) The PH2 dataset aims to provide a benchmarking tool for the comparison of computer-aided diagnosis systems for the analysis of dermoscopic images This chapter includes examples of such systems trained with the PH2 database and tested using 10-fold cross-validation The results presented in this chapter for illustrative purposes allow a direct comparison with other systems The PH2 database was made available online in September 2013 Since its release, the PH2 database has been downloaded 51 times by research groups from 21 countries These download statistics are a strong indicator of the impact of the PH2 database in the research community due to the lack of public ground truth databases of dermoscopic images ACKNOWLEDGMENTS This work was partially funded with grant SFRH/ BD/84658/2012 and by the FCT Funda¸c˜ ao para a Ciˆencia e Tecnologia projects PTDC/SAUBEB/103471/2008 and [UID/EEA/50009/2013] REFERENCES G Argenziano, H P Soyer, V De Giorgi, D Piccolo, P Carli, M Delfino, A Ferrari et al., Interactive Atlas of Dermoscopy, EDRA Medical Publishing and New Media, 2000 G C Carmo and M R e Silva, Dermoscopy: Basic concepts, International Journal of Dermatology, vol 47, pp 712–719, 2008 H Kittler, H Pehamberger, K Wolff, and M Binder, Diagnostic accuracy of dermoscopy, Lancet Oncology, vol 3, pp 159–165, 2002 ADDI project: Automatic computer-based diagnosis system for dermoscopy images, http://www.fc.up.pt/addi W Stolz, A Riemann, and A B Cognetta, ABCD rule of dermatoscopy: A new practical method for early recognition of malignant melanoma, European Journal of Dermatology, vol 4, pp 521–527, 1994 T&F Cat #K23910 — K23910 C013 — page 438 — 8/14/2015 — 10:51 PH2 439 G Argenziano, G Fabbrocini, P Carli, V D Giorgi, E Sammarco, and M Delfino, Epiluminescence microscopy for the diagnosis of doubtful melanocytic skin lesions: Comparison of the ABCD rule of dermatoscopy and a new 7-point checklist based on pattern analysis, Archives of Dermatology, vol 12, no 134, pp 1563–1570, 1998 S Menzies, C Ingvar, K Crotty, and W H McCarthy, Frequency and morphologic characteristics of invasive melanomas lacking specific surface microscopic features, Archives of Dermatology, vol 132, pp 1178–1182, 1996 M E Celebi, G Schaefer, H Iyatomi, and W V Stoecker, Lesion border detection in dermoscopy images, Computerized Medical Imaging and Graphics, vol 33, pp 148–153, 2009 C Xu and J L Prince, Snakes, shapes, and gradient vector flow, IEEE Transactions on Image Processing, vol 17, no 3, 1998 10 C Barata, J S Marques, and J Rozeira, A system for the detection of pigment network in dermoscopy images using directional filters, IEEE Transactions on Biomedical Engineering, vol 10, pp 2744–2754, 2012 11 M Kass, A Witkin, and D Terzopoulos, Snakes: Active contour models, International Journal of Computer Vision, pp 321–331, 1988 12 J Canny, A computational approach to edge detection, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol 8, no 6, pp 679–714, 1986 13 C Barata, M A T Figueiredo, M E Celebi, and J S Marques, Color identification in dermoscopy images using Gaussian mixture models, in IEEE International Conference on Acoustics, Speech, and Signal Processing, pp 3611–3615, Florence, Itlay, 2014 14 M A T Figueiredo and A K Jain, Unsupervised learning of finite mixture models, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol 24, no 3, pp 381–396, 2002 15 H Ganster, P Pinz, R Rohrer, E Wildling, M Binder, and H Kittler, Automated melanoma recognition, IEEE Transactions on Medical Imaging, vol 20, no 3, pp 233–239, 2001 16 M E Celebi, H Kingravi, B Uddin, H Iyatomi, Y Aslandogan, W Stoecker, and R Moss, A methodological approach to the classification of dermoscopy images, Computerized Medical Imaging and Graphics, vol 31, pp 362–373, 2007 17 H Iyatomi, H Oka, M E Celebi, M Hashimoto, M Hagiwara, M Tanaka, and K Ogawa, An improved Internet-based melanoma screening system with dermatologist-like tumor area extraction algorithm, Computerized Medical Imaging and Graphics, vol 32, pp 566–579, 2008 18 Q Abbas, M E Celebi, I F Garcia, and W Ahmad, Melanoma recognition framework based on expert definition of ABCD for dermoscopic images, Skin Research and Technology, vol 19, pp e93–e102, 2013 19 C Barata, M Ruela, M Francisco, T Mendon¸ca, and J S Marques, Two systems for the detection of melanomas in dermoscopy images using texture and color features, IEEE Systems Journal, vol 8, no 3, pp 965–979, 2014 20 C Barata, M Ruela, T Mendon¸ca, and J S Marques, A bag-of-features approach for the classification of melanomas in dermoscopy images: The role of color and texture descriptors, in Computer Vision Techniques for the Diagnosis of Skin Cancer (J Scharcanski and M E Celebi, eds.), pp 49–69, Springer, Berlin, 2014 T&F Cat #K23910 — K23910 C013 — page 439 — 8/14/2015 — 10:51 Index Abbas methods global pattern classification, 193–194 numerical results, 172–173 reticular pattern recognition, 135, 153–159 techniques used, 167–168 ABCDE rule, 252, 258 ABCD rule Abbas methods, 156 asymmetry, 303, 423 basic concepts, 250–251, 420 border blurriness, 306 color, 308 color detection, 13 color information, 232–233 color labeling, 433 color/texture features, 51 computer-aided diagnostics, 253 databases, 388 dermoscopic criteria, 423 feature extraction, 303 feature selection, 313 global pattern classification, 184 Gola methods, 148 image acquisition, 389 image features, 257 learning color mixture models, 13 melanoma diagnosis, 295 reticular patterns, 132 streak detection, 212 ABC features, 296 Absence of streaks, 223, 224 Accuracy (ACC) artificial neural networks, 296 color assessment, 240, 243 color identification, 16 color labeling, 435 computer system vs humans, 253–254 dermatoscopy, 250 DMB system, 240 ensemble classifiers, 315 evaluation, 107 hair detection, 364 lesion classification results, 10 lesion segmentation, 367–368 perceptible grade, 243 support vector machine, 373 Accurate and scalable system, melanoma automatic detection artificial neural networks, 314 asymmetry, 303–306 background, 297 basic concepts, 294–297 blurriness, 306–307 borders, 306–308 classification, 313–317 color, 305–306, 308–309 comparisons, 316–317 datasets, 323 decision trees, 315 differential structures, 310–311 discriminant analysis, 313–314 disease classification, 324–328 ensemble classifiers and modeling, 315, 320–321, 326–328 experimental results, 322–328 feature extraction, 303–311 feature selection, 311–313 high-level intuitive features, 311 image acquisition, 297–298 irregularity, borders, 307–308 k-nearest neighborhood, 314 large-scale training, 322 low-level features, 318–320, 324–325 machine learning method comparisons, 316–317 palette, color, 308 441 T&F Cat #K23910 — K23910 IDX — page 441 — 7/28/2015 — 17:25 442 patterns, 305–306, 323–326 relative color histogram analysis, 308–309 segmentation, 299–302 shape asymmetry, 304–305 statistical parameters, 309 summary, 328–329 support vector machine, 314 visual recognition approach, 317–322 Acquisition, different conditions basic concepts, 2–3 BoF model, 8–9 color, 5–7, 13–18 gamma correction, 6–7 gray shades, 5–6 learning color mixture models, 13–16 lesion classification, 8–9 related research work, 3–5 summary, 18–19 Acral volar lesions, 192 Active contours segmentation, 104, 299 AdaBoost, 151, 156, 194 Adaptive fuzzy symmetry distance (AFSD), 304 Adaptive histogram equalization (AHE), 394 Adaptive luminance transformation, 100 Adaptive symmetry distance (ASD), 304 ADTree, 144 AFSD, see Adaptive fuzzy symmetry distance AHE, see Adaptive histogram equalization Ambiguities, 37–46, 256 Analysis of variance (ANOVA), 136, 264–265 Anantha methods most relevant works, 173 numerical results, 171 Index reticular pattern recognition, 138–139 techniques used, 164 ANN, see Artificial neural networks Approximate lesion localization, 101–102 Architectural order, see Color, architectural order, symmetry, and homogeneity Area, feature extraction, 269 Area under the curve (AUC) Abbas methods, 155 bioinspired color representation, 56 classification, 55 color and texture analysis, 194 differential structures, 310 feature selection, 312–313 machine learning, streak detection, 224 parallel pattern detection, 192 pattern classifier on PH2, 326 Artifact removal color-space formulation, 46 computer vision-based system, 347–348 lesion segmentation, 50 preprocessing, lesion border detection survey, 102–104 streak detection, 212–213 Artificial neural networks (ANN) accuracy, 296 classification, 261, 314, 350 comparisons, 316, 375–376 computer vision-based system, 363–364 evaluation, 374–375 ASD, see Adaptive symmetry distance Aspect ratio, feature extraction, 356 Assessment, perceptible color regions basic concepts, 232 color assessment, 236–244 T&F Cat #K23910 — K23910 IDX — page 442 — 7/28/2015 — 17:25 Index color information importance, 232–233 DMB system, 233–241 establishment, color regions, 234–235 lesion segmentation-based, 236 melanoma diagnosis assessment, 233–244 perceptible color difference, 241–244 summary, 244 Asymmetry automatic detection of melanoma, 303–306 CAD system, 262–264 dermoscopic criteria, PH2 database, 425 feature extraction, 262–264, 303–306, 356 segmentation comparison, 71 streak detection, 212 Atypical pigment networks (APNs), 144, 148, 350 AUC, see Area under the curve Automated segmentation, 49–50, 105, 369; see also Segmentation Automated segmentation, melanocytic lesion variability basic concepts, 76–78 color clustering, 80–82 noise reduction, 79 postprocessing, 82–84 preprocessing, 78 projection onto 1-D color space, 78–79 variability of human vs., 84–91 Automatic detection artificial neural networks, 314 asymmetry, 303–306 background, 297 basic concepts, 294–297 blurriness, 306–307 443 borders, 98–100, 306–308 classification, 313–317 color, 305–306, 308–309 comparisons, 316–317 datasets, 323 decision trees, 315 differential structures, 310–311 discriminant analysis, 313–314 disease classification, 324–328 ensemble classifiers and modeling, 315, 320–321, 326–328 experimental results, 322–328 feature extraction, 303–311 feature selection, 311–313 high-level intuitive features, 311 image acquisition, 297–298 irregularity, borders, 307–308 k-nearest neighborhood, 314 large-scale training, 322 low-level features, 318–320, 324–325 machine learning method comparisons, 316–317 palette, color, 308 patterns, 305–306, 323–326 pigment network, conceptualization, 134–135 relative color histogram analysis, 308–309 segmentation, 299–302 shape asymmetry, 304–305 statistical parameters, 309 summary, 328–329 support vector machine, 314 visual recognition approach, 317–322 Automatic snake initialization, 429–432 Average color distance ratio, 236 Bag of features (BoF) model classification system, differential structures, 311 lesion classification, 8–10 T&F Cat #K23910 — K23910 IDX — page 443 — 7/28/2015 — 17:25 444 melanoma detection, 436 model-based global pattern classification, 200 S´ aez methods, 159 Bag-of-visual-words histogram model, 199 Barata methods most relevant works, 173 numerical results, 172 reticular pattern recognition, 151–153 techniques used, 167 Bayesian information criterion (BIC), 269–271 Bayesian law, 16 Bayesian model, 397 Bayes’ law, 433 BayesNet, 144 Bayes networks, 316 BDR, see Bluish dominant regions Benchmarking tool, see PH2 database/dataset Best-fit ellipse, 303, 308 Bhattacharyya-based distance metric, 201, 260 BIC, see Bayesian information criterion Bilinear interpolation, 103 Bin index, 44–45 Bioinspired color representation ambiguities, 37–46 basic concepts, 24–25 classification, 55–56 color image formation, 30–31 color-space formulation, 46–47 discussion, 56–58 experiments, 47–53 FastICA algorithm, 29 geometric mean chromaticity, 35–36 hemoglobin quantification, 58–60 independent component analysis, 26–30, 36–46 Index light-skin interaction, 31–32 melanin quantification, 58–60 method, 25–47 permutation ambiguity, 37–42 recovery algorithm, 46 results, 54–56 scaling and sign ambiguity, 42–46 segmentation, 54–55 skin coloration model, 32–37 skin lesion images algorithm, 36 summary, 60–61 Black frames, 103 BLINCK algorithm, 251–252, 295 Blind source separation (BSS), 26–27 Blobness, 219 Blotches, 134, 350 Blue channel, 100 Blue detection, 271 Blue-gray area, 265–266 Blue-white veil, 134, 350, 427 Bluish dominant regions (BDRs), 237, 240 Blur detection, 398–400 BoF, see Bag of features (BoF) model Borders automatic detection, 98–100, 106–107, 306–308 blurriness, 306–307 error rate, 432 feature extraction, 264–268, 306–308, 349 irregularity, 307–308 postprocessing, 106 streak detection, 212 systematic study, 113 unsolved problems, 113 Boundaries, postprocessing, 82–84 Bounding box, 101–102 Bowen’s disease, 272 Box counting, 306 Breslow depth, 272, 296, 370 Bright region, see DMB system T&F Cat #K23910 — K23910 IDX — page 444 — 7/28/2015 — 17:25 Index BSS, see Blind source separation Bubbles, 103 Bulkiness, 304, 349 CAD, see Computer-aided diagnostics Calibration methods, 3–5 Canny edge detection and algorithm artifact removal, 104 automatic snake initialization, 429 Gola methods, 149 hair detection, 351 Canon G10 camera, 272 Caputo methods, 137, 164, 171 CART, see Classification and regression tree algorithm Cartesian coordinate system, 391 CASH, see Color, architectural order, symmetry, and homogeneity Celebi methods, 108, 109 Central limit theorem (CLT) independent component analysis, 27 Central vs peripheral borders, 267–268 CFS, see Correlation-based feature selection Chaos and clues algorithm, 251, 295 Chernoff distance metric, 260 Choice of classifier, 280–282 Chromaticity, 33–34, 44 CI, see Compactness index CIECAM02 space, 156, 193 CIEL*a*b* spaces Abbas methods, 153, 156 basic concepts, 259 border blurriness, 307 color and pattern symmetry, 306 color assessment, DMB system, 241–242 color distribution, 265 color spaces, 100, 391–392 445 feature extraction, 262 lesion segmentation, 49 Markov random fields, 198 model-based global pattern classification, 199 permutation ambiguity, 40–42 reflection detection, 398 Sadeghi methods, 147 S´ aez methods, 153, 160 scaling and sign ambiguity, 44–46 segmentation, 300 Serrano and Acha methods, 143 specific color detection, 271 statistical parameters, 309 textons, 195 Wighton methods, 150 CIELUV space basic concepts, 391 color space transformation, 100 feature extraction, 349 hair removal, 395 S´ aez methods, 160 segmentation, 301–302 statistical parameters, 309 CIE94 space, 199 CIEXYZ space basic concepts, 259 color assessment, DMB system, 241 color spaces, 390–391 differential structures, 310 S´ aez methods, 160 segmentation, 300 transform, 136 Circularity and circularity index, 304, 349 CLAHE, see Contrast limited adaptive histogram equalization Classification automatic detection of melanoma, 313–317, 323–328 T&F Cat #K23910 — K23910 IDX — page 445 — 7/28/2015 — 17:25 ... Preprocessing 25 6 9 .2. 2 Segmentation and Hair Removal 25 6 24 7 T&F Cat #K23910 — K23910 C009 — page 24 7 — 7/14 /20 15 — 9: 32 248 Dermoscopy Image Analysis 9 .2. 3 Image Features 25 7 9 .2. 3.1... in dermoscopy images and its application to skin lesion classification, IEEE Systems Journal, vol 8, pp 980–984, 20 14 T&F Cat #K23910 — K23910 C008 — page 24 5 — 7/14 /20 15 — 9: 32 246 Dermoscopy Image. .. 25 8 9 .2. 4 Feature Selection 25 9 9 .2. 5 Classification 26 1 9.3 Feature Extraction 26 2 9.3.1 Asymmetry: Difference in Grayscale (f1 , f2 ) 26 2 9.3 .2 Asymmetry: