[IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) -...

5
A Computational Model of the Infant Pain Impressions with Gaussian and Nearest Mean Classifier Muhammad Naufal Mansor School of Mehatronic Engineering Universiti Malaysia Perlis Perlis, Malaysia [email protected] Mohd Nazri Rejab Department of Mechanical Politeknik Tuanku Syed Sirajuddin Perlis, Malaysia [email protected] Abstract—In the last recent years, non-invasive methods through image analysis of facial have been proved to be excellent and reliable tool to diagnose of pain recognition. This paper proposes a new feature vector based Local Binary Pattern (LBP) for the pain detection. Different sampling point and radius weighted are proposed to distinguishing performance of the proposed features. In this work, Infant COPE database is used with illumination added. Multi Scale Retinex (MSR) is applied to remove the shadow. Two different supervised classifiers such as Gaussian and Nearest Mean Classifier are employed for testing the proposed features. The experimental results uncover that the proposed features give very promising classification accuracy of 90% for Infant COPE database. Index Terms—Infant Pain, MSR, LBP, Gaussian, Nearest Mean Classifier. I. INTRODUCTION In recent years, pain recognition has been recognized as a critical demand in medical field [1-8]. Regardless from this matter many pain assessment tools or monitoring protocol such as NIPS, CRIES, PIPP and N-PASS were known as the standard practice for infant pain assessment tool [17-22] have been developed from the last two decades. However, [17-19] consent that lack of experience, insufficient assessment skill, and tired from duty routine to distinguish the infant pain cues might jeopardize this pain assessment tools. Thus, we proposed a robust stand alone infant pain monitoring system which detects the condition of the infant through continuously monitoring the face of the infant. Different illumination levels were employed to prove the robustness of the system. We adopted the Multi scale retinex (MSR) algorithm for extracting illumination invariant facial features [9]. Finally, LBP [6] was employed as feature extraction. The proposed method is tested with Gaussian and Nearest Mean classifier methods using the Infant Cope databases. The results demonstrate the effectiveness of our method, which can achieve satisfactory recognition rates with different illumination levels. The organization of this paper is as follows. Sections II briefly describe our proposed algorithm empirically. In Section III and IV, we will present LBP feature extraction method and Gaussian and Nearest Mean classifier briefly. Sections V and VI show the comprehensive experiment results and our conclusion respectively. II. PROPOSED METHOD As illustrated in Fig. 1, the experimental procedures can be divided into the following stages: preprocessing, feature extraction, and classification. The original images are adopted from Classification of Pain Expressions (COPE) database. The Images are reduced to 100pixels x 120pixels, in order to reduce the time processing. In the preprocessing stage, we present our new combination of pre-processing method for infant pain recognition. In the feature extraction stage, facial features such as LBP are employed as the feature vectors further. Finally, Gaussian and Nearest Mean classifier is used to classify the feature vectors into the following category pairs: pain/nonpain. All experiments were processed in the MATLAB environment under Windows XP operating system using an Intel® Core ™ 2Duo CPU, 2.80 GHz processor. Fig. 1. Infant pain Experimental Design Original Image Feature Extraction (LBP) Pre-Processing ~Grayscale Image ~Add Illumination ~Multi Scale Retinex (MSR) Nearest Mean Classifier Gaussian Classifier 2013 IEEE International Conference on Control System, Computing and Engineering, 29 Nov. - 1 Dec. 2013, Penang, Malaysia 978-1-4799-1508-8/13/$31.00 ©2013 IEEE 249

Transcript of [IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) -...

Page 1: [IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) - Penang, Malaysia (2013.11.29-2013.12.1)] 2013 IEEE International Conference on Control

A Computational Model of the Infant Pain Impressions with Gaussian and Nearest Mean

Classifier

Muhammad Naufal Mansor School of Mehatronic Engineering

Universiti Malaysia Perlis Perlis, Malaysia

[email protected]

Mohd Nazri Rejab Department of Mechanical

Politeknik Tuanku Syed Sirajuddin Perlis, Malaysia

[email protected]

Abstract—In the last recent years, non-invasive methods through image analysis of facial have been proved to be excellent and reliable tool to diagnose of pain recognition. This paper proposes a new feature vector based Local Binary Pattern (LBP) for the pain detection. Different sampling point and radius weighted are proposed to distinguishing performance of the proposed features. In this work, Infant COPE database is used with illumination added. Multi Scale Retinex (MSR) is applied to remove the shadow. Two different supervised classifiers such as Gaussian and Nearest Mean Classifier are employed for testing the proposed features. The experimental results uncover that the proposed features give very promising classification accuracy of 90% for Infant COPE database.

Index Terms—Infant Pain, MSR, LBP, Gaussian, Nearest Mean Classifier.

I. INTRODUCTION

In recent years, pain recognition has been recognized as a critical demand in medical field [1-8]. Regardless from this matter many pain assessment tools or monitoring protocol such as NIPS, CRIES, PIPP and N-PASS were known as the standard practice for infant pain assessment tool [17-22] have been developed from the last two decades. However, [17-19] consent that lack of experience, insufficient assessment skill, and tired from duty routine to distinguish the infant pain cues might jeopardize this pain assessment tools.

Thus, we proposed a robust stand alone infant pain monitoring system which detects the condition of the infant through continuously monitoring the face of the infant. Different illumination levels were employed to prove the robustness of the system. We adopted the Multi scale retinex (MSR) algorithm for extracting illumination invariant facial features [9]. Finally, LBP [6] was employed as feature extraction. The proposed method is tested with Gaussian and Nearest Mean classifier methods using the Infant Cope databases. The results demonstrate the effectiveness of our method, which can achieve satisfactory recognition rates with different illumination levels.

The organization of this paper is as follows. Sections II briefly describe our proposed algorithm empirically. In

Section III and IV, we will present LBP feature extraction method and Gaussian and Nearest Mean classifier briefly. Sections V and VI show the comprehensive experiment results and our conclusion respectively.

II. PROPOSED METHOD

As illustrated in Fig. 1, the experimental procedures can be divided into the following stages: preprocessing, feature extraction, and classification. The original images are adopted from Classification of Pain Expressions (COPE) database. The Images are reduced to 100pixels x 120pixels, in order to reduce the time processing. In the preprocessing stage, we present our new combination of pre-processing method for infant pain recognition. In the feature extraction stage, facial features such as LBP are employed as the feature vectors further. Finally, Gaussian and Nearest Mean classifier is used to classify the feature vectors into the following category pairs: pain/nonpain. All experiments were processed in the MATLAB environment under Windows XP operating system using an Intel® Core ™ 2Duo CPU, 2.80 GHz processor.

Fig. 1. Infant pain Experimental Design

Original Image

Feature Extraction (LBP)

Pre-Processing ~Grayscale Image ~Add Illumination

~Multi Scale Retinex (MSR)

Nearest Mean Classifier

Gaussian Classifier

2013 IEEE International Conference on Control System, Computing and Engineering, 29 Nov. - 1 Dec. 2013, Penang, Malaysia

978-1-4799-1508-8/13/$31.00 ©2013 IEEE 249

Page 2: [IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) - Penang, Malaysia (2013.11.29-2013.12.1)] 2013 IEEE International Conference on Control

III. PRE-PROCESSING The our main step involved 3 subsection such as (a) Image

Subjects, (b) grayscale images (c) Illumination adding and finally (d) Multi Scale Retinex (MSR).

A. Image Subjects The Infant COPE database as shown in Fig.2 contains a total of 204 color photographs of 26 Caucasian neonates (13 boys and 13 girls) ranging in age from 18 h to 3 days. Six males had been circumcised the day before the photographs were taken, and the last feeding time before the photography session ranged from 45 min to 5 h [1-3].

Fig. 2. Infant COPE Database

B. Grayscale Image We convert the true color image of the subjects to the grayscale intensity image. The grayscale image eliminating the hue and saturation information while retaining the luminance [11] as shown in Fig.3.

Fig. 3. Grayscale Infant COPE Database Image

C. Illumination Adding To simulate the problems associated with testing in different environments, illumination (varying across the image) was added to the test images of the COPE database. This was done by changing the intensity of the image’s pixels proportionally to their column displacement from the image centre [12]. An example of the effect of this transform (for differentδ ) on an

image is shown in Fig. 2. The illumination algorithm is constructed by [13]. New Image ),( xy = Old Image δ++ mxxy ),( (1)

Where 2/)1( −

−=XN

m δ

,1,....,1,0,1,....,1,0

−=−=

Y

X

NyNx

Fig. 4. Illumination for Infant COPE Database Image

D. Multi Scale Retinex Rahman [9] and [10] improved previous method by estimating illumination as a combination of several weighting (ωn) Gaussian filters with different scales (N). Reflectance image is defined by:

{ }ΙΙ−Ι=∑=

),(*),(log[),(log),(1

yxyxFyxyxRN

nnMSR ω

(2) Six sample face images received from multi scale retinex is shown in Fig. 5 below.

Fig. 5. Multi Scale Retinex for Infant COPE Database Image

IV. LBP FEATURE EXTRACTION Ojala et al [23] introduced the original LBP operator.

The operator labels the pixels of an image by thresholding the 3x3 neighborhood of each pixel with the center value and gives a binary number as result (see Figure 6 for an

2013 IEEE International Conference on Control System, Computing and Engineering, 29 Nov. - 1 Dec. 2013, Penang, Malaysia

250

Page 3: [IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) - Penang, Malaysia (2013.11.29-2013.12.1)] 2013 IEEE International Conference on Control

illustration). It also produces a histogram called LBP histogram. By concatenating this histogram for all together we can obtain a texture descriptor.

Fig. 6. Left: The basic LBP operator [33]. Right: Two examples of the extended LBP [31]: a circular (8,1) neighborhood, and a circular

(12, 1.5)neighborhood

There is a limitation for this basic LBP operator. Since its neighborhood is small, it cannot capture dominant feature with large scale structures. So, the operator was modified to be used with neighborhood of different sizes [24]. Now, it can use circular neighborhood. Examples of the extended LBP are shown in right of Fig. 9, where (P, R) denotes P sampling points on a circle of radius of R.

LBP undergoes series of extensions each solves problem with earlier extensions. Then uniform LBP pattern was introduced [25]. A Local Binary Pattern is called uniform if it contains at most two bitwise transitions from 0 to 1 or vice versa when the binary string is considered circular. For example, 00000000, 001110000 and 11100001 are uniform patterns. It is observed that uniform patterns account for nearly 90% of all patterns in the (8, 1) neighborhood and for about 70% in the (16, 2) neighborhood in texture images [26].

Let see what this uniform pattern all about is. Uniform LBP is denoted as RPLBP ,

2μ . The (P, R) represent the sampling points and radius used for neighborhood while ( )2μ indicates only uniform patterns is used and other patterns are labeled as a single pattern. A histogram of a labeled image ( )yxf ,1 can be defined as

( )( ) 1,....0,,,

−===∑ niiyxfIHyx

ii (3)

Where n is the number of different labels produced by the LBP operator and

( )falsetrue

isis

AA

AI⎩⎨⎧

=01

(4)

(16) Histogram that defined above contains information about distribution of edges, flat area and spots for a whole image. But to create an excellent face representation, spatial information also must be collected together as feature. Hence the face image is divided into m small regions

mRRR ,....,, 10 R0, and a spatially enhanced histogram is defined as ( )( ) ( )( ) j

yx

RyxIiyxfIHji

∈==∑ ,,,

1, (5)

Where 1,....0,1,....0,1,....0 −=−=−= mjnjnI As a result of above operation, histogram produced contains information on three level which is pixel-level, regional level and finally global level. This is a result of concatenating all pixel level patterns to produce regional level. In addition, all regional level patterns are summed to produce a global description of the image used [27].

V. CLASSIFIER A. Gaussian Classifier Gaussian classifier is belonging to “unsupervised” classifiers category [14]. All the training samples are not labeled to their membership category [14]. The Gaussian classifier equation (Equation 6) is derived as follows.

⎥⎦⎤

⎢⎣⎡ −∑−−

∑= − )()(

21

2)2(

1)( 1

2μμ

πxxedxp T (6)

Where: x is a d-component feature vector. μ is the d-component vector containing the mean of each feature. d is the dimension feature vector ∑ is the d-by-d covariance matrix

∑ is its determinant.

)(xp is the probability densities for the elements of x B. Nearest Mean Classifier In machine learning, a nearest mean or nearest prototype classifier is a classification model that assigns to observations the label of the class of training samples whose mean is closest to the observation. The nearest mean classifier is known as the Rocchio classifier because of its similarity to the Rocchio algorithm for relevance feedback [15]. The Nearest Mean Classifier algorithm is derived as follows.

Where training samples { })11 ,(),.....,,( nn yxyx with

labels Yyi ∈ , compute the per class mean ∑∈

=ιι

ιμCi

ixC1

where ιC is the set of indices samples belong to class Y∈ι .

Prediction function: the class assigned to an observation x is

xy Y −= ∈ ιι μminargˆ

2013 IEEE International Conference on Control System, Computing and Engineering, 29 Nov. - 1 Dec. 2013, Penang, Malaysia

251

Page 4: [IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) - Penang, Malaysia (2013.11.29-2013.12.1)] 2013 IEEE International Conference on Control

VI. EXPERIMENTAL RESULTS The proposed algorithm was evaluated on a COPE database. The entire subjects were tested for ten trials with Gaussian and Nearest Mean Classifier with P=8 and P=16 respectively. Average accuracy and time consumption of Gaussian and Nearest Mean Classifier for all subjects were illustrated in Fig. 7 and Fig. 8 respectively.

Avearge Accuracy vs Different Illumination Levels

0102030405060708090

100

20 40 60 80 100

Illumination levels

Aver

age

Accu

racy

(%)

Gaussian Classifier with P=8 Gaussian Classifier with P=16 Neareast Mean Classifier with P=8Neareast Mean Classifier with P=16

Fig. 7. Graph of Average Accuracy Vs Difference Illumination Levels

Fig. 7 shows the Average Accuracy of Gaussian and Nearest Mean Classifier; the accuracy was weighted by different illumination levels (δ ). The entire five part of δ for which accuracy result were received, all trials in different δ decrease significantly. These results represent the weighted accuracy for the whole test route with different P parameter gives different view. It is apparent that the highest levels of accuracy are in the range 88.00% to 87.74% and there is decreasing in accuracy levels from 20 to 100δ . It proves that the illumination levels (δ ) which generates the most high illumination levels (δ ) give the lowest and vice versa (L. Nanni et al., 2004). The highest percentage accuracy is 83.0 % for Gaussian at 20δ with P=16. However 80.0 % and 76.0% accuracy were taken from Nearest Mean Classifier at 20δ at P=8 and P=16 respectively. Based on the experiment, Gaussian Classifier is better to distinguish the pain or no pain symptom rather than Nearest Mean classifier.

Time Taken vs Different Illumination levels

0

0.5

1

1.5

2

2.5

3

20 40 60 80 100

Illumination Levels

Tim

e (s

) Gaussian Classifier with P=8 Gaussian Classifier with P=16 Neareast Mean Classifier with P=8Neareast Mean Classifier with P=16

Fig. 8. Graph of Average Accuracy Vs Difference Illumination Levels

Fig. 8 illustrates the Time Consumption of Different Classifiers versus different illumination levels for Gaussian and Nearest Mean Classifier; the time consumption was weighted by P value. These results represent the weighted time consumption for the whole test route. The Gaussian time consumption P=8 is to be between the ranges of 0.02 s to 0.12s. It showed is fast and synchronize. Meanwhile the Nearest Mean Classifier with P=16 apparent is the slowest. It gets around 0.7s to 2.6s compare to Nearest Mean Classifier with P=8. Finally, in all of trials, it determined that the best consumption time is Gaussian classifier which is fast and consistent

VII. CONCLUSION With deeper analysis and adding an interpretation of new pre-processing and classifier, the decision about the pose feature extraction such as LBP with Gaussian and Nearest Mean classifier can be made. Several examples were conducted to evaluate the performance of the proposed method under different illumination levels. Finally, the classification results indicate that the suggested method could be used as a valuable tool for classifying the infant between normal and pain with Gaussian and Nearest Mean Classifier.

ACKNOWLEDGMENT This research was conducted under Fundamental Research Grant Scheme (FRGS) which is contributed by Ministry of Higher Education Malaysia.

REFERENCES [1] S. C, Chuang, Shih F. Y., and. Slack M. R. "Machine

recognition and representation of neonatal facial displays of acute pain," Artificial Intelligence in Medicine 36(2), 2006, pp. 211-222.

[2] S. Brahnam, C. Chuang, S. S Randal, and Y. S Frank. "Machine assessment of neonatal facial expressions of acute pain," Decision Support Systems 43, 2007, pp. 1242-125.

[3] S. Brahnam , L. Nanni, and S. Randall. “Introduction to neonatal facial pain detection using common and advanced face classification techniques,” Advanced Computation Intelligence

2013 IEEE International Conference on Control System, Computing and Engineering, 29 Nov. - 1 Dec. 2013, Penang, Malaysia

252

Page 5: [IEEE 2013 IEEE International Conference on Control System, Computing and Engineering (ICCSCE) - Penang, Malaysia (2013.11.29-2013.12.1)] 2013 IEEE International Conference on Control

Paradigms in Healthcare, 1, Studies in Computational Intelligence (SCI) Series: Springer-Verlag, Berlin, 48, 2004, pp. 225-253

[4] S. Brahnam, F. C Cgao, Y. S Frank, and R. S Melinda. “SVM classification of neonatal facial image of pain,“ Proceedings of the 6th International Workshop on Fuzzy Login and Applications (WILF05),

[5] S. Brahnam and L. Nanni , "Neonatal facial pain detection using NNSOA and LSVM," Proceedings of the International Conference on Image Processing, Computer Vision, and Pattern Recognition (IPCV08), Las Vegas, vol. 2, 2008, pp. 352-357,

[6] L. Nanni, S. Brahnam, A. Lumini. "A local approach based on a local binary patterns variant texture descriptor for classifying pain states." Expert Systems with Applications 37(11) pp. 7888-7894

[7] S. Brahnam, C. F. Chuang, F. Y. Shih, and M. R. Slack, "SVM classification of neonatal facial images of pain," Fuzzy Logic and Applications, Isabelle Bloch, Alfredo Petrosino, Andrea G. B. Tettamanzi, editors, Lecture Notes in Computer Science, 3849, pp. 111-115

[8] K. Pun, and Y. Moon, "Recent advances in ear biometrics." pp. 144-149, 2004.

[9] D. J. Jobson, Z. Rahman, G. A. Woodell. A multiscale retinex for bridging the gap between color images and the human observations of scenes. IEEE Transactions on Image Processing, Vol. 6, No. 7, str. 965–976, 1997.

[10] D. J. Jobson, Z. Rahman, G.A. Woodell. Properties and performance of a center/surround retinex. IEEE Transactions on Image Processing, Vol. 6, No. 3, str. 451–462, 1997.

[11] C. Rafael Gonzalez, E. Richard Woods and L. Steven Eddins (2004). Digital Image Processing using MATLAB. Pearson Education. ISBN 978-81-7758-898-9.

[12] B. Schwerin, and K. K. Paliwal, “Local-DCT features for facial recognition”, In Proc. Intern. Conf. Signal Proc. and Communication Systems, Gold Coast, Australia, Dec 2008.

[13] C. Sanderson, 2002, “Automatic Person Verification Using Speech & Face Information”, Dissertation presented to School of Microelectronic Engineering, Griffith University.

[14] R. O. Duda, and P. E. Hart, (1973]). Pattern Classification and Scene Analysis. John Wiley and Sons, Inc.

[15] Manning, Christopher; Raghavan, Prabhakar; Schütze, Hinrich (2008). "Vector space classification". Introduction to Information Retrieval. Cambridge University Press.

[16] Spence K., Gillies D., Harrison D., Johnston L., Nagy S. A. Reliable pain assessment tool for clinical assessment in the neonatal intensive care unit. J Obstet Gynecol Neonatal Nurs. 2003; 34:80-86.

[17] Cignacco E., Mueller R., Hamers J.P.H. , Gessler P. Pain assessment in the neonate using the Bernese Pain Scale for Newborns. Early Hum Dev. 2004; 78:115-121.

[18] Lawrence J., Alcock D., McGrath P., Kay S., MacMurray S. B., Dulberg D. The development of a tool to assess neonatal pain. Neonatal Netw. 1993; 11:59-66.

[19] Krechel S.W., Bildner J. CRIES: a new neonatal postoperative pain measurement score. Initial testing of validity and reliability. Paediatr Anaesthesiol. 1995; 5:53-61.

[20] Stevens B., Johnston C., Petryshen P., Taddio A. Premature Infant Pain Profile: development and initial validation. Clin J Pain. 1996; 11:12-22.

[21] Hummel P., Lawlor-Klean P., Weiss M.G. Validity and reliability of the N-PASS assessment tool with acute pain. J Perinatol. 2009:1-5.

[22] Hummel P., Puchalski M., Creech S. D., Weiss M. G. Clinical reliability and validity of the N-PASS: neonatal pain, agitation and sedation scale with prolonged pain. J Perinatol. 2008; 16:55-60.

[23] Ojala T., Pietikäinen M., Harwood D., (1996) .A Comparative Study of Texture Measures with Classification Based on Featured Distribution, Pattern Recognition 29 (1) 51–59.

[24] Lianghua, H., Cairing, Z., Li, Z., and Die, H., (2005) An Enhanced LBP Feature Based on Facial Expression Recognition. Shanghai, China, September 1-4, 2005.

[25] Shiqing, Z., Xiaoming, Z., and Bicheng, L., (2011) Facial Expression Recognition using Local Fisher Discriminant Analysis.

[26] Littlewort G., Bartlett M., Fasel I., Susskind J., and Movellan J., .Dynamics of Facial Expression Extracted Aautomatically from Video, in IEEE Workshop on Face Processing in Video, 2004.

[27] Lyons M. J., Budynek J., and Akamatsu S., .Automatic Classification of Single Facial Images, IEEE PAMI, 1999.

2013 IEEE International Conference on Control System, Computing and Engineering, 29 Nov. - 1 Dec. 2013, Penang, Malaysia

253