1. Trang chủ
  2. » Kỹ Thuật - Công Nghệ

Cutting Edge Robotics Part 12 ppsx

30 224 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

EmotionRecognitionthroughPhysiologicalSignalsforHuman-MachineCommunication 321 Fig. 1. Physiological signals acquisition system 3.2 Acquisition of physiological signals The physiological signals were acquired using the PROCOMP Infiniti system [16]. The sampling rate was fixed at 256 samples per second for all the channels. Appropriate amplification and bandpass filtering were performed. One session of experiments took approximately 5 min. The subjects were requested to be as relaxed as possible during this period. Subsequently, emotional stimulus was applied, and physiological signals were recorded. The participant was asked to self assess the valence and the arousal of his/her emotion using a Self Assessment Manikin (SAM [17]), with 9 possible numerical judgments for each dimension (arousal and valence), which will be used in future works. The used sensors are described in the following. 3.2.1 Blood Volume Pulse (BVP) The Blood Volume pulse sensor uses photoplethysmography to detect the blood pressure in the extremities. Photoplethysmography is a process of applying a light source and measuring the light reflected by the skin. At each contraction of the heart, blood is forced through the peripheral vessels, producing engorgement of the vessels under the light source-thereby modifying the amount of light to the photosensor. The resulting pressure waveform is recorded. Fig. 2. BVP sensor 3.2.2 Electromyography (EMG) The electromyographic sensors measure the electromyographic activity of the muscle (the electrical activity produced by a muscle when it is being contracted), amplify the signal and send it to the encoder. In the encoder, a band pass filter is applied to the signal. For all our experiments, the sensor has used the 0-400 microvolt range and the 20-500 Hz filter, which is the most commonly used position. (Figure 3) Fig. 3. EMG sensor 3.2.3 Electrodermal activity (EDA) Electrodermal activity (EDA) is another signal that can easily be measured from the body surface and represents the activity of the autonomic nervous system. It is also called galvanic skin response [18]. It characterizes changes in the electrical properties of the skin due to the activity of sweat glands and is physically interpreted as conductance. Sweat glands distributed on the skin receive input from the sympathetic nervous system only, and thus this is a good indicator of arousal level due to external sensory and cognitive stimuli. Fig. 4. Skin Conductivity sensor 3.2.4 Skin Temperature (SKT) Variations in the skin temperature (SKT) mainly come from localized changes in blood flow caused by vascular resistance or arterial blood pressure. Local vascular resistance is modulated by smooth muscle tone, which is mediated by the sympathetic nervous system. The mechanism of arterial blood pressure variation can be described by a complicated CuttingEdgeRobotics2010322 model of cardiovascular regulation by the auto-nomic nervous system. Thus it is evident that the SKT variation reflects autonomic nervous system activity and is another effective indicator of emotional status. Fig. 5. Skin Temperature sensor 3.2.5 Respiration (Resp) The respiration sensor can be placed either over the sternum for thoracic monitoring or over the diaphram for diaphragmatic monitoring (Figure 6). The sensor consists mainly of a large velcro belt which extends around the chest cavity and a small elastic which stretches as the subject's chest cavity expands. The amount of stretch in the elastic is measured as a voltage change and recorded. From the waveform, the depth the subject's breath and the subject's rate of respiration can be learned. Fig. 6. Respiration sensor 4. Features extraction Having established a set of signals which may be used for recognizing emotion, it is then necessary to define a methodology in order to enable the system to translate the signals coming from these sensors into specific emotions. The first necessary step was the extraction of useful information bearing features for pattern classification. For emotion recognition training or testing, the features of each bio-potential data must be extracted. In this study, for each record, we compute the six parameters proposed by Picard [10] on the N values (5 seconds at 256 samples per second gives N=1280): the mean of the raw signals (Eq.1), the standard deviation of the raw signals (Eq.2), the mean of the absolute values of the first differences of the raw signals (Eq.3), the mean of the absolute values of the first differences of the normalized signals (Eq.4), the mean of the absolute values of the second differences of the raw signals (Eq.5) and the mean of the absolute values of the second differences of the normalized signals (Eq.6). w h 5. 5. 1 A f st a w i pr o Fi s S V m e sti m - I n fe a -I n w e B o 5. 2 M a th e se t le a th a u n h ere t is the sam p Emotion reco g 1 Classification f ter havin g extra c a tistical classifier , i th which it is p o posed in the lit e s her linear discr i V M al g orithm a n e thods. Feature m ulus form a di s n SVM method a n a ture vectors to t h n the 2nd metho d e use subsequent o th methods will b 2 Support vecto r a chine learnin g a e input and outp t of labeled traini n a rnin g s y stems t y a t y ields a label n seen example x. p lin g number an d g nition metho d methods c ted the features , with the g oal of p resented. A n u e rature, Fernand e i minant pro j ecti o n d the Fisher li n vectors extracte s tribution in hi g h n d without dim e h e support vecto r d , we reduce the d quadratic classif i b e now briefl y d e r machine a l g orithms recei v ut a h y pothesis f ng examples S y picall y tr y to fin d h e { - 1, 1 } (for t h Support Vector d T is the total nu m d as described in learnin g the cor r u mber of differe n e z [19] for exam p o n. This paper w n ear discriminan t d from multipl e -dimensional sp a e nsionalit y reduc t r machine (SVM) d imension of th e i er for reco g nitio n e scribed. v e input data du r f unction that can 1 1 ((x ,y ), ,(x l  d a decision func t    x sgn .w x   h e basic case of Machines are b a m ber of sample. the previous sec t r espondin g emo t n t classification p le used HMMs, w ill focus its atte t . We chose to t e sub j ects unde r a ce. t ion, our s y stem d classifier. e feature vector b y n . r in g a trainin g p be used to pred i   ,y )),y 1,1 l i   t ion of the form  b  ) bi n ar y classific a a sed on results f r t ion, we then tr a t ion for a set of f e al g orithms hav e while Heale y [2 0 ntion on two of t est and compar e r the same em o directl y g ives ex t y Fisher pro j ecti o p hase, build a m o i ct future data. G  a tion) for a pre v r om statistical le a ined a e atures e been 0 ] used them: e both o tional t racted o n and o del of G iven a (5.1) (5.2) v iously arnin g EmotionRecognitionthroughPhysiologicalSignalsforHuman-MachineCommunication 323 model of cardiovascular regulation by the auto-nomic nervous system. Thus it is evident that the SKT variation reflects autonomic nervous system activity and is another effective indicator of emotional status. Fig. 5. Skin Temperature sensor 3.2.5 Respiration (Resp) The respiration sensor can be placed either over the sternum for thoracic monitoring or over the diaphram for diaphragmatic monitoring (Figure 6). The sensor consists mainly of a large velcro belt which extends around the chest cavity and a small elastic which stretches as the subject's chest cavity expands. The amount of stretch in the elastic is measured as a voltage change and recorded. From the waveform, the depth the subject's breath and the subject's rate of respiration can be learned. Fig. 6. Respiration sensor 4. Features extraction Having established a set of signals which may be used for recognizing emotion, it is then necessary to define a methodology in order to enable the system to translate the signals coming from these sensors into specific emotions. The first necessary step was the extraction of useful information bearing features for pattern classification. For emotion recognition training or testing, the features of each bio-potential data must be extracted. In this study, for each record, we compute the six parameters proposed by Picard [10] on the N values (5 seconds at 256 samples per second gives N=1280): the mean of the raw signals (Eq.1), the standard deviation of the raw signals (Eq.2), the mean of the absolute values of the first differences of the raw signals (Eq.3), the mean of the absolute values of the first differences of the normalized signals (Eq.4), the mean of the absolute values of the second differences of the raw signals (Eq.5) and the mean of the absolute values of the second differences of the normalized signals (Eq.6). w h 5. 5. 1 A f st a w i pr o Fi s S V m e sti m - I n fe a -I n w e B o 5. 2 M a th e se t le a th a u n h ere t is the sam p Emotion reco g 1 Classification f ter havin g extra c a tistical classifier , i th which it is p o posed in the lit e s her linear discr i V M al g orithm a n e thods. Feature m ulus form a di s n SVM method a n a ture vectors to t h n the 2nd metho d e use subsequent o th methods will b 2 Support vecto r a chine learnin g a e input and outp t of labeled traini n a rnin g s y stems t y a t yields a label n seen example x. p ling number an d g nition metho d methods c ted the features , with the g oal of p resented. A n u e rature, Fernand e i minant projecti o n d the Fisher li n vectors extracte s tribution in hi g h n d without dim e h e support vecto r d , we reduce the d quadratic classif i b e now briefl y d e r machine a l g orithms recei v ut a h y pothesis f ng examples S y picall y tr y to fin d h e { - 1, 1 } (for t h Support Vector d T is the total nu m d as described in learnin g the cor r u mber of differe n e z [19] for exam p o n. This paper w n ear discriminan t d from multipl e -dimensional sp a e nsionalit y reduc t r machine (SVM) d imension of th e i er for reco g nitio n e scribed. v e input data du r f unction that can 1 1 ((x ,y ), ,(x l  d a decision func t    x sgn .w x  h e basic case of Machines are b a m ber of sample. the previous sec t r espondin g emo t n t classification p le used HMMs, w ill focus its atte t . We chose to t e sub j ects unde r a ce. t ion, our s y stem d classifier. e feature vector b y n . r in g a trainin g p be used to pred i   ,y )),y 1,1 l i   t ion of the form  b  ) binary classific a a sed on results f r t ion, we then tr a t ion for a set of f e al g orithms hav e while Heale y [2 0 ntion on two of t est and compar e r the same em o directl y g ives ex t y Fisher pro j ecti o p hase, build a m o i ct future data. G  a tion) for a pre v r om statistical le a ined a e atures e been 0 ] used them: e both o tional t racted o n and o del of G iven a (5.1) (5.2) v iously arnin g CuttingEdgeRobotics2010324 theory, pioneered by Vapnik [21], instead of heuristics or analogies with natural learning systems. SVM algorithms separate the training data in feature space by a hyperplane defined by the type of kernel function used. They find the hyperplane of maximal margin, defined as the sum of the distances of the hyperplane from the nearest data point of each of the two classes. The size of the margin bounds the complexity of the hyperplane function and hence determines its generalization performance on unseen data. The SVM methodology learns nonlinear functions of the form:       1 1 sgn l i i i f x y K x x b            (5.3) where the α i are Lagrange multipliers of a dual optimization problem. Once a decision function is obtained, classification of an unseen example x amounts to checking on what side of the hyperplane the example lies. 5.3 Fisher linear discriminant The Fisher's discriminant is a technique used to reduce a high dimensional feature set, x, to a lower dimensional feature set y, such that the classes can be more easily separated in the lower dimensional space. The Fisher discriminant seeks to find the projection matrix w such that when the original features are projected onto the new space according to , t y w x (5.4) the means of the projected classes are maximally separated and the scatter within each class is minimized. This matrix w is the linear function for which the criterion function:   t B t W w S w J w w S w  (5.5) is maximized. In this equation, S B and S W represent the between class scatter and within class scatter, respectively. This expression is well known in mathematical physics as the generalized Rayleigh quotient. This equation can be most intuitively understood in the two class case where is reduces to:   1 2 2 2 1 2 m m J w s s        (5.6) where 1 m  and 2 m  are the projected means of the two classes and 1 s  and 2 s  are the projected scatter of the two classes. This function is maximized when the distance between the means of the classes is maximized in the projected space and the scatter within each class is minimized [22]. 6. Experimental results Figure 7 shows an example of the five physiological signals recorded during the induction of six emotions (Amusement, Contentment, Disgust, Fear, Neutrality and Sadness) for subject1 (male) and subject2 (female), respectively. It can be seen that each physiological signal, varies widely across emotion and also across subjects. For emotion recognition, we have implemented the SVM method with a linear kernel and Fisher's discriminant classifier. A set of six examples for each basic emotion was used for training, followed by classification of 4 unseen examples per emotion. Table 1 gives the percentage of correctly classified examples for ten subjects using SVM method and Fisher's discriminant. Using a linear classifier, we are able to correctly classify 6 emotions of 10 subjects.As it can be observed, Fisher and SVM classifiers give a good results (92%, 90% respectively) for emotion recognition. Fig. 7. An example of five physiological signals (BVP, EMG, SC, SKT and Resp) acquired during the induction of the six emotions (left: Subjectl, right: Subject 2) EmotionRecognitionthroughPhysiologicalSignalsforHuman-MachineCommunication 325 theory, pioneered by Vapnik [21], instead of heuristics or analogies with natural learning systems. SVM algorithms separate the training data in feature space by a hyperplane defined by the type of kernel function used. They find the hyperplane of maximal margin, defined as the sum of the distances of the hyperplane from the nearest data point of each of the two classes. The size of the margin bounds the complexity of the hyperplane function and hence determines its generalization performance on unseen data. The SVM methodology learns nonlinear functions of the form:       1 1 sgn l i i i f x y K x x b            (5.3) where the α i are Lagrange multipliers of a dual optimization problem. Once a decision function is obtained, classification of an unseen example x amounts to checking on what side of the hyperplane the example lies. 5.3 Fisher linear discriminant The Fisher's discriminant is a technique used to reduce a high dimensional feature set, x, to a lower dimensional feature set y, such that the classes can be more easily separated in the lower dimensional space. The Fisher discriminant seeks to find the projection matrix w such that when the original features are projected onto the new space according to , t y w x (5.4) the means of the projected classes are maximally separated and the scatter within each class is minimized. This matrix w is the linear function for which the criterion function:   t B t W w S w J w w S w  (5.5) is maximized. In this equation, S B and S W represent the between class scatter and within class scatter, respectively. This expression is well known in mathematical physics as the generalized Rayleigh quotient. This equation can be most intuitively understood in the two class case where is reduces to:   1 2 2 2 1 2 m m J w s s        (5.6) where 1 m  and 2 m  are the projected means of the two classes and 1 s  and 2 s  are the projected scatter of the two classes. This function is maximized when the distance between the means of the classes is maximized in the projected space and the scatter within each class is minimized [22]. 6. Experimental results Figure 7 shows an example of the five physiological signals recorded during the induction of six emotions (Amusement, Contentment, Disgust, Fear, Neutrality and Sadness) for subject1 (male) and subject2 (female), respectively. It can be seen that each physiological signal, varies widely across emotion and also across subjects. For emotion recognition, we have implemented the SVM method with a linear kernel and Fisher's discriminant classifier. A set of six examples for each basic emotion was used for training, followed by classification of 4 unseen examples per emotion. Table 1 gives the percentage of correctly classified examples for ten subjects using SVM method and Fisher's discriminant. Using a linear classifier, we are able to correctly classify 6 emotions of 10 subjects.As it can be observed, Fisher and SVM classifiers give a good results (92%, 90% respectively) for emotion recognition. Fig. 7. An example of five physiological signals (BVP, EMG, SC, SKT and Resp) acquired during the induction of the six emotions (left: Subjectl, right: Subject 2) CuttingEdgeRobotics2010326 T a su b K n m a to ch o Fi g pr o us e sp a e m of se e su b T a su b se e e m it m e th e cl a 7. in p h t w di s S V di f re c a ble 1. Reco g niti o bj ects n owin g that SV M a de when ad j ust i test other kerne o ose the best ker n g ures 8 and 9 pr oj ected down to e d to g et a g oo d a ce. As expected , m otion. The data a all sub j ects does e that the data a bj ects case (each e a bles 2,3,4,5,6 an d bj ect2 and all su b e , that the hi g he r m otion for both m can be observe d e thod, especiall y, e application of F a ssification seem s Conclusions a this paper we p r hy siolo g ical si g n a w o pattern reco g s criminant. Reco g V M classifier g iv e f ferent sub j ects. T c o g nized b y a co m o n accurac y of F i M kernel choice i s i n g an SVM clas s e l as: polynomia l n el in order to i m esent the results a two dimensio n d representation , there were si gn a re separated int not refine the in f a re unseparated, e motion, varies w d 7 g ive the con f bj ects, respective l r reco g nition rat i m ethods, also, w h d , the best resul , for all sub j ects F isher linear disc r s to provide ver y a nd future wo r r esented an app r a ls. Ph y siolo g ica l g nition methods g nition rates of a e s best results t h T his stud y has s h m puter usin g ph y i sher Linear Dis c s amon g the mo s s ifier to a partic u l , sigmoid or ga u m prove SVM reco g of the f eatures s n al space (Fisher f of multidimen s n ificant variation s o well-defined c l f ormation relate d which explains w idel y across su b f usion matrixes f ly , usin g Fisher d i o is alwa y s obt a h en we mixed th e ts classification case. These resu l r iminant analysi s g ood results. r ks r oach to emotion l data was acqui r have been tes a bout 90% were h an Fisher discri m h own that specif i y siolo g ical featur c riminant and S V s t important cus t u lar application d u ssian radial ba s g nition results. s i g nals separatio f eatures). Fisher s ional class data s in the positions l usters. Obviousl y d to tar g et emoti o the obtained Fi s bj ects). f or the ori g inal t d iscriminant and a ined for the cor r e features si g nals achievement w a l ts are ver y pro m s or SVM metho d reco g nition bas e r ed in six differ e s ted: SVM met h achieved for bo t m inant usin g mi i c emotion patte r es. V M classification t omizations that d omain. It is inte r s is function (RB F n. These feature s transformation i s in a two dime n of data points f o y , mer g in g the f e o ns (Fi g ure 10). W s her rate as 0% t rainin g set of s u SVM method. It r espondin g and c . On the other h a a s g ained b y th e m isin g in the sen s d to the task of e m e d on the proces s e nt affective stat e h od and Fisher t h classifiers. Ho w xed features si gn r n can be autom a for 10 can be r estin g F ) and s were s often n sional o r each e atures W e can for all u b j ect1, can be c orrect a nd, as e SVM s e that m otion s in g of e s and linear w ever, n als of a ticall y Future work on arousal, valence assessment will be used in order to identify the emotion in the valence / arousal space. We intend to use wireless sensor in order to ensure a natural and without constraints interaction between human and machine. There is also much scope to improve our system to incorporate other means of emotion recognition. Currently we are working on a facial expression system which can be integrated with physiological signal features. 8. References [1] B. Reeves and C. Nass, The Media Equation. Cambridge University Press, 1996. [2] H.R. Kim, K.W. Lee, and D.S. Kwon, Emotional Interaction Model for a Service Robot, in Proceedings of the IEEE International Workshop on Robots and Human Interactive Communication, 2005, pp. 672-678. [3] R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S.Kollias, W. Fellenz and J. G. Taylor, Emotion recognition in human-computer interaction, IEEE Signal Process. Magazine, Vol. 18, 2001, pp. 32-80. [4] R. C. Arkin, M. Fujita, T. Takagi and R. Hasegawa, Ethological modeling and architecture for An entertainment robot, IEEE Int. Conf. Robotics & Automation, 2001, pp. 453- 458. [5] H. Hirsh, M. H. Coen, M. C. Mozer, R. Hasha and J. L. Flanagan, Room service, AI-style, IEEE Intelligent Systems and their application, Vol. 14, 1999, pp. 8-19. [6] R. W. Picard, Affective computing, MIT Press, Cambridge, 1995. [7] P. Lang, The emotion probe: Studies of motivation and attention, American Psychologist, vol. 50(5), 1995, pp. 372-385. [8] R. J. Davidson, Parsing affective space: Perspectives from neuropsychology and psychophysiology, Neuropsychology, Vol. 7, no. 4, 1993, pp. 464—475. [9] R. W. Levenson, Emotion and the autonomic nervous system: A prospectus for research on autonomic specificity, In H. L. Wagner, editor, Social Psychophysiology and Emotion: Theory and Clinical Applications, 1988, pp. 17-42. [10] R. W. Picard, E. Vyzas, and J. Healey, Toward machine emotional intelligence: analysis of affective physiological state, IEEE Trans. Pattern Anal. Mach. Intell., Vol. 23, 2001, pp. 1175-1191. [11] F. Nasoz, K. Alvarez, C. Lisetti,& N. Finkelstein, Emotion recognition from physiological signals for presence technologies, International Journal of Cognition, Technology, and Work - Special Issue on Presence, Vol. 6(1),2003. [12] J. WagnerJ. Kim & E. Andre, From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification, IEEE Internation Conference on Multimedia and Expo, Amsterdam, 2005, pp. 940-943. [13] B. Herbelin, P. Benzaki, F. Riquier, O. Renault, and D. Thalmann, Using physiological measures for emotional assessment: a computer-aided tool for cognitive and behavioural therapy, in 5th Int. Conf on Disability, Virtual Reality, 2004, pp. 307- 314. [14] C.L. Lisetti and F. Nasoz, Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals, Journal on applied Signal Processing, Hindawi Publishing Corporation, 2004, pp. 1672-1687. EmotionRecognitionthroughPhysiologicalSignalsforHuman-MachineCommunication 327 T a su b K n m a to ch o Fi g pr o us e sp a e m of se e su b T a su b se e e m it m e th e cl a 7. in p h t w di s S V di f re c a ble 1. Reco g niti o bj ects n owin g that SV M a de when ad j ust i test other kern e o ose the best ker n g ures 8 and 9 pr oj ected down to e d to g et a g oo d a ce. As expected , m otion. The data a all sub j ects does e that the data a bj ects case (each e a bles 2,3,4,5,6 an d bj ect2 and all su b e , that the hi g he r m otion for both m can be observe d e thod, especiall y, e application of F a ssification seem s Conclusions a this paper we p r hy siolo g ical si g n a w o pattern reco g s criminant. Reco g V M classifier g iv e f ferent sub j ects. T c o g nized b y a co m o n accurac y of F i M kernel choice i s i n g an SVM clas s e l as: pol y nomia l n el in order to i m esent the results a two dimensio n d representation , there were si gn a re separated int not refine the in f a re unseparated, e motion, varies w d 7 g ive the con f bj ects, respective l r reco g nition rat i m ethods, also, w h d , the best resul , for all sub j ects F isher linear disc r s to provide ver y a nd future wo r r esented an app r a ls. Ph y siolo g ica l g nition methods g nition rates of a e s best results t h T his stud y has s h m puter usin g ph y i sher Linear Dis c s amon g the mo s s ifier to a partic u l , si g moid or g a u m prove SVM reco g of the f eatures s n al space (Fisher f of multidimen s n ificant variation s o well-defined c l f ormation relate d which explains w idel y across su b f usion matrixes f ly , usin g Fisher d i o is alwa y s obt a h en we mixed th e ts classification case. These resu l r iminant anal y si s g ood results. r ks r oach to emotion l data was acqui r have been te s a bout 90% were h an Fisher discri m h own that specif i y siolo g ical featur c riminant and S V s t important cus t u lar application d u ssian radial ba s g nition results. s i g nals separatio f eatures). Fisher s ional class data s in the positions l usters. Obviousl y d to tar g et emoti o the obtained Fi s bj ects). f or the ori g inal t d iscriminant and a ined for the cor r e features si g nals achievement w a l ts are ver y pro m s or SVM metho d reco g nition bas e r ed in six differ e s ted: SVM met h achieved for bo t m inant usin g mi i c emotion patte r es. V M classification t omizations that d omain. It is inte r s is function (RB F n. These feature s transformation i s in a two dime n of data points f o y , mer g in g the f e o ns (Fi g ure 10). W s her rate as 0% t rainin g set of s u SVM method. It r espondin g and c . On the other h a a s g ained b y th e m isin g in the sen s d to the task of e m e d on the proces s e nt affective stat e h od and Fisher t h classifiers. Ho w xed features si gn r n can be autom a for 10 can be r estin g F ) and s were s often n sional o r each e atures W e can for all u b j ect1, can be c orrect a nd, as e SVM s e that m otion s in g of e s and linear w ever, n als of a ticall y Future work on arousal, valence assessment will be used in order to identify the emotion in the valence / arousal space. We intend to use wireless sensor in order to ensure a natural and without constraints interaction between human and machine. There is also much scope to improve our system to incorporate other means of emotion recognition. Currently we are working on a facial expression system which can be integrated with physiological signal features. 8. References [1] B. Reeves and C. Nass, The Media Equation. Cambridge University Press, 1996. [2] H.R. Kim, K.W. Lee, and D.S. Kwon, Emotional Interaction Model for a Service Robot, in Proceedings of the IEEE International Workshop on Robots and Human Interactive Communication, 2005, pp. 672-678. [3] R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S.Kollias, W. Fellenz and J. G. Taylor, Emotion recognition in human-computer interaction, IEEE Signal Process. Magazine, Vol. 18, 2001, pp. 32-80. [4] R. C. Arkin, M. Fujita, T. Takagi and R. Hasegawa, Ethological modeling and architecture for An entertainment robot, IEEE Int. Conf. Robotics & Automation, 2001, pp. 453- 458. [5] H. Hirsh, M. H. Coen, M. C. Mozer, R. Hasha and J. L. Flanagan, Room service, AI-style, IEEE Intelligent Systems and their application, Vol. 14, 1999, pp. 8-19. [6] R. W. Picard, Affective computing, MIT Press, Cambridge, 1995. [7] P. Lang, The emotion probe: Studies of motivation and attention, American Psychologist, vol. 50(5), 1995, pp. 372-385. [8] R. J. Davidson, Parsing affective space: Perspectives from neuropsychology and psychophysiology, Neuropsychology, Vol. 7, no. 4, 1993, pp. 464—475. [9] R. W. Levenson, Emotion and the autonomic nervous system: A prospectus for research on autonomic specificity, In H. L. Wagner, editor, Social Psychophysiology and Emotion: Theory and Clinical Applications, 1988, pp. 17-42. [10] R. W. Picard, E. Vyzas, and J. Healey, Toward machine emotional intelligence: analysis of affective physiological state, IEEE Trans. Pattern Anal. Mach. Intell., Vol. 23, 2001, pp. 1175-1191. [11] F. Nasoz, K. Alvarez, C. Lisetti,& N. Finkelstein, Emotion recognition from physiological signals for presence technologies, International Journal of Cognition, Technology, and Work - Special Issue on Presence, Vol. 6(1),2003. [12] J. WagnerJ. Kim & E. Andre, From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification, IEEE Internation Conference on Multimedia and Expo, Amsterdam, 2005, pp. 940-943. [13] B. Herbelin, P. Benzaki, F. Riquier, O. Renault, and D. Thalmann, Using physiological measures for emotional assessment: a computer-aided tool for cognitive and behavioural therapy, in 5th Int. Conf on Disability, Virtual Reality, 2004, pp. 307- 314. [14] C.L. Lisetti and F. Nasoz, Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals, Journal on applied Signal Processing, Hindawi Publishing Corporation, 2004, pp. 1672-1687. CuttingEdgeRobotics2010328 [15] P. Lang, M. Bradley, B. Cuthbert, International affective picture system (IAPS): Digitized photographs, instruction manual and affective ratings. Technical report A-6. University of Florida (2005). [16] www.thoughttechnology.com/proinf.htm [17] J.D. Morris, SAM: The Self-Assessment Manikin, An Efficient Cross-Cultural Measurement of Emotional Response, Journal of Advertising Research, 1995. [18] W. Boucsein, Electrodermal activity, New York: Plenum Press, 1992. [19] R. Fernandez, R. Picard, Signal Processing for Recognition of Human Frustration; IEEE Int. Conf. on Acoustics, Speech and Signal Processing, Vol. 6, 1998, pp. 3773-3776. [20] J. Healey and R. Picard, Digital processing of Affective Signals, ICASSP, 1988. [21] V. N. Vapnik, An overview of statistical learning theory, IEEE Trans. Neural Network., Vol. 10, 1999, pp. 988-999. [22] R. Duda, P. Hart, Pattern Classification and Scene Analysis. Bayes Decision Theory, John Wiley & Sons, 1973. Fig. 8. Points representing emotion episodes are projected onto the first two Fisher features for subject! Fig. 9. Points representing emotion episodes are projected onto the first two Fisher features for subject2 EmotionRecognitionthroughPhysiologicalSignalsforHuman-MachineCommunication 329 [15] P. Lang, M. Bradley, B. Cuthbert, International affective picture system (IAPS): Digitized photographs, instruction manual and affective ratings. Technical report A-6. University of Florida (2005). [16] www.thoughttechnology.com/proinf.htm [17] J.D. Morris, SAM: The Self-Assessment Manikin, An Efficient Cross-Cultural Measurement of Emotional Response, Journal of Advertising Research, 1995. [18] W. Boucsein, Electrodermal activity, New York: Plenum Press, 1992. [19] R. Fernandez, R. Picard, Signal Processing for Recognition of Human Frustration; IEEE Int. Conf. on Acoustics, Speech and Signal Processing, Vol. 6, 1998, pp. 3773-3776. [20] J. Healey and R. Picard, Digital processing of Affective Signals, ICASSP, 1988. [21] V. N. Vapnik, An overview of statistical learning theory, IEEE Trans. Neural Network., Vol. 10, 1999, pp. 988-999. [22] R. Duda, P. Hart, Pattern Classification and Scene Analysis. Bayes Decision Theory, John Wiley & Sons, 1973. Fig. 8. Points representing emotion episodes are projected onto the first two Fisher features for subject! Fig. 9. Points representing emotion episodes are projected onto the first two Fisher features for subject2 CuttingEdgeRobotics2010330 T a A m Fi g fo r T a A m a ble 2. Confusio n m usement, 2: Co n g . 10. Points repr e r all sub j ects a ble 3. Confusio n m usement, 2: Co n n matrix for emo t n tentment, 3: Dis g e sentin g emotio n n matrix for e m n tentment, 3: Dis g t ion reco g nition g ust, 4: Fear, 5: N n episodes are pr o m otion reco g niti o g ust, 4: Fear, 5: N usin g Fisher dis N eutral, 6: Sadnes oj ected onto the f o n usin g SVM N eutral, 6: Sadnes criminant (sub j e c s f irst two Fisher f e method (sub- j e c s c t1); 1: e atures c t1); 1: T a A m T a C o T a A m a ble 4. Confusio n m usement, 2: Co n a ble 5. Confusion o ntentment, 3: Di s a ble 6. Confusion m usement, 2: Co n n matrix for emo t n tentment, 3: Dis g matrix for emot sg ust, 4: Fear, 5: N matrix for emoti n tentment, 3: Dis g t ion reco g nition g ust, 4: Fear, 5: N ion reco g nition u N eutral, 6: Sadne on reco g nition u s g ust, 4: Fear, 5: N usin g Fisher dis N eutral, 6: Sadnes u sin g SVM (sub je ss s in g Fisher discr i N eutral, 6: Sadnes criminant (sub j e c s e ct2); 1: Amuse m i minant (all sub je s c t2); 1: m ent, 2: e cts); 1: [...]... http://www.bpsmedicine com/content/1/1/18 350 Cutting Edge Robotics 2010 Mark May and Barry M Schaitkin Facial Paralysis: Rehabilitation Techniques Thieme, 2003 Z Dingguo and Z Kuanyi "neural network control for leg rhythmic movements via functional electrical stimulation" Proc of IEEE International Joint Conference on Neural Networks, Volume 2 :124 5 124 8, 2004 J.C Borod, E Koff, S Yecker, C Santschi,... specially designed SMA based actuator called “Silent Actuation Unit” or SIAC unit in short The basic principle is the contraction of SMA coil elements when undergoing Joule heating 344 Cutting Edge Robotics 2010 Fig 12 Artificial facial displacement through actuation on the subject’s right side (a) Compared 3-dimensionally to the neutral face (b)Actuation direction and corresponding muscles ... Two actuators are been used to generate the movement of Zygomaticus major 346 Cutting Edge Robotics 2010 (a) (b) Fig 14 Performance of SMA coils type actuation elements under various controlled conditions (a) Low voltage actuation characteristics, and (b) Time-displacement characteristics during heating and cooling Fig 15 Part of the oral group of facial muscles under consideration, actuator positioning,... (Quinn and Jr.; 1996) Facial nerve paralysis is a fairly common issue that involves the paralysis of any parts stimulated by the facial nerve Due to the lengthy and relatively convoluted nature of the pathway 334 Bell’s Palsy recurrence rate lifetime prevalence Table 1 Bell’s Palsy statistics Cutting Edge Robotics 2010 0.5/1000 per year 7.0% 0.64-2.0% of the facial nerve, several situations can result in... 9 and 10 show the histograms for the difference in the displacement angles when compared with the natural smile for grid-line and muscle markers respectively The values are given in degrees 340 Cutting Edge Robotics 2010 2.0 1.8 Deflection [mm] 1.6 Right 1.4 Left 1.2 1.0 0.8 0.6 0.4 0.2 0.0 1 2 3 4 5 6 7 8 9 10 Marker Position (a) 2.0 1.8 Right Deflection [mm] 1.6 Left 1.4 1.2 1.0 0.8 0.6 0.4 0.2 0.0... markers and (b)Muscle markers between the neutral and natural smile, artificial risorius simulator (“Art Bottom ”), artificial zygomaticus major simulator (“Art Top ”), and both simultaneously 342 Cutting Edge Robotics 2010 Fig 9 Histogram of the error between the deviation angles of artificial and natural expressions for the artificial risorius simulator (1 Art Muscle), artificial zygomaticus major simulator(2... 6: Sadness Ta able 6 Confusion matrix for emotion recognition us sing Fisher discri iminant (all subje ects); 1: Am musement, 2: Con ntentment, 3: Disg gust, 4: Fear, 5: Neutral, 6: Sadness N 332 Cutting Edge Robotics 2010 Ta able 7 Confusion matrix for emoti recognition using SVM (all su ion u ubjects); 1: Amusement, e 2: C Contentment, 3: D Disgust, 4: Fear, 5 Neutral, 6: Sadn 5: ness Robot Assisted... expressions are recreated Because of the importance and universality of facial expressions it is necessary for artificially reconstructed expressions to be as close to the natural ones as possible 336 Cutting Edge Robotics 2010 2 Analysis of Facial Morphology Even though in case of illness or accident, muscle movement can be lost, the functions of natural muscles can be recreated artificially In the case of... of it’s sample number) of bioelectrical signal captured at a rate of 200Hz and the corresponding actuation logic derived by the master controller The data corresponds to a time period of 40s 348 Cutting Edge Robotics 2010 a b c d Fig 17 Formation of artificial expressions through the Robot Mask 3.3 Evaluation of Robot Mask Performances The figure 17 shows three different sets of artificially generated... designated with A shows the greatest displacement while smiling while B shows little displacement Fig 3 Artificial facial skin displacement using 3 artificial muscles Fig 4 Muscles on the face 338 Cutting Edge Robotics 2010 Fig 5 Position of Markers on the face Left: Grid-line markers Right: Muscle markers Figure 5 shows the position of both the grid-line and muscle markers The grid-line markers are white . that involves the paralysis of any parts stimu- lated by the facial nerve. Due to the lengthy and relatively convoluted nature of the pathway 21 Cutting Edge Robotics 2010334 Bell’s Palsy 0.5/1000. both o tional t racted o n and o del of G iven a (5.1) (5.2) v iously arnin g Cutting Edge Robotics 2010324 theory, pioneered by Vapnik [21], instead of heuristics or analogies with. Resp) acquired during the induction of the six emotions (left: Subjectl, right: Subject 2) Cutting Edge Robotics 2010326 T a su b K n m a to ch o Fi g pr o us e sp a e m of se e su b T a su b se e e m it

Ngày đăng: 10/08/2014, 23:21

Xem thêm: Cutting Edge Robotics Part 12 ppsx