Mã sӕ: 8.48.01.01
LUҰN VĂ17+Ҥ& SƬ
73+Ӗ&+Ë0,1+WKiQJQăP 2021
Trang 2&Ð1*75Î1+ĈѬӦC HOÀN THÀNH TҤI:
Trang 3ĈҤ,+Ӑ&48Ӕ&*,$73+&0 &Ӝ1*+2¬;+Ӝ,&+Ӫ1*+Ƭ$9,ӊ7 NAM75ѬӠ1*ĈҤ,+Ӑ&%È&+.+2$ ĈӝFOұS± 7ӵGR± +ҥQKSK~F
,,,1*¬<*,$21+,ӊ09Ө 21/03/2021
,91*¬<+2¬17+¬1+1+,ӊ09Ө 31/05/2021 9&È1%Ӝ+ѬӞ1*'Ү1 3*6764XҧQ7KjQK7Kѫ
Trang 4LӠI CҦM Ѫ1
ĈҫXWLrQW{L[LQÿѭӧFEj\WӓOzQJELӃWѫQVkXVҳFWӟL3*6764XҧQ7KjQK7KѫQJѭӡLÿmKѭӟQJGүQW{LWURQJVXӕWTXiWUuQKWKӵFKLӋQOXұQYăQFNJQJQKѭÿӅFѭѫQJ Nhӡ cy QKӳQJ FKӍGүQYjJySêcӫa thҫy mj tôi mӟi cy thӇ hojn thjnh tӕWÿѭӧc ÿӅWjLOXұQYăn njy 7{L[LQÿѭӧFJӱLOӡLFҧPѫQÿӃQTXêWKҫ\F{NKRD.KRDKӑFYj.ӻWKXұWPi\WtQKÿmWUX\ӅQWKөQKӳQJNLӃQWKӭFNLQKQJKLӋPTXêEiXFKRW{LWURQJKѫQKDLQăPTXD;LQJӱLOӡLWULkQÿӃQWҩWFҧFiFWKjQKYLrQWURQJQKyP/DQJXDJH0RGHOFӫDWKҫ\7KѫYuQKӳQJVӵJL~SÿӥYjKӛWUӧWURQJVXӕWTXiWUuQKWKӵFKLӋQOXұQYăQ &XӕLFQJW{L[LQJӱLOӡLFҧPѫQFKkQWKjQKÿӃQJLDÿuQKYjEҥQEqQKӳQJQJѭӡLÿmOX{QÿӝQJYLrQӫQJKӝW{LWURQJVXӕWWKӡLJLDQKӑF&DRKӑF
7S+͛&Kt0LQKQJj\31 tháng 05 QăP
Trang 5TÓM TҲT LUҰ19Ă1
&iFJLҧLSKiS[k\GӵQJEӝWӯÿLӇQVRQJQJӳWӵÿӝQJKLӋQQD\WKѭӡQJSKҧLGӵDWUrQFiFWұSGӳOLӋXVRQJQJӳÿӇKXҩQOX\ӋQ0ӝWVӕQJKLrQFӭXJҫQÿk\FKRWKҩ\ FyWKӇkhông FҫQSKҧLVӱGөQJGӳOLӋXFRUSXVVRQJQJӳFKRYLӋFKXҩQOX\ӋQ, bҵQJFiFKKXҩQOX\ӋQ mô KuQKKӑFVkXÿӇWҥRUDEӝWKDPVӕGQJFKRYLӋFiQK[ҥWӯNK{QJJLDQFӫDQJ{QQJӳQJXӗQVDQJNK{QJJLDQQJ{QQJӳÿtFKFiFKKRjQWRjQWӵÿӝQJ0ӝWFiFKKLӇXNKiFP{KuQKVӁWuPFiFKFăQFKӍQKSKkQEӕFӫDNK{QJJLDQQJ{QQJӳQJXӗQNKӟSYӟLSKkQEӕFӫDNK{QJJLDQQJ{QQJӳÿtFKYjEӝWKDPVӕFӫDP{KuQKVӁWUӣWKjQKPDWUұQiQK[ҥJLӳDQJ{QQJӳ/XұQYăQVӁFyKѭӟQJWLӃSFұQVӱGөQJPҥQJWӵVLQKÿӕLNKiQJ*$1 NӃWKӧSYӟLYLӋFJLҧLTX\ӃW YҩQÿӅWUӵFJLDR3URFUXVWHVÿӇ[k\GӵQJP{KuQKQKѭYұ\7ұSGӳOLӋXVӱGөQJFKROXұQYăQOjFiFWұSFRUSXVÿѫQQJӳFӫDWLӃQJ$QKWLӃQJ3KiSYj7LӃQJ9LӋWWӯZLNLSHGLD&iFEӝ:RUG(PEHGGLQJVӱGөQJFKRYLӋFKXҩQOX\ӋQOj:RUG9HFYj)DVW7H[W7ӯÿyFyQKӳQJVӵÿiQKJLiQKLӅXJyFÿӝWӯFKtQKEӝWӯÿLӇQVLQKUDÿѭӧFWӯmô hình
Trang 6ABSTRACT
The state-of-the-art methods for learning cross-lingual word embeddings have relied on parallel corpora Recent studies showed that the need for parallel data supervision can be alleviated In this work, it shows that we can build a bilingual dictionary between two languages without using any parallel corpora, by aligning monolingual word embedding spaces in an unsupervised way Hence, I applied a Generative Adversarial Network (GAN) and solving orthogonal Procrustes problem to implement these solutions The dataset which used for this thesis is the monolingual corpora of English, French and Vietnamese and they are collected from Wikipedia The Word Embedding which used for training are Word2Vec and FastText Finally, I also present the evaluation about the dictionary which generated from these models
Trang 7LӠ,&$0Ĉ2$1
7{L[LQFDPÿRDQOXұQYăQ³ӬQJGөQJKӑFVkXYjRGӏFKWӯYӵQJPjNK{QJFҫQGӳOLӋXVRQJQJӳ´OjNӃWTXҧQJKLrQFӭXFӫDW{LGѭӟLVӵKѭӟQJGүQYjJySêFӫD3*6764XҧQ7KjQK7Kѫ1KӳQJWK{QJWLQWKDPNKҧRWӯFiFF{QJWUuQKNKiFFyOLrQTXDQÿӅXÿmÿѭӧFJKLU}WURQJOXұQYăQ1ӝLGXQJQJKLrQFӭXYjFiFNӃWTXҧÿӅXOjGRFKtQKW{LWKӵFKLӋQNK{QJVDRFKpSKD\Oҩ\WӯPӝWQJXӗQQjRNKiF7{L[LQFKӏXWRjQEӝWUiFKQKLӋPYӅOӡLFDPÿRDQQj\
7KjQKSK͙+͛&Kt0LQKQJj\31 tháng 06 QăP
+ӑF9LrQ
7UҫQ4XkQ
Trang 8MӨC LӨC
NHIӊM VӨ LUҰ19Ă17+Ҥ&6Ƭ I/Ӡ,&Ҧ0Ѫ1 II7Ï07Ҳ7/8Ұ19Ă1 IIIABSTRACT IV/Ӡ,&$0Ĉ2$1 V0Ө&/Ө& VI'$1+0Ө&+Î1+9Ӏ VIII'$1+0Ө&%Ҧ1* IX'$1+0Ө&0&+ѬѪ1*75Î1+ IX'$1+0Ө& &+Ӳ9,ӂ77Ҳ7 IX
2CÁC CÔNG TRÌNH LIÊN QUAN 4
2.1T.MIKOLOV,L.V.QUOC, AND I.SUTSKEVER,³EXPLOITING SIMILARITIES AMONG LANGUAGES FOR MACHINE TRANSLATION´ ARXIV PREPRINT ARXIV:1309.4168,2013B.[1] 4
2.2C.XING,D.WANG,C.LIU, AND Y.LIN,³1ORMALIZED WORD EMBEDDING AND ORTHOGONAL TRANSFORM FOR BILINGUAL WORD TRANSLATION´PROCEEDINGS OF NAACL,2015[2] 4
2.3W.AMMAR,G.MULCAIRE,Y.TSVETKOV,G.LAMPLE,C.DYER,A.SMITH,³0ASSIVELY MULTILINGUAL WORD EMBEDDINGS´ ARXIV PREPRINT ARXIV:1602.01925,2016[3] 4
2.4A.CONNEAU,G.LAMPLE,M.RANZATO,L.DENOYER,H.JÉGOU,³:ORD TRANSLATION WITHOUT PARALLEL DATA´ ARXIV PREPRINT ARXIV:1710.04087,2018[4] 5
Trang 10DANH MӨC HÌNH VӀ
Hunh 1: Minh hӑa quá trình ánh xҥ giӳDNK{QJJLDQYHFWѫFӫa 2 ngôn ngӳ 1
Hunh 2: Hình minh hӑa 1 mҥQJQѫURQQKLӅu lӟp 6
Hunh 3Ĉӗ thӏ hàm tanh 7
Hunh 4Ĉӗ thӏ hàm Sigmoid 8
Hunh 5: Ĉӗ thӏ hàm ReLU 8
Hunh 6Ĉӗ thӏ hàm Leaky ReLU 9
Hunh 7: Minh hӑa kӻ thuұt dropout 11
Hunh 8: minh hӑa vӅ các tә chӭc one-hot vector 12
Hunh 9: Hình minh hӑa thӇ hiӋn sӵ liên quan vӅ ngӳ QJKƭDWURQJZRUGYHF 13
Hunh 10: Hình minh hӑa kiӃn trúc cӫa mô hình word2vec 14
Hunh 11: Hình minh hӑa mô hình CBOW 15
Hunh 12: Minh hӑa kiӃn trúc mҥQJQѫURQFӫa mô hình Skip-gram 16
Hunh 13: Minh hӑa vӅ vҩQÿӅ vӅ Out of vocabulary cӫa word2vec 17
Hunh 14: Minh hӑa vӅ phân bӕ cӫa tұp hӧp B 20
Hunh 15: Minh hӑa vӅ phân bӕ cӫa tұp hӧp A 20
Hunh 16: KӃt quҧ cӫa viӋFFăQFKӍnh 2 phân bӕ RA và B 22
Hunh 17: Hình minh hӑa kiӃn trúc cӫa GAN 23
Hunh 18: Hình minh hӑa 2 phân bӕ EDQÿҫu hoàn toàn cách biӋt nhau 25
Hunh 19: Mҥng Discriminative có nhiӋm vө phân biӋt 2 phân bӕ 25
Hunh 20%DQÿҫu, mҥng Discriminative dӉ dàng phân biӋt 2 phân bӕ 25
Hunh 21: Hình minh hӑa quá trình cұp nhұt lҥi trӑng sӕ cӫD*HQDUDWLYH0RGHOÿӇ tҥo ra phân bӕ mӟi tӕWKѫQ 26
Hunh 22: Discriminative Model vүn còn phát hiӋn ra sӵ khác biӋt cӫa 2 phân bӕ, vì thӃ tiӃp tөc lan truyӅn lҥL*HQDUDWLYHÿӇ cұp nhұt tiӃp trӑng sӕ 26
Hunh 23: Hình minh hӑa viӋc các mô hình dӯng lҥi khi 2 phân bӕ ÿmNKӟp nhau 26
Hunh 24: Hình minh hӑa vӅ xoay phân bӕ X bҵng ma trұQ:ÿӇ khӟp vӟi phân bӕ Y 28
Hunh 25 Minh hӑa các thành phҫn và luӗng hoҥWÿӝng cӫDP{KuQKGQJWURQJÿӅ tài 32
Hunh 26: Minh hӑa quá trình xӱ lý dataset 33
Hunh 27: Mô hình Discriminator phân biӋt phân bӕ thұt giҧ 33
Hunh 28: Minh hӑa hoҥWÿӝng cӫa mô hình Mapper 34
Hunh 29: Minh hӑa cách xây dӵng hàm loss cho mô hình 34
Hunh 30: Minh hӑa quá trình tӕLѭX:Eҵng giҧi quyӃt Procrustes 35
Hunh 31: Minh hӑa chi tiӃt quá trình hoҥWÿӝng cӫa mô hình 35
Hunh 32: Minh hӑa quá trình sinh tӯ ÿLӇn 36
Hunh 33: Giá trӏ loss cӫa mô hình GAN sau 25 epochs 45
Hunh 34: So sánh kӃt quҧ cӫa các tӯ ÿLӇn khác nhau 46
Trang 11DANH MӨC BҦNG
Bҧng 1: Minh hӑa quá trình dùng tӯ [XQJTXDQKFRQWH[WZRUGV ÿӇ dӵ ÿRiQWӯ ӣ giӳa (center
word) cӫa CBOW 14
Bҧng 2: Bҧng minh hӑa quá trình dùng tӯ ӣ giӳDFRQWH[WZRUGV ÿӇ dӵ ÿRiQWӯ các tӯ xung quanh cӫa skip-gram 16
Bҧng 3: Minh quá quá trình tách các sub-words cӫa FastText 18
Bҧng 4: Minh hӑa quá trình huҩn luyӋn cӫa FastText 18
Bҧng 5: Tӯ ÿLӇn Anh - ViӋt 39
Bҧng 6: Tӯ ÿLӇn Anh - Pháp (Word2vec) 41
Bҧng 7: Tӯ ÿLӇn Anh - Pháp (FastText) 43
DANH MӨ&0&+ѬѪ1*TRÌNH 0mFKѭѫQJWUuQK: Decode dӳ liӋu wikipedia 36
0mFKѭѫQJWUuQK: Xây dӵng mô hình word2vec và fasttext 36
0mFKѭѫQJWUuQK: Xây dӵng mô hình Discriminator 37
0mFKѭѫQJWUuQK: Xây dӵng mô hình Mapper 37
0mFKѭѫQJWUuQK: HiӋn thӵc tính toán Procrustes 38
Trang 121 GIӞI THIӊU 1.1 Tәng quan
ӢÿӅFѭѫQJQj\W{LVӁÿӅ[XҩW[k\GӵQJKӋWKӕQJVLQKWӯÿLӇQWӵÿӝQJQKѭQJNK{QJFҫQVӱGөQJFRUSXVVRQJQJӳ%ҵQJFiFKWUtFK[XҩWÿһFWUѭQJQJ{QQJӳW{LVӁWLӃQKjQKWҥRUDNK{QJJLDQvec-Wѫ FӫD WӯYӵQJ Wӯ PӛLORҥLQJ{QQJӳVDXÿy[k\GӵQJÿѭӧFP{hình giúp iQK[ҥNK{QJJLDQ vec-WѫFӫD QJ{QQJӳQJXӗQVDQJQJ{QQJӳÿtch Lúc này, FiFWӯYӵQJFӫDQJ{QQJӳQJXӗQVӁÿѭӧFiQK[ҥVDQJFiFWӯYӵQJFӫDQJ{QQJӳÿtFKWѭѫQJÿѭѫQJ6DXÿk\OjKuQKP{WҧP{KuQKPjW{LGӵNLӃQ[k\GӵQJ
Hunh 10LQKK͕DTXiWUuQKiQK[̩JLͷDNK{QJJLDQYHFW˯FͯDQJ{QQJͷ
+uQKPLQKKӑDFiFKPjP{KuQKFӫDÿӅWjLVӁWKӵFKLӋQ
- ĈҫXWLrQFKRSKkQEӕFiFWӯYӵQJWURQJNK{QJJLDQQJ{QQJӳOjWLӃQJ$QKPjXÿӓ Yj7LӃQJ9LӋWPjXWtP
- 1KLӋPYөFӫDP{KuQKOjWuPFiFKELӃQÿәLSKkQEӕPjXÿӓEҵQJFiFKQKkQYӟL
mDWUұQWUӵFJLDR W WҥRUDSKpS[RD\VDRFKRNKӟSYӟLSKkQEӕPjXWtP
- 6DXÿyÿRNKRҧQJFiFKJLӳDFiFWӯFӫDSKkQEӕVDXNKL[RD\ÿӇWuPUDQKӳQJFһS
WӯQjRJҫQQKDXQKҩWO~FÿyFiFWӯQKѭ³FDW´FӫDWLӃQJ$QKVӁWUQJNKӟSYӟLWӯWѭѫQJӭQJFӫDWLӃQJ9LӋWOj³FRQBPqR´
.ӃWTXҧFӫDGӵiQVӁJL~StFKFKRYLӋF[k\GӵQJWӯÿLӇQVRQJQJӳFiFKWӵÿӝQJPjNK{QJFҫQWұSGӳOLӋXFRUSXVVRQJQJӳQjR&iFKWLӃS FұQQj\JL~SFKRYLӋFGӏFKWKXұWJLӳDFiFQJ{Q QJӳtWSKәELӃQQKѭWLӃQJGkQWӝFWKLӇXVӕ ÿѭӧFGӉGjQJKѫQ 1JRjLUDEӝWӯÿLӇQQj\FzQKӛWUӧFKRPӝWVӕF{QJÿRҥQKXҩQOX\ӋQFiFP{KuQKGӏFKmáy
Trang 131.2 Tính ӭng dөng cӫDÿӅ tài
7ӯJLҧLSKiSGӏFKWӯYӵQJJLӳDQJ{QQJӳPjNK{QJFҫQGӳOLӋXVRQJQJӳÿӅWjLVӁKѭӟQJÿӃQYLӋFWҥRUDEӝWӯÿLӇQVRQJQJӳPӝWFiFKWӵÿӝQJĈӕLYӟLQKӳQJQJ{QQJӳtWSKәELӃQQKѭWLӃQJGkQWӝFWKLӇXVӕÿӅWjLQj\FjQJFyQKLӅXêQJKƭD9LӋFVLQKUDWӯÿLӇQQKѭYұ\KӛWUӧUҩWQKLӅXFKRQKӳQJFiQEӝF{QJWiFÿӃQYQJVkXYQg xa mà không có WjLOLӋXWӯÿLӇQÿӇWKDPNKҧR
1.3 Mөc tiêu và giӟi hҥn cӫDÿӅ tài
0өFWLrXFӫDÿӅWjLQj\EDRJӗP
- 7uPNLӃPYj[ӱOêFiFWұSFRUSXVGӳOLӋXWӯQKLӅXQJXӗQNKiFQKDX&iFWұS
FRUSXVVӱGөQJWURQJÿӅWjLEDRJӗPZLNLSHGLD7LӃQJ$QKWLӃQJ9LӋWWLӃQJ3KiSbaomoi.com
- ĈѭDUDNӃWOXұQYjKѭӟQJSKiWWULӇQWLӃSWKHRFӫDÿӅWjLWURQJWѭѫQJODLĈӅWjL
WKXÿѭӧFPӝWVӕNӃWTXҧNKҧTXDQNKLEӝWӯÿLӇQVLQKUDFyÿӝFKtQK[iFNKiFDR'ӵDWUrQQKӳQJNӃWTXҧNKҧTXDQQKѭYұ\ÿӅWjLFNJQJVӁ ÿӅ[XҩWUDQKӳQJKѭӟQJÿLWURQJWѭѫQJODL
1.4 Cҩu trúc cӫa luұQYăQ
&KѭѫQJ7әQJTXDQYӅQӝLGXQJPөFWLrXYjFҩXWU~FOXұQYăQ
Trang 14&KѭѫQJ.LӃQWKӭFQӅQWҧQJFyOLrQTXDQÿӃQÿӅWjLQKѭ:RUG(PEHGGLQJPҥQJQѫ-URQYҩQÿӅWUӵFJLDR3URFUXVWHVPҥQJ*$1s
&KѭѫQJ&iFF{QJWUuQKQJKLrQFӭXFyOLrQTXDQÿӃQÿӅWjL
&KѭѫQJ7UuQKEj\FiFSKѭѫQJSKiSVӱGөQJNKLKLӋQ WKӵF OXұQYăQ &KѭѫQJ0{WҧWKӵF WӃYLӋF KӋWKӕQJ YjÿiQKJLiNӃWTXҧ
&KѭѫQJ67әQJNӃWOҥLQKӳQJNӃWTXҧÿmÿҥWÿѭӧFYjÿӏQKKѭӟQJWURQJWѭѫQJODL
Trang 152 CÁC CÔNG TRÌNH LIÊN QUAN
2.1 T Mikolov, L.V Quoc, and I Sutskever, ³Exploiting similarities among
languages for machine translation´ arXiv preprint arXiv:1309.4168, 2013b.
7URQJF{QJWUuQKQj\0RNRORYYjFӝQJVӵTXDQViWUҵQJZRUGHPEHGGLQJFNJQJFySKkQEӕJLӕQJQKDXWUrQFҧQKLӅXQJ{QQJӳQJD\FҧQKӳQJQJ{QQJӳWӯQKӳQJYăQKyDNKiFQKDXQKѭWLӃQJ$QKWLӃQJ9LӋW+ӑFNJQJÿӅ[XҩWUҵQJEҵQJFiFKiQK[ҥJLӳDEӝZRUGHPEHGGLQJQj\FyWKӇSKөFYөYLӋFGӏFKVRQJQJӳ%ҵQJFiFKÿѭDUDWӯYӵQJFKRPӛLQJ{QQJӳÿӇOjPFiFÿLӇPQHRVDXÿy[RD\PDWUұQiQK[ҥYLӋFiQK[ҥJLӳDQJ{QQJӳQj\YүQGӵDYjREӝWӯÿLӇQVRQJQJӳÿӇFӕÿӏQKFiFWӯWѭѫQJÿѭѫQJQKDXYjiQK[ҥTXDQKDX&iFKWLӃSFұQQj\YүQSKҧLGQJFiFGӳOLӋXVRQJQJӳÿӇFӕÿӏQK1ӃXFiFQJ{QQJӳtWSKәELӃQWKLӃXFiFGDWDVHWVRQJQJӳWKuP{KuQKQj\FNJQJNKyWKӵFKLӋQÿѭӧF
2.2 C Xing, D Wang, C Liu, and Y /LQ ³1RUPDOL]ed word embedding and
RUWKRJRQDOWUDQVIRUPIRUELOLQJXDOZRUGWUDQVODWLRQ´Proceedings of NAACL,
&{QJWUuQKQj\ÿѭDUDPӝWJLҧLSKiSÿӇFKXҭQKyDFiFYHFWRUWӯYjFiFKELӃQÿәLWX\ӃQWtQKJLӳD:RUG(PEHGGLQJWK{QJTXDPDWUұQWUӵFJLDR&KDR;LQJYjFiFFӝQJVӵÿmpSWҩWFҧFiFEѭӟFFұSQKұWPDWUұQiQK[ҥQJ{QQJӳYӅPӝWPDWUұQWUӵFJLDR0өFWLrXOjÿҧPEҧRFiFSKpSELӃQÿәLYHFWRUSKҧLFKӍOjPӝWSKpSTXD\KRһFSKҧQ[ҥ PjWK{L*LҧLSKiSQj\ÿmPDQJOҥLFiFNӃWTXҧҩQWѭӧQJNKLWKӵFKLӋQYLӋFGӏFKFiFNK{QJJLDQWӯYӵQJWӯ7LӃQJ$QKVDQJWLӃQJ7k\%DQ1KD7{LFNJQJVӱGөQJJLҧLSKiSQj\WURQJYLӋFFKXҭQKyDPDWUұQiQK[ҥ:ÿӇFyWKӇJL~SP{KuQKWҥRUDNӃWTXҧWӕWQKҩW
2.3 W Ammar, G Mulcaire, Y Tsvetkov, G Lample, C Dyer, A Smith,
³0DVVLYHO\ PXOWLOLQJXDO ZRUG HPEHGGLQJV´ arXiv preprint arXiv:
1602.01925, 2016[3]
&{QJWUuQKQJKLrQFӭXQj\ÿѭDUDJLҧLSKiSÿӇFyWKӇWҥRUDPӝWZRUGHPEHGGLQJFKXQJÿҥLGLӋQFKRWҩWFҧFiFQJ{QQJӳNKiFQKDX&{QJWUuQKKRjQWRjQVӱGөQJFiFWұSFRUSXVÿѫQQJӳFӫDQJ{QQJӳNKiFQKDXWUrQWKӃJLӟL3KѭѫQJSKiSQj\FҫQUҩWQKLӅXQJ{QQJӳÿӇWәQJKӧSÿѭӧFEӝHPEHGGLQJFKXQJYjQyFKӍÿҥLGLӋQFKRFiFÿһFWtQKFKXQJFӫDQJ{Q QJӳFKӭNK{QJÿһFWUѭQJULrQJFKRFһSQJ{QQJӳQjRQrQNK{QJSKKӧSYӟLPөFÿtFK[k\GӵQJEӝWӯÿLӇQVRQJQJӳULrQJELӋW
Trang 162.4 A Conneau, G Lample, M Ranzato, L Denoyer, H -pJRX³:RUG
WUDQVODWLRQZLWKRXWSDUDOOHOGDWD´DU;LYSUHSULQWDU;LY[4]&{QJWUuQKQJKLrQFӭXQj\ÿmÿѭDUDPӝWJLҧLSKiS[k\GӵQJP{KuQKKӑFNK{QJJLiPViW+ӑFKӍVӱGөQJKDLQKyPÿѫQQJӳPӝWOjQJ{QQJӳQJXӗQYjPӝWOjQJ{QQJӳÿtFK3KѭѫQJSKiSFӫDKӑOj[k\GӵQJPҥQJÿһFELӋWPjWӵQyFyWKӇiQK[ҥWX\ӃQWtQKWӯNK{QJJLDQQJ{QQJӳQJXӗQWӟLNK{QJJLDQQJ{QQJӳÿtFKGӵDWUrQPӝWP{KuQKWrQOjPҥQJWӵVLQKÿӕLNKiQJ*$1 PjNK{QJFҫQFyGӳOLӋXVRQJQJӳÿӇKXҩQOX\ӋQ&{QJWUuQKÿѭDUDJLҧLSKiSVӱGөQJP{KuQKWӵVLQKÿӕLNKiQJÿӇFyWKӇWӵVLQKUDEӝWӯÿLӇQWӯYLӋFWӵFăQFKӍQKSKkQEӕWK{QJTXDFiFÿһFÿLӇPFӫDP{KuQK*$17{LFyVӱGөQJJLҧLSKiSQj\FKRÿӅWjLNӃWKӧSYLӋFWUӵFJLDRKyDPDWUұQFӫDF{QJWUuQK2.2 FKREӝWӯÿLӇQWLӃQJ$QK± 9LӋWYjWLӃQJ3KiS± 9LӋW
Trang 17Hunh 2+uQKPLQKK͕DP̩QJQ˯URQQKL͉XOͣS
KiӃn tr~c chung cӫa mӝt ANN gӗm 3 thjnh phҫQÿy lj ÿҫu vjo (input layer), tҫng ҭn (hidden layer) vj ÿҫu ra (output layer) Trong hunh 1, minh hӑa mӝt mҥng nѫ-ron cѫ bҧn vӟi 2 tҫng ҭn Mӛi vzng trzn lj mӝt nѫ-ron, cic mNJi trQÿLYjo lj ciFÿҫu vjo vj cic mNJi trQÿLUDOj cic kӃt quҧ ÿҫu ra cӫa nѫ-URQÿy Cic nѫ-URQÿѭӧc sҳp xӃp thjnh cic tҫng, biӇu diӉn luӗng th{QJWLQÿLTXDPҥng Tҫng dѭӟi cng kh{ng cy bҩt kǤ mNJi trQÿLYjo, vj lj
Trang 18ÿҫu vjo cӫa mҥng Tѭѫng tӵ, tҫng trrn cng kh{ng cy bҩt kǤ mNJi trQÿLUDYj lj ÿҫu ra cӫa mҥng Cic tҫng khiFÿѭӧc gӑi lj tҫng "ҭn"
Kê hiӋXErn trong cic nѫ-ron biӇu diӉn hjm phi tuyӃn ttnh (hjm ktch hoҥt) sigmoid
= (1/(1 + eíx