PPt6 - Hopfield pdf

53 189 0
PPt6 - Hopfield pdf

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

 1    !!"  Recurrent Network có các hidden neuron: phần tử làm trễ z -1 được dùng - Đầu ra của Neural được feedback về tất cả các Neural z -1 z -1 z -1     2    !!" #$$%&#$$' Input: Pattern (thưng c nhiu hoc xung cp) Output: Corresponding pattern (hon hảo/xét m$t c%ch tương đi l ko c nhiu) Process – $()*+, /&'012345)-6 () – 7-(-8--9414*(-0: 1;- – <=1>?(-0?0/       3    !!" @ABC$% Input: Pattern (thưng c nhiu hoc xung cp) Output: Corresponding pattern (hon hảo/xét m$t c%ch tương đi l ko c nhiu) Process – $()*+, /&'012345)-6 () – 7-(-8--9414*(-0: 1;- – <=1>?(-0?0/       4    !!" 70(@A$%     !"#$ %  !& '"()$ *+ , -./01 23 * !4  &5#& ' !526  5    !!" 70(@A$%&'   %  !7."()$ *'8   !#)   %  !7."()$ *'8   !#)    6    !!" DDE#  9 7:.;7'  <<:'' '6 4. = >4 4''!'6 ?' .246 ' 6> 2 4''@ %??=AA    ! '=42  2 .>'2 6  4 6 48'6   2 = .? '?! ?44 =@@@ @. ! 5 >526 B.must store ! .' ? ''  2  ''    '   '     7    !!" 77)  C D E 2 2 ''4 4'. >' D   ! '= 4' 2 D F!6 42=! ' 2  4 ! '  D G. ?42; !.'  HH HH ? ? !.7'  ?  ?  8    !!" FDDE#  I 72  ' . . 4'  '  ∑ = ∝ P p pjpkjk iiw 1 pjpkjk iiw ∝∆ %?? J'4 ''  ! '2= .  *:2  2 44=   .  ''6=.'' ! .'   ''  ∑ = ≡ P p pjpkjk ii P w 1 1 :.#! .7'  pjpkjk oiw ∝∆   %  ∑ = ∝ P p pjpkjk oiw 1  %  ∑ = ≡ P p pjpkjk oi P w 1 1 % #2 #2  9    !!" )$%  Auto-Association Network  Fully-connected (clique) with symmetric weights  State of node = f(inputs)  Weight values based on Hebbian principle  Performance: Must iterate a bit to converge on a pattern, but generally much less computation than in back-propagation networks.  A   42 6  ))(sgn()1( 1 ∑ = =+ n j pjkjpk txwtx K '  10    !!" )$%  The Hopfield network implements a so-called associative (also called content addressable) memory.  A collection of patterns called fundamental memories is stored in the NN by means of weights.  Each neuron represents an attribute (dimension, feature) of the input.  The weight of the link between two neurons measures the correlation between the two corresponding attributes over the fundamental memories. If the weight is high then the corresponding attributes are often equal over the fundamental memories.  John Hopfield (Physicist, at Caltech in 1982, proposed the model, now at Princeton)   [...]... Execution (-1 , 1, 1) weight 1 (1, 1, -1 ) +1 -1 -1 2 (-1 , 1, -1 ) +1 -1 (1, 1, 1) 3 (-1 , -1 , 1) -1 (1, -1 , 1) neuron stable states (-1 , -1 , -1 ) (1, -1 , -1 ) -1 1 -1 -1 -1 -1 1 1 -1 -1 1 1 1 -1 -1 -1 -1 1 1 1 1 attraction basin 1 attraction basin 2 1 -1 1 Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 19 19 Example of Training  Consider the two fundamental memories (-1 -1 -1 )... Consider the two fundamental memories (-1 -1 -1 ) ( 1 1 1)  Find weights of Hopfield network w12 = (1*1 + (-1 )* (-1 ))/2=1 w23 = 1 1 w13 = 1 Network behaviour: -1 -1 -1 2 -1 -1 1 -1 1 -1 1 -1 -1 -1 1 1 1 -1 1 3 1 1 -1 1 1 1 Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 20 20 Properties of Hopfield Nets Distributed Representations Memories are stored across the whole... of Hopfield net Output Bit maps Prototype patterns An arbitrary pattern (e.g picture with noise) The best prototype for that • Output Faculty of Electronics NN 5 and Telecommunications, HUT pattern Bangkok, Jun 14 – 23, 2006 11 11 Hopfield NN architecture: recurrent Multiple-loop feedback system with no self-feedback X1 weight 1 +1 -1 +1 -1 X2 2 Example of Hopfield NN for 3 dimensional input data -1 ... for 3 dimensional input data -1 3 X3 -1 neuron Attribute 3 of input (x1,x2,x3) Execution: Input pattern attributes are initial states of neurons Repeatedly update state of neurons asynchronously until states do not change Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 12 12 Discrete Hopfield NN  Input vectors values are in {-1 ,1} (or {0,1})  The number of neurons... Auto-Associative Patterns to Remember 1 2 1 2 3 4 3 4 Comp/Node value legend: dark (blue) with x => +1 dark (red) w/o x => -1 light (green) => 0 1 2 3 4 1 2 3 4 1 2 3 4 • 1 node per pattern unit • Fully connected: clique 1 • Weights = avg correlations across all patternsof Electronics NN 5 Faculty of the corresponding unitsHUT 3 and Telecommunications, 2 2 Distributed Storage of All Patterns: -1 3... Properties of Hopfield Nets Distributed Representations Local Asynchronous Control Each unit makes its own local decision (based on locally available information) about its activation, asynchronously from any other unit Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 22 22 Properties of Hopfield Nets Distributed Representations Local Asynchronous Control Content-addressable... content addressable memory  It illustrates the behavior of the discrete Hopfield network as a content addressable memory  n = 120 neurons (⇒n2 - n = 12,280 weights)  The network is trained to retrieve 8 black and white patterns (see next slide) Each pattern contains 120 pixels The inputs of the net assume value +1 for black pixels and -1 for white pixels Implemented in MATLAB, available at: ftp://ftp.mathworks.com/pub/books/haykin... network will automatically find the closest stored pattern Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 23 23 Properties of Hopfield Nets Distributed Representations Local Asynchronous Control Content-addressable memory (also known as associative memory) Fault Tolerant The network will continue to function reasonably if a few of the nodes or connections are damaged... memory) Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 32 32 Hopfield Networks Can we formalize what do they do and why do they do what they do? … They minimize an energy function Faculty of Electronics NN 5 and Telecommunications, HUT Bangkok, Jun 14 – 23, 2006 33 33 Hopfield Network Energy Energy of the associative memory should be low when pairs of node values... Bangkok, Jun 14 – 23, 2006 34 34 Hopfield Nets and Energy Minimisation at time zero, = initial stimulus ξ(1) ξ(2) −ξ(1) −ξ(2) If the network is started off in a state which is a perturbed version of one ofofthe patterns µ the network will settle into a Faculty Electronics NN 5 and Telecommunications, HUT state with its Bangkok, Jun 14 – 23, 2006 given by µ activation levels 35 35 Hopfield Nets and Energy Minimisation . /&'012345 )-6  () – 7-(  - 8 -  -9 414*( -0 : 1; - – <=1>?( -0 ?0/       3   . /&'012345 )-6  () – 7-(  - 8 -  -9 414*( -0 : 1; - – <=1>?( -0 ?0/       4   . !!"  Recurrent Network có các hidden neuron: phần tử làm trễ z -1 được dùng - Đầu ra của Neural được feedback về tất cả các Neural z -1 z -1 z -1     2   

Ngày đăng: 01/07/2014, 15:20

Mục lục

  • Recurrent network

  • Recurrent Neural Network (RNN)

  • Associative-Memory Networks

  • Các loại Associative Network

  • Các loại Associative Network (cont.)

  • Hebb’s Rule

  • Correlated Field Components

  • Quantifying Hebb’s Rule

  • Hopfield Networks

  • Slide 10

  • Example: image retrieval

  • Hopfield NN architecture: recurrent

  • Discrete Hopfield NN

  • How do we compute the weights?

  • Training

  • Rationale

  • Execution

  • Execution: Pictorially

  • Example of Execution

  • Example of Training

Tài liệu cùng người dùng

Tài liệu liên quan