1. Trang chủ
  2. » Giáo án - Bài giảng

single layer perceptron as linear classifier - codeproject

6 223 0

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

Articles » General Programming » Algorithms & Recipes » Neural Networks Single Layer Perceptron as Linear Classifier By Kanasz Robert, 7 Nov 2010 Download bin - 19.33 KB Download source - 28.41 KB Introduction Perceptron is the simplest type of feed forward neural network. It was designed by Frank Rosenblatt as dichotomic classifier of two classes which are linearly separable. This means that the type of problems the network can solve must be linearly separable. Basic perceptron consists of 3 layers: Sensor layer Associative layer Output neuron There are a number of inputs (x n ) in sensor layer, weights (w n ) and an output. Sometimes w 0 is called bias and x 0 = +1/-1 (In this case is x 0 =-1). 4.80 (44 votes) For every input on the perceptron (including bias), there is a corresponding weight. To calculate the output of the perceptron, every input is multiplied by its corresponding weight. Then weighted sum is computed of all inputs and fed through a limiter function that evaluates the final output of the perceptron. The output of neuron is formed by activation of the output neuron, which is function of input: (1) The activation function F can be linear so that we have a linear network, or nonlinear. In this example, I decided to use threshold (signum) function: (2) Output of network in this case is either +1 or -1 depending on the input. If the total input (weighted sum of all inputs) is positive, then the pattern belongs to class +1, otherwise to class -1. Because of this behavior, we can use perceptron for classification tasks. Let's consider we have a perceptron with 2 inputs and we want to separate input patterns into 2 classes. In this case, the separation between the classes is straight line, given by equation: (3) When we set x 0 =-1 and mark w 0 =?, then we can rewrite equation (3) into form: (4) Here I will describe the learning method for perceptron. Learning method of perceptron is an iterative procedure that adjust the weights. A learning sample is presented to the network. For each weight, the new value is computed by adding a correction to the old value. The threshold is updated in the same way: (5) where y is output of perceptron, d is desired output and ? is the learning parameter. Using the Program When you run the program, you see area where you can input samples. Clicking by left button on this area, you will add first class sample (blue cross). Clicking by right button on this area, you will add first class sample (red cross). Samples are added to the samples list. You can also set learning rate and number of iterations. When you have set all these values, you can click on Learn button to start learning. Using the Code All samples are stored in generic list samples which holds only Sample class objects. public class Sample { double x1; double x2; double cls; public Sample(double x1, double x2, int cls) { this.x1 = x1; this.x2 = x2; this.cls = cls; } public double X1 { get { return x1; } set { this.x1 = value; } } public double X2 { get { return x2; } set { this.x2 = value; } } public double Class { get { return cls; } set { this.cls = value; } } } Before running a learning of perceptron is important to set learning rate and number of iterations. Perceptron has one great property. If solution exists, perceptron always find it but problem occurs, when solution does not exist. In this case, perceptron will try to find the solution in infinity loop and to avoid this, it is better to set maximum number of iterations. The next step is to assign random values for weights (w 0 , w 1 and w 2 ). Random rnd = new Random(); w0 = rnd.NextDouble(); w1 = rnd.NextDouble(); w2 = rnd.NextDouble(); When random values are assigned to weights, we can loop through samples and compute output for every sample and compare it with desired output. double x1 = samples[i].X1; double x2 = samples[i].X2; int y; if (((w1 * x1) + (w2 * x2) - w0) < 0) { y = -1; } else { y = 1; } I decided to set x0=-1 and for this reason, the output of perceptron is given by equation: y=w 1 *w 1 +w 2 *w 2 -w 0 . When perceptron output and desired output doesn’t match, we must compute new weights: if (y != samples[i].Class) { error = true; w0 = w0 + alpha * (samples[i].Class - y) * x0 / 2; w1 = w1 + alpha * (samples[i].Class - y) * x1 / 2; w2 = w2 + alpha * (samples[i].Class - y) * x2 / 2; } Y is output of perceptron and samples[i].Class is desired output. The last 2 steps (looping through samples and computing new weights), we must repeat while the error variable is <> 0 and current number of iterations (iterations) is less than maxIterations. int i; int iterations = 0; bool error = true; maxIterations = int.Parse(txtIterations.Text); Random rnd = new Random(); w0 = rnd.NextDouble(); w1 = rnd.NextDouble(); w2 = rnd.NextDouble(); alpha = (double)trackLearningRate.Value / 1000; while (error && iterations < maxIterations) { error = false; for (i = 0; i <= samples.Count - 1; i++) { double x1 = samples[i].X1; double x2 = samples[i].X2; int y; if (((w1 * x1) + (w2 * x2) - w0) < 0) { y = -1; } else { y = 1; } if (y != samples[i].Class) { error = true; w0 = w0 + alpha * (samples[i].Class - y) * x0 / 2; w1 = w1 + alpha * (samples[i].Class - y) * x1 / 2; w2 = w2 + alpha * (samples[i].Class - y) * x2 / 2; } } objGraphics.Clear(Color.White); DrawSeparationLine(); iterations++; } Function DrawSeparationLine draws separation line of 2 classes. History 07 Nov 2010 - Original version posted License This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL) About the Author Kanasz Robert Architect The Staffing Edge & Marwin Cassovia Soft Slovakia Member My name is Robert Kanasz and I have been working with ASP.NET, WinForms and C# for several years. MCTS - .NET Framework 3.5, ASP.NET Applications - SQL Server 2008, Database Development - SQL Server 2008, Implementation and Maintenance - .NET Framework 4, Data Access - .NET Framework 4, Service Communication Applications - .NET Framework 4, Web Applications MCPD - ASP.NET Developer 3.5 - Web Developer 4 MCITP - Database Administrator 2008 - Database Developer 2008 Open source projects: DBScripter - Library for scripting SQL Server database objects Please, do not forget vote Permalink | Advertise | Privacy | Mobile Web01 | 2.6.121031.1 | Last Updated 7 Nov 2010 Article Copyright 2010 by Kanasz Robert Everything else Copyright © CodeProject, 1999-2012 Terms of Use Comments and Discussions 34 messages have been posted for this article Visit http://www.codeproject.com/Articles/125346/Single- Layer-Perceptron-as-Linear-Classifier to post and view comments on this article, or click here to get a print view with messages. . © CodeProject, 199 9-2 012 Terms of Use Comments and Discussions 34 messages have been posted for this article Visit http://www .codeproject. com/Articles/125346 /Single- Layer- Perceptron- as- Linear- Classifier. Recipes » Neural Networks Single Layer Perceptron as Linear Classifier By Kanasz Robert, 7 Nov 2010 Download bin - 19.33 KB Download source - 28.41 KB Introduction Perceptron is the simplest. Applications - .NET Framework 4, Web Applications MCPD - ASP.NET Developer 3.5 - Web Developer 4 MCITP - Database Administrator 2008 - Database Developer 2008 Open source projects: DBScripter - Library

Ngày đăng: 28/04/2014, 10:11

Xem thêm: single layer perceptron as linear classifier - codeproject

TỪ KHÓA LIÊN QUAN