F ORECASTING DAILY LOAD ON THE BASIS OF THE PREVIOUS DAY

Một phần của tài liệu Pauli Murto (1998), Neural network models for short-term load forecasting (Trang 54 - 60)

Basic idea

In this section, the forecasting rests on the idea that similar conditions to those at the forecasting moment have existed before. The recognition of the conditions is simply based on the load behavior on the day preceding the target day. This means that the yesterday's load defines the forecast.

The use of the yesterday's load curve as an indicator of the load conditions assumes that the load conditions change relatively slowly, and that the basic factors making up the characteristics of the load of a certain Thursday, to take an example, can somehow be seen in the load of the preceding Wednesday.

An attraction of the idea lies in the robustness of the model. The forecast is obtained more or less directly from past load. Therefore, if all abnormal behavior in the load history is removed, the forecasts obtained using that data will always be sensible to some degree. This is not necessarily the case with the previous models of this chapter:

when forecasting with feed-forward neural networks, the input combination very different to any experienced during the training may produce illogical forecasts.

Two models will be considered in testing the idea. First, Kohonen's self-organizing feature map (SOM) is used in forming network weights representing different load types. The map is trained on the load data of one year. The forecast is obtained by associating the load of the previous day to a weight vector, which contains the forecast for the target day. Then, a simplification of the model is tested. This model does not use neural network techniques at all.

Even though the forecasting is performed for one whole day at a time, the idea could as well be used for a shorter part of the day; the load of a few hours could be predicted on the basis of the few previous hours in the similar manner as the whole daily load curve.

Using Kohonen's self organizing feature map

Overview of the model

The model uses the self-organizing feature map proposed by Kohonen. This self- organizing map (SOM) uses unsupervised learning, and is described in, for instance, Kohonen (1987) and Kohonen (1997). The idea of using SOM directly in forecasting the daily load curve is proposed by Baumann and Germond (1993).

The forecasting consists of the training phase and the auto-association phase. In the training phase the weight vectors of the SOM organize themselves. The actual forecasting takes place in the auto-association phase. There, the forecast for a day is obtained by auto-associative memory. The network is provided with vectors half the length of the ones used in the training phase, and the network reconstructs the vectors providing at the same time the load forecast.

The weight vectors contain 48 scalars. The SOM consisting of n x n weight vectors is trained with load curves consisting of two preceding days. These training cases are taken from the load history over one year. As the network organizes itself, the weight vectors specialize on characteristics of different seasons of the year.

To make the model perform efficiently, seven separate networks are used in order to take the differences between the days of the week into account. For example, there is one network specialized in forecasting Tuesday load curves, and this network is trained with vectors consisting of load values of Mondays-Tuesdays throughout the year.

Once the training of all seven networks is completed, the load can be forecast in a simple way. The load curve of the day preceding the target day is considered as a vector of 24 elements. This is compared to the first 24 elements in all network weight vectors. The weight vector with the smallest Euclidean distance to the load curve is chosen, and the remaining elements of this best matching weight vector constitute the forecast.

The extreme load curves are problematic for the model. The spreading of the network weights over the whole space formed by the training examples takes a long time. For instance, in the winter, when the load is at its highest, there will not be any neurons in

the network, which would have large enough weight values to compare with the actual load values. Therefore, the forecast values will be too low.

To overcome the problem, a correction to the forecasts is needed. In the article of Baumann and Germond (1993), a trend correction is proposed. This is intended for taking care of the annual growth, but it also applies to the problem caused by extreme load curves. The forecast for a certain hour is adjusted in the following way: the difference between the corresponding hour in the previous day and the weight vector element of that hour in the best-matching neuron is taken. This is multiplied by a constant δ and added to the preliminary forecast. From here on, the correction will be called delta correction. The forecasting with the delta correction can be expressed as follows:

) ( 24

24 i i

i Wi L W

L∧ = + + δ − − , (4.6)

where

∧ =

Li corrected load forecast for the i:th hour of the day

−24 =

Li load of the i:th hour on the previous day

i =

W the i:th element of the weight vector of the best-matching neuron

=

δ the constant multiplier used in the correction

Test results

The training and test sets are the same as used earlier in this chapter: May 24, 1996 – May 23, 1997 and May 24, 1997 – August 18, 1997. The performance of the model was tested with five different network sizes, and with eleven values for δ. The average percentage forecasting errors are given in figures 4.18 and 4.19. In figure 4.18, the network was trained by feeding the whole training set 50 times into each network (one network for each day of the week). In figure 4.19, 500 training epochs were used.

Figure 4.18: the average errors with different network sizes and delta values.

The training consisted of 50 epochs.

Figure 4.19: the average errors with different network sizes and delta values.

The training consisted of 500 epochs.

The results are better with 500 training epochs. Especially for the large network sizes, the long training appears to be necessary. The smallest forecasting errors are obtained

δ

0 2 4 6 8 10 12

d=0 d=0.1 d=0.2 d=0.3 d=0.4 d=0.5 d=0.6 d=0.7 d=0.8 d=0.9 d=1

Average error (%)

6 x 6 8 x 8 10 x 10 12 x 12 14 x 14

0 2 4 6 8 10 12

d=0 d=0.1 d=0.2 d=0.3 d=0.4 d=0.5 d=0.6 d=0.7 d=0.8 d=0.9 d=1

Average error (%)

6 x 6 8 x 8 10 x 10 12 x 12 14 x 14

A simple selection model

The model can be dramatically simplified. The use of the self-organizing feature map is here replaced with a simple selection model. The load data of the day preceding the target day is compared to all days of the same day type in the history data. The one with the smallest Euclidean distance is selected, and the forecast is obtained by taking the next day in the load history.

The model can best be explained through an example: suppose the load of a normal Thursday is to be predicted. The load of the Wednesday just before this target day is then taken, and all normal Wednesdays in the database are compared with this. The Wednesday, on which the load is closest to this day, is selected, and the preliminary forecast for the target day is the load of the following Thursday.

The preliminary forecast is corrected with similar delta correction as used with self- organizing map in the previous section. If the load curve of the day before the target day (call this day i-1) is:

) 24 , 1 ( ),..., 2 , 1 ( ), 1 , 1

(iL iL iL

and the load of the selected day having the smallest Euclidean distance to this is:

) 24 , 1 ( ),..., 2 , 1 ( ), 1 , 1

(sL sL sL

then the forecast load curve is:

[( 1, ) ( 1, )]

) , ( ) ,

(i j L s j L i j L s j

L∧ = + δ − − − , for j=1,...,24 (4.7)

The model was tested on the same test period as the SOM-model. The average forecasting errors with different delta parameters are shown in figure 4.20:

4 4 .1 4 .2 4 .3 4 .4 4 .5 4 .6 4 .7 4 .8 4 .9 5 5 .1 5 .2

d = 0 d = 0 . 1 d = 0 .2 d = 0 .3 d = 0 .4 d = 0 . 5 d = 0 . 6 d = 0 . 7 d = 0 .8 d = 0 . 9 d = 1

Average error (%)

Figure 4.20:The average forecasting errors for the test period May 24 – August 18, 1997.

It can be seen that this model achieves greater accuracy than the more complicated SOM-model. The best result is obtained with δ=0.5, where the average percentage error is 4.55 %. This is only slightly worse than the result obtained by forecasting the daily average load with a MLP network, and combining this to the load shape prediction (section 4.2).

For an illustration, the real load and the forecast with δ=0.5 for the two-weeks-long period May 25 – June 7, 1997 are shown in figure 4.21. The average error is 4.04 %.

Figure 4.21: Actual load and forecast obtained with the selection model over the period May 25 – June 7, 1997. The average forecasting error is 4.04 %

To illustrate the ability of this model to forecast the mere shape of the load, the forecasting was also performed with the assumption that the average load of the target day is known in advance. In that case, the forecast load curve is:

[ () ( )]

) , ( ) ,

(i j L s j L i L s

L∧ = + aveave (4.8)

The average error was 3.20 % on the testing period. This is slightly worse than with the model described in section 4.1.

0 5 0 1 0 0 1 5 0 2 0 0 2 5 0 3 0 0 3 5 0

2 0 2 5 3 0 3 5 4 0 4 5 5 0 5 5 6 0

H o u rs

MW

Một phần của tài liệu Pauli Murto (1998), Neural network models for short-term load forecasting (Trang 54 - 60)

Tải bản đầy đủ (PDF)

(92 trang)