C++ Neural Networks and Fuzzy Logic


bnrn array and get corresponding output vector as a binary pattern. If this is the Y


Download 1.14 Mb.
Pdf ko'rish
bet18/41
Sana16.08.2020
Hajmi1.14 Mb.
#126479
1   ...   14   15   16   17   18   19   20   21   ...   41
Bog'liq
C neural networks and fuzzy logic


bnrn array and get corresponding output vector as a binary pattern. If this is the Y in the exemplar

pair, the network has made a desired association in one direction, and we go on to the next.step.

Otherwise we have a potential associated pair, one of which is X and the other is what we just got as

the output vector in the opposite layer. We say potential associated pair because we have the next step

to confirm the association.

  We run the bnrn array through the transpose of the weight matrix and calculate the outputs of the

anrn array elements. If, as a result, we get the vector X as the anrn array, we found an associated

pair, (X, Y). Otherwise, we repeat the two steps just described until we find an associated pair.



  We now work with the next pair of exemplar vectors in the same manner as above, to find an

associated pair.



  We assign serial numbers, denoted by the variable idn, to the associated pairs so we can print

them all together at the end of the program. The pair is called (X, Y) where X produces Y through the

weight matrix W, and Y produces X through the weight matrix, which is the transpose of W.

  A flag is used to have value 0 until confirmation of association is obtained, when the value of the

flag changes to 1.



  Functions compr1 and compr2 in the network class verify if the potential pair is indeed an

associated pair and set the proper value of the flag mentioned above.



  Functions comput1 and comput2 in the network class carry out the calculations to get the

activations and then find the output vector, in the respective directions of the fuzzy associative

memory network.

A lot of the code from the bidirectional associative memory (BAM) is used for the FAM. Here are the listings,

with comments added where there are differences between this code and the code for the BAM of Chapter 8.

C++ Neural Networks and Fuzzy Logic:Preface

C++ Implementation

184


Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

C++ Implementation

185


C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next



Header File

Listing 9.1 fuzzyam.h

//fuzzyam.h   V. Rao, H. Rao

#include

#define MXSIZ 10

class fzneuron

{

protected:



       int nnbr;

       int inn,outn;

       float output;

       float activation;

       float outwt[MXSIZ];

       char *name;

       friend class network;

public:


       fzneuron() { };

       void getnrn(int,int,int,char *);

};

class exemplar



{

protected:

       int xdim,ydim;

       float v1[MXSIZ],v2[MXSIZ];   // this is different from BAM

       friend class network;

public:


       exemplar() { };

       void getexmplr(int,int,float *,float *);

       void prexmplr();

};

class asscpair



{

protected:

       int xdim,ydim,idn;

       float v1[MXSIZ],v2[MXSIZ];

       friend class network;

public:


       asscpair() { };

       void getasscpair(int,int,int);

       void prasscpair();

};

C++ Neural Networks and Fuzzy Logic:Preface



Header File

186


class potlpair

{

protected:



       int xdim,ydim;

       float v1[MXSIZ],v2[MXSIZ];

       friend class network;

public:


       potlpair() { };

       void getpotlpair(int,int);

       void prpotlpair();

};

class network



{

public:


       int  anmbr,bnmbr,flag,nexmplr,nasspr,ninpt;

       fzneuron (anrn)[MXSIZ],(bnrn)[MXSIZ];

       exemplar (e)[MXSIZ];

       asscpair (as)[MXSIZ];

       potlpair (pp)[MXSIZ];

       float outs1[MXSIZ],outs2[MXSIZ];   // change from BAM to floats

       double mtrx1[MXSIZ][MXSIZ],mtrx2[MXSIZ][MXSIZ]; // change from

BAM to doubles

       network() { };

       void getnwk(int,int,int,float [][6],float [][4]);

       void compr1(int,int);

       void compr2(int,int);

       void prwts();

       void iterate();

       void findassc(float *);

       void asgninpt(float *);

       void asgnvect(int,float *,float *);

       void comput1();

       void comput2();

       void prstatus();

};

Source File

Listing 9.2 fuzzyam.cpp

//fuzzyam.cpp   V. Rao, H. Rao

#include "fuzzyam.h"

float max(float x,float y)   //new for FAM

{

float u;


u = ((x>y) ? x : y );

return u;

}

float min(float x,float y)      // new for FAM



{

float u;


u =( (x>y) ? y : x) ;

return u;

}

C++ Neural Networks and Fuzzy Logic:Preface



Source File

187


void fzneuron::getnrn(int m1,int m2,int m3,char *y)

{

int i;



name = y;

nnbr = m1;

outn = m2;

inn  = m3;

for(i=0;i

       outwt[i] = 0 ;

       }

output = 0;

activation = 0;

}

void exemplar::getexmplr(int k,int l,float *b1,float *b2)    // changed



from BAM

{

int i2;



xdim = k;

ydim = l;

for(i2=0;i2

       v1[i2] = b1[i2]; }

for(i2=0;i2

       v2[i2] = b2[i2]; }

}

void exemplar::prexmplr()



{

int i;


cout<<"\nX vector you gave is:\n";

for(i=0;i

       cout<

cout<<"\nY vector you gave is:\n";

for(i=0;i

       cout<

cout<<"\n";

}

void asscpair::getasscpair(int i,int j,int k)



{

idn = i;


xdim = j;

ydim = k;

}

void asscpair::prasscpair()



{

int i;


cout<<"\nX vector in the associated pair no. "<for(i=0;i

       cout<

C++ Neural Networks and Fuzzy Logic:Preface

Source File

188


cout<<"\nY vector in the associated pair no. "<for(i=0;i

       cout<

cout<<"\n";

}

void potlpair::getpotlpair(int k,int j)



{

xdim = k;

ydim = j;

}

void potlpair::prpotlpair()



{

int i;


cout<<"\nX vector in possible associated pair is:\n";

for(i=0;i

       cout<

cout<<"\nY vector in possible associated pair is:\n";

for(i=0;i

       cout<

cout<<"\n";

}

void network::getnwk(int k,int l,int k1,float b1[][6],float



       b2[][4])

{

anmbr = k;



bnmbr = l;

nexmplr = k1;

nasspr = 0;

ninpt = 0;

int i,j,i2;

float tmp1,tmp2;

flag =0;

char *y1="ANEURON", *y2="BNEURON" ;

for(i=0;i

       e[i].getexmplr(anmbr,bnmbr,b1[i],b2[i]);

       e[i].prexmplr();

       cout<<"\n";

       }

for(i=0;i

       anrn[i].fzneuron::getnrn(i,bnmbr,0,y1);}

for(i=0;i

       bnrn[i].fzneuron::getnrn(i,0,anmbr,y2);}

for(i=0;i

       for(j=0;j

              tmp1 = 0.0;

              for(i2=0;i2

C++ Neural Networks and Fuzzy Logic:Preface

Source File

189


                      tmp2 = min(e[i2].v1[i],e[i2].v2[j]);

                      tmp1 = max(tmp1,tmp2);

              }

              mtrx1[i][j] = tmp1;

              mtrx2[j][i] = mtrx1[i][j];

              anrn[i].outwt[j] = mtrx1[i][j];

              bnrn[j].outwt[i] = mtrx2[j][i];

              }

       }

prwts();


cout<<"\n";

}

void network::asgninpt(float *b)



{

int i,j;


cout<<"\n";

for(i=0;i

       anrn[i].output = b[i];

       outs1[i] = b[i];

       }

}

void network::compr1(int j,int k)



{

int i;


for(i=0;i       if(pp[j].v1[i] != pp[k].v1[i]) flag = 1;

       break;

       }


}

void network::compr2(int j,int k)

{

int i;


for(i=0;i

       if(pp[j].v2[i] != pp[k].v2[i]) flag = 1;

       break;}

}

void network::comput1()   //changed from BAM



{

int j;


for(j=0;j       int ii1;

       float c1 =0.0,d1;

       cout<<"\n";

       for(ii1=0;ii1

C++ Neural Networks and Fuzzy Logic:Preface

Source File

190


              d1 = min(outs1[ii1],mtrx1[ii1][j]);

              c1 = max(c1,d1);

              }

       bnrn[j].activation = c1;

       cout<<"\n output layer neuron  "<

              <

       bnrn[j].output = bnrn[j].activation;

       outs2[j] = bnrn[j].output;

       cout<<"\n output layer neuron  "<

              <

       }

}

void network::comput2()             //changed from BAM



{

int i;


for(i=0;i       int ii1;

       float c1=0.0,d1;

       for(ii1=0;ii1

              d1 = min(outs2[ii1],mtrx2[ii1][i]);

              c1 = max(c1,d1);}

       anrn[i].activation = c1;

       cout<<"\ninput layer neuron "<

              <

       anrn[i].output = anrn[i].activation;

       outs1[i] = anrn[i].output;

       cout<<"\n input layer neuron  "<

              <

       }


}

void network::asgnvect(int j1,float *b1,float *b2)

{

int  j2;


for(j2=0;j2

       b2[j2] = b1[j2];}

}

void network::prwts()



{

int i3,i4;

cout<<"\n  weights−−  input layer to output layer: \n\n";

for(i3=0;i3

       for(i4=0;i4

              cout<

       cout<<"\n"; }

cout<<"\n";

cout<<"\nweights−−  output layer to input layer: \n\n";

C++ Neural Networks and Fuzzy Logic:Preface

Source File

191


for(i3=0;i3       for(i4=0;i4

              cout<

       cout<<"\n";  }

cout<<"\n";

}

void network::iterate()



{

int i1;


for(i1=0;i1       findassc(e[i1].v1);

       }

}

void network::findassc(float *b)



{

int j;


flag = 0;

asgninpt(b);

       ninpt ++;

       cout<<"\nInput vector is:\n" ;

       for(j=0;j<6;++j){

              cout<

       cout<<"\n";

       pp[0].getpotlpair(anmbr,bnmbr);

asgnvect(anmbr,outs1,pp[0].v1);

comput1();

if(flag>=0){

       asgnvect(bnmbr,outs2,pp[0].v2);

       cout<<"\n";

       pp[0].prpotlpair();

       cout<<"\n";

       comput2(); }

for(j=1;j

       pp[j].getpotlpair(anmbr,bnmbr);

       asgnvect(anmbr,outs1,pp[j].v1);

       comput1();

       asgnvect(bnmbr,outs2,pp[j].v2);

       pp[j].prpotlpair();

       cout<<"\n";

       compr1(j,j−1);

       compr2(j,j−1);

       if(flag == 0) {

C++ Neural Networks and Fuzzy Logic:Preface

Source File

192


              int j2;

              nasspr += 1;

              j2 = nasspr;

              as[j2].getasscpair(j2,anmbr,bnmbr);

              asgnvect(anmbr,pp[j].v1,as[j2].v1);

              asgnvect(bnmbr,pp[j].v2,as[j2].v2);

              cout<<"\nPATTERNS ASSOCIATED:\n";

              as[j2].prasscpair();

              j = MXSIZ ;

              }

       else

              if(flag == 1)

                     {

                     flag = 0;

                     comput1();

                     }

              }

}

void network::prstatus()



{

int j;


cout<<"\nTHE FOLLOWING ASSOCIATED PAIRS WERE FOUND BY FUZZY AM\n\n";

for(j=1;j<=nasspr;++j){

       as[j].prasscpair();

       cout<<"\n";}

}

void main()



{

int ar = 6, br = 4, nex = 1;

float inptv[][6]={0.1,0.3,0.2,0.0,0.7,0.5,0.6,0.0,0.3,0.4,0.1,0.2};

float outv[][4]={0.4,0.2,0.1,0.0};

cout<<"\n\nTHIS PROGRAM IS FOR A FUZZY ASSOCIATIVE MEMORY NETWORK. THE

NETWORK \n";

cout<<"IS SET UP FOR ILLUSTRATION WITH "<

cout<<" OUTPUT NEURONS.\n"<

static network famn;

famn.getnwk(ar,br,nex,inptv,outv);

famn.iterate();

famn.findassc(inptv[1]);

famn.prstatus();

}

Previous Table of Contents Next



Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Source File

193


C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next



Output

The illustrative run of the previous program uses the fuzzy sets with fit vectors (0.1, 0.3, 0.2, 0.0, 0.7, 0.5) and

(0.4, 0.2, 0.1, 0.0). As you can expect according to the discussion earlier, recall is not perfect in the reverse

direction and the fuzzy associated memory consists of the pairs (0.1, 0.3, 0.2, 0.0, 0.4, 0.4) with (0.4, 0.2, 0.1,

0.0) and (0.1, 0.2, 0.2, 0, 0.2, 0.2) with (0.2, 0.2, 0.1, 0). The computer output is in such detail as to be

self−explanatory.

THIS PROGRAM IS FOR A FUZZY ASSOCIATIVE MEMORY NETWORK. THE NETWORK IS

SET UP FOR ILLUSTRATION WITH SIX INPUT NEURONS, AND FOUR OUTPUT NEURONS.

1 exemplars are used to encode

X vector you gave is:

0.1  0.3  0.2  0  0.7  0.5

Y vector you gave is:

0.4  0.2  0.1  0

  weights−−input layer to output layer:

0.1  0.1  0.1  0

0.3  0.2  0.1  0

0.2  0.2  0.1  0

0    0    0    0

0.4  0.2  0.1  0

0.4  0.2  0.1  0

weights−−output layer to input layer:

0.1  0.3  0.2  0  0.4  0.4

0.1  0.2  0.2  0  0.2  0.2

0.1  0.1  0.1  0  0.1  0.1

0    0    0    0  0    0

Input vector is:

0.1 0.3 0.2 0 0.7 0.5

output layer neuron  0 activation is 0.4

output layer neuron  0 output is 0.4

output layer neuron  1 activation is 0.2

output layer neuron  1 output is 0.2

output layer neuron  2 activation is 0.1

output layer neuron  2 output is 0.1

output layer neuron  3 activation is 0

C++ Neural Networks and Fuzzy Logic:Preface

Output


194

 output layer neuron  3 output is 0

X vector in possible associated pair is:

0.1  0.3  0.2  0  0.7  0.5

Y vector in possible associated pair is:

0.4  0.2  0.1  0

input layer neuron 0 activation is 0.1

input layer neuron  0 output is 0.1

input layer neuron 1 activation is 0.3

input layer neuron  1 output is 0.3

input layer neuron 2 activation is 0.2

input layer neuron  2 output is 0.2

input layer neuron 3 activation is 0

input layer neuron  3 output is 0

input layer neuron 4 activation is 0.4

input layer neuron  4 output is 0.4

input layer neuron 5 activation is 0.4

input layer neuron  5 output is 0.4

output layer neuron  0 activation is 0.4

output layer neuron  0 output is 0.4

output layer neuron  1 activation is 0.2

output layer neuron  1 output is 0.2

output layer neuron  2 activation is 0.1

output layer neuron  2 output is 0.1

output layer neuron  3 activation is 0

output layer neuron  3 output is 0

X vector in possible associated pair is:

0.1  0.3  0.2  0  0.4  0.4

Y vector in possible associated pair is:

0.4  0.2  0.1  0

PATTERNS ASSOCIATED:

X vector in the associated pair no. 1 is:

0.1  0.3  0.2  0  0.4  0.4

Y vector in the associated pair no. 1 is:

0.4  0.2  0.1  0

Input vector is:

0.6 0 0.3 0.4 0.1 0.2

C++ Neural Networks and Fuzzy Logic:Preface

Output


195

 output layer neuron  0 activation is 0.2

 output layer neuron  0 output is 0.2

 output layer neuron  1 activation is 0.2

 output layer neuron  1 output is 0.2

 output layer neuron  2 activation is 0.1

 output layer neuron  2 output is 0.1

 output layer neuron  3 activation is 0

 output layer neuron  3 output is 0

X vector in possible associated pair is:

0.6  0  0.3  0.4  0.1  0.2

Y vector in possible associated pair is:

0.2  0.2  0.1  0

input layer neuron 0 activation is 0.1

 input layer neuron  0 output is 0.1

input layer neuron 1 activation is 0.2

 input layer neuron  1 output is 0.2

input layer neuron 2 activation is 0.2

 input layer neuron  2 output is 0.2

input layer neuron 3 activation is 0

 input layer neuron  3 output is 0

input layer neuron 4 activation is 0.2

 input layer neuron  4 output is 0.2

input layer neuron 5 activation is 0.2

 input layer neuron  5 output is 0.2

 output layer neuron  0 activation is 0.2

 output layer neuron  0 output is 0.2

 output layer neuron  1 activation is 0.2

 output layer neuron  1 output is 0.2

 output layer neuron  2 activation is 0.1

 output layer neuron  2 output is 0.1

 output layer neuron  3 activation is 0

 output layer neuron  3 output is 0

X vector in possible associated pair is:

C++ Neural Networks and Fuzzy Logic:Preface

Output

196


0.1  0.2  0.2  0  0.2  0.2

Y vector in possible associated pair is:

0.2  0.2  0.1  0

 output layer neuron  0 activation is 0.2

 output layer neuron  0 output is 0.2

 output layer neuron  1 activation is 0.2

 output layer neuron  1 output is 0.2

 output layer neuron  2 activation is 0.1

 output layer neuron  2 output is 0.1

 output layer neuron  3 activation is 0

 output layer neuron  3 output is 0

 output layer neuron  0 activation is 0.2

 output layer neuron  0 output is 0.2

 output layer neuron  1 activation is 0.2

 output layer neuron  1 output is 0.2

 output layer neuron  2 activation is 0.1

 output layer neuron  2 output is 0.1

 output layer neuron  3 activation is 0

 output layer neuron  3 output is 0

X vector in possible associated pair is:

0.1  0.2  0.2  0  0.2  0.2

Y vector in possible associated pair is:

0.2  0.2  0.1  0

PATTERNS ASSOCIATED:

X vector in the associated pair no. 2 is:

0.1  0.2  0.2  0  0.2  0.2

Y vector in the associated pair no. 2 is:

0.2  0.2  0.1  0

THE FOLLOWING ASSOCIATED PAIRS WERE FOUND BY FUZZY AM

X vector in the associated pair no. 1 is:

0.1  0.3  0.2  0  0.4  0.4

Y vector in the associated pair no. 1 is:

0.4  0.2  0.1  0

X vector in the associated pair no. 2 is:

0.1  0.2  0.2  0  0.2  0.2

Y vector in the associated pair no. 2 is:

0.2  0.2  0.1  0

C++ Neural Networks and Fuzzy Logic:Preface

Output

197


Summary

In this chapter, bidirectional associative memories are presented for fuzzy subsets. The development of these

is largely due to Kosko. They share the feature of resonance between the two layers in the network with

Adaptive Resonance theory. Even though there are connections in both directions between neurons in the two

layers, only one weight matrix is involved. You use the transpose of this weight matrix for the connections in

the opposite direction. When one input at one end leads to some output at the other, which in turn leads to

output same as the previous input, resonance is reached and an associated pair is found. In the case of

bidirectional fuzzy associative memories, one pair of fuzzy sets determines one fuzzy associative memory

system. Fit vectors are used in max–min composition. Perfect recall in both directions is not the case unless

the heights of both fit vectors are equal. Fuzzy associative memories can improve the performance of an

expert system by allowing fuzzy rules.

Previous Table of Contents Next

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Summary


198

C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Previous Table of Contents Next



Chapter 10

Adaptive Resonance Theory (ART)

Introduction

Grossberg’s Adaptive Resonance Theory, developed further by Grossberg and Carpenter, is for the

categorization of patterns using the competitive learning paradigm. It introduces a gain control and a reset to

make certain that learned categories are retained even while new categories are learned and thereby addresses

the plasticity–stability dilemma.

Adaptive Resonance Theory makes much use of a competitive learning paradigm. A criterion is developed to

facilitate the occurrence of winner−take−all phenomenon. A single node with the largest value for the set

criterion is declared the winner within its layer, and it is said to classify a pattern class. If there is a tie for the

winning neuron in a layer, then an arbitrary rule, such as the first of them in a serial order, can be taken as the

winner.


The neural network developed for this theory establishes a system that is made up of two subsystems, one

being the attentional subsystem, and this contains the unit for gain control. The other is an orienting

subsystem, and this contains the unit for reset. During the operation of the network modeled for this theory,

patterns emerge in the attentional subsystem and are called traces of STM (short−term memory). Traces of

LTM (long−term memory) are in the connection weights between the input layer and output layer.

The network uses processing with feedback between its two layers, until resonance occurs. Resonance occurs

when the output in the first layer after feedback from the second layer matches the original pattern used as

input for the first layer in that processing cycle. A match of this type does not have to be perfect. What is

required is that the degree of match, measured suitably, exceeds a predetermined level, termed vigilance

parameter. Just as a photograph matches the likeness of the subject to a greater degree when the granularity is

higher, the pattern match gets finer when the vigilance parameter is closer to 1.



Download 1.14 Mb.

Do'stlaringiz bilan baham:
1   ...   14   15   16   17   18   19   20   21   ...   41




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling