C++ Neural Networks and Fuzzy Logic


Download 1.14 Mb.
Pdf ko'rish
bet40/41
Sana16.08.2020
Hajmi1.14 Mb.
#126479
1   ...   33   34   35   36   37   38   39   40   41
Bog'liq
C neural networks and fuzzy logic


G

Gain

Sometimes a numerical factor to enhance the activation. Sometimes a connection for the same

purpose.

Generalized Delta rule

A rule used in training of networks such as backpropagation training where hidden layer weights are

modified with backpropagated error.

Global minimum

A point where the value of a function is no greater than the value at any other point in the domain of

the function.

H

Hamming distance

The number of places in which two binary vectors differ from each other.



Hebbian learning

A learning algorithm in which Hebb’s rule is used. The change in connection weight between two

neurons is taken as a constant times the product of their outputs.

Heteroassociative

Making an association between two distinct patterns or objects.



Hidden layer

An array of neurons positioned in between the input and output layers.



Hopfield network

A single layer, fully connected, autoassociative neural network.



I

Inhibition

The attempt by one neuron to diminish the chances of firing by another neuron.



Input layer

An array of neurons to which an external input or signal is presented.



Instar

C++ Neural Networks and Fuzzy Logic:Preface

Glossary

425


A neuron that has no connections going from it to other neurons.

L

Lateral connection

A connection between two neurons that belong to the same layer.



Layer

An array of neurons positioned similarly in a network for its operation.



Learning

The process of finding an appropriate set of connection weights to achieve the goal of the network

operation.

Linearly separable

Two subsets of a linear set having a linear barrier (hyperplane) between the two of them.



LMS rule

Least mean squared error rule, with the aim of minimizing the average of the squared error. Same as

the Delta rule.

Local minimum

A point where the value of the function is no greater than the value at any other point in its

neighborhood.

Long−term memory (LTM)

Encoded information that is retained for an extended period.



Lyapunov function

A function that is bounded below and represents the state of a system that decreases with every

change in the state of the system.

M

Madaline

A neural network in which the input layer has units that are Adalines. It is a multiple−Adaline.



Mapping

A correspondence between elements of two sets.



N

Neural network

A collection of processing elements arranged in layers, and a collection of connection edges between

pairs of neurons. Input is received at one layer, and output is produced at the same or at a different

layer.


Noise

Distortion of an input.



Nonlinear optimization

Finding the best solution for a problem that has a nonlinear function in its objective or in a constraint.



O

On center off surround

Assignment of excitatory weights to connections to nearby neurons and inhibitory weights to

connections to distant neurons.

Orthogonal vectors

Vectors whose dot product is 0.

C++ Neural Networks and Fuzzy Logic:Preface

Glossary


426

Outstar

A neuron that has no incoming connections.



P

Perceptron

A neural network for linear pattern matching.



Plasticity

Ability to be stimulated by new inputs and learn new mappings or modify existing ones.



R

Resonance

The responsiveness of two neurons in different layers in categorizing an input. An equilibrium in two

directions.

S

Saturation

A condition of limitation on the frequency with which a neuron can fire.



Self−organization

A process of partitioning the output layer neurons to correspond to individual patterns or categories,

also called unsupervised learning or clustering.

Short−term memory (STM)

The storage of information that does not endure long after removal of the corresponding input.



Simulated annealing

An algorithm by which changes are made to decrease energy or temperature or cost.



Stability

Convergence of a network operation to a steady−state solution.



Supervised learning

A learning process in which the exemplar set consists of pairs of inputs and desired outputs.



T

Threshold value

A value used to compare the activation of a neuron to determine if the neuron fires or not. Sometimes

a bias value is added to the activation of a neuron to allow the threshold value to be zero in

determining the neuron’s output.



Training

The process of helping in a neural network to learn either by providing input/output stimuli

(supervised training) or by providing input stimuli (unsupervised training), and allowing weight

change updates to occur.



U

Unsupervised learning

Learning in the absence of external information on outputs, also called self−organization or

clustering.

C++ Neural Networks and Fuzzy Logic:Preface

Glossary

427


V

Vigilance parameter

A parameter used in Adaptive Resonance Theory. It is used to selectively prevent the activation of a

subsystem in the network.

W

Weight

A number associated with a neuron or with a connection between two neurons, which is used in

aggregating the outputs to determine the activation of a neuron.

Table of Contents

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Glossary


428

C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Table of Contents



Index

A

ABAM see Adaptive Bi−directional Associative Memory

abstract class, 138

abstract data type, 22

accumulation−distribution, 402, 404,

activation, 3, 18, 89

zero, 182

Adaline, 81, 82, 102, 103, 112

adaptation, 77, 120

Adaptive Bi−directional Associative Memory, 181, 212

Competitive, 212

Differential

Competitive, 212

Hebbian, 212

adaptive filters, 120

adaptive linear element, 102

adaptive models, 120

adaptive parameters, 373

adaptive resonance, 118

Adaptive Resonance Theory I, 104, 107, 108, 115, 117, 243

algorithm for calculations, 246

equations for, 246

F1 layer calculations, 247

F2 layer calculations, 247

modifying connection weights, 248

Top−down inputs, 247

two−thirds rule, 244, 245

Adaptive Resonance Theory II, 248

Adaptive Resonance Theory III, 248

additive composition method, 506

advancing issues, 387

aggregation, 82, 87, 98

Ahmadian, 513

Aiken, 405

algorithm, 1

backpropagation, 7, 103, 271, 373, 375

constructive, 121

data clustering, 245

encoding, 94, 96

C++ Neural Networks and Fuzzy Logic:Preface

Index

429


learning algorithm , 61, 79, 102, 118

adaptive steepest descent, 410

alpha, 273, 274, 330, 372, 373, 384

alphabet classifier system, 320

Amari, 104

analog, 73, 74, 322

signal, 8

AND operation , 64

Anderson, 105

annealing

process, 430

schedule 113

simulated annealing, 113, 114

Anzai, 456, 472

application, 102, 374, 511

nature of , 74

approximation, 109

architecture, 7, 81, 120, 373

Artificial Intelligence , 6, 34

artificial neural network , 1

ART1 see Adaptive Resonance Theory I

ART2 see Adaptive Resonance Theory II

ART3 see Adaptive Resonance Theory III

artneuron class, 249

ASCII, 305, 306, 307, 329

graphic characters, 306

assocpair class

in BAM network, 186

association, 218

asynchronously , 1, 14

asynchronous update, 13, 62

attentional subsystem, 107, 243

Augusteijn, 512

autoassociation , 7, 8, 82,, 92, 102, 180

autoassociative network, 7, 64, 97, 375

average winner distance, 296

Azoff, 410

B

backpropagated errors, 144

Backpropagation, 10, 103, 104, 120, 123, 302, 325, 329, 374

algorithm, 7, 103, 271, 373, 375

beta, 330

calculating error, 396

calculations, 128, 130

changing beta while training, 337

choosing middle layer size, 372

convergence, 372

momentum term, 330

C++ Neural Networks and Fuzzy Logic:Preface

Index

430


noise factor, 336

self−supervised, 375

simulator, 134, 173, 337, 375, 377, 396

training and testing, 396

training law, 333

variations of , 373

Baer, 516

BAM see Bi−directional Associative Memory

bar

chart, 403, 404



features, 513

Barron’s, 377, 378, 388

base class , 25, 28, 138

beta, 136, 337, 372, 373, 384

bias, 16, 77, 125, 128, 378

Bi−directional Associative Memory, 81, 92, 104, 115, 117, 179, 185, 215

connection weight matrix, 212

continuous, 211

inputs, 180

network, 104

outputs, 180

training, 181

Unipolar Binary, 212

bin, 325


binary , 8, 15, 51, 65, 73, 74, 104

input pattern, 51, 98

patterns 11, 97

string, 16, 62

binary to bipolar mapping, 62, 63

binding


dynamic binding , 24, 139

late binding , 24

static binding, 139

bipolar, 15, 17, 104

mapping, 97

string, 16, 62, 180

bit pattern, 13

Blending problem, 418

block averages, 393

bmneuron class, 186

Boltzmann distribution, 113

Boltzmann machine, 92, 112, 113, 118, 419, 512

Booch, 21

boolean logic, 50

bottom−up

connection weight, 248

connections, 107, 244

Box−Jenkins methodology, 406

Brain−State−in−a−Box, 81, 82, 105

breadth, 387, 389

Buckles, 484

C++ Neural Networks and Fuzzy Logic:Preface

Index

431


buy/sell signals, 409, 410

C

cache, 137

Cader, 406

CAM see Content−Addressable−Memory

Carpenter, 92, 107, 117, 243, 269, 517

car navigation, 374

cartesian product, 479

cascade−correlation, 512

Cash Register game, 3, 65

categorization of inputs, 261

category, 37

Cauchy distribution, 113

Cauchy machine, 112, 113, 419

cells


complex cells, 106

simple cells, 106

center of area method, 504

centroid, 507, 508

character recognition, 305

characters

alphabetic, 305

ASCII, 306, 307

garbled, 322

graphic, 306, 307

handwritten, 320

Chawla, 514

Chiang, 513

cin, 25, 58, 71

clamping probabilities, 114

Clinton, Hillary, 405

C language, 21

class, 22

abstract class, 138

base class, 25, 28, 138, 139

derived class, 23, 25, 26, 144

friend class, 23

hierarchy, 27, 138, 139

input_layer class, 138

iostream class, 71

network class, 53, 66

output_layer class, 138

parent class, 26

classification, 322

C layer, 106

codebook vectors, 116

Cohen, 212

Collard, 405

C++ Neural Networks and Fuzzy Logic:Preface

Index

432


column vector, 97

combinatorial problem, 422

comparison layer, 244

competition, 9, 94, 97

competitive learning, 243

compilers

C++ compilers, 27

compilation error messages, 27

complement, 33, 185, 201, 202

complex cells, 106

composition

max−min, 220

compressor, 375

Computer Science, 34

conditional fuzzy

mean, 491

variance, 491

conjugate gradient methods, 373

conjunction operator, 505

connections, 2, 93, 102

bottom−up, 107

excitatory, 272

feedback , 82

inhibitory , 272

lateral, 93, 97, 107, 272, 276

recurrent, 82, 107, 179

top−down, 107

connection weight matrix, 220

connection weights, 89, 98

conscience factor, 302

constraints, 417

constructor, 23, 28, 55, 66

default constructor, 23

Consumer Price Index, 387

Content−Addressable−Memory, 5

continuous

Bi−directional Associative Memory, 211

models, 98

convergence, 78, 79, 96, 118, 119, 132, 323, 372, 373, 375

cooperation, 9, 94

correlation matrix, 9, 63

correlation−product encoding, 220

cost function, 124, 373

Cottrell, 374

counterpropagation, 106

network, 92, 93, 302

cout, 25, 58

C++, 21


classes, 138

code, 36


comments, 58

C++ Neural Networks and Fuzzy Logic:Preface

Index

433


compilers, 27

implementation, 185

crisp, 31, 73, 74

data sets, 475

rules, 48, 217

values, 50

cube, 84, 87, 89, 90

cum_deltas, 331

cycle, 78, 125

learning cycle, 103

cyclic information, 380

D

data biasing, 378

data hiding, 21, 22

data clustering, 109, 245

data completion, 102

data compression, 102, 302

Deboeck, 406

Davalo, 457

de la Maza, 410

Decision support systems, 75

declining issues, 387

decompressor, 375

decorrelation, 384

default


constructor, 23, 66

destructor, 23

defuzzification, 504, 506

degenerate tour, 424

degree of

certainty, 31

membership, 32, 477

degrees of freedom, 383

delete, 24, 144

delta rule, 110−113

derived class, 23, 25, 26, 144

descendant, 139, 143

DeSieno, 302

destructor, 23, 24

digital signal processing boards, 325

dimensionality, 381, 382, 384

directed graph, 65

discount rate, 35

discretization of a character, 98

discrete models, 98

discriminator, 517

disjunction operator, 506

display_input_char function, 308

C++ Neural Networks and Fuzzy Logic:Preface

Index

434


display_winner_weights function, 308

distance


Euclidean, 13

Hamming, 13

DJIA see Dow Jones Industrial Average

DOF see degrees of freedom

domains, 479, 484

dot product, 11, 12, 51, 64, 65

Dow Jones Industrial Average, 378, 386

dual confirmation trading system, 408

dynamic allocation of memory, 24

dynamic binding, 24, 139

dynamics

adaptive, 74

nonadaptive, 74

E

EMA see exponential moving average

encapsulate, 29

encapsulation, 21, 22

encode, 375

encoding, 7, 81, 220

algorithm, 94, 96

correlation−product, 220

phase, 94

thermometer, 380

time, 120

energy, 422

function, 119

level, 113, 422

surface, 177

Engineering, 34

epoch, 125

Ercal, 514

error signal, 103

error surface, 113

error tolerance, 136

Euclidean distance, 13, 280, 296

excitation, 94, 98, 276, 303

excitatory connections, 244, 272

exclusive or, 83

exemplar, 181−183, 201

class in BAM network, 186

pairs, 135, 177

patterns, 135, 177

exemplar pattern, 16, 64

exemplars, 64, 65, 74, 75, 115, 181

Expert systems, 48, 75, 217

exponential moving average, 399

C++ Neural Networks and Fuzzy Logic:Preface

Index

435


extended (database) model, 486

extended−delta−bar−delta, 406



F

factorial, 420

FAM see Fuzzy Associative Memories

Fast Fourier Transform, 374

fault tolerance, 374

feature detector, 328

feature extraction, 7, 513

Federal Reserve, 388, 388

feedback, 4, 5, 93

connections, 123, 179

feed forward

Backpropagation, 81, 92, 112, 115, 123, 384,

network , 145, 406, 409, 511,

architecture, 124,

layout, 124,

network, 10

operation, 185

type, 2


field, 82

filter, 322

financial forecasting, 377

fire, 3, 71, 87, 89

first derivative, 380

fiscal policy, 36

fit values, 32, 217

fit vector, 32, 217, 221

floating point calculations, 519

compilation for, 519

F1 layer, 245

calculations, 247

Fogel, 76, 77

forecasting, 102

model, 378

T−Bill yields, 405

T−Note yields, 405

forward, 93, 104

Fourier transform, 380

Frank, 515

Freeman, 246, 248

frequency

component, 380

signature, 374

frequency spikes, 380

friend class, 23, 66, 68

F2 layer, 245

calculations, 247

C++ Neural Networks and Fuzzy Logic:Preface

Index


436

Fukushima, 92, 106

function


constructor function, 28

energy function, 119

evaluation, 109

fuzzy step function, 101

hyperbolic tangent function, 100

linear function, 99, 102

logistic function, 86, 100

Lyapunov function, 118, 119

member function, 28, 308

objective, 417

overloading, 25, 139

ramp function, 99, 101, 102

reference function, 493

sigmoid


function, 99, 100, 126, 129, 133, 164, 177, 395

logistic function, 100

step function, 99, 101

threshold function, 52, 95, 99, 101

XOR function, 83−85, 87

fuzzifier, 35, 36, 47

program, 50

fuzziness, 48, 50

fuzzy adaptive system, 49

fuzzy ARTMAP, 517

fuzzy association, 217

Fuzzy Associative Memories, 49, 50, 81, 92, 104, 115, 117, 217, 218, 473

encoding, 219, 220

Fuzzy Cognitive Map, 48, 49

fuzzy conditional expectations, 490, 509

fuzzy control , 497, 509

system, 47, 48, 473

fuzzy controller, 47

fuzzy database, 473, 475, 509

fuzzy expected value, 490

fuzzy equivalence relation, 481

fuzzy events, 488, 509

conditional probability of, 491

probability of , 490

fuzzy inputs, 47, 73

fuzzy logic, 31, 34, 50, 473

controller, 473, 497, 509

fuzzy matrix, 217

fuzzy means, 488, 490, 509

fuzzy numbers, 493, 496

triangular, 496

fuzzy OR method, 505

fuzzy outputs, 47, 74

fuzzy quantification, 488

fuzzy queries, 483, 488

C++ Neural Networks and Fuzzy Logic:Preface

Index

437


fuzzy relations, 479, 509

matrix representation, 479

fuzzy rule base, 502−504

fuzzy rules, 47, 50

fuzzy set, 32, 50, 218, 477, 488

complement, 218

height, 218

normal, 218

operations, 32, 217

fuzzy systems, 50, 217

fuzzy−valued, 34

fuzzy values, 477

fuzzy variances, 488, 490, 509

fzneuron class, 221



G

Gader, 513

gain , 107

constant, 273

parameter, 429, 467

gain control, 243, 248

unit, 244

Gallant, 117

Ganesh, 514

Gaussian density function, 458, 459, 524

generalization, 121, 320, 382

ability, 121, 336

generalized delta rule, 112, 176

genetic algorithms, 75, 76, 385

global

minimum, 113, 177



variable, 28

Glover, 471

gradient, 112, 113

grandmother cells, 117

gray scale, 305, 306, 322, 374

grid, 214, 305

Grossberg, 19, 92, 93, 107, 117, 243, 269

Grossberg layer, 9, 19, 82, 92, 106, 302



H

Hamming distance, 13, 201, 202

handwriting analysis, 98

handwriting recognition, 102

handwritten characters, 92

heap, 144

Hebb, 110

Hebbian


C++ Neural Networks and Fuzzy Logic:Preface

Index


438

conditioned learning, 302

learning, 105, 110

Hebb’s rule, 110, 111

Hecht−Nielsen, 93, 106, 302

Herrick Payoff Index, 401

heteroassociation, 7, 8, 82, 92, 102, 104, 180, 181

heteroassociative network, 7, 97

hidden layer, 2, 4, 75, 86, 89

hierarchical neural network, 407

hierarchies of classes, 27, 29

Hinton, 114

Hoff, 102, 112

holographic neural network, 408

Honig, 515

Hopfield, 422, 427, 429

memory, 73, 115, 117, 181

Hopfield network, 9, 11−14, 16, 19, 51, 79, 81, 82, 93, 111, 119, 120, 181, 472

Hotelling transform, 384

Housing Starts, 387

hybrid models, 75

hyperbolic tangent function, 429

hypercube, 218

unit, 218

hyperplane, 84

hypersphere, 273

I

image, 106, 302

compression, 374

processing, 98, 102

five−way transform, 516

recognition, 375

resolution, 322

implementation of functions, 67

ineuron class, 66

inference engine, 47

inheritance, 21, 25, 26, 138

multiple inheritance, 26

inhibition, 9, 94, 98

global, 428, 456

lateral, 272, 276

inhibitory connection, 272

initialization of

bottom−up weights, 250

parameters, 246

top−down weights, 250

weights, 94

inline, 165

C++ Neural Networks and Fuzzy Logic:Preface

Index


439

input, 98

binary input, 98

bipolar input, 98

layer, 2, 10

nature of , 73

number of , 74

patterns, 51, 65

signals, 65

space, 124

vector, 53, 71, 272, 112

input/output, 71

inqreset function, 251

instar, 93

interactions, 94

interconnections, 7

interest rate, 387

internal activation , 3

intersection, 32, 33

inverse mapping, 62, 182

Investor’s Business Daily, 388

iostream, 54, 71

istream, 58

iterative process, 78


Download 1.14 Mb.

Do'stlaringiz bilan baham:
1   ...   33   34   35   36   37   38   39   40   41




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling