C++ Neural Networks and Fuzzy Logic


Matrices and Some Arithmetic Operations on Matrices


Download 1.14 Mb.
Pdf ko'rish
bet39/41
Sana16.08.2020
Hajmi1.14 Mb.
#126479
1   ...   33   34   35   36   37   38   39   40   41
Bog'liq
C neural networks and fuzzy logic


Matrices and Some Arithmetic Operations on Matrices

A real matrix is a rectangular array of real numbers. A matrix with m rows and n columns is referred to as an

mxn matrix. The element in the ith row and jth column of the matrix is referred to as the ij element of the

matrix and is denoted by a



ij

.

The transpose of a matrix M is denoted by M



T

. The element in the ith row and jth column of M



T

 is the same as

the element of M in its jth row and ith column. M

T

 is obtained from M by interchanging the rows and columns

of M. For example, if

            2  7 −3              2  4



M =          , then M

T

 =  7  0


            4  0  9             −3  9

If X is a vector with m components, x

1

, …, x



m

, then it can be written as a column vector with components

listed one below another. It can be written as a row vector, X = (x

1

, …, x



m

). The transpose of a row vector is

the column vector with the same components, and the transpose of a column vector is the corresponding row

vector.


The addition of matrices is possible if they have the same size, that is, the same number of rows and same

number of columns. Then you just add the ij elements of the two matrices to get the ij elements of the sum

matrix. For example,

       3 −4 5    5 2 −3      8 −2  2

               +         =

       2  3 7    6 0  4      8  3 11

Multiplication is defined for a given pair of matrices, only if a condition on their respective sizes is satisfied.

Then too, it is not a commutative operation. This means that if you exchange the matrix on the left with the

matrix on the right, the multiplication operation may not be defined, and even if it is, the result may not be the

same as before such an exchange.

The condition to be satisfied for multiplying the matrices A, B as AB is, that the number of columns in A is

equal to the number of rows in B. Then to get the ij element of the product matrix AB, you take the ith row of

C++ Neural Networks and Fuzzy Logic:Preface

Appendix B Mathematical Background

416


A as one vector and the jth column of B as a second vector and do a dot product of the two. For example, the

two matrices given previously to illustrate the addition of two matrices are not compatible for multiplication

in whichever order you take them. It is because there are three columns in each, which is different from the

number of rows, which is 2 in each. Another example is given as follows.

                 3 −4 5              5  6

       Let A =            and   B =  2  0

                 2  3 7             −3  4

Then AB and BA are both defined, AB is a 2x2 matrix, whereas BA is 3x3.

                 −8  38              27 −2 67

       Also AB =         and BA  =    6 −8 10

                 −5  40              −1 24 13

Lyapunov Function

A Lyapunov function is a function that decreases with time, taking on non−negative values. It is used to

correspond between the state variables of a system and real numbers. The state of the system changes as time

changes, and the function decreases. Thus, the Lyapunov function decreases with each change of state of the

system.

We can construct a simple example of a function with the property of decreasing with each change of state as



follows. Suppose a real number, x, represents the state of a dynamic system at time t. Also suppose that x is

bounded for any t by a positive real number M. That means x is less than M for every value of t.

Then the function,

       f(x,t) = exp(−|x|/(M+|x|+t))

is non−negative and decreases with increasing t.

Local Minimum

A function f(x) is defined to have a local minimum at y, with a value z, if

       f(y) = z, and f(x) e z, for each x, such that there exists

       a positive real number h such that y – h d x d y + h.

In other words, there is no other value of x in a neighborhood of y, where the value of the function is smaller

than z.

There can be more than one local minimum for a function in its domain. A Step function (with a graph

resembling a staircase) is a simple example of a function with an infinite number of points in its domain with

local minima.

Global Minimum

A function f(x) is defined to have a global minimum at y, with a value z, if

       f(y) = z, and f(x) e z, for each x in the domain of

       the function f.

C++ Neural Networks and Fuzzy Logic:Preface

Global Minimum

417


In other words, there is no other value of x in the domain of the function f, where the value of the function is

smaller than z. Clearly, a global minimum is also a local minimum, but a local minimum may not be a global

minimum.

There can be more than one global minimum for a function in its domain. The trigonometric function f(x) =

sinx is a simple example of a function with an infinite number of points with global minima. You may recall

that sin(3À/ 2), sin (7À/ 2), and so on are all –1, the smallest value for the sine function.



Kronecker Delta Function

The Kronecker delta function is a function of two variables. It has a value of 1 if the two arguments are equal,

and 0 if they are not. Formally,

                       1 if x = y

       ´(x,y)=

                       0 if x `y



Gaussian Density Distribution

The Gaussian Density distribution, also called the Normal distribution, has a density function of the following

form. There is a constant parameter c, which can have any positive value.

Table of Contents

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

Global Minimum

418


C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Table of Contents



References

Ahmadian, Mohamad, and Pimmel, Russell, “Recognizing Geometrical Features of Simulated Targets

with Neural Networks,” Conference Proceedings of the 1992 Artificial Neural Networks in

Engineering Conference, V.3, pp. 409–411.

Aleksander, Igor, and Morton, Helen, An Introduction to Neural Computing, Chapman and Hall,

London, 1990.

Aiken, Milam, “Forecasting T−Bill Rates with a Neural Net,” Technical Analysis of Stocks and

Commodities, May 1995, Technical Analysis Inc., Seattle.

Anderson, James, and Rosenfeld, Edward, eds., Neurocomputing: Foundations of Research, MIT

Press, Cambridge, MA, 1988.

Anzai, Yuichiro, Pattern Recognition and Machine Learning, Academic Press, Englewood Cliffs, NJ,

1992.


Azoff, E. Michael, Neural Network Time Series Forecasting of Financial Markets, John Wiley &

Sons, New York, 1994.

Bauer, Richard J., Genetic Algorithms and Investment Strategies, John Wiley & Sons, New York,

1994.


Booch, Grady, Object Oriented Design with Applications, Benjamin−Cummings, Redwood City, CA,

1991.


Carpenter, Gail A., and Ross, William D., “ART−EMAP: A Neural Network Architecture for Object

Recognition by Evidence Accumulation,” IEEE Transactions on Neural Networks, Vol. 6., No. 4,

July 1995, pp. 805–818.

Colby, Robert W., and Meyers, Thomas A., The Encyclopedia of Technical Market Indicators,

Business One Irwin, Homewood, IL, 1988.

Collard, J. E., “Commodity Trading with a Three Year Old,” Neural Networks in Finance and



Investing, pp. 411–420, Probus Publishing, Chicago, 1993.

Cox, Earl, The Fuzzy Systems Handbook, Academic Press, Boston, 1994.

Dagli, Cihan, et al., eds., Intelligent Engineering Systems through Artificial Neural Networks, Volume

2, ASME Press, New York, 1992.

Davalo, Eric, and Naim, Patrick, Neural Networks, MacMillan, New York, 1991.

Ercal, F., et al., “Diagnosing Malignant Melanoma Using a Neural Network,” Conference

Proceedings of the 1992 Artificial Neural Networks in Engineering Conference, V.3, pp. 553–555.

Frank, Deborah, and Pletta, J. Bryan, “Neural Network Sensor Fusion for Security Applications,”

Conference Proceedings of the 1992 Artificial Neural Networks in Engineering Conference, V.3, pp.

745–748.


Freeman, James A., and Skapura, David M., Neural Networks Algorithms, Applications, and

Programming Techniques, Addison−Wesley, Reading, MA, 1991.

Gader, Paul, et al., “Fuzzy and Crisp Handwritten Character Recognition Using Neural Networks,”

Conference Proceedings of the 1992 Artificial Neural Networks in Engineering Conference, V.3, pp.

421–424.


Ganesh, C., et al., “A Neural Network−Based Object Identification System,” Conference Proceedings

of the 1992 Artificial Neural Networks in Engineering Conference, V.3, pp. 471–474.

C++ Neural Networks and Fuzzy Logic:Preface

References

419


Glover, Fred, ORSA CSTS Newsletter Vol 15, No 2, Fall 1994.

Goldberg, David E., Genetic Algorithms in Search, Optimization and Machine Learning,

Addison−Wesley, Reading, MA, 1989.

Grossberg, Stephen, et al., Introduction and Foundations, Lecture Notes, Neural Network Courses and

Conference, Boston University, May 1992.

Hammerstrom, Dan, “Neural Networks at Work,” IEEE Spectrum, New York, June 1993.

Hertz, John, Krogh, Anders, and Palmer, Richard, Introduction to the Theory of Neural Computation,

Addison−Wesley, Reading, MA, 1991.

Jagota, Arun, “Word Recognition with a Hopfield−Style Net,” Conference Proceedings of the 1992

Artificial Neural Networks in Engineering Conference, V.3, pp. 445–448.

Johnson, R. Colin, “Accuracy Moves OCR into the Mainstream,” Electronic Engineering Times,

CMP Publications, Manhasset, NY, January 16, 1995.

Johnson, R. Colin, “Making the Neural−Fuzzy Connection,” Electronic Engineering Times, CMP

Publications, Manhasset, NY, September 27, 1993.

Johnson, R. Colin, “Neural Immune System Nabs Viruses,” Electronic Engineering Times, CMP

Publications, Manhasset, NY, May 8, 1995.

Jurik, Mark, “The Care and Feeding of a Neural Network,” Futures Magazine, Oster

Communications, Cedar Falls, IA, October 1992.

Kimoto, Takashi, et al., “Stock Market Prediction System with Modular Neural Networks,” Neural

Networks in Finance and Investing, pp. 343–356, Probus Publishing, Chicago, 1993.

Kline, J., and Folger, T.A., Fuzzy Sets, Uncertainty and Information, Prentice Hall, New York, 1988.

Konstenius, Jeremy G., “Trading the S&P with a Neural Network,” Technical Analysis of Stocks and

Commodities, October 1994, Technical Analysis Inc., Seattle.

Kosaka, M., et al., “Applications of Fuzzy Logic/Neural Network to Securities Trading Decision

Support System,” Conference Proceedings of the 1991 IEEE International Conference on Systems,

Man and Cybernetics, V.3, pp. 1913–1918.

Kosko, Bart, and Isaka, Satoru, Fuzzy Logic, Scientific American, New York, July 1993.

Kosko, Bart, Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine

Intelligence, Prentice−Hall, New York, 1992.

Laing, Jonathan, “New Brains: How Smart Computers are Beating the Stock Market,” Barron’s,

February 27, 1995.

Lederman, Jess, and Klein, Robert, eds., Virtual Trading, Probus Publishing, Chicago, 1995.

Lin, C.T. and Lee, C.S.G, “A Multi−Valued Boltzman Machine”, IEEE Transactions on Systems,

Man, and Cybernetics, Vol. 25, No. 4, April 1995 pp. 660−668.

MacGregor, Ronald J., Neural and Brain Modeling, Academic Press, Englewood Cliffs, NJ, 1987.

Mandelman, Avner, “The Computer’s Bullish! A Money Manager’s Love Affair with Neural

Network Programs,” Barron’s, December 14, 1992.

Maren, Alianna, Harston, Craig, and Pap, Robert, Handbook of Neural Computing Applications,

Academic Press, Englewood Cliffs, NJ, 1990.

Marquez, Leorey, et al., “Neural Network Models as an Alternative to Regression,” Neural Networks



in Finance and Investing, pp. 435–449, Probus Publishing, Chicago, 1993.

Mason, Anthony et al., “Diagnosing Faults in Circuit Boards—A Neural Net Approach,” Conference

Proceedings of the 1992 Artificial Neural Networks in Engineering Conference, V.3, pp. 839–843.

McNeill, Daniel, and Freiberger, Paul, Fuzzy Logic, Simon & Schuster, New York, 1993.

McNeill, F. Martin, and Thro, Ellen, Fuzzy Logic: A Practical Approach, Academic Press, Boston,

1994.


Murphy, John J., Intermarket Technical Analysis, John Wiley & Sons, New York, 1991.

Murphy, John J., Technical Analysis of the Futures Markets, NYIF Corp., New York, 1986.

Nellis, J., and Stonham, T.J., “A Neural Network Character Recognition System that Dynamically

Adapts to an Author’s Writing Style,” Conference Proceedings of the 1992 Artificial Neural

Networks in Engineering Conference, V.3, pp. 975–979.

C++ Neural Networks and Fuzzy Logic:Preface

References

420


Peters, Edgar E., Fractal Market Analysis: Applying Chaos Theory to Investment and Economics,

John Wiley & Sons, New York, 1994.

Ressler, James L., and Augusteijn, Marijke F., “Weapon Target Assignment Accessibility Using

Neural Networks,” Conference Proceedings of the 1992 Artificial Neural Networks in Engineering

Conference, V.3, pp. 397–399.

Rao, Satyanaryana S,, and Sethuraman, Sriram, “A Modified Recurrent Learning Algorithm for

Nonlinear Time Series Prediction,” Conference Proceedings of the 1992 Artificial Neural Networks in

Engineering Conference, V.3, pp. 725–730.

Rao, Valluru, and Rao, Hayagriva, Power Programming Turbo C++, MIS:Press, New York, 1992.

Ruggiero, Murray, “How to Build an Artificial Trader,” Futures Magazine, Oster Communications,

Cedar Falls, IA, September 1994.

Ruggiero, Murray, “How to Build a System Framework,” Futures Magazine, Oster Communications,

Cedar Falls, IA, November 1994.

Ruggiero, Murray, “Nothing Like Net for Intermarket Analysis,” Futures Magazine, Oster

Communications, Cedar Falls, IA, May 1995.

Ruggiero, Murray, “Putting the Components before the System,” Futures Magazine, Oster

Communications, Cedar Falls, IA, October 1994.

Ruggiero, Murray, “Training Neural Networks for Intermarket Analysis,” Futures Magazine, Oster

Communications, Cedar Falls, IA, August 1994.

Rumbaugh, James, et al, Object−Oriented Modeling and Design, Prentice−Hall Inc, Englewood

Cliffs, New Jersey, 1991.

Sharda, Ramesh, and Patil, Rajendra, “A Connectionist Approach to Time Series Prediction: An

Empirical Test,” Neural Networks in Finance and Investing, pp. 451–463, Probus Publishing,

Chicago, 1993.

Sherry, Clifford, “Are Your Inputs Correlated?” Technical Analysis of Stocks and Commodities,

February 1995, Technical Analysis Inc., Seattle.

Simpson, Patrick K., Artificial Neural Systems: Foundations, Paradigms, Applications and

Implementations, Pergamon Press, London, 1990.

Soucek, Branko, and Soucek, Marina, Neural and Massively Parallel Computers: The Sixth



Generation, John Wiley & Sons, New York, 1988.

Stein, Jon, “The Trader’s Guide to Technical Indicators,” Futures Magazine, Oster Communications,

Cedar Falls, IA, August 1990.

Terano, Toshiro, et al., Fuzzy Systems Theory and Its Applications, Academic Press, Boston, 1993.

Trippi, Robert, and Turban, Efraim, eds., Neural Networks in Finance and Investing, Probus

Publishing, Chicago, 1993.

Wasserman, Gary S., and Sudjianto, Agus, Conference Proceedings of the 1992 Artificial Neural

Networks in Engineering Conference, V.3, pp. 901–904.

Wasserman, Philip D., Advanced Methods in Neural Computing, Van Nostrand Reinhold, New York,

1993.


Wasserman, Philip D., Neural Computing, Van Nostrand Reinhold, New York, 1989.

Wu, Fred, et al., “Neural Networks for EEG Diagnosis,” Conference Proceedings of the 1992

Artificial Neural Networks in Engineering Conference, V.3, pp. 559–565.

Yager, R., ed., Fuzzy Sets and Applications: Selected Papers by L.Z. Zadeh, Wiley−Interscience, New

York, 1987.

Yan, Jun, Ryan, Michael, and Power, James, Using Fuzzy Logic, Prentice−Hall, New York, 1994.

Yoon, Y., and Swales, G., “Predicting Stock Price Performance: A Neural Network Approach,”

Neural Networks in Finance and Investing, pp. 329–339, Probus Publishing, Chicago, 1993.

C++ Neural Networks and Fuzzy Logic:Preface

References

421


Table of Contents

Copyright ©

 IDG Books Worldwide, Inc.

C++ Neural Networks and Fuzzy Logic:Preface

References

422


C++ Neural Networks and Fuzzy Logic

by Valluru B. Rao

MTBooks, IDG Books Worldwide, Inc.



ISBN: 1558515526   Pub Date: 06/01/95

Table of Contents



Glossary

A

Activation

The weighted sum of the inputs to a neuron in a neural network.



Adaline

Adaptive linear element machine.



Adaptive Resonance Theory

Theory developed by Grossberg and Carpenter for categorization of patterns, and to address the

stability–plasticity dilemma.

Algorithm

A step−by−step procedure to solve a problem.



Annealing

A process for preventing a network from being drawn into a local minimum.



ART

(Adaptive Resonance Theory) ART1 is the result of the initial development of this theory for binary

inputs. Further developments led to ART2 for analog inputs. ART3 is the latest.

Artificial neuron

The primary object in an artificial neural network to mimic the neuron activity of the brain. The

artificial neuron is a processing element of a neural network.

Associative memory

Activity of associating one pattern or object with itself or another.



Autoassociative

Making a correspondence of one pattern or object with itself.



B

Backpropagation

A neural network training algorithm for feedforward networks where the errors at the output layer are

propagated back to the layer before in learning. If the previous layer is not the input layer, then the

errors at this hidden layer are propagated back to the layer before.



BAM

Bidirectional Associative Memory network model.



Bias

A value added to the activation of a neuron.



Binary digit

A value of 0 or 1.



Bipolar value

A value of –1 or +1.



Boltzmann machine

C++ Neural Networks and Fuzzy Logic:Preface

Glossary

423


A neural network in which the outputs are determined with probability distributions. Trained and

operated using simulated annealing.



Brain−State−in−a−Box

Anderson’s single−layer, laterally connected neural network model. It can work with inputs that have

noise in them or are incomplete.

C

Cauchy machine

Similar to the Boltzmann machine, except that a Cauchy distribution is used for probabilities.



Cognitron

The forerunner to the Neocognitron. A network developed to recognize characters.



Competition

A process in which a winner is selected from a layer of neurons by some criterion. Competition

suggests inhibition reflected in some connection weights being assigned a negative value.

Connection

A means of passing inputs from one neuron to another.



Connection weight

A numerical label associated with a connection and used in a weighted sum of inputs.



Constraint

A condition expressed as an equation or inequality, which has to be satisfied by the variables.



Convergence

Termination of a process with a final result.



Crisp

The opposite of fuzzy—usually a specific numerical quantity or value for an entity.



D

Delta rule

A rule for modification of connection weights, using both the output and the error obtained. It is also

called the LMS rule.

E

Energy function

A function of outputs and weights in a neural network to determine the state of the system, e.g.,

Lyapunov function.

Excitation

Providing positive weights on connections to enable outputs that cause a neuron to fire.



Exemplar

An example of a pattern or object used in training a neural network.



Expert system

A set of formalized rules that enable a system to perform like an expert.



F

FAM

Fuzzy Associative Memory network. Makes associations between fuzzy sets.



Feedback

The process of relaying information in the opposite direction to the original.

C++ Neural Networks and Fuzzy Logic:Preface

Glossary


424

Fit vector

A vector of values of degree of membership of elements of a fuzzy set.



Fully connected network

A neural network in which every neuron has connections to all other neurons.



Fuzzy

As related to a variable, the opposite of crisp. A fuzzy quantity represents a range of value as opposed

to a single numeric value, e.g., “hot” vs. 89.4°.

Fuzziness

Different concepts having an overlap to some extent. For example, descriptions of fair and cool

temperatures may have an overlap of a small interval of temperatures.

Fuzzy Associative Memory

A neural network model to make association between fuzzy sets.



Fuzzy equivalence relation

A fuzzy relation (relationship between fuzzy variables) that is reflexive, symmetric, and transitive.



Fuzzy partial order

A fuzzy relation (relationship between fuzzy variables) that is reflexive, antisymmetric, and transitive.



Download 1.14 Mb.

Do'stlaringiz bilan baham:
1   ...   33   34   35   36   37   38   39   40   41




Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2024
ma'muriyatiga murojaat qiling