Who's Who

Example entry:

surname, other name [hyperlinks]
Anderson, James [text]
involved in neural net and expert system development.
Babbage, Charles [text]
the first to try to automate computation.
Cajal, Santiago Ramon [text]
elaborated on Canton's work.
Canton, Richard [text]
discoverer of brain waves.
Cauchy, Augustin-Louis [text]
19th C. mathematician who developed a type of probability calculation
Cavendish, Henry [text]
British physicist. Discoverer of hydrogen, he also showed that water was a compound.
Descartes, Rene [text]
elaborated on Galenus's ideas.
Du Bois-Reymond, Emil [text]
the first to measure electricity in nerves.
Galenus, Claudius [text]
performed the first proper research in to the human nervous system using injured gladiators.
Galvani, Luigi [text]
Italian physician.
Hammerstrom, Dan [text]
involved in research with large neural net systems.
Hinton, Geoffrey [text]
developer of the Boltzmann Machine.
Hoff, Ted [text]
Widrow's student.
Hopfield, John [text, text]
developer of energy relaxation.
McClelland, James [text]
extended the delta-rule.
McCulloch, Warren [text]
discovered a frog's eye recognised certain objects before signalling the brain. Originator of the threshold input-output function.
Mead, Carver [text]
involved in modelling sensory systems, neural chip development.
Muller, Johannes [text]
Newton, Isaac [text]
a leading scientist in his time, he contributed greatly to physics and mathematics.
Pitts, Walter [text]
partner of McCulloch, all their work was joint.
Plato, [text]
did no true research of the brain but produces various ideas.
Psaltis, Demetri [text]
developing a parallel read-head for optical storage systems.
Purkinje, Jan [text]
Rosenblatt, Frank [text]
inventor of the perceptron.
Rumelhart, David [text, text]
worked with McClelland.
Sejnowski, Terrance [text, text]
worked with Hinton. Developed NETtalk.
Szu, Harold [text]
involved in developing optical nets.
Turing, Alan [text]
inventor of the "Turing Machine".
Von Neumann, John [text, text]
"father" of the digital computer.
Widrow, Bernard [text]
extended the Hebbian learning rule.


Example entry:

word (similar words) [hyperlinks]
delta-rule [text]
see text.
activity level [text, text]
the weighed sum of the inputs into a neuron.
AI system (Artificial Intelligence system) [text]
a man-made system that is meant to behave intelligently.
anti-Hebbian rule [text]
the weight between neurons is decreased when the neuron fires. See text.
auto-associative memory [text]
when some learning rules operate they adapt the weights so that for a given input the output matches a "teacher" signal. Auto-associative systems assume the "teacher" is the same as the input. These systems are used to correct data.
binary [text]
a number system with only two digits, 0 and 1.
biological computation [text]
the processing carried out in the brain.
Boltzmann machine [text, text, text]
see text.
chaos theory [text]
simple rules can give rise to very complex systems. It is impossible to accurately predict the behaviour of such systems unless its components and rules are known, perfectly.
cortex [text]
the brain is divided into many sections, this is the outer layer of the brain.
CPU (Central Processing Unit) [text]
this is the part of the computer that runs the programs.
current of injury [text]
the name given to the current a neuron emits after it has been injured.
energy relaxation [text, text]
see text.
excitory [text]
a connection which increases the activity level of a neuron.
expert system [text, text, text]
a particular type of AI system that uses a knowledge base and a set of instructions to answer questions.
feedback [text]
where information at a letter part of a system can effect information in an earlier part of the system.
generalised delta-rule [text]
see text.
heat buildup [text]
the resistance in any electrical component causes electrical energy to turn into heat. This may cause damage if the temperature becomes too high.
Hebbian rule (Hebb's law) [text, text, text]
see text.
hidden layer [text, text]
see text.
Hopfield's principal [text]
energy relaxation. See text.
inhibitory [text]
a connection which decreases the activity level of a neuron.
input layer [text, text]
see text.
input-output function [text, text, text]
the process through which the activity level of a neuron goes before being sent as a signal to other neurons.
linear [text, text]
see text.
main-frames [text]
large, powerful computers that allow more than one user to access them at the same time.
microprocessor [text]
a chip that contains an entire functional unit of a computer.
minicomputer [text]
smaller, cheaper machines than main-frames but limited to more specific tasks. Digital Equipment Corp. pioneered these and this led to the introduction of computers into technical and scientific markets.
neural network (neural net,network,net) [text, text, text]
a circuit or simulation built to model or simulate the way the brain works.
output layer [text, text]
see text.
phonemes [text]
the units of speech out of which all words are made.
RAM (Random Access Memory) [text]
a memory storage device for computers built on a chip.
RAM, dynamic [text]
this uses capacitors and so the memory has to be "refreshed" every so often, this takes up some of the computer's time.
RAM, static [text]
this uses transistors and so does not have to be "refreshed" and therefore doesn't take up the computer's time.
sigmoid [text, text]
see text.
supercomputer [text, text]
very large,powerful computers used for large amounts of processing.
synapse [text]
the minute gap between two "joined" neurons through which chemical signals from the neurons pass.
three-dimensional energy surface [text]
a graph where the x and y axis represent the states of the circuit and the z axis represents the "energy" of the circuit.
threshold [text]
see text.
weight [text, text, text]
the name given to the amount that a signal is multiplied before being given to a neuron.