Would you like to make this site your homepage? It's fast and easy...
Yes, Please make this my home page!
to this artful virtual galery.
A quick encapsulation:
Artificial neural nets (or as they are often simply called, `neural nets')
are composed of units or nodes designed to represent neurons,
which are connected by links designed to represent dendrites, each
of which has a numeric weight. It is usually assumed that some of
the units work in symbiosis with the external environment; these units
form the sets of input and output units. Each unit has a
current activation level, which is its output, and can compute,
based on its inputs and weights on those inputs, its activation level at
the next moment in time. This computation is entirely local: a unit takes
account of but its neighbors in the net. This local computation is calculated
in two stages. First, the input function, 30#30, gives the weighted
sum of the unit's input values, that is, the sum of the input activations
multiplied by their weights:
In the second
stage, the activation function, g, takes the input from the
first stage as argument and generates the output, or activation level,
One common (and
confessedly elementary) choice for the activation function (which ususally
governs all units in a given net) is the step function, which usually has
a threshold t that sees to it that a 1 is output when the input
is greater than t, and that 0 is output otherwise. This is supposed
to be ``brain-like" to some degree, given that 1 represents the firing
of a pulse from a neuron through an axon, and 0 represents no firing. As
you might imagine, there are many different kinds of neural nets. The main
distinction is between feed-forward and recurrent
nets. In feed-forward nets, as their name suggests, links move information
in one direction, and there are no cycles; recurrent nets allow for cycling
back, and can become rather complicated. Recurrent nets underlie the MONA-LISA
system we describe below.
As Ned Block has
recently pointed out to one of us (Bringsjord), since at least all mammals
are probably P-conscious, the accident would had to have happened quite
a while ago.
Information can be
If you want
to get the plain truth
Be not concerned with right and wrong.
The conflict between right and wrong
is the sickness of the mind.
p.281, commentary/reflections (on Riddle of Universe):
"This sentence is false"
"Thiss sentence contains threee errors."
"This sentence contains one error."
To the philosopher
J.R.Lucas by C.H.Whitely:
"Lucas cannot consistently assert this sentence."
or "Lucas cannot consistently believe this sentence."
of nature is very, very complicated. How could one descrive
a cloud? A cloud is not a sphere.. It is like a ball but very irregular.
A mountain? A mountain is not a cone...If you want to speak of clouds,
of mountains, of rivers, of ligthtning, the goemetric language of school
is inadequate. Mandelbrot.
For a complete treatment
of super-computation and related matters, including literary creativity,
see [Bringsjord & Zenzen, 1997
[ and Bringsjord,
series of rule governed state transitions whose rules can be altered
There are numerous
competing definitions of computation. Along with the initial definition
provided here, the following three definitions are often encountered:
The difficulties with
these definitions can be summarized as follows:
Rule governed state transitions
Discrete rule governed
Rule governed state transtitions
between interpretable states
Admits all physical systems
into the class of computational systems, making the definition somewhat
Excludes all forms of
analog computation, perhaps including the sorts of processing taking place
in the brain.
all computational systems as representational systems. In other words,
there is no computation without representation on this definition.