803 lines
31 KiB
HTML
803 lines
31 KiB
HTML
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
|
|
<HTML>
|
|
<HEAD>
|
|
<META NAME="GENERATOR" CONTENT="LinuxDoc-Tools 0.9.21">
|
|
<TITLE>GNU/Linux AI & Alife HOWTO: Connectionism</TITLE>
|
|
<LINK HREF="AI-Alife-HOWTO-4.html" REL=next>
|
|
<LINK HREF="AI-Alife-HOWTO-2.html" REL=previous>
|
|
<LINK HREF="AI-Alife-HOWTO.html#toc3" REL=contents>
|
|
</HEAD>
|
|
<BODY>
|
|
<A HREF="AI-Alife-HOWTO-4.html">Next</A>
|
|
<A HREF="AI-Alife-HOWTO-2.html">Previous</A>
|
|
<A HREF="AI-Alife-HOWTO.html#toc3">Contents</A>
|
|
<HR>
|
|
<H2><A NAME="Connectionism"></A> <A NAME="s3">3.</A> <A HREF="AI-Alife-HOWTO.html#toc3">Connectionism</A></H2>
|
|
|
|
<P>Connectionism is a technical term for a group of related
|
|
techniques. These techniques include areas such as Artificial
|
|
Neural Networks, Semantic Networks and a few other similar
|
|
ideas. My present focus is on neural networks (though I am
|
|
looking for resources on the other techniques). Neural
|
|
networks are programs designed to simulate the workings of the
|
|
brain. They consist of a network of small mathematical-based
|
|
nodes, which work together to form patterns of information.
|
|
They have tremendous potential and currently seem to be having
|
|
a great deal of success with image processing and robot
|
|
control.</P>
|
|
|
|
|
|
<H2><A NAME="ss3.1">3.1</A> <A HREF="AI-Alife-HOWTO.html#toc3.1">Connectionist class/code libraries</A>
|
|
</H2>
|
|
|
|
|
|
<P>These are libraries of code or classes for use in programming within
|
|
the Connectionist field. They are not meant as stand alone
|
|
applications, but rather as tools for building your own applications.</P>
|
|
<P>
|
|
<DL>
|
|
<P>
|
|
<A NAME="Baysian Modeling"></A> </P>
|
|
<DT><B>Software for Flexible Bayesian Modeling</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.utoronto.ca/~radford/fbm.software.html">www.cs.utoronto.ca/~radford/fbm.software.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>This software implements flexible Bayesian models for regression
|
|
and classification applications that are based on multilayer
|
|
perceptron neural networks or on Gaussian processes. The
|
|
implementation uses Markov chain Monte Carlo methods. Software
|
|
modules that support Markov chain sampling are included in the
|
|
distribution, and may be useful in other applications.</P>
|
|
|
|
<P>
|
|
<A NAME="BELIEF"></A> </P>
|
|
<DT><B>BELIEF</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/reasonng/probabl/belief/">www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/reasonng/probabl/belief/</A></LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>BELIEF is a Common Lisp implementation of the Dempster and Kong
|
|
fusion and propagation algorithm for Graphical Belief Function
|
|
Models and the Lauritzen and Spiegelhalter algorithm for
|
|
Graphical Probabilistic Models. It includes code for
|
|
manipulating graphical belief models such as Bayes Nets and
|
|
Relevance Diagrams (a subset of Influence Diagrams) using both
|
|
belief functions and probabilities as basic representations of
|
|
uncertainty. It uses the Shenoy and Shafer version of the
|
|
algorithm, so one of its unique features is that it supports
|
|
both probability distributions and belief functions. It also
|
|
has limited support for second order models (probability
|
|
distributions on parameters).</P>
|
|
<P>
|
|
<A NAME="bpnn.py"></A> </P>
|
|
<DT><B>bpnn.py</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://arctrix.com/nas/python/bpnn.py">http://arctrix.com/nas/python/bpnn.py</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>A simple back-propogation ANN in Python.</P>
|
|
|
|
<P>
|
|
<A NAME="brain"></A> </P>
|
|
<DT><B>brain</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://harthur.github.com/brain/">http://harthur.github.com/brain/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Brain is a lightweight JavaScript library for neural networks. It
|
|
implements the standard feedforward multi-layer perceptron neural
|
|
network trained with backpropagation.</P>
|
|
<P>
|
|
<A NAME="brain-simulator"></A> </P>
|
|
<DT><B>brain-simulator</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.briansimulator.org/">http://www.briansimulator.org/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Brian is a clock-driven simulator for spiking neural networks. It is
|
|
designed with an emphasis on flexibility and extensibility, for rapid
|
|
development and refinement of neural models. Neuron models are
|
|
specified by sets of user-specified differential equations, threshold
|
|
conditions and reset conditions (given as strings). The focus is
|
|
primarily on networks of single compartment neuron models (e.g. leaky
|
|
integrate-and-fire or Hodgkin-Huxley type neurons). It is written in
|
|
Python and is easy to learn and use, highly flexible and easily
|
|
extensible. Features include:</P>
|
|
<P>
|
|
<UL>
|
|
<LI>a system for specifying quantities with physical dimensions</LI>
|
|
<LI>exact numerical integration for linear differential equations</LI>
|
|
<LI>Euler, Runge-Kutta and exponential Euler integration for
|
|
nonlinear differential equations</LI>
|
|
<LI>synaptic connections with delays</LI>
|
|
<LI>short-term and long-term plasticity (spike-timing dependent
|
|
plasticity)</LI>
|
|
<LI>a library of standard model components, including
|
|
integrate-and-fire equations, synapses and ionic currents</LI>
|
|
<LI>a toolbox for automatically fitting spiking neuron models to
|
|
electrophysiological recordings</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="CNNs"></A> </P>
|
|
<DT><B>CNNs</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.isiweb.ee.ethz.ch/haenggi/CNNsim.html">http://www.isiweb.ee.ethz.ch/haenggi/CNNsim.html</A></LI>
|
|
<LI>Newer Version:
|
|
<A HREF="http://www.isiweb.ee.ethz.ch/haenggi/CNNsim_adv_manual.html">http://www.isiweb.ee.ethz.ch/haenggi/CNNsim_adv_manual.html</A></LI>
|
|
<LI>Old Page:
|
|
<A HREF="http://www.ce.unipr.it/research/pardis/CNN/cnn.html">http://www.ce.unipr.it/research/pardis/CNN/cnn.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Cellular Neural Networks (CNN) is a massive parallel computing
|
|
paradigm defined in discrete N-dimensional spaces. A visualizing CNN
|
|
Simulator which allows to track the way in which the state trajectories
|
|
evolve, thus gaining an insight into the behavior of CNN dynamics.
|
|
This may be useful for forming an idea how a CNN 'works', especially
|
|
for those people who are not experienced in CNN theory.</P>
|
|
|
|
<P>
|
|
<A NAME="CONICAL"></A> </P>
|
|
<DT><B>CONICAL</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://strout.net/conical/">strout.net/conical/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>CONICAL is a C++ class library for building simulations common
|
|
in computational neuroscience. Currently its focus is on
|
|
compartmental modeling, with capabilities similar to GENESIS and
|
|
NEURON. A model neuron is built out of compartments, usually
|
|
with a cylindrical shape. When small enough, these open-ended
|
|
cylinders can approximate nearly any geometry. Future classes
|
|
may support reaction-diffusion kinetics and more. A key feature
|
|
of CONICAL is its cross-platform compatibility; it has been
|
|
fully co-developed and tested under Unix, DOS, and Mac OS.</P>
|
|
|
|
<P>
|
|
<A NAME="Encog"></A> </P>
|
|
<DT><B>Encog</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.heatonresearch.com/">http://www.heatonresearch.com/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Encog is an advanced neural network and machine learning framework.
|
|
Encog contains classes to create a wide variety of networks, as well as
|
|
support classes to normalize and process data for these neural
|
|
networks. Encog trains using multithreaded resilient propagation. Encog
|
|
can also make use of a GPU to further speed processing time. A GUI
|
|
based workbench is also provided to help model and train neural
|
|
networks. Encog has been in active development since 2008. Encog is
|
|
available for Java, .Net and Silverlight.</P>
|
|
|
|
<P>
|
|
<A NAME="FANN"></A> </P>
|
|
<DT><B>FANN</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://leenissen.dk/fann/">http://leenissen.dk/fann/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Fast Artificial Neural Network Library is a free open source neural
|
|
network library, which implements multilayer artificial neural networks
|
|
in C with support for both fully connected and sparsely connected
|
|
networks. Cross-platform execution in both fixed and floating point are
|
|
supported. It includes a framework for easy handling of training data
|
|
sets. It is easy to use, versatile, well documented, and fast. PHP,
|
|
C++, .NET, Ada, Python, Delphi, Octave, Ruby, Prolog Pure Data and
|
|
Mathematica bindings are available. A reference manual accompanies the
|
|
library with examples and recommendations on how to use the library. A
|
|
graphical user interface is also available for the library.</P>
|
|
|
|
<P>
|
|
<A NAME="ffnet"></A> </P>
|
|
<DT><B>ffnet</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://ffnet.sourceforge.net/">http://ffnet.sourceforge.net/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>ffnet is a fast and easy-to-use feed-forward neural network training
|
|
solution for python. Many nice features are implemented: arbitrary
|
|
network connectivity, automatic data normalization, very efficient
|
|
training tools, network export to fortran code.</P>
|
|
|
|
<P>
|
|
<A NAME="Joone"></A> </P>
|
|
<DT><B>Joone</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://sourceforge.net/projects/joone/">http://sourceforge.net/projects/joone/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Joone is a neural net framework to create, train and test neural nets.
|
|
The aim is to create a distributed environment based on JavaSpaces both
|
|
for enthusiastic and professional users, based on the newest Java
|
|
technologies. Joone is composed of a central engine that is the
|
|
fulcrum of all applications that already exist or will be developed.
|
|
The neural engine is modular, scalable, multitasking and tensile.
|
|
Everyone can write new modules to implement new algorithms or new
|
|
architectures starting from the simple components distributed with the
|
|
core engine. The main idea is to create the basis to promote a zillion
|
|
of AI applications that revolve around the core framework.</P>
|
|
|
|
<P>
|
|
<A NAME="Matrix Class"></A> </P>
|
|
<DT><B>Matrix Class</B><DD><P>
|
|
<UL>
|
|
<LI>FTP site:
|
|
<A HREF="ftp://ftp.cs.ucla.edu/pub/">ftp.cs.ucla.edu/pub/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>A simple, fast, efficient C++ Matrix class designed for
|
|
scientists and engineers. The Matrix class is well suited for
|
|
applications with complex math algorithms. As an demonstration
|
|
of the Matrix class, it was used to implement the backward error
|
|
propagation algorithm for a multi-layer feed-forward artificial
|
|
neural network.</P>
|
|
|
|
<P>
|
|
<A NAME="NEAT"></A> </P>
|
|
<DT><B>NEAT</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://nn.cs.utexas.edu/project-view.php?RECORD_KEY(Projects)=ProjID&ProjID(Projects)=14">http://nn.cs.utexas.edu/project-view.php?RECORD_KEY(Projects)=ProjID&ProjID(Projects)=14</A></LI>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.ucf.edu/~kstanley/neat.html">http://www.cs.ucf.edu/~kstanley/neat.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Many neuroevolution methods evolve fixed-topology networks. Some
|
|
methods evolve topologies in addition to weights, but these usually
|
|
have a bound on the complexity of networks that can be evolved and
|
|
begin evolution with random topologies. This project is based on a
|
|
neuroevolution method called NeuroEvolution of Augmenting Topologies
|
|
(NEAT) that can evolve networks of unbounded complexity from a minimal
|
|
starting point.</P>
|
|
<P>The research as a broader goal of showing that evolving topologies is
|
|
necessary to achieve 3 major goals of neuroevolution: (1) Continual
|
|
coevolution: Successful competitive coevolution can use the evolution
|
|
of topologies to continuously elaborate strategies. (2) Evolution of
|
|
Adaptive Networks: The evolution of topologies allows neuroevolution to
|
|
evolve adaptive networks with plastic synapses by designating which
|
|
connections should be adaptive and in what ways. (3) Combining Expert
|
|
Networks: Separate expert neural networks can be fused through the
|
|
evolution of connecting neurons between them.</P>
|
|
|
|
<P>
|
|
<A NAME="NeuroLab"></A> </P>
|
|
<DT><B>NeuroLab</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://packages.python.org/neurolab/">http://packages.python.org/neurolab/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>NeuroLab - a library of basic neural networks algorithms with flexible
|
|
network configurations and learning algorithms for Python. To simplify
|
|
the using of the library, interface is similar to the package of Neural
|
|
Network Toolbox (NNT) of MATLAB (c). The library is based on the
|
|
package numpy (http://numpy.scipy.org), some learning algorithms are
|
|
used scipy.optimize (http://scipy.org).</P>
|
|
|
|
<P>
|
|
<A NAME="NuPIC"></A> </P>
|
|
<DT><B>NuPIC</B><DD><P>
|
|
<UL>
|
|
<LI>Web site: http://www.numenta.org/</LI>
|
|
<LI>Web site: https://github.com/numenta/nupic</LI>
|
|
</UL>
|
|
</P>
|
|
<P>The Numenta Platform for Intelligent Computing (NuPIC) is built around
|
|
Cortical learning algorithms, a new variation of HTM networks
|
|
(Hierarchical Temporal Memory). Based on Jeff Hawkins idea as laid out
|
|
in his On Intelligence book. NuPIC consists of the Numenta Tools
|
|
Framework and the Numenta Runtime Engine.</P>
|
|
|
|
<P>
|
|
<A NAME="Pulcinella"></A> </P>
|
|
<DT><B>Pulcinella</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://iridia.ulb.ac.be/pulcinella/">iridia.ulb.ac.be/pulcinella/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Pulcinella is written in CommonLisp, and appears as a library of
|
|
Lisp functions for creating, modifying and evaluating valuation
|
|
systems. Alternatively, the user can choose to interact with
|
|
Pulcinella via a graphical interface (only available in Allegro
|
|
CL). Pulcinella provides primitives to build and evaluate
|
|
uncertainty models according to several uncertainty calculi,
|
|
including probability theory, possibility theory, and
|
|
Dempster-Shafer's theory of belief functions; and the
|
|
possibility theory by Zadeh, Dubois and Prade's. A User's Manual
|
|
is available on request.</P>
|
|
|
|
<P>
|
|
<A NAME="scnANNlib"></A> </P>
|
|
<DT><B>scnANNlib</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.sentinelchicken.org/projects/scnANNlib/">www.sentinelchicken.org/projects/scnANNlib/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>SCN Artificial Neural Network Library provides a programmer with a
|
|
simple object-oriented API for constructing ANNs. Currently, the
|
|
library supports non-recursive networks with an arbitrary number of
|
|
layers, each with an arbitrary number of nodes. Facilities exist for
|
|
training with momentum, and there are plans to gracefully extend the
|
|
functionality of the library in later releases.</P>
|
|
|
|
<P>
|
|
<A NAME="UTCS"></A> </P>
|
|
<DT><B>UTCS Neural Nets Research Group Software</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://nn.cs.utexas.edu/soft-list.php">http://nn.cs.utexas.edu/soft-list.php</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>A bit different from the other entries, this is a reference to a
|
|
collection of software rather than one application. It was all
|
|
developed by the
|
|
<A HREF="http://nn.cs.utexas.edu/">UTCS Neural Net Research Group</A>. Here's a summary of some of the packages
|
|
available:</P>
|
|
<P>
|
|
<UL>
|
|
<LI>Natural Language Processing
|
|
<UL>
|
|
<LI>MIR - Tcl/Tk-based rapid prototyping for sentence
|
|
processing</LI>
|
|
<LI>SPEC - Parsing complex sentences</LI>
|
|
<LI>DISCERN - Processing script-based stories, including
|
|
<UL>
|
|
<LI>PROC - Parsing, generation, question answering</LI>
|
|
<LI>HFM - Episodic memory organization</LI>
|
|
<LI>DISLEX - Lexical processing</LI>
|
|
<LI>DISCERN - The full integrated model</LI>
|
|
</UL>
|
|
</LI>
|
|
<LI>FGREPNET - Learning distributed representations</LI>
|
|
</UL>
|
|
</LI>
|
|
<LI>Self-Organization
|
|
<UL>
|
|
<LI>LISSOM - Maps with self-organizing lateral connections.</LI>
|
|
<LI>FM - Generic Self-Organizing Maps</LI>
|
|
</UL>
|
|
</LI>
|
|
<LI>Neuroevolution
|
|
<UL>
|
|
<LI>Enforced Sub-Populations (ESP) for sequential decision
|
|
tasks
|
|
<UL>
|
|
<LI>Non-Markov Double Pole Balancing</LI>
|
|
</UL>
|
|
</LI>
|
|
<LI>Symbiotic, Adaptive NeuroEvolution (SANE; predecessor of
|
|
ESP)
|
|
<UL>
|
|
<LI>JavaSANE - Java software package for applying SANE to
|
|
new tasks</LI>
|
|
<LI>SANE-C - C version, predecessor of JavaSANE</LI>
|
|
<LI>Pole Balancing - Neuron-level SANE on the Pole
|
|
Balancing task</LI>
|
|
</UL>
|
|
</LI>
|
|
<LI>NeuroEvolution of Augmenting Topologies (NEAT)
|
|
software for evolving neural networks using structure</LI>
|
|
</UL>
|
|
</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="C++ ANNs"></A> </P>
|
|
<DT><B>Various (C++) Neural Networks</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.dontveter.com/nnsoft/nnsoft.html">www.dontveter.com/nnsoft/nnsoft.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Example neural net codes from the book,
|
|
<A HREF="http://www.dontveter.com/basisofai/basisofai.html">The Pattern Recognition Basics of AI</A>.
|
|
These are simple example codes of these various
|
|
neural nets. They work well as a good starting point for simple
|
|
experimentation and for learning what the code is like behind the
|
|
simulators. The types of networks available on this site are:
|
|
(implemented in C++)</P>
|
|
|
|
<P>
|
|
<UL>
|
|
<LI>The Backprop Package</LI>
|
|
<LI>The Nearest Neighbor Algorithms</LI>
|
|
<LI>The Interactive Activation Algorithm</LI>
|
|
<LI>The Hopfield and Boltzman machine Algorithms</LI>
|
|
<LI>The Linear Pattern Classifier</LI>
|
|
<LI>ART I</LI>
|
|
<LI>Bi-Directional Associative Memory</LI>
|
|
<LI>The Feedforward Counter-Propagation Network</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
|
|
</DL>
|
|
</P>
|
|
|
|
<H2><A NAME="ss3.2">3.2</A> <A HREF="AI-Alife-HOWTO.html#toc3.2">Connectionist software kits/applications</A>
|
|
</H2>
|
|
|
|
|
|
<P>These are various applications, software kits, etc. meant for research
|
|
in the field of Connectionism. Their ease of use will vary, as they
|
|
were designed to meet some particular research interest more than as
|
|
an easy to use commercial package.
|
|
<DL>
|
|
|
|
<P>
|
|
<A NAME="Aspirin-MIGRANES"></A> </P>
|
|
<DT><B>Aspirin - MIGRAINES</B><DD><P>(am6.tar.Z on ftp site)
|
|
<UL>
|
|
<LI>FTP site:
|
|
<A HREF="ftp://sunsite.unc.edu/pub/academic/computer-science/neural-networks/programs/Aspirin/">sunsite.unc.edu/pub/academic/computer-science/neural-networks/programs/Aspirin/</A></LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>The software that we are releasing now is for creating,
|
|
and evaluating, feed-forward networks such as those used with the
|
|
backpropagation learning algorithm. The software is aimed both at
|
|
the expert programmer/neural network researcher who may wish to tailor
|
|
significant portions of the system to his/her precise needs, as well
|
|
as at casual users who will wish to use the system with an absolute
|
|
minimum of effort.</P>
|
|
|
|
|
|
<P>
|
|
<A NAME="DDLab"></A> </P>
|
|
<DT><B>DDLab</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.ddlab.com/">http://www.ddlab.com/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>DDLab is an interactive graphics program for research into the
|
|
dynamics of finite binary networks, relevant to the study of
|
|
complexity, emergent phenomena, neural networks, and aspects of
|
|
theoretical biology such as gene regulatory networks. A network
|
|
can be set up with any architecture between regular CA (1d or
|
|
2d) and "random Boolean networks" (networks with arbitrary
|
|
connections and heterogeneous rules). The network may also have
|
|
heterogeneous neighborhood sizes.</P>
|
|
|
|
<P>
|
|
<A NAME="Emergent"></A> </P>
|
|
<DT><B>Emergent</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://grey.colorado.edu/emergent/index.php/Main_Page">http://grey.colorado.edu/emergent/index.php/Main_Page</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Note: this is a descendant of
|
|
<A HREF="#PDP++">PDP++</A>
|
|
</P>
|
|
<P>emergent is a comprehensive, full-featured neural network simulator
|
|
that allows for the creation and analysis of complex, sophisticated
|
|
models of the brain in the world. With an emphasis on qualitative
|
|
analysis and teaching, it also supports the workflow of professional
|
|
neural network researchers. The GUI environment allows users to quickly
|
|
construct basic networks, modify the input/output patterns,
|
|
automatically generate the basic programs required to train and test
|
|
the network, and easily utilize several data processing and network
|
|
analysis tools. In addition to the basic preset network train and test
|
|
programs, the high level drag-and-drop programming interface, built on
|
|
top of a scripting language that has full introspective access to all
|
|
aspects of networks and the software itself, allows one to write
|
|
programs that seamlessly weave together the training of a network and
|
|
evolution of its environment without ever typing out a line of code.
|
|
Networks and all of their state variables are visually inspected in 3D,
|
|
allowing for a quick "visual regression" of network dynamics and robot
|
|
behavior.</P>
|
|
|
|
<P>
|
|
<A NAME="GENESIS"></A> </P>
|
|
<DT><B>GENESIS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://genesis-sim.org/">http://genesis-sim.org/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>GENESIS (short for GEneral NEural SImulation System) is a
|
|
general purpose simulation platform which was developed to
|
|
support the simulation of neural systems ranging from complex
|
|
models of single neurons to simulations of large networks made
|
|
up of more abstract neuronal components. GENESIS has provided
|
|
the basis for laboratory courses in neural simulation at both
|
|
Caltech and the Marine Biological Laboratory in Woods Hole, MA,
|
|
as well as several other institutions. Most current GENESIS
|
|
applications involve realistic simulations of biological neural
|
|
systems. Although the software can also model more abstract
|
|
networks, other simulators are more suitable for backpropagation
|
|
and similar connectionist modeling.</P>
|
|
|
|
<P>
|
|
<A NAME="JavaBayes"></A> </P>
|
|
<DT><B>JavaBayes</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.cmu.edu/~javabayes/">http://www.cs.cmu.edu/~javabayes/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The JavaBayes system is a set of tools, containing a
|
|
graphical editor, a core inference engine and a parser.
|
|
JavaBayes can produce:
|
|
<UL>
|
|
<LI> the marginal distribution for any variable in a network.</LI>
|
|
<LI> the expectations for univariate functions (for example,
|
|
expected value for variables).</LI>
|
|
<LI> configurations with maximum a posteriori probability.</LI>
|
|
<LI> configurations with maximum a posteriori expectation for
|
|
univariate functions.</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="Jbpe"></A> </P>
|
|
<DT><B>Jbpe</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://cs.felk.cvut.cz/~koutnij/studium/jbpe.html">cs.felk.cvut.cz/~koutnij/studium/jbpe.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Jbpe is a back-propagation neural network editor/simulator.</P>
|
|
<P>Features
|
|
<UL>
|
|
<LI>Standart back-propagation networks creation.</LI>
|
|
<LI>Saving network as a text file, which can be edited and loaded
|
|
back.</LI>
|
|
<LI>Saving/loading binary file</LI>
|
|
<LI>Learning from a text file (with structure specified below),
|
|
number of learning periods / desired network energy can be
|
|
specified as a criterion.</LI>
|
|
<LI>Network recall</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="Nengo"></A> </P>
|
|
<DT><B>Nengo</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.nengo.ca/">http://www.nengo.ca/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Nengo (Nengo Neural Simulator) is a graphical and scripting based
|
|
software package for simulating large-scale neural systems.</P>
|
|
<P>To use it, you define groups of neurons in terms of what they
|
|
represent, and then form connections between neural groups in terms of
|
|
what computation should be performed on those representations. Nengo
|
|
then uses the Neural Engineering Framework (NEF) to solve for the
|
|
appropriate synaptic connection weights to achieve this desired
|
|
computation. Nengo also supports various kinds of learning. Nengo helps
|
|
make detailed spiking neuron models that implement complex high-level
|
|
cognitive algorithms.</P>
|
|
<P>Among other things, Nengo has been used to implement motor control,
|
|
visual attention, serial recall, action selection, working memory,
|
|
attractor networks, inductive reasoning, path integration, and planning
|
|
with problem solving.</P>
|
|
<P>The Spaun
|
|
<A HREF="http://models.nengo.ca/spaun">http://models.nengo.ca/spaun</A> neural simulator
|
|
is implemented in Nengo and its source is available as well.</P>
|
|
|
|
<P>
|
|
<A NAME="NN Generator"></A> </P>
|
|
<DT><B>Neural Network Generator</B><DD><P>
|
|
<UL>
|
|
<LI>FTP site:
|
|
<A HREF="ftp://ftp.idsia.ch/pub/rafal/">ftp.idsia.ch/pub/rafal</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The Neural Network Generator is a genetic algorithm for the
|
|
topological optimization of feedforward neural networks. It
|
|
implements the Semantic Changing Genetic Algorithm and the
|
|
Unit-Cluster Model. The Semantic Changing Genetic Algorithm is
|
|
an extended genetic algorithm that allows fast dynamic
|
|
adaptation of the genetic coding through population
|
|
analysis. The Unit-Cluster Model is an approach to the
|
|
construction of modular feedforward networks with a ''backbone''
|
|
structure.</P>
|
|
<P>NOTE: To compile this on Linux requires one change in the Makefiles.
|
|
You will need to change '-ltermlib' to '-ltermcap'.</P>
|
|
|
|
<P>
|
|
<A NAME="NEURON"></A> </P>
|
|
<DT><B>NEURON</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.neuron.yale.edu/">www.neuron.yale.edu/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>NEURON is an extensible nerve modeling and simulation
|
|
program. It allows you to create complex nerve models by
|
|
connecting multiple one-dimensional sections together to form
|
|
arbitrary cell morphologies, and allows you to insert multiple
|
|
membrane properties into these sections (including channels,
|
|
synapses, ionic concentrations, and counters). The interface was
|
|
designed to present the neural modeler with a intuitive
|
|
environment and hide the details of the numerical methods used
|
|
in the simulation.</P>
|
|
|
|
<P>
|
|
<A NAME="Neuroph"></A> </P>
|
|
<DT><B>Neuroph</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://neuroph.sourceforge.net/">http://neuroph.sourceforge.net/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Neuroph is lightweight Java neural network framework to develop common
|
|
neural network architectures. It contains well designed, open source
|
|
Java library with small number of basic classes which correspond to
|
|
basic NN concepts. Also has nice GUI neural network editor to quickly
|
|
create Java neural network components.</P>
|
|
|
|
<P>
|
|
<A NAME="PDP++"></A> </P>
|
|
<DT><B>PDP++</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://archive.cnbc.cmu.edu/Resources/PDP++/PDP++.html">http://archive.cnbc.cmu.edu/Resources/PDP++/PDP++.html</A></LI>
|
|
<LI>FTP mirror (US):
|
|
<A HREF="ftp://grey.colorado.edu/pub/oreilly/pdp++/">ftp://grey.colorado.edu/pub/oreilly/pdp++/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>NOTE: Renamed to
|
|
<A HREF="#Emergent">Emergent</A>
|
|
</P>
|
|
<P>As the field of Connectionist modeling has grown, so has the need
|
|
for a comprehensive simulation environment for the development and
|
|
testing of Connectionist models. Our goal in developing PDP++ has been
|
|
to integrate several powerful software development and user interface
|
|
tools into a general purpose simulation environment that is both user
|
|
friendly and user extensible. The simulator is built in the C++
|
|
programming language, and incorporates a state of the art script
|
|
interpreter with the full expressive power of C++. The graphical user
|
|
interface is built with the Interviews toolkit, and allows full access
|
|
to the data structures and processing modules out of which the
|
|
simulator is built. We have constructed several useful graphical
|
|
modules for easy interaction with the structure and the contents of
|
|
neural networks, and we've made it possible to change and adapt many
|
|
things. At the programming level, we have set things up in such a way
|
|
as to make user extensions as painless as possible. The programmer
|
|
creates new C++ objects, which might be new kinds of units or new
|
|
kinds of processes; once compiled and linked into the simulator, these
|
|
new objects can then be accessed and used like any other.</P>
|
|
|
|
<P>
|
|
<A NAME="RNS"></A> </P>
|
|
<DT><B>RNS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/rns/">www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/rns/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>RNS (Recurrent Network Simulator) is a simulator for recurrent
|
|
neural networks. Regular neural networks are also supported. The
|
|
program uses a derivative of the back-propagation algorithm, but
|
|
also includes other (not that well tested) algorithms.</P>
|
|
<P>Features include
|
|
<UL>
|
|
<LI>freely choosable connections, no restrictions besides memory
|
|
or CPU constraints</LI>
|
|
<LI>delayed links for recurrent networks</LI>
|
|
<LI>fixed values or thresholds can be specified for weights</LI>
|
|
<LI>(recurrent) back-propagation, Hebb, differential Hebb,
|
|
simulated annealing and more</LI>
|
|
<LI>patterns can be specified with bits, floats, characters,
|
|
numbers, and random bit patterns with Hamming distances can
|
|
be chosen for you</LI>
|
|
<LI>user definable error functions</LI>
|
|
<LI>output results can be used without modification as input</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="Python Smantic Nets"></A> </P>
|
|
<DT><B>Semantic Networks in Python</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://strout.net/info/coding/python/ai/index.html">strout.net/info/coding/python/ai/index.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The semnet.py module defines several simple classes for
|
|
building and using semantic networks. A semantic network is a
|
|
way of representing knowledge, and it enables the program to
|
|
do simple reasoning with very little effort on the part of the
|
|
programmer.</P>
|
|
<P>The following classes are defined:
|
|
<UL>
|
|
<LI><B>Entity</B>: This class represents a noun; it is
|
|
something which can be related to other things, and about
|
|
which you can store facts.</LI>
|
|
<LI><B>Relation</B>: A Relation is a type of relationship
|
|
which may exist between two entities. One special relation,
|
|
"IS_A", is predefined because it has special meaning (a sort
|
|
of logical inheritance).</LI>
|
|
<LI><B>Fact</B>: A Fact is an assertion that a relationship
|
|
exists between two entities.</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>With these three object types, you can very quickly define knowledge
|
|
about a set of objects, and query them for logical conclusions.</P>
|
|
|
|
<P>
|
|
<A NAME="SNNS"></A> </P>
|
|
<DT><B>SNNS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www-ra.informatik.uni-tuebingen.de/SNNS/">http://www-ra.informatik.uni-tuebingen.de/SNNS/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Stuttgart Neural Net Simulator (version 4.1). An awesome neural
|
|
net simulator. Better than any commercial simulator I've seen. The
|
|
simulator kernel is written in C (it's fast!). It supports over 20
|
|
different network architectures, has 2D and 3D X-based graphical
|
|
representations, the 2D GUI has an integrated network editor, and can
|
|
generate a separate NN program in C. SNNS is very powerful, though
|
|
a bit difficult to learn at first. To help with this it comes with
|
|
example networks and tutorials for many of the architectures.
|
|
ENZO, a supplementary system allows you to evolve your networks with
|
|
genetic algorithms.</P>
|
|
|
|
<P>
|
|
<A NAME="TOOLDIAG"></A> </P>
|
|
<DT><B>TOOLDIAG</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.inf.ufes.br/~thomas/home/soft.html">www.inf.ufes.br/~thomas/home/soft.html</A></LI>
|
|
<LI>Alt site:
|
|
<A HREF="http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/tooldiag/0.html">http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/neural/systems/tooldiag/0.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>TOOLDIAG is a collection of methods for statistical pattern
|
|
recognition. The main area of application is classification. The
|
|
application area is limited to multidimensional continuous
|
|
features, without any missing values. No symbolic features
|
|
(attributes) are allowed. The program in implemented in the 'C'
|
|
programming language and was tested in several computing
|
|
environments.</P>
|
|
|
|
<P>
|
|
<A NAME="XNBC"></A> </P>
|
|
<DT><B>XNBC</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.b3e.jussieu.fr/xnbc/">www.b3e.jussieu.fr/xnbc/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>XNBC v8 is a simulation tool for the neuroscientists interested in
|
|
simulating biological neural networks using a user friendly tool.</P>
|
|
<P>XNBC is a software package for simulating biological neural networks.</P>
|
|
<P>Four neuron models are available, three phenomenologic models (xnbc,
|
|
leaky integrator and conditional burster) and an ion-conductance based
|
|
model. Inputs to the simulated neurons can be provided by experimental
|
|
data stored in files, allowing the creation of `hybrid'' networks.</P>
|
|
|
|
|
|
</DL>
|
|
</P>
|
|
|
|
<HR>
|
|
<A HREF="AI-Alife-HOWTO-4.html">Next</A>
|
|
<A HREF="AI-Alife-HOWTO-2.html">Previous</A>
|
|
<A HREF="AI-Alife-HOWTO.html#toc3">Contents</A>
|
|
</BODY>
|
|
</HTML>
|