1166 lines
44 KiB
HTML
1166 lines
44 KiB
HTML
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
|
|
<HTML>
|
|
<HEAD>
|
|
<META NAME="GENERATOR" CONTENT="LinuxDoc-Tools 0.9.21">
|
|
<TITLE>GNU/Linux AI & Alife HOWTO: Symbolic Systems (GOFAI)</TITLE>
|
|
<LINK HREF="AI-Alife-HOWTO-3.html" REL=next>
|
|
<LINK HREF="AI-Alife-HOWTO-1.html" REL=previous>
|
|
<LINK HREF="AI-Alife-HOWTO.html#toc2" REL=contents>
|
|
</HEAD>
|
|
<BODY>
|
|
<A HREF="AI-Alife-HOWTO-3.html">Next</A>
|
|
<A HREF="AI-Alife-HOWTO-1.html">Previous</A>
|
|
<A HREF="AI-Alife-HOWTO.html#toc2">Contents</A>
|
|
<HR>
|
|
<H2><A NAME="Symbolic Systems (GOFAI)"></A> <A NAME="s2">2.</A> <A HREF="AI-Alife-HOWTO.html#toc2">Symbolic Systems (GOFAI)</A> </H2>
|
|
|
|
|
|
<P>Traditionally AI was based around the ideas of logic, rule systems,
|
|
linguistics, and the concept of rationality. At its roots are programming
|
|
languages such as Lisp and Prolog though newer systems tend to use more
|
|
popular procedural languages. Expert systems are the largest successful
|
|
example of this paradigm. An expert system consists of a detailed
|
|
knowledge base and a complex rule system to utilize it. Such systems have
|
|
been used for such things as medical diagnosis support and credit checking
|
|
systems.</P>
|
|
|
|
|
|
<H2><A NAME="ss2.1">2.1</A> <A HREF="AI-Alife-HOWTO.html#toc2.1">AI class/code libraries</A>
|
|
</H2>
|
|
|
|
|
|
<P>These are libraries of code or classes for use in programming within
|
|
the artificial intelligence field. They are not meant as stand alone
|
|
applications, but rather as tools for building your own applications.</P>
|
|
<P>
|
|
<DL>
|
|
<P>
|
|
<A NAME="ACL2"></A> </P>
|
|
<DT><B>ACL2</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cliki.net/ACL2">www.cliki.net/ACL2</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>ACL2 (A Computational Logic for Applicative Common Lisp) is a theorem
|
|
prover for industrial applications. It is both a mathematical logic and
|
|
a system of tools for constructing proofs in the logic. ACL2 works
|
|
with GCL (GNU Common Lisp).</P>
|
|
|
|
<P>
|
|
<A NAME="AI Kernel"></A> </P>
|
|
<DT><B>AI Kernel</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://aikernel.sourceforge.net/">aikernel.sourceforge.net</A></LI>
|
|
<LI>Sourceforge site:
|
|
<A HREF="http://sourceforge.net/projects/aikernel/">sourceforge.net/projects/aikernel/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The AI Kernel is a re-usable artificial intelligence engine that uses
|
|
natural language processing and an Activator / Context model to allow
|
|
multi tasking between installed cells.</P>
|
|
|
|
<P>
|
|
<A NAME="AI Search II"></A> </P>
|
|
<DT><B>AI Search II</B><DD><P>
|
|
<UL>
|
|
<LI>WEB site:
|
|
<A HREF="http://www.neiu.edu/~kwtracy/ooai-book/">http://www.neiu.edu/~kwtracy/ooai-book/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Basically, the library offers the programmer a set of search
|
|
algorithms that may be used to solve all kind of different
|
|
problems. The idea is that when developing problem solving software
|
|
the programmer should be able to concentrate on the representation of
|
|
the problem to be solved and should not need to bother with the
|
|
implementation of the search algorithm that will be used to actually
|
|
conduct the search. This idea has been realized by the implementation
|
|
of a set of search classes that may be incorporated in other software
|
|
through <B>C++</B>'s features of derivation and inheritance. The
|
|
following search algorithms have been implemented:</P>
|
|
|
|
<P>
|
|
<UL>
|
|
<LI>depth-first tree and graph search.</LI>
|
|
<LI>breadth-first tree and graph search.</LI>
|
|
<LI>uniform-cost tree and graph search.</LI>
|
|
<LI>best-first search.</LI>
|
|
<LI>bidirectional depth-first tree and graph search.</LI>
|
|
<LI>bidirectional breadth-first tree and graph search.</LI>
|
|
<LI>AND/OR depth tree search.</LI>
|
|
<LI>AND/OR breadth tree search.</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>This library has a corresponding book, "
|
|
<A HREF="http://www.neiu.edu/~kwtracy/ooai-book/">Object-Oriented Artificial Intelligence, Using C++</A>".</P>
|
|
|
|
<P>
|
|
<A NAME="Alchemy"></A> </P>
|
|
<DT><B>Alchemy</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://alchemy.cs.washington.edu/">http://alchemy.cs.washington.edu/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Alchemy is a software package providing a series of algorithms for
|
|
statistical relational learning and probabilistic logic inference,
|
|
based on the Markov logic representation. Alchemy allows you to easily
|
|
develop a wide range of AI applications, including:</P>
|
|
<P>
|
|
<UL>
|
|
<LI>Collective classification</LI>
|
|
<LI>Link prediction</LI>
|
|
<LI>Entity resolution</LI>
|
|
<LI>Social network modeling</LI>
|
|
<LI>Information extraction</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="Aleph"></A> </P>
|
|
<DT><B>Aleph</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.ox.ac.uk/activities/machlearn/Aleph/">http://www.cs.ox.ac.uk/activities/machlearn/Aleph/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>This document provides reference information on A Learning Engine for
|
|
Proposing Hypotheses (Aleph). Aleph is an Inductive Logic Programming
|
|
(ILP) system. Aleph is intended to be a prototype for exploring ideas.
|
|
Aleph is an ILP algorithm implemented in Prolog by Dr Ashwin
|
|
Srinivasan at the Oxford University Computing Laboratory, and is
|
|
written specifically for compilation with the YAP Prolog compiler</P>
|
|
|
|
<P>
|
|
<A NAME="CBR Microprograms"></A> </P>
|
|
<DT><B>Microprograms</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.indiana.edu/~leake/cbr/code/">http://www.cs.indiana.edu/~leake/cbr/code/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>A collection of case-based reasoning "micro" versions of dissertation
|
|
programs that were developed for pedagogical purposes. These programs
|
|
are meant to distill key aspects of the original programs into a form
|
|
that can be easily understood, modified, and extended.</P>
|
|
|
|
<P>
|
|
<A NAME="Chess In List"></A> </P>
|
|
<DT><B>Chess In Lisp (CIL)</B><DD><P>
|
|
<UL>
|
|
<LI>Web site: *found as part of the CLOCC archive at:
|
|
<A HREF="http://clocc.sourceforge.net/">clocc.sourceforge.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The CIL (Chess In Lisp) foundation is a Common Lisp
|
|
implementaion of all the core functions needed for development
|
|
of chess applications. The main purpose of the CIL project is
|
|
to get AI researchers interested in using Lisp to work in the
|
|
chess domain.</P>
|
|
|
|
<P>
|
|
<A NAME="clasp"></A> </P>
|
|
<DT><B>clasp</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.uni-potsdam.de/clasp/">http://www.cs.uni-potsdam.de/clasp/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>clasp is an answer set solver for (extended) normal logic programs. It
|
|
combines the high-level modeling capacities of answer set programming
|
|
(ASP) with state-of-the-art techniques from the area of Boolean
|
|
constraint solving. The primary clasp algorithm relies on
|
|
conflict-driven nogood learning, a technique that proved very
|
|
successful for satisfiability checking (SAT). Unlike other learning ASP
|
|
solvers, clasp does not rely on legacy software, such as a SAT solver
|
|
or any other existing ASP solver. Rather, clasp has been genuinely
|
|
developed for answer set solving based on conflict-driven nogood
|
|
learning. clasp can be applied as an ASP solver (on LPARSE output
|
|
format), as a SAT solver (on simplified DIMACS/CNF format), or as a PB
|
|
solver (on OPB format).</P>
|
|
|
|
<P>
|
|
<A NAME="ConceptNet"></A> </P>
|
|
<DT><B>ConceptNet</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://conceptnet.media.mit.edu/">http://conceptnet.media.mit.edu/</A></LI>
|
|
<LI>Old Web site:
|
|
<A HREF="http://web.media.mit.edu/~hugo/conceptnet/">http://web.media.mit.edu/~hugo/conceptnet/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>ConceptNet aims to give computers access to common-sense knowledge, the
|
|
kind of information that ordinary people know but usually leave
|
|
unstated. The data in ConceptNet was collected from ordinary people who
|
|
contributed it over the Web. ConceptNet represents this data in the
|
|
form of a semantic network, and makes it available to be used in
|
|
natural language processing and intelligent user interfaces.</P>
|
|
<P>This API provides Python code with access to both ConceptNet 3 and the
|
|
development database that will become ConceptNet 4, and the natural
|
|
language tools necessary to work with it. It uses Django for
|
|
interacting with the database.</P>
|
|
|
|
<P>
|
|
<A NAME="ERESYE"></A> </P>
|
|
<DT><B>ERESYE</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://sourceforge.net/projects/eresye/">http://sourceforge.net/projects/eresye/</A></LI>
|
|
<LI>Tutorial:
|
|
<A HREF="http://www.trapexit.org/Artificial_Intelligence_with_Erlang:_the_Domain_of_Relatives">http://www.trapexit.org/Artificial_Intelligence_with_Erlang:_the_Domain_of_Relatives</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>ERESYE means ERlang Expert SYstem Engine. It is a library to write
|
|
expert systems and rule processing engines using the Erlang programming
|
|
language. It allows to create multiple engines, each one with its own
|
|
facts and rules to be processed.</P>
|
|
|
|
<P>
|
|
<A NAME="FFLL"></A> </P>
|
|
<DT><B>FFLL</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://ffll.sourceforge.net/">ffll.sourceforge.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The Free Fuzzy Logic Library (FFLL) is an open source fuzzy logic class
|
|
library and API that is optimized for speed critical applications, such
|
|
as video games. FFLL is able to load files that adhere to the IEC
|
|
61131-7 standard.</P>
|
|
|
|
<P>
|
|
<A NAME="FLiP"></A> </P>
|
|
<DT><B>FLiP</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://staff.washington.edu/jon/flip/www/">http://staff.washington.edu/jon/flip/www/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Flip is a logical framework written in Python. A logical framework is a
|
|
library for defining logics and writing applications such as theorem
|
|
provers. The checker can use different logics; Flip comes with several.
|
|
You can add another logic, or add axioms and derived rules, by writing
|
|
a module in Python. Python is both the object language and the
|
|
metalanguage. Formulas, inference rules, and entire proofs are Python
|
|
expressions. Prover commands are Python functions.</P>
|
|
|
|
<P>
|
|
<A NAME="Fuzzy sets for Ada"></A> </P>
|
|
<DT><B>Fuzzy sets for Ada</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.dmitry-kazakov.de/ada/fuzzy.htm">http://www.dmitry-kazakov.de/ada/fuzzy.htm</A></LI>
|
|
<LI>Freshmeat:
|
|
<A HREF="http://freshmeat.net/projects/fuzzy/">http://freshmeat.net/projects/fuzzy/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Fuzzy sets for Ada is a library providing implementations of confidence
|
|
factors with the operations not, and, or, xor, +, and *, classical
|
|
fuzzy sets with the set-theoretic operations and the operations of the
|
|
possibility theory, intuitionistic fuzzy sets with the operations on
|
|
them, fuzzy logic based on the intuitionistic fuzzy sets and the
|
|
possibility theory; fuzzy numbers, both integer and floating-point with
|
|
conventional arithmetical operations, and linguistic variables and sets
|
|
of linguistic variables with operations on them. String-oriented I/O
|
|
is supported.</P>
|
|
|
|
<P>
|
|
<A NAME="HTK"></A> </P>
|
|
<DT><B>HTK</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://htk.eng.cam.ac.uk/">htk.eng.cam.ac.uk</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The Hidden Markov Model Toolkit (HTK) is a portable toolkit for
|
|
building and manipulating hidden Markov models. HTK consists of a set
|
|
of library modules and tools available in C source form. The tools
|
|
provide sophisticated facilities for speech analysis, HMM training,
|
|
testing and results analysis. The software supports HMMs using both
|
|
continuous density mixture Gaussians and discrete distributions and can
|
|
be used to build complex HMM systems. The HTK release contains
|
|
extensive documentation and examples.</P>
|
|
|
|
<P>
|
|
<A NAME="JCK"></A> </P>
|
|
<DT><B>JCK</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.pms.informatik.uni-muenchen.de/software/jack/">www.pms.informatik.uni-muenchen.de/software/jack/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>JCK is a new library providing constraint programming and search for
|
|
Java.
|
|
<UL>
|
|
<LI>JCK consists of three components:</LI>
|
|
<LI>- JCHR: Java Constraint Handling Rules.
|
|
A high-level language to write constraint solvers.</LI>
|
|
<LI>- JASE: Java Abstract Search Engine.
|
|
A generic search engine for JCHR to solve constraint
|
|
problems.</LI>
|
|
<LI>- VisualCHR:
|
|
An interactive tool to visualize JCHR computations.</LI>
|
|
</UL>
|
|
|
|
Source and documentation available from link above.</P>
|
|
|
|
<P>
|
|
<A NAME="KANREN"></A> </P>
|
|
<DT><B>KANREN</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://kanren.sourceforge.net/">kanren.sourceforge.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>KANREN is a declarative logic programming system with first-class
|
|
relations, embedded in a pure functional subset of Scheme. The system
|
|
has a set-theoretical semantics, true unions, fair scheduling,
|
|
first-class relations, lexically-scoped logical variables, depth-first
|
|
and iterative deepening strategies. The system achieves high
|
|
performance and expressivity without cuts.</P>
|
|
|
|
<P>
|
|
<A NAME="LK"></A> </P>
|
|
<DT><B>LK</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.utoronto.ca/~neto/research/lk/">www.cs.utoronto.ca/~neto/research/lk/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>LK is an implementation of the Lin-Kernighan heuristic for the
|
|
Traveling Salesman Problem and for the minimum weight perfect matching
|
|
problem. It is tuned for 2-d geometric instances, and has been applied
|
|
to certain instances with up to a million cities. Also included are
|
|
instance generators and Perl scripts for munging TSPLIB instances. </P>
|
|
<P>This implementation introduces ``efficient cluster compensation'', an
|
|
experimental algorithmic technique intended to make the Lin-Kernighan
|
|
heuristic more robust in the face of clustered data.</P>
|
|
|
|
<P>
|
|
<A NAME="LingPipe"></A> </P>
|
|
<DT><B>LingPipe</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://alias-i.com/lingpipe/">http://alias-i.com/lingpipe/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>LingPipe is a state-of-the-art suite of natural language processing
|
|
tools written in Java that performs tokenization, sentence detection,
|
|
named entity detection, coreference resolution, classification,
|
|
clustering, part-of-speech tagging, general chunking, fuzzy dictionary
|
|
matching.</P>
|
|
|
|
<P>
|
|
<A NAME="Logfun"></A> </P>
|
|
<DT><B>Logfun</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.irisa.fr/lande/ferre/logfun/">http://www.irisa.fr/lande/ferre/logfun/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Logfun is a library of logic functors. A logic functor is a
|
|
function that can be applied to zero, one or several logics so as
|
|
to produce a new logic as a combination of argument logics. Each
|
|
argument logic can itself be built by combination of logic
|
|
functors. The signature of a logic is made of a parser and a
|
|
printer of formulas, logical operations such as a theorem prover
|
|
for entailment between formulas, and more specific operations
|
|
required by Logical Information Systems (LIS). Logic functors can
|
|
be concrete domains like integers, strings, or algebraic
|
|
combinators like product or sum of logics.</P>
|
|
<P>Logic functors are coded as Objective Caml modules. A logic
|
|
semantics is associated to each of these logic functors. This
|
|
enables to define properties of logics like the consistency and
|
|
completeness of the entailment prover, and to prove under which
|
|
conditions a generated entailement prover satisfies these
|
|
properties given the properties of argument logics.</P>
|
|
|
|
<P>
|
|
<A NAME="Loom"></A> </P>
|
|
<DT><B>Loom</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.isi.edu/isd/LOOM/">http://www.isi.edu/isd/LOOM/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>* Note: Loom has been succeeded by
|
|
<A HREF="#PowerLoom">PowerLoom</A>
|
|
.</P>
|
|
<P>Loom is a language and environment for constructing intelligent
|
|
applications. The heart of Loom is a knowledge representation system
|
|
that is used to provide deductive support for the declarative portion
|
|
of the Loom language. Declarative knowledge in Loom consists of
|
|
definitions, rules, facts, and default rules. A deductive engine called
|
|
a classifier utilizes forward-chaining, semantic unification and
|
|
object-oriented truth maintainance technologies in order to compile the
|
|
declarative knowledge into a network designed to efficiently support
|
|
on-line deductive query processing.</P>
|
|
<P>The Loom system implements a logic-based pattern matcher that drives a
|
|
production rule facility and a pattern-directed method dispatching
|
|
facility that supports the definition of object-oriented methods. The
|
|
high degree of integration between Loom's declarative and procedural
|
|
components permits programmers to utilize logic programming, production
|
|
rule, and object-oriented programming paradigms in a single
|
|
application. Loom can also be used as a deductive layer that overlays
|
|
an ordinary CLOS network. In this mode, users can obtain many of the
|
|
benefits of using Loom without impacting the function or performance of
|
|
their CLOS-based applications.</P>
|
|
|
|
<P>
|
|
<A NAME="maxent"></A> </P>
|
|
<DT><B>maxent</B><DD><P>
|
|
<UL>
|
|
<LI>Python/C++ version:
|
|
<A HREF="http://homepages.inf.ed.ac.uk/lzhang10/maxent_toolkit.html">http://homepages.inf.ed.ac.uk/lzhang10/maxent_toolkit.html</A></LI>
|
|
<LI>Java version:
|
|
<A HREF="http://maxent.sourceforge.net/">maxent.sourceforge.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The Maximum Entropy Toolkit provides a set of tools and library for
|
|
constructing maximum entropy (maxent) models in either Python or C++.
|
|
Maxent Entropy Model is a general purpose machine learning framework
|
|
that has proved to be highly expressive and powerful in statistical
|
|
natural language processing, statistical physics, computer vision and
|
|
many other fields.</P>
|
|
<P>It features conditional maximum entropy models, L-BFGS and GIS
|
|
parameter estimation, Gaussian Prior smoothing, a C++ API, a Python
|
|
extension module, a command line utility, and good documentation. A
|
|
Java version is also available.</P>
|
|
|
|
<P>
|
|
<A NAME="Nyquist"></A> </P>
|
|
<DT><B>Nyquist</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www-2.cs.cmu.edu/~music/nyquist/">www-2.cs.cmu.edu/~music/nyquist/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The Computer Music Project at CMU is developing computer music
|
|
and interactive performance technology to enhance human musical
|
|
experience and creativity. This interdisciplinary effort draws
|
|
on Music Theory, Cognitive Science, Artificial Intelligence and
|
|
Machine Learning, Human Computer Interaction, Real-Time Systems,
|
|
Computer Graphics and Animation, Multimedia, Programming
|
|
Languages, and Signal Processing. A paradigmatic example of
|
|
these interdisciplinary efforts is the creation of interactive
|
|
performances that couple human musical improvisation with
|
|
intelligent computer agents in real-time.</P>
|
|
|
|
<P>
|
|
<A NAME="OpenCyc"></A> </P>
|
|
<DT><B>OpenCyc</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.opencyc.org/">http://www.opencyc.org/</A></LI>
|
|
<LI>Alt Web site:
|
|
<A HREF="http://sourceforge.net/projects/opencyc/">http://sourceforge.net/projects/opencyc/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>OpenCyc is the open source version of Cyc, the largest and most
|
|
complete general knowledge base and commonsense reasoning engine. An
|
|
ontology based on 6000 concepts and 60000 assertions about them.</P>
|
|
|
|
<P>
|
|
<A NAME="Pattern"></A> </P>
|
|
<DT><B>Pattern</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.clips.ua.ac.be/pages/pattern">http://www.clips.ua.ac.be/pages/pattern</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Pattern is a web mining module for the Python programming language. It
|
|
bundles tools for data retrieval (Google + Twitter + Wikipedia API, web
|
|
spider, HTML DOM parser), text analysis (rule-based shallow parser,
|
|
WordNet interface, syntactical + semantical n-gram search algorithm,
|
|
tf-idf + cosine similarity + LSA metrics) and data visualization (graph
|
|
networks).</P>
|
|
|
|
<P>
|
|
<A NAME="PowerLoom"></A> </P>
|
|
<DT><B>PowerLoom</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.isi.edu/isd/LOOM/PowerLoom/">http://www.isi.edu/isd/LOOM/PowerLoom/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>PowerLoom is the successor to the
|
|
<A HREF="#Loom">Loom</A>
|
|
knowledge
|
|
representation system. It provides a language and environment for
|
|
constructing intelligent, knowledge-based applications. PowerLoom uses
|
|
a fully expressive, logic-based representation language (a variant of
|
|
KIF). It uses a natural deduction inference engine that combines
|
|
forward and backward chaining to derive what logically follows from the
|
|
facts and rules asserted in the knowledge base. While PowerLoom is not
|
|
a description logic, it does have a description classifier which uses
|
|
technology derived from the Loom classifier to classify descriptions
|
|
expressed in full first order predicate calculus (see paper). PowerLoom
|
|
uses modules as a structuring device for knowledge bases, and
|
|
ultra-lightweight worlds to support hypothetical reasoning.</P>
|
|
<P>To implement PowerLoom we developed a new programming language called
|
|
STELLA, which is a Strongly Typed, Lisp-like LAnguage that can be
|
|
translated into Lisp, C++ and Java. PowerLoom is written in STELLA and
|
|
therefore available in Common-Lisp, C++ and Java versions.</P>
|
|
|
|
<P>
|
|
<A NAME="PyCLIPS"></A> </P>
|
|
<DT><B>PyCLIPS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://pyclips.sourceforge.net/web/">http://pyclips.sourceforge.net/web/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>PyCLIPS is an extension module for the Python language that embeds full
|
|
CLIPS functionality in Python applications. This means that you can
|
|
provide Python with a strong, reliable, widely used and well documented
|
|
inference engine.</P>
|
|
|
|
<P>
|
|
<A NAME="Pyke"></A> </P>
|
|
<DT><B>Pyke</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://pyke.sourceforge.net/">http://pyke.sourceforge.net/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Pyke is a knowledge-based inference engine (expert system) written in
|
|
100% python that can:</P>
|
|
<P>
|
|
<UL>
|
|
<LI> Do both forward-chaining (data driven) and backward-chaining
|
|
(goal directed) inferencing.
|
|
<UL>
|
|
<LI> Pyke may be embedded into any python program.</LI>
|
|
</UL>
|
|
|
|
</LI>
|
|
<LI> Automatically generate python programs by assembling
|
|
individual python functions into complete call graphs.
|
|
|
|
<UL>
|
|
<LI> This is done through a unique design where the individual
|
|
python functions are attached to backward-chaining rules.
|
|
</LI>
|
|
<LI> Unlike other approaches to code reuse (e.g. Zope adapters
|
|
and generic functions), this allows the inference engine to ensure
|
|
that all of the function's requirements are completely satisfied,
|
|
by examining the entire call graph down to the leaves, before any
|
|
of the functions are executed.
|
|
</LI>
|
|
<LI> This is an optional feature. You don't need to use it if you
|
|
just want the inferencing capability by itself.
|
|
</LI>
|
|
</UL>
|
|
</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="python-dlp"></A> </P>
|
|
<DT><B>python-dlp</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://code.google.com/p/python-dlp/">http://code.google.com/p/python-dlp/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>python-dlp aims to be a contemporary expert system based on the
|
|
Semantic Web technologies. Traditionally, expert systems are an
|
|
application of computing and artificial intelligence with the aim
|
|
of supporting software that attempts to reproduce the deterministic
|
|
behavior of one or more human experts in a specific problem domain.
|
|
It utilizes the efficient RETE_UL algorithm as the 'engine' for the
|
|
expert system</P>
|
|
|
|
<P>
|
|
<A NAME="Reverend"></A> </P>
|
|
<DT><B>Reverend</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://sourceforge.net/projects/reverend/">http://sourceforge.net/projects/reverend/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Reverned is a general purpose Bayesian classifier written in Python. It
|
|
is designed to be easily extended to any application domain.</P>
|
|
|
|
<P>
|
|
<A NAME="Screamer"></A> </P>
|
|
<DT><B>Screamer</B><DD><P>
|
|
<UL>
|
|
<LI>Latest version is part of CLOCC:
|
|
<A HREF="http://clocc.sourceforge.net/">clocc.sourceforge.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Screamer is an extension of Common Lisp that adds support for
|
|
nondeterministic programming. Screamer consists of two
|
|
levels. The basic nondeterministic level adds support for
|
|
backtracking and undoable side effects. On top of this
|
|
nondeterministic substrate, Screamer provides a comprehensive
|
|
constraint programming language in which one can formulate and
|
|
solve mixed systems of numeric and symbolic
|
|
constraints. Together, these two levels augment Common Lisp with
|
|
practically all of the functionality of both Prolog and
|
|
constraint logic programming languages such as CHiP and CLP(R).
|
|
Furthermore, Screamer is fully integrated with Common
|
|
Lisp. Screamer programs can coexist and interoperate with other
|
|
extensions to Common Lisp such as CLOS, CLIM and Iterate.</P>
|
|
|
|
<P>
|
|
<A NAME="SimpleAI"></A> </P>
|
|
<DT><B>SimpleAI</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="https://github.com/simpleai-team/simpleai">https://github.com/simpleai-team/simpleai</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Python library that implements many of the artificial intelligence
|
|
algorithms described on the book "Artificial Intelligence, a Modern
|
|
Approach", from Stuart Russel and Peter Norvig. Emphasis on creating a
|
|
stable, modern, and maintainable version. We are testing the majority
|
|
of the lib, it's available via pip install, has a standard repo and lib
|
|
architecture, well documented, respects the python pep8 guidelines,
|
|
provides only working code (no placeholders for future things), etc.
|
|
Even the internal code is written with readability in mind, not only
|
|
the external API.</P>
|
|
<P>There is also
|
|
<A HREF="https://code.google.com/p/aima-python/">https://code.google.com/p/aima-python/</A>
|
|
which implements these algorithms as well. Though it hasn't seen
|
|
activity in a while.</P>
|
|
|
|
<P>
|
|
<A NAME="SPASS"></A> </P>
|
|
<DT><B>SPASS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.spass-prover.org/">http://www.spass-prover.org/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>SPASS: An Automated Theorem Prover for First-Order Logic with Equality</P>
|
|
<P>If you are interested in first-order logic theorem proving, the formal
|
|
analysis of software, systems, protocols, formal approaches to AI
|
|
planning, decision procedures, modal logic theorem proving, SPASS may
|
|
offer you the right functionality.</P>
|
|
|
|
<P>
|
|
<A NAME="Torch"></A> </P>
|
|
<DT><B>Torch</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.torch.ch/">www.torch.ch</A></LI>
|
|
<LI>Successor:
|
|
<A HREF="AI-Alife-HOWTO-7.html#Torch5">Torch5</A>
|
|
</LI>
|
|
</UL>
|
|
</P>
|
|
<P>Torch is a machine-learning library, written in C++. Its aim is to
|
|
provide the state-of-the-art of the best algorithms. It is, and it
|
|
will be, in development forever.</P>
|
|
<P>
|
|
<UL>
|
|
<LI>Many gradient-based methods, including multi-layered
|
|
perceptrons, radial basis functions, and mixtures of experts. Many
|
|
small "modules" (Linear module, Tanh module, SoftMax module, ...)
|
|
can be plugged together.
|
|
</LI>
|
|
<LI>Support Vector Machine, for classification and regression.
|
|
</LI>
|
|
<LI>Distribution package, includes Kmeans, Gaussian Mixture
|
|
Models, Hidden Markov Models, and Bayes Classifier, and classes for
|
|
speech recognition with embedded training.
|
|
</LI>
|
|
<LI>Ensemble models such as Bagging and Adaboost.
|
|
</LI>
|
|
<LI>Non-parametric models such as K-nearest-neighbors, Parzen
|
|
Regression and Parzen Density Estimator.
|
|
</LI>
|
|
<LI></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Torch is an open library whose authors encourage everybody to develop
|
|
new packages to be included in future versions on the official website.</P>
|
|
|
|
</DL>
|
|
</P>
|
|
|
|
|
|
<H2><A NAME="ss2.2">2.2</A> <A HREF="AI-Alife-HOWTO.html#toc2.2">AI software kits, applications, etc.</A>
|
|
</H2>
|
|
|
|
|
|
<P>These are various applications, software kits, etc. meant for research
|
|
in the field of artificial intelligence. Their ease of use will vary,
|
|
as they were designed to meet some particular research interest more
|
|
than as an easy to use commercial package.</P>
|
|
<P>
|
|
<DL>
|
|
<P>
|
|
<A NAME="ASA"></A> </P>
|
|
<DT><B>ASA - Adaptive Simulated Annealing</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.ingber.com/#ASA-CODE">http://www.ingber.com/#ASA-CODE</A></LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>ASA (Adaptive Simulated Annealing) is a powerful global
|
|
optimization C-code algorithm especially useful for nonlinear and/or
|
|
stochastic systems.</P>
|
|
<P>ASA is developed to statistically find the best global fit of a
|
|
nonlinear non-convex cost-function over a D-dimensional space. This
|
|
algorithm permits an annealing schedule for 'temperature' T decreasing
|
|
exponentially in annealing-time k, T = T_0 exp(-c k^1/D).
|
|
The introduction of re-annealing also permits adaptation to changing
|
|
sensitivities in the multi-dimensional parameter-space. This annealing
|
|
schedule is faster than fast Cauchy annealing, where T = T_0/k,
|
|
and much faster than Boltzmann annealing, where T = T_0/ln k.</P>
|
|
|
|
<P>
|
|
<A NAME="Babylon"></A> </P>
|
|
<DT><B>Babylon</B><DD><P>
|
|
<UL>
|
|
<LI>Archive:
|
|
<A HREF="http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/expert/systems/babylon/">http://www.cs.cmu.edu/afs/cs/project/ai-repository/ai/areas/expert/systems/babylon/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>BABYLON is a modular, configurable, hybrid environment for
|
|
developing expert systems. Its features include objects, rules with
|
|
forward and backward chaining, logic (Prolog) and constraints. BABYLON
|
|
is implemented and embedded in Common Lisp.</P>
|
|
|
|
<P>
|
|
<A NAME="cfengine"></A> </P>
|
|
<DT><B>cfengine</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.iu.hio.no/cfengine/">www.iu.hio.no/cfengine/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Cfengine, or the configuration engine is a very high level language for
|
|
building expert systems which administrate and configure large computer
|
|
networks. Cfengine uses the idea of classes and a primitive form of
|
|
intelligence to define and automate the configuration of large systems
|
|
in the most economical way possible. Cfengine is design to be a part of
|
|
computer immune systems.</P>
|
|
|
|
<P>
|
|
<A NAME="CLIPS"></A> </P>
|
|
<DT><B>CLIPS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://clipsrules.sourceforge.net/">http://clipsrules.sourceforge.net/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>CLIPS is a productive development and delivery expert system tool
|
|
which provides a complete environment for the construction of rule
|
|
and/or object based expert systems.</P>
|
|
<P>CLIPS provides a cohesive tool for handling a wide variety of
|
|
knowledge with support for three different programming paradigms:
|
|
rule-based, object-oriented and procedural. Rule-based programming
|
|
allows knowledge to be represented as heuristics, or "rules of thumb,"
|
|
which specify a set of actions to be performed for a given
|
|
situation. Object-oriented programming allows complex systems to be
|
|
modeled as modular components (which can be easily reused to model
|
|
other systems or to create new components). The procedural
|
|
programming capabilities provided by CLIPS are similar to capabilities
|
|
found in languages such as C, Pascal, Ada, and LISP.</P>
|
|
|
|
<P>
|
|
<A NAME="EMA-XPS"></A> </P>
|
|
<DT><B>EMA-XPS - A Hybrid Graphic Expert System Shell</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://ema-xps.org/">http://ema-xps.org/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>EMA-XPS is a hybrid graphic expert system shell based on the
|
|
ASCII-oriented shell Babylon 2.3 of the German National Research
|
|
Center for Computer Sciences (GMD). In addition to Babylon's AI-power
|
|
(object oriented data representation, forward and backward chained
|
|
rules - collectible into sets, horn clauses, and constraint networks)
|
|
a graphic interface based on the X11 Window System and the OSF/Motif
|
|
Widget Library has been provided.</P>
|
|
|
|
<P>
|
|
<A NAME="Eprover"></A> </P>
|
|
<DT><B>Eprover</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.eprover.org/">http://www.eprover.org/</A></LI>
|
|
<LI>Web site:
|
|
<A HREF="http://www4.informatik.tu-muenchen.de/~schulz/WORK/eprover.html">http://www4.informatik.tu-muenchen.de/~schulz/WORK/eprover.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The E Equational Theorem Prover is a purely equational theorem prover.
|
|
The core proof procedure operates on formulas in clause normal form,
|
|
using a calculus that combines superposition (with selection of negative
|
|
literals) and rewriting. No special rules for non-equational literals
|
|
have been implemented, i.e., resolution is simulated via paramodulation
|
|
and equality resolution. The basic calculus is extended with rules for AC
|
|
redundancy elemination, some contextual simplification, and
|
|
pseudo-splitting. The latest version of E also supports simultaneous
|
|
paramodulation, either for all inferences or for selected inferences.</P>
|
|
<P>E is based on the DISCOUNT-loop variant of the given-clause algorithm,
|
|
i.e. a strict separation of active and passive facts. Proof search in E
|
|
is primarily controlled by a literal selection strategy, a clause
|
|
evaluation heuristic, and a simplification ordering. The prover supports
|
|
a large number of preprogrammed literal selection strategies, many of
|
|
which are only experimental. Clause evaluation heuristics can be
|
|
constructed on the fly by combining various parameterized primitive
|
|
evaluation functions, or can be selected from a set of predefined
|
|
heuristics. Supported term orderings are several parameterized instances
|
|
of Knuth-Bendix-Ordering (KBO) and Lexicographic Path Ordering (LPO). </P>
|
|
|
|
<P>
|
|
<A NAME="Fool-Fox"></A> </P>
|
|
<DT><B>FOOL & FOX</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://rhaug.de/fool/">rhaug.de/fool/</A></LI>
|
|
<LI>FTP site:
|
|
<A HREF="ftp://ftp.informatik.uni-oldenburg.de/pub/fool/">ftp.informatik.uni-oldenburg.de/pub/fool/</A></LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>FOOL stands for the Fuzzy Organizer OLdenburg. It is a result from
|
|
a project at the University of Oldenburg. FOOL is a graphical user
|
|
interface to develop fuzzy rulebases. FOOL will help you to invent
|
|
and maintain a database that specifies the behavior of a
|
|
fuzzy-controller or something like that.</P>
|
|
|
|
<P>FOX is a small but powerful fuzzy engine which reads this database,
|
|
reads some input values and calculates the new control value.</P>
|
|
|
|
<P>
|
|
<A NAME="FreeHAL"></A> </P>
|
|
<DT><B>FreeHAL</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="https://freehal.net">https://freehal.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>FreeHAL is a self-learning conversation simulator which uses semantic
|
|
nets to organize its knowledge.</P>
|
|
<P>FreeHAL uses a semantic network, pattern matching, stemmers, part of
|
|
speech databases, part of speech taggers, and Hidden Markov Models.
|
|
Both the online and the download version support TTS.</P>
|
|
|
|
<P>
|
|
<A NAME="FUF-SURGE"></A> </P>
|
|
<DT><B>FUF and SURGE</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.bgu.ac.il/surge/index.html">http://www.cs.bgu.ac.il/surge/index.html</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>FUF is an extended implementation of the formalism of functional
|
|
unification grammars (FUGs) introduced by Martin Kay specialized to
|
|
the task of natural language generation. It adds the following
|
|
features to the base formalism:
|
|
<UL>
|
|
<LI>Types and inheritance.</LI>
|
|
<LI>Extended control facilities (goal freezing, intelligent
|
|
backtracking).</LI>
|
|
<LI>Modular syntax.</LI>
|
|
</UL>
|
|
|
|
These extensions allow the development of large grammars which can be
|
|
processed efficiently and can be maintained and understood more
|
|
easily. SURGE is a large syntactic realization grammar of English
|
|
written in FUF. SURGE is developed to serve as a black box syntactic
|
|
generation component in a larger generation system that encapsulates a
|
|
rich knowledge of English syntax. SURGE can also be used as a platform
|
|
for exploration of grammar writing with a generation perspective.</P>
|
|
|
|
<P>
|
|
<A NAME="GATE"></A> </P>
|
|
<DT><B>GATE</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://gate.ac.uk/">http://gate.ac.uk/</A></LI>
|
|
<LI>Alt site:
|
|
<A HREF="http://sourceforge.net/projects/gate">http://sourceforge.net/projects/gate</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>GATE (General Architecture for Text Engineering) is an architecture,
|
|
framework and development environment for developing, evaluating and
|
|
embedding Human Language Technology.</P>
|
|
<P>GATE is made up of three elements:
|
|
<UL>
|
|
<LI>An architecture describing how language processing systems are
|
|
made up of components.</LI>
|
|
<LI>A framework (or class library, or SDK), written in Java and
|
|
tested on Linux, Windoze and Solaris.</LI>
|
|
<LI>A graphical development environment built on the framework.</LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="Grammar Workbench"></A> </P>
|
|
<DT><B>The Grammar Workbench</B><DD><P>
|
|
<UL>
|
|
<LI>Web site: ???
|
|
<A HREF="http://www.cs.kun.nl/agfl/">www.cs.kun.nl/agfl/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Seems to be obsolete??? Its gone from the site, though its parent
|
|
project is still ongoing.</P>
|
|
<P>The Grammar Workbench, or GWB for short, is an environment for the
|
|
comfortable development of Affix Grammars in the AGFL-formalism. Its
|
|
purposes are:
|
|
<UL>
|
|
<LI>to allow the user to input, inspect and modify a grammar; </LI>
|
|
<LI>to perform consistency checks on the grammar; </LI>
|
|
<LI>to compute grammar properties; </LI>
|
|
<LI>to generate example sentences; </LI>
|
|
<LI>to assist in performing grammar transformations. </LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>
|
|
<A NAME="GSM Suite"></A> </P>
|
|
<DT><B>GSM Suite</B><DD><P>
|
|
<UL>
|
|
<LI>Alt site:
|
|
<A HREF="http://www.ibiblio.org/pub/Linux/apps/graphics/draw/">www.ibiblio.org/pub/Linux/apps/graphics/draw/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The GSM Suite is a set of programs for using Finite State
|
|
Machines in a graphical fashion. The suite consists of programs
|
|
that edit, compile, and print state machines. Included in the
|
|
suite is an editor program, gsmedit, a compiler, gsm2cc, that
|
|
produces a C++ implementation of a state machine, a PostScript
|
|
generator, gsm2ps, and two other minor programs. GSM is licensed
|
|
under the GNU Public License and so is free for your use under
|
|
the terms of that license.</P>
|
|
|
|
<P>
|
|
<A NAME="Isabelle"></A> </P>
|
|
<DT><B>Isabelle</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://isabelle.in.tum.de/">isabelle.in.tum.de</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Isabelle is a popular generic theorem prover developed at Cambridge
|
|
University and TU Munich. Existing logics like Isabelle/HOL provide a
|
|
theorem proving environment ready to use for sizable applications.
|
|
Isabelle may also serve as framework for rapid prototyping of deductive
|
|
systems. It comes with a large library including Isabelle/HOL
|
|
(classical higher-order logic), Isabelle/HOLCF (Scott's Logic for
|
|
Computable Functions with HOL), Isabelle/FOL (classical and
|
|
intuitionistic first-order logic), and Isabelle/ZF (Zermelo-Fraenkel
|
|
set theory on top of FOL).</P>
|
|
|
|
<P>
|
|
<A NAME="Jess"></A> </P>
|
|
<DT><B>Jess, the Java Expert System Shell</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://herzberg.ca.sandia.gov/jess/">herzberg.ca.sandia.gov/jess/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Jess is a clone of the popular CLIPS expert system shell written
|
|
entirely in Java. With Jess, you can conveniently give your
|
|
applets the ability to 'reason'. Jess is compatible with all
|
|
versions of Java starting with version 1.0.2. Jess implements
|
|
the following constructs from CLIPS: defrules, deffunctions,
|
|
defglobals, deffacts, and deftemplates. </P>
|
|
|
|
<P>
|
|
<A NAME="learn"></A> </P>
|
|
<DT><B>learn</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.ibiblio.org/pub/Linux/apps/cai/">www.ibiblio.org/pub/Linux/apps/cai/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Learn is a vocable learning program with memory model. </P>
|
|
|
|
<P>
|
|
<A NAME="LISA"></A> </P>
|
|
<DT><B>LISA</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://lisa.sourceforge.net/">lisa.sourceforge.net</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>LISA (Lisp-based Intelligent Software Agents) is a production-rule
|
|
system heavily influenced by JESS (Java Expert System Shell). It has at
|
|
its core a reasoning engine based on the Rete pattern matching
|
|
algorithm. LISA also provides the ability to reason over ordinary CLOS
|
|
objects.</P>
|
|
|
|
<P>
|
|
<A NAME="Livingstone2"></A> </P>
|
|
<DT><B>Livingstone2</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://ti.arc.nasa.gov/opensource/projects/livingstone2/">http://ti.arc.nasa.gov/opensource/projects/livingstone2/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Livingstone2 (L2) is a reusable artificial intelligence (AI) software
|
|
system designed to assist spacecraft, life support systems, chemical
|
|
plants or other complex systems in operating robustly with minimal
|
|
human supervision, even in the face of hardware failures or unexpected
|
|
events.</P>
|
|
|
|
<P>
|
|
<A NAME="NICOLE"></A> </P>
|
|
<DT><B>NICOLE</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://nicole.sourceforge.net/">http://nicole.sourceforge.net/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>NICOLE (Nearly Intelligent Computer Operated Language Examiner) is a
|
|
theory or experiment that if a computer is given enough combinations of
|
|
how words, phrases and sentences are related to one another, it could
|
|
talk back to you. It is an attempt to simulate a conversation by
|
|
learning how words are related to other words. A human communicates
|
|
with NICOLE via the keyboard and NICOLE responds back with its own
|
|
sentences which are automatically generated, based on what NICOLE has
|
|
stored in it's database. Each new sentence that has been typed in, and
|
|
NICOLE doesn't know about, is included into NICOLE's database, thus
|
|
extending the knowledge base of NICOLE.</P>
|
|
|
|
<P>
|
|
<A NAME="Otter"></A> </P>
|
|
<DT><B>Otter: An Automated Deduction System</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www-unix.mcs.anl.gov/AR/otter/">http://www-unix.mcs.anl.gov/AR/otter/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>Our current automated deduction system Otter is designed to prove
|
|
theorems stated in first-order logic with equality. Otter's
|
|
inference rules are based on resolution and paramodulation, and it
|
|
includes facilities for term rewriting, term orderings, Knuth-Bendix
|
|
completion, weighting, and strategies for directing and restricting
|
|
searches for proofs. Otter can also be used as a symbolic
|
|
calculator and has an embedded equational programming system.</P>
|
|
|
|
<P>
|
|
<A NAME="PVS"></A> </P>
|
|
<DT><B>PVS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://pvs.csl.sri.com/">pvs.csl.sri.com/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>PVS is a verification system: that is, a specification language
|
|
integrated with support tools and a theorem prover. It is
|
|
intended to capture the state-of-the-art in mechanized formal
|
|
methods and to be sufficiently rugged that it can be used for
|
|
significant applications. PVS is a research prototype: it
|
|
evolves and improves as we develop or apply new capabilities,
|
|
and as the stress of real use exposes new requirements.</P>
|
|
|
|
<P>
|
|
<A NAME="SNePS"></A> </P>
|
|
<DT><B>SNePS</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cse.buffalo.edu/sneps/">www.cse.buffalo.edu/sneps/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>The long-term goal of The SNePS Research Group is the design and
|
|
construction of a natural-language-using computerized cognitive
|
|
agent, and carrying out the research in artificial intelligence,
|
|
computational linguistics, and cognitive science necessary for
|
|
that endeavor. The three-part focus of the group is on knowledge
|
|
representation, reasoning, and natural-language understanding
|
|
and generation. The group is widely known for its development of
|
|
the SNePS knowledge representation/reasoning system, and Cassie,
|
|
its computerized cognitive agent. </P>
|
|
|
|
<P>
|
|
<A NAME="Soar"></A> </P>
|
|
<DT><B>Soar</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://sitemaker.umich.edu/soar">sitemaker.umich.edu/soar</A></LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>Soar has been developed to be a general cognitive architecture.
|
|
We intend ultimately to enable the Soar architecture to:
|
|
<UL>
|
|
<LI>work on the full range of tasks expected of an
|
|
intelligent agent, from highly routine to extremely difficult,
|
|
open-ended problems</LI>
|
|
<LI>represent and use appropriate forms of knowledge, such as
|
|
procedural, declarative, episodic, and possibly iconic</LI>
|
|
<LI>employ the full range of problem solving methods</LI>
|
|
<LI>interact with the outside world and</LI>
|
|
<LI>learn about all aspects of the tasks and its performance on them. </LI>
|
|
</UL>
|
|
|
|
In other words, our intention is for Soar to support all the
|
|
capabilities required of a general intelligent agent.</P>
|
|
<P>
|
|
<A NAME="TCM"></A> </P>
|
|
<DT><B>TCM</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://wwwhome.cs.utwente.nl/~tcm/">http://wwwhome.cs.utwente.nl/~tcm/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>TCM (Toolkit for Conceptual Modeling) is our suite of graphical
|
|
editors. TCM contains graphical editors for Entity-Relationship
|
|
diagrams, Class-Relationship diagrams, Data and Event Flow
|
|
diagrams, State Transition diagrams, Jackson Process Structure
|
|
diagrams and System Network diagrams, Function Refinement trees
|
|
and various table editors, such as a Function-Entity table
|
|
editor and a Function Decomposition table editor. TCM is easy
|
|
to use and performs numerous consistency checks, some of them
|
|
immediately, some of them upon request.</P>
|
|
|
|
<P>
|
|
<A NAME="Yale"></A> </P>
|
|
<DT><B>Yale</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://yale.sf.net/">yale.sf.net/</A></LI>
|
|
<LI>Alt Web site:
|
|
<A HREF="http://rapid-i.com/">rapid-i.com/</A></LI>
|
|
</UL>
|
|
</P>
|
|
<P>YALE (Yet Another Learning Environment) is an environment for machine
|
|
learning experiments. Experiments can be made up of a large number of
|
|
arbitrarily nestable operators and their setup is described by XML
|
|
files which can easily created with a graphical user interface.
|
|
Applications of YALE cover both research and real-world learning tasks.</P>
|
|
|
|
<P>
|
|
<A NAME="WEKA"></A> </P>
|
|
<DT><B>WEKA</B><DD><P>
|
|
<UL>
|
|
<LI>Web site:
|
|
<A HREF="http://www.cs.waikato.ac.nz/~ml/">lucy.cs.waikato.ac.nz/~ml/</A></LI>
|
|
</UL>
|
|
</P>
|
|
|
|
<P>WEKA (Waikato Environment for Knowledge Analysis) is an
|
|
state-of-the-art facility for applying machine learning
|
|
techniques to practical problems. It is a comprehensive software
|
|
"workbench" that allows people to analyse real-world data. It
|
|
integrates different machine learning tools within a common
|
|
framework and a uniform user interface. It is designed to
|
|
support a "simplicity-first" methodology, which allows users to
|
|
experiment interactively with simple machine learning tools
|
|
before looking for more complex solutions.</P>
|
|
|
|
|
|
</DL>
|
|
</P>
|
|
|
|
|
|
|
|
<HR>
|
|
<A HREF="AI-Alife-HOWTO-3.html">Next</A>
|
|
<A HREF="AI-Alife-HOWTO-1.html">Previous</A>
|
|
<A HREF="AI-Alife-HOWTO.html#toc2">Contents</A>
|
|
</BODY>
|
|
</HTML>
|