old-www/LDP/LG/issue19/ai.html

333 lines
13 KiB
HTML

<!--startcut ==========================================================-->
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN">
<HTML>
<HEAD>
<title>Linux and AI Issue 19</title>
</HEAD>
<BODY BGCOLOR="#f7f7f7" TEXT="#000000" LINK="#0000FF" VLINK="#007353"
ALINK="#FF0000">
<!--endcut ============================================================-->
<H4>
&quot;Linux Gazette...<I>making Linux just a little more fun!</I>&quot;
</H4>
<P> <HR> <P>
<!--===================================================================-->
<center>
<H2>Linux and Artificial Intelligence</H2>
<H4>By John Eikenberry,
<a href="mailto:jae@ai.uga.edu">jae@ai.uga.edu</a></H4>
</center>
<P><HR>
Three years ago when I was starting my last year of my masters of
philosophy degree. I found myself asking that eternal question,
"Ok, now what in the hell am I going to do?" Not wanting to
continue on in philosophy, what could a philosopher (and computer
enthusiast) do that would be both fun and profitable. Artificial
Intelligence of course (but you saw that coming didn't you?)
<P></P>
I had fallen in love with Linux in late 1993 and after seeing all
the Suns scattered about the AI Dept, it seemed like the perfect
OS for AI research. Guess what, I was right. I have found so many
resources available for doing AI research on Linux that I had to
write them all down (warning: blatant plug follows), thus my <a
href="http://www.ai.uga.edu/students/jae/ai.html">Linux AI/Alife
mini-HOWTO</a> came into being. <P></P>
Ok, enough of this drivel, now on to the meat of the article.
<P></P>
Modern AI is a many faceted field of research, dealing with
everything from 'traditional' logic based systems, to
connectionism, evolutionary computing, artificial life, and
autonomous agents. With Unix being the main platform for AI, there
are many excellent resources available for Linux in each of these
areas. The rest of this article I'll give a brief description
of each of these areas along with one of the more interesting
resources available to the Linux user. <P></P>
<hr>
<DL>
<DT><b>PROGRAMMING LANGUAGES</B>
<DD>
I know I didn't mention this above, but there are many
programming languages that have been specifically designed
with AI applications in mind.
</DD>
<P></P>
<DT>DFKI OZ<BR>
Web page: <A HREF="http://www.ps.uni-sb.de/oz/">www.ps.uni-sb.de/oz/</A><BR>
FTP site: <A HREF="ftp://ps-ftp.dfki.uni-sb.de/pub/oz2/">ps-ftp.dfki.uni-sb.de/pub/oz2/</A>
<DD>
Oz is a high-level programming language designed for
concurrent symbolic computation. It is based on a new
computation model providing a uniform and simple foundation
for several programming paradigms, including higher-order
functional, constraint logic, and concurrent object-oriented
programming. Oz is designed as a successor to languages such
as Lisp, Prolog and Smalltalk, which fail to support
applications that require concurrency, reactivity, and
real-time control. <P></P>
DFKI Oz is an interactive implementation of Oz featuring a
programming interface based on GNU Emacs, a concurrent
browser, an object-oriented interface to Tcl/Tk, powerful
interoperability features (sockets, C, C++), an incremental
compiler, a garbage collector, and support for stand-alone
applications. Performance is competitive with commercial
Prolog and Lisp systems.
</DD>
</DL>
<hr>
<DL>
<DT><B>TRADITIONAL ARTIFICIAL INTELLIGENCE</B>
<DD>
Traditional AI is based around the ideas of logic, rule
systems, linguistics, and the concept of rationality. At its
roots are programming languages such as Lisp and Prolog.
Expert systems are the largest successful example of this
paradigm. An expert system consists of a detailed knowledge
base and a complex rule system to utilize it. Such systems
have been used for such things as medical diagnosis support
and credit checking systems.
</DD>
<P></P>
<DT>SNePS<BR>
Web site: <A
HREF="http://www.cs.buffalo.edu/pub/sneps/WWW/">www.cs.buffalo.edu/pub/sneps/WWW/</A><BR>
FTP site: <A
HREF="ftp://ftp.cs.buffalo.edu/pub/sneps/">ftp.cs.buffalo.edu/pub/sneps/</A>
<DD>
The long-term goal of The SNePS Research Group is the design
and construction of a natural-language-using computerized
cognitive agent, and carrying out the research in artificial
intelligence, computational linguistics, and cognitive science
necessary for that endeavor. The three-part focus of the group
is on knowledge representation, reasoning, and natural-language
understanding and generation. The group is widely known for its
development of the SNePS knowledge representation/reasoning
system, and Cassie, its computerized cognitive agent.
</DD>
</DL>
<hr>
<DL>
<DT><B>CONNECTIONISM</B>
<DD>
Connectionism is a technical term for a group of related
techniques. These techniques include areas such as Artificial
Neural Networks, Semantic Networks and a few other similar
ideas. My present focus is on neural networks (though I am
looking for resources on the other techniques). Neural
networks are programs designed to simulate the workings of the
brain. They consist of a network of small mathematical-based
nodes, which work together to form patterns of information.
They have tremendous potential and currently seem to be having
a great deal of success with image processing and robot
control.
</DD>
<P></P>
<DT>PDP++<BR>
Web site: <a href="http://www.cnbc.cmu.edu/PDP++/">www.cnbc.cmu.edu/PDP++/</a><br>
FTP site (US): <a href="ftp://cnbc.cmu.edu/pub/pdp++/">cnbc.cmu.edu/pub/pdp++/</a><br>
FTP site (Europe): <a href="ftp://unix.hensa.ac.uk/mirrors/pdp++/">
unix.hensa.ac.uk/mirrors/pdp++/ </a>
<DD>
As the field of connectionist modeling has grown, so has the
need for a comprehensive simulation environment for the
development and testing of connectionist models. Our goal in
developing PDP++ has been to integrate several powerful
software development and user interface tools into a general
purpose simulation environment that is both user friendly and
user extensible. The simulator is built in the C++ programming
language, and incorporates a state of the art script
interpreter with the full expressive power of C++. The
graphical user interface is built with the Interviews toolkit,
and allows full access to the data structures and processing
modules out of which the simulator is built. We have
constructed several useful graphical modules for easy
interaction with the structure and the contents of neural
networks, and we've made it possible to change and adapt many
things. At the programming level, we have set things up in
such a way as to make user extensions as painless as
possible. The programmer creates new C++ objects, which might
be new kinds of units or new kinds of processes; once compiled
and linked into the simulator, these new objects can then be
accessed and used like any other.
</DD>
</DL>
<hr>
<DL>
<DT><B>EVOLUTIONARY COMPUTING [EC]</B>
<DD>
Evolutionary computing is actually a broad term for a vast
array of programming techniques, including genetic algorithms,
complex adaptive systems, evolutionary programming, etc.
The main thrust of all these techniques is the idea of
evolution. The idea that a program can be written that will
<i>evolve</i> toward a certain goal. This goal can be
anything from solving some engineering problem to winning a
game.
</DD>
<P></P>
<DT>GAGS<BR>
Web site: <A HREF="http://kal-el.ugr.es/gags.html">kal-el.ugr.es/gags.html</A><BR>
FTP site: <A HREF="ftp://kal-el.ugr.es/GAGS/">kal-el.ugr.es/GAGS/</A>
<DD>
Genetic Algorithm </a> application generator and class library
written mainly in C++.
<BR>
As a class library, and among other thing, GAGS includes:
<UL>
<LI>A <em>chromosome hierarchy</em> with variable length
chromosomes. <em>Genetic operators</em>: 2-point crossover,
uniform crossover, bit-flip mutation, transposition (gene
interchange between 2 parts of the chromosome), and
variable-length operators: duplication, elimination, and
random addition.
<LI><em>Population level operators</em> include steady
state, roulette wheel and tournament selection.
<LI><em>Gnuplot wrapper</em>: turns gnuplot into a
<code>iostreams</code>-like class.
<LI>Easy sample file loading and configuration file parsing.
</ul>
As an application generator (written in <code>PERL</code>),
you only need to supply it with an ANSI-C or C++ fitness
function, and it creates a C++ program that uses the above
library to 90% capacity, compiles it, and runs it, saving
results and presenting fitness thru <code>gnuplot</code>.
</DD>
</DL>
<hr>
<DL>
<DT><b>ALIFE</b>
<DD>
Alife takes yet another approach to exploring the mysteries of
intelligence. It has many aspects similar to EC and
connectionism, but takes these ideas and gives them a
meta-level twist. Alife emphasizes the development of
intelligence through <i>emergent</i> behavior of <i>complex
adaptive systems</i>. Alife stresses the social or group
based aspects of intelligence. It seeks to understand life and
survival. By studying the behaviors of groups of 'beings' Alife
seeks to discover the way intelligence or higher order
activity emerges from seemingly simple individuals. Cellular
Automata and Conway's Game of Life are probably the most
commonly known applications of this field.
</DD>
<P></P>
<DT>Tierra<BR>
Web site: <A HREF="http://www.hip.atr.co.jp/~ray/tierra/tierra.html">www.hip.atr.co.jp/~ray/tierra/tierra.html</A> <br>
FTP site: <A HREF="ftp://alife.santafe.edu/pub/SOFTWARE/Tierra/">alife.santafe.edu/pub/SOFTWARE/Tierra/</A><BR>
Alternate FTP site: <a href="ftp://ftp.cc.gatech.edu/ac121/linux/science/biology/">ftp.cc.gatech.edu/ac121/linux/science/biology/</a>
<DD>
Tierra's written in the C programming language. This source
code creates a virtual computer and its operating system,
whose architecture has been designed in such a way that the
executable machine codes are evolvable. This means that the
machine code can be mutated (by flipping bits at random) or
recombined (by swapping segments of code between algorithms),
and the resulting code remains functional enough of the time
for natural (or presumably artificial) selection to be able to
improve the code over time.
</DD>
</DL>
<hr>
<DL>
<DT><B>AUTONOMOUS AGENTS</B>
<DD>
Also known as intelligent software agents or just agents, this
area of AI research deals with simple applications of small
programs that aid the user in his/her work. They can be mobile
(able to stop their execution on one machine and resume it on
another) or static (live in one machine). They are usually
specific to the task (and therefore fairly simple) and meant
to help the user much as an assistant would. The most popular
(ie. widely known) use of this type of application to date are
the web robots that many of the indexing engines
(eg. webcrawler) use.
</DD>
<P></P>
<DT>Ara<BR>
Web site: <A HREF="http://www.uni-kl.de/AG-Nehmer/Ara/">www.uni-kl.de/AG-Nehmer/Ara/</A>
<DD>
Ara is a platform for the portable and secure execution of
mobile agents in heterogeneous networks. Mobile agents in this
sense are programs with the ability to change their host
machine during execution while preserving their internal
state. This enables them to handle interactions locally which
otherwise had to be performed remotely. Ara's specific aim in
comparison to similar platforms is to provide full mobile
agent functionality while retaining as much as possible of
established programming models and languages.
</DD>
</DL>
<!--===================================================================-->
<P> <hr> <P>
<center><H5>Copyright &copy; 1997, John Eikenberry<BR>
Published in Issue 19 of the Linux Gazette, July 1997</H5></center>
<!--===================================================================-->
<P> <hr> <P>
<A HREF="./index.html"><IMG ALIGN=BOTTOM SRC="../gx/indexnew.gif"
ALT="[ TABLE OF CONTENTS ]"></A>
<A HREF="../index.html"><IMG ALIGN=BOTTOM SRC="../gx/homenew.gif"
ALT="[ FRONT PAGE ]"></A>
<A HREF="./hallways.html"><IMG SRC="../gx/back2.gif"
ALT=" Back "></A>
<A HREF="./program.html"><IMG SRC="../gx/fwd.gif" ALT=" Next "></A>
<P> <hr> <P>
<!--startcut ==========================================================-->
</BODY>
</HTML>
<!--endcut ============================================================-->