Newsgroups: sci.physics,sci.skeptic,alt.consciousness,sci.psychology,comp.ai.philosophy,sci.bio,sci.philosophy.meta,rec.arts.books,rec.arts.sf.science
From: ohgs@chatham.demon.co.uk (Oliver Sparrow)
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!pipex!demon!chatham.demon.co.uk!ohgs
Subject: Re: Information And Entropy (models)
Distribution: inet
References: <383svn$js9@galaxy.ucr.edu> <1994Oct20.214734.15940@forte.com> <38ad0j$gc4@scunix2.harvard.edu> <38h8cm$hua@netaxs.com> <39bnop$1d1@news-rocq.inria.fr> <39dme3$afl@netaxs.com> <malloy00.784047972@pentagon.io.com>
Organization: Royal Institute of International Affairs
Reply-To: ohgs@chatham.demon.co.uk
X-Newsreader: Demon Internet Simple News v1.27
Lines: 83
Date: Tue, 15 Nov 1994 13:48:13 +0000
Message-ID: <784907293snz@chatham.demon.co.uk>
Sender: usenet@demon.co.uk
Xref: glinda.oz.cs.cmu.edu sci.physics:100315 sci.skeptic:95292 sci.psychology:29753 comp.ai.philosophy:22061 sci.bio:23111 sci.philosophy.meta:14765

One needs to unbundle the idea of "information" to a degree in order to have  a 
useful discussion. We use the word in three senses:

1: In respect of data heaps.
2: In respect of interpretive structures which make something of data.
3: In respect of systems of understanding which offer recursive self-validation.

[Expand, using both sides of the paper :) ]

People talk about data in a number of ways. Given "pure" data, how much can you 
stuff through a conduit of a given capacity? Given signals which carry data, 
how much extraneous signal corresponds to how much lost signal: S/N ratios and 
so forth. Given pure signals which represent the entities in which data are 
encoded - e.g. ASCII - then how much data is embedded in the data heap of, so 
to speak *latent* data - given particular ways of digging this out? This last 
is the random generator / monkey pumping out Shakespeare if it pumps hard 
enough and long enough: but how to *detect* Shakespeare in the heap?

These are obviously different issues and not succeptible to a single view. (I 
think!) Consider the second point: that a datum is made into information by a 
framework of interpretation: that "42" becomes information when the datum is 
offered within a context: "What's my room number, please?" "What salary are you 
offering?" Same number, different contexts. The "Shakespeare" issue is just the 
same: there is a discrete but large set of numbers which corresponds to the 
entire works of Shakespeare, differently encoded and in different orders as 
seen by a particular "Shakespeare detector", of which one assumes that there 
could be an effectively infinite number. Thus in principle any large number of 
sufficient complexity has latent in it the Works of Bill, given an 
interpretive frame by which to detect and translate it; and vica versa. (This 
is analogous to the body's antigen-antibody problem.)

Finally, there are some information structures which are self-supporting within 
a given framework. These are not easily accessible, but consider many economic 
phenomena, which are both a component of the elements from which they are built 
and a contributor to their own perpetuation and that of related systems. A set 
of markets, for example, creates forces and information which enables the 
existance of the media of exchange, and therefore markets. In ecology, animals 
make forests and forest provide the homes for animals. Emergent systems - 
convection cells, cellular automata - generate broad phenomena from local 
interactions. Coupled to an inheritable memory, "successful" emergent systems 
become better at their task (teleology rules, OK?) and come to dominate the 
scene. They are the product of their interaction which that scene, however, and 
indistinguishable from it.

*The* problem with AI is that of knowledge representation. All - well, many - 
of the difficulties raised by critics and which are encountered in what passes 
for reality are to do with the issue of contextuality: how do I represent 
"ball" as something other than an abstract symbol which has a bundle of 
abstract properties set in terms of other abstract properties? How do I inject 
common sense "ballness"? Have I done more than assemble a data heap? If I force 
connections onto a system, what am I doing which is more than Airfix-and-Glue?

My suspicion is that the answer to this lies neither in the direction 
signposted "to cognition" and not in the direction of Huge Dataheaps. Rather, 
we should look at what happens when a network of neural networks comes to learn.
IN essence, we are looking for systems which can - gradually, under guidance - 
learn to separate raw percept (video stream data, Reuters text data, global 
climatic data...) into clusters; find ways of managing these clusters such that 
teh resulting filtration allows greater discrimination within the afferrent 
data stream. We should be looking for this improvement within the context of 
some over-arching goals, which are probably judgemental of the outcome of many 
trials rather than ex ante rules layed down by "interviewing experts". 

There are two interesting paper in this respect in this and last weeks Nature. 
In last week's version, reserachers were looking at the way in which the human 
brain detected anomolous sounds in a sounds stream. How did we detect 
difference: "here is something to note"? In this week's edition, the 
researchers looked at how neural networks learned to detect pattersn when these 
were presetned in a variety of orientations, as occurs in the natural world to 
the retina of the eye. They found that the networks could cope with this task, 
but that the pattern which best satisified the weights which evolved was 
typically highly symmetrical in one, two or more planes. In other words, the 
networks have developed a general principle (abstract, detectable only to the 
educated eye of the observer) by which to find complex patterns in any 
orientation. I suggest that nobody would dream of designing such a system ex 
ante or know how to do so. Having found how to do it, however, the neurral 
networks would be in aposition to play a role in a broader system of 
filtration, separation, modeling, symbol extractiuon and recombination: of 
artifical intelligence. 

---------------------------------------------
  Oliver Sparrow
  ohgs@chatham.demon.co.uk
