Information ecologies and bunnies

Bill Barowy (wbarowy who-is-at lesley.edu)
Sat, 6 Feb 1999 14:13:44 -0500

It is interesting to pursue further the various concepts of information and
meaning within an activity theoretical framework, and see if there is any
purchase there. I wonder - 'information' is a problematic term as is also
'meaning' as all the other posts of this thread have indicated. It is
really a quagmire - 'information' has even crept into physics where
exploitation of specially prepared and spatially separated elements of a
physical system (Einstein, Podolsky, Rosen, 1935) might allow
faster-than-light propagation of 'information'. And so the theory is that
two humans can interact and send 'information' back and forth to one
another instantly, even though they may be 30 light years apart, a distance
over which light would take 30 years to cross.

On the other hand, it might appear that phenomena such as this and others
can also be understood without these concepts of 'information'. How? At
this moment, I am not sure, I'm just writing out a hunch. I'll try to
stick to simple situations, such as light detectors, mousetraps, and
bunnies. The latter I am fond of, and pleased to find in Leontyev's
writing as well.

First, Jay is right about the beginnings of cybernetics in control systems,
and that is where one finds applied cybernetics now, transformed into and
through engineering, although Weiner and his cybernetic followers have been
more interested in understanding the similarities of humans and computing
machines, and this takes cybernetics beyond control systems. Curiously, if
you view computing machines as human artifacts, and in a Peircian sense,
artifacts behave according to nature as do humans, then the exploration of
the similarities (and the differences!) between humans and artifacts qua
computing machines places cybernetics and activity theory close together in
research interests. And then a lot of the work in AI, as a child of
cybernetics with generational mutations, simulating humans with machine,
makes sense, and may resist some anti-cognitivist sieges. But that is
another story.

You may consider the following reductionist. I prefer to think of it as
Peircist. But I could be wrong.

In the first situation, consider a light source and a light detector, and
some apparatus that is designed to do something upon certain conditions of
the light detector having been met. This is the kind of 'system' you might
find at home - it opens your garage door when you drive in the driveway, or
turns on your lights when someone walks in front of the detector - and it
is also the kind of 'system' that you will find in the fiber-optic network
cable that probably has made possible your reception of this email, hence
its link to 'information' technology. The system may have been simply
designed to perform an action if the light impinging on the detector
crosses certain thresholds, and this is the basis of the digital logic
design of electo-optical interfaces for computer networks. So suppose if
the light is intense enough, the device registers a '1" and if the light is
sufficiently weak, the device registers a '0'.

This might be a simple man-infestation of some physical phenomenon - the
'interpretation' of a 'signal'. One could say that the device 'interprets'
the light 'signal' and renders a '1' or a '0' accordingly. Then one might
say that the system has 'received' the 'information' of a a '1' or a '0'.
It is worthy to note that the system must be tuned to this 'signal'. If
the wavelength of the light is not within the range to which the detector
is sensitive, then the system does not 'interpret' the 'signal' at all.
There is a mutuality in both the design and the definition of the system
and the signal. Other lights of different wavelengths impinge on the
detector, but if the system is not designed to interpret them, for example
if the wavelength is not within range of the detectors sensitivity, they
are not 'signals' i.e. they do not effect the state of the system.
Conversely, the light to which the detector of our system is tuned, when
impinging upon another system of different design, may not be interpreted
as a signal.

But why do we speak of 'information' in this system? Under what conditions
do we find this term useful? The second situation - a baited mousetrap -
is similar in function to the first. This system is designed so that when
a condition of the detector is met, i.e. the pressing down of the trigger,
the system performs a certain action, i.e. the mouse is caught (or your
finger is snapped). This system is even simpler than the first. If the
pressure on the trigger is intense enough, the system catches the mouse (or
you feel pain). The state of the unsprung trap may re-present a '0' and
that of the sprung trap may re-present a '1'.

If the use of 'information' in the first situation bothers you, then
perhaps its use in this situation will. It certainly does me. Although
fundamentally the action of the two systems are very similar, that they
change their states when the right conditions are met, it seems fantastic
to say that the dead mouse, or the sprung mouse trap, or the throbbing
finger, carries 'information'. Rather it makes more sense to say the trap
'performed an action' - it did something that could effect something else.

The third situation treats my bunny, mentioned on this list before. I am
fairly sure now that our bunny is deaf. It may be that her deafness is a
consequence of the limited genetic pool in which floppy eared bunnies are
bred, especially as the ear phenotype is recessive, and its popularity
drives breaders, who want to maximize their profits, towards in-breeding.
We have found that when the bunny is sitting and facing the stone wall, she
does not detect our approach until she sees us up close. Being startled,
she runs quickly into her hutch, and sometimes bumps the wall, and then
thumps her back feet as if in anger or pain. Noting this, we decided to
give her advance warning. But her response remained the same even as we
approached while talking loudly to her, to announce our arrival. Yet if
the bunny sees us approaching, her response is quite different, often
running excitedly back and forth in the pen, perhaps in expectation of some
vegetable cuttings from our evening meal preparation.

Through talking to the bunny, we try to send 'information' to her, but she
just doesn't get it. (Other bunnies we have owned did get it) In her
deafness, she is not tuned to audio 'signals'. We have experimented. In
our experiments, loud noises created out of her sight have also failed to
arouse her. The bunny is tuned to visible light, and if she sees our
presence, we can observe the changes in her state, such as hopping back and
forth. Even though bunnies are comparatively simpler than humans, they are
comparatively more complex than mousetraps. One could say that the light
signal (my visible approach) interpreted by the bunny means something to
her (a tasty snack). But using 'information' and 'meaning' to describe
bunny's behavior seems a reach.

Rather than to say that bunny interprets my visible approach as meaningful,
as an indication of tasty snacks, I'd prefer to say that bunny's behavior
and my behavior have co-evolved through a history of prior interactions.
We are an ecological activity system - a system that involves many things
and perhaps multiple objects. As I work in my kitchen, I think that bunny
will appreciate scraps of vegetables, because I have seen her behavior upon
these conditions before. So I take some veggies out to her. She responds
in turn. Our behavior is mapped out as an ecological system is mapped out,
the system behaving over time, as it will, through co-evolved
relationships, unless peturbed by other influence. So bunny and me form an
open system, including the pen, hutch, food, etc and I can arbitrarily and
artificially draw boundaries around the system for the purposes of
analysis. Within this system there are signals, and system and signal are
mutually defined, as is also 'information'.

But I think I can get by without using 'signal' and 'information'. Rather,
bunny and me interact through a variety of processes; visual, artifactual
(food, fence, box, etc..), tactile (petting, allowing to be petted). And
so I think I find myself coming around to Mike's notions of interaction
used quite generally, and Naoki's views of mutuality.

I think, similarly, information and signals in cultural systems of people
and artifacts are mutually, albeit vaguely, defined within that system.
When signals of another culture impinge on one, they may carry little or no
information - so listening to a foreign language that is truely foreign may
carry no information to the listener. In Jay's example, what is
information to the librarians may not be the same as what is information to
the computer group. Because they do share a great deal by being human, and
american, and literate, etc... there may be a great deal of overlap in what
they do term information. But I'll leave the theorizing about information
and cultural systems, and the translation to and from activity theory to
others. I have a bunny to concentrate on.

Bill Barowy, Associate Professor
Technology in Education
Lesley College, 31 Everett Street, Cambridge, MA 02138-2790
Phone: 617-349-8168 / Fax: 617-349-8169
http://www.lesley.edu/faculty/wbarowy/Barowy.html
_______________________
"One of life's quiet excitements is to stand somewhat apart from yourself
and watch yourself softly become the author of something beautiful."
[Norman Maclean in "A river runs through it."]