signed languages

Connie Mayer (cmayer who-is-at oise.on.ca)
Sat, 14 Oct 1995 16:11:33 -0400 (EDT)

As a teacher/researcher in the field of deaf education, I have found the
current discussion on xmca particularly interesting. With respect to the
discussion of signed languages, I would stress the point that although
signed languages (such as ASL or LSQ) are visual/spatial as oppposed to
auditory/oral, they are nevertheless languages and not gesture systems.
This may seem an obvious point but it has only been relatively recently
(since the work of William Stokoe in the early 1960s) that ASL has been
acknowledged as a legitimate language. Eugene Matusov also mentioned in
his post of Oct.13 that "sign language was invented and developed within
human culture using oral verbal language." While hearing people
sometimes played a role in promoting the use of signed languages
(in educational settings for example), signed languages are the natural
languages of deaf people around the world and were not "invented" by the
hearing for the deaf. Harlan Lane's book "When the Mind Hears" presents a
comprehensive overview of some of these issues.

Of course, there are some "linguistic overlaps" between the signed language
of a particular deaf community and the oral/written language of the majority
culture in which the signed language must co-exist. For example, ASL uses
initialized signs in which the first letter of the English word determines the
handshape used in forming the sign (the sign for "philosophy" uses a "p"
handshape from the fingerspelled alphabet in the production of the sign).
ASL is impacted upon by an oral language (English) but it is not born of this
language.

As well, many studies have shown that the acquisition of signed language
and the acquisition of spoken language proceed at a similar pace with
similar stages of development. There has been some discussion that first
signs precede first words, however, more recently it has been argued that
there is no "sign advantage" for early acquisition but rather that there
is a "gesture advantage" for prelinguistic communication in both deaf and
hearing children (Volterra and Erting, 1994 and Volterra and Iverson, 1995).

The parallel development of signed and spoken language in the deaf
and hearing child respectively assumes, of course, that the deaf child
will have access to a comprehensible visual language. For most deaf children
(more than 90% of deaf children have hearing parents), this access is not
the case. The majority of students I work with arrive at school with no L1 but
only some sort of gesture system that allows them to get across their
basic wants and needs. Gestures can serve to "stand in" for language but
can not replace it. Gestures are limited in their scope as tools for
communication - you can point at what you want on the grocer's shelf but
you can't point at some thing that isn't there, something that may be in
the backroom or under the counter.

Many of these deaf children arrive at school not knowing their own names
- without a concept of what a name is. These are children whose cognitive
potential is (or was?) intact but who have been deprived of a viable
language for communication and for thinking. Vygotsky, in his observations
of deaf children, identified this problem and he considered it a powerful
illlustration of the critical role language must play in the development of
thought.

Connie Mayer
Ontario Institute for Studies in Education
cmayer who-is-at oise.on.ca