Hi Mike, bb, and All,
Let me explain a little more and see if it makes sense.. I could be in
my own world... at this point in doctoral work everything seems related
to my work... :-)
Under NCLB schools must demonstrate adequate yearly progress (AYP) and
this is done by using the state mandated, standardized, high stakes
assessment scores achieved by students. However, for children with
disabilities, for whom learning and development may progress in ways
that vary in terms of pace, depth and degree relative to "regular ed"
standards of learning, progress may be hidden. In a dynamic assessment -
in this case a Dynamic Standard of Learning Assessment developed for 3rd
grade reading using a hierarchical model of graduated prompts in the
learning within the test approach - progress may be revealed as the
assessment is more sensitive to the budding areas of development rather
than merely the attainment of (or not) a passing score. There are some
interesting data results... data for teachers as they develop
instruction, data that reveals progress relative to regular ed (the DSLA
redesigns existing assessments) for parents, data that reveals progress
for schools and districts when students not delayed or disabled enough
to take alternate exams must take the regular ed exams with
accommodations (which don't actually help much if you don't know the
material yet). So the assessment becomes intertwined with a variety of
school-based education activities.
The collaborative nature of dynamic assessment, however, goes against
the grain of the autonomous individual which has long been the goal in
special education as well as regular education. And while we'd perhaps
like to think of our classrooms as communities, the nature of the
testing suggests that it is a room full of individuals who have sucked
up and can spit out the same answers.
The individual must be able to demonstrate their knowledge (banked or
otherwise) as if they existed in a vacuum.
The teacher who has taught a particular group THAT year receives the
thumbs up or down relative to the demonstration of aquired knowledge...
as if students learned in a vacuum.
So my question is, do our testing practices reflect a particular view of
the individual/collective relation and is there a hope, perhaps through
such activities as dynamic assessment, that by parlaying with attractive
data we can shift the way we think about democratic education practices?
I know I'm wishful, but does the connection make sense?
~ Em
Mike Cole wrote:
> I am unclear how your intervention bears on the individual/collective
> action
> relation from what you have written, Emily.
> mike
>
> On 8/6/06, duvalleg@adelphia.net <duvalleg@adelphia.net> wrote:
>
>>
>> Hi bb and All,
>> Your question:
>>
>> "Are different processes implicated depending upon how one thinks about
>> the individual/collective relation?"
>>
>> seems to touch on my work.
>> I have taken a state mandated, high stakes assessment of 3rd grade
>> reading
>> and transformed it into a Dynamic Standards of Learning Assessment - for
>> children with learning disabilities. The results of the pilot are very
>> promising - so agree the participants, parents, teachers and even
>> administrators. The question for me is... can there by a buy-in for the
>> collaborative nature of dynamic assessment in an environment that
>> esteems
>> the autonomous individual?
>>
>> Emily
>> ---- xmca-whoever@comcast.net wrote:
>> > Hi,
>> >
>> > Just a brief response -- need to think a bit more -- but what was
>> important to me about organizing 'joint activity''was the sentence
>> "Assessments help to define and articulate the zone between the everyday
>> actions of the present and new and possible forms of activity."
>> >
>> > It's proleptic AND the statement moves from the level of actions in
>> the
>> present -- not necessarily coordinated , to the level of activity in the
>> future-- definitely coordinated.
>> >
>> > And we are composing a collective subject - it's a thought piece
>> for the
>> university.
>> >
>> > This question is a tough one: "Are different processes implicated
>> depending upon how one thinks about the individual/collective relation?"
>> >
>> > My gut response is 'yes', but that response is not easily forthcoming
>> from my observations. More processing needed...
>> >
>> >
>> > bb
>> >
>> >
>> > -------------- Original message ----------------------
>> > From: "Mike Cole" <lchcmike@gmail.com>
>> > > bb--
>> > >
>> > > Your "sidebar" on the practices associated with assessment at Lesley
>> got me
>> > > thinking about several issues. But I'll comment just
>> > > on one. Perhaps others will be encouraged to comment on other
>> aspects
>> of
>> > > your report, perhaps not. Anyway, here is what caught
>> > > my attention:
>> > >
>> > > Community members make no distinction between their day to day work
>> and
>> > > assessment, but rather identify assessment as the process for
>> collecting
>> > > evidence that will assist them in their continuing and new work.
>> > >
>> > > What struck me is that this characterization of (ideal?) members'
>> normative
>> > > behavior is a lot like what we might term "crticial
>> > > thinking" at the level of individuals.This thought, in turn, got
>> me to
>> > > reflected on the issue of the "subject" of activity in various
>> > > activity theory discussion. Is the subject an individual, or a
>> collective
>> > > subject? Are differrent processes implicated depending upon
>> > > how one thinks about the individual/collective relation?
>> > >
>> > > I assume that as is true of most people, some of the time community
>> members
>> > > are not thinking of assessment (evaluatng
>> > > crtically the consequences of their actions, at other times --
>> some of
>> them
>> > > institutionally mandated by such contingencies
>> > > as progress reports or accredidation deadlines--- assessment becomes
>> the
>> > > leading concern.
>> > >
>> > > Would it be proper to say that a cultural of evidence takes the
>> statement in
>> > > red above as an ideal that members value to be strived
>> > > for and valorized, a norm that helps to organize joint activity?
>> > > mike
>> > >
>> > >
>> > >
>> > > On 8/4/06, bb <xmca-whoever@comcast.net> wrote:
>> > > >
>> > > > As a sidebar to the present discussion, I've spent the greater
>> part
>> of
>> > > > this week involved in program assessment and redesign with the
>> goal
>> of
>> > > > supporting my institution's application for a new national
>> > > > accreditation. In this context I reviewed a vision paper on
>> assessment
>> > > > practices at the university, written several years ago,
>> sponsored by
>> the
>> > > > provost, and of which I was a coauthor. We adopted the term
>> "culture of
>> > > > evidence" (which was used heavily this week) and proceeded to
>> adapt
>> it to
>> > > > our circumstances, with the following exerpt providing the core
>> definition
>> > > > -- of which I was pleased to rediscover that the last paragraph
>> has
>> > > > a clearly traceable influence to this forum and its several of
>> its
>> > > > participants.
>> > > > bb
>> > > > -----------------------------
>> > > >
>> > > >
>> > > > III Culture of Evidence
>> > > > One outstanding pattern in models of best practices that appear in
>> the
>> > > > literature, and on the Internet, is the systemic weaving of
>> assessment into
>> > > > the fabric of the institution, as a culture of evidence.
>> Assessment
>> is not
>> > > > simply patched onto extant practices, as an adjunct or summative
>> process,
>> > > > but instead is integrated into day to day routines and operations,
>> and
>> > > > thereby is integrated into the totality of work in the
>> > > > institution. Assessment data provides the basis upon which
>> departments,
>> > > > programs, schools, and individuals evaluate their practices in
>> relation to
>> > > > their stated goals and the university mission, and upon which
>> decisions are
>> > > > then made to support the operations of the institution and to make
>> > > > improvements. Culture of evidence specifically refers, in its
>> ideal
>> form, to
>> > > > the systemic coordination of people in an institution who are:
>> > > >
>> > > > Identifying and addressing student, faculty, and staff
>> issues,
>> > > >
>> > > > Consulting about data needs and assessment methodologies,
>> > > >
>> > > > Planning and designing assessments,
>> > > >
>> > > > Ensuring sound assessment methodology using current
>> technologies and
>> > > > techniques,
>> > > >
>> > > > Routinely collecting and analyzing student-oriented data,
>> > > >
>> > > > Organizing to continually address selected needs and demands
>> of the
>> > > > university,
>> > > >
>> > > > Providing institutional support for assessment practices and
>> their
>> > > > improvements,
>> > > >
>> > > > Ensuring that collected data are analyzed, interpreted, and
>> > > > disseminated to all invested decision-makers, who include faculty,
>> advisors,
>> > > > support staff, as well as administrators.
>> > > >
>> > > > Our use of the term culture is to convey an ideal that is not
>> > > > undemocratic: everyone gets involved, the process is not one of
>> mandated
>> > > > changes, and assessment becomes a shared tool. By definition,
>> each
>> member
>> > > > of a culture necessarily enacts the practices that constitute that
>> culture,
>> > > > and the culture of evidence can be thought in part as the
>> consolidated and
>> > > > collaborative coordination of assessment practices in an
>> > > > institution. Members of the community continuously ask: What
>> do we
>> > > > know? How do we know it? What resources do we have to do
>> something
>> about
>> > > > what we know? Are we constituting and enacting a responsible
>> system?
>> > > >
>> > > > In contrast to thinking of assessment as an external activity,
>> assessment
>> > > > is recognized as an ongoing ethnography of the balance between
>> challenges
>> > > > and capacities. Assessments help to define and articulate the
>> zone
>> between
>> > > > the everyday actions of the present and new and possible forms of
>> > > > activity. Community members make no distinction between their day
>> to day
>> > > > work and assessment, but rather identify assessment as the process
>> for
>> > > > collecting evidence that will assist them in their continuing and
>> new
>> > > > work. This way of thinking about assessment also constitutes the
>> culture of
>> > > > evidence, where decision-making and planning is based on the data
>> and
>> > > > information created during the processes of learning, teaching,
>> and
>> > > > working. The culture of evidence is both a way of doing and a way
>> of
>> > > > thinking.
>> > > > _______________________________________________
>> > > > xmca mailing list
>> > > > xmca@weber.ucsd.edu
>> > > > http://dss.ucsd.edu/mailman/listinfo/xmca
>> > > >
>> > > _______________________________________________
>> > > xmca mailing list
>> > > xmca@weber.ucsd.edu
>> > > http://dss.ucsd.edu/mailman/listinfo/xmca
>> >
>> >
>> > _______________________________________________
>> > xmca mailing list
>> > xmca@weber.ucsd.edu
>> > http://dss.ucsd.edu/mailman/listinfo/xmca
>>
>> _______________________________________________
>> xmca mailing list
>> xmca@weber.ucsd.edu
>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>
> _______________________________________________
> xmca mailing list
> xmca@weber.ucsd.edu
> http://dss.ucsd.edu/mailman/listinfo/xmca
>
_______________________________________________
xmca mailing list
xmca@weber.ucsd.edu
http://dss.ucsd.edu/mailman/listinfo/xmca
This archive was generated by hypermail 2b29 : Tue Sep 05 2006 - 08:14:30 PDT