Hmmm...
Situations of extensive, high level "decision making" for many
people/organizations/places/times may be helped not only by "high level
statistics" but high level thinking/collaborating that can be narrative as
well. As a designer, I am involved in making for a context not of now, but
of later (context of design versus the context of use), for people who are
not me, but others (designers versus users). The process of making is
on-going and uncertain. There are punctuated areas of certainty, but often
many areas of uncertainty where designers must make statistically
"unwarranted" decisions. Therein lies responsibility and accountability...
Ka:ren
-----Original Message-----
From: xmca-bounces@weber.ucsd.edu [mailto:xmca-bounces@weber.ucsd.edu] On
Behalf Of Wolff-Michael Roth
Sent: Monday, March 31, 2008 12:34 PM
To: eXtended Mind, Culture, Activity
Subject: Re: [xmca] What new and interesting?
Hi all,
Karen provides some interesting cases for the use of different kinds
of method to be used.
Although I don't use this or that quantitative method, it does not
mean that I have to slam dunk it. For example, think of a politician
or panel or what not that makes a decision about how to distribute
money involving populations of a few thousand to millions of people.
How do you set up the different pots? How do you make decisions when
large numbers of human beings are involved without having to stop and
listen to the stories of all the thousands or millions involved?
Well, if you did listen, you would never come to the moment of being
able to spend the money.
Situations like this help me understand that there are legitimate
questions that require high level statistics.
And other situations I want to make sure that something is justified,
treatments dealing with various physiological, psychological, or
sociological illnesses. If I do not want to resort to some sort of
word to mouth and let's have a few hundred years of experience kind
of development, an experimental design can provide us with more
global pictures that are more trustworthy than others.
Also, I bet there were many experimental studies done by CHAT
psychologists in the USSR and GDR and other countries. But Mike may
know more about this and can inform us.
Cheers,
Michael
On 31-Mar-08, at 12:22 PM, Karen Wieckert wrote:
Martin,
Using a term Mike Cole introduced awhile back, possibly we can say that
methods used are dependent upon context. For example, the Poverty
Action
Lab (http://www.povertyactionlab.org/) endorses randomized controlled
studies to evaluate development programs. Under extreme resource
constraints, at the level of should money be given to "de-worm"
children or
provide "high-quality curriculum", can "qualitative" work only be
used to
critique the natural, independent entities assumed in the studies
(worms,
children, learning, curricula, etc)? I find randomized controlled
trials
less objectionable here possibly, than, for example, "testing" a
textbook
plus web-site for use in algebra classes in middle schools in Tennessee.
On the other hand, I have just finished working with my daughter's Girl
Scout troop to set up a library in Lesotho through the African Library
Project (http://www.africanlibraryproject.org/index.php). No randomized
controlled studies will likely be done on this project, and if we
wanted to
measure its import, where and what would we measure? The changes
made in
the lives of the US children and adults involved? The connections made
inside and between countries through individuals? The "slack
resource" of
Peace Corps volunteer time taken up with another activity, e.g.,
setting up
school libraries/disseminating books? The reading scores of 3rd
graders in
Lesotho? The reduction in the US waste stream in the US from books that
cannot be resold or reused? The economic impact of using warehouse
space in
New Orleans!? Or, another possibility would be the amount of
time/money/person hours required to identifying, tracking, evaluating,
publishing quantifiable measures and cause/effect relationships.
Sometimes I feel that because we are in such a rich context (US,
schools of
higher education) that we lose sight of what is important. Maybe
that is
why the 5th Dimension is so wonderful, or university schools, or "action
research." We want to not simply make words on paper with or without
tests
of significance. We want to be significant...
Ka:ren Wieckert
-----Original Message-----
From: xmca-bounces@weber.ucsd.edu [mailto:xmca-
bounces@weber.ucsd.edu] On
Behalf Of Martin Packer
Sent: Monday, March 31, 2008 10:47 AM
To: eXtended Mind, Culture, Activity
Subject: Re: [xmca] What new and interesting?
Hi Michael,
OK, we're in agreement on the point that has to be made. The question
then
is how best to make it. In the movement for mixed methods, which
seems to be
founded unquestioningly on the kind of division of labor that Shavelson
summarizes, with qualitative research relegated once more to the
descriptive, hypothesis-testing phase, to be used only when more
´powerful´
designs are unavailable? On in a critique of the dubious assumptions
underlying quantitative research: of a world of natural, independent
entities with purely causal relations? Perhaps I'm wrong, but I'm
inclined
to go for the second option. Do you find the first option more
appropriate
strategically/epistemologically/ethically?
Martin
On 3/31/08 11:34 AM, "Wolff-Michael Roth" <mroth@uvic.ca> wrote:
> Hi Martin,
> this is PRECISELY the point the book makes in its final chapter, at
> least one, mine; it is also a point others make, that generalization
> is not relegated to the clinical paradigm but that qualitative
> research (e.g., in the phenomenological work a la Husserl) is making
> very generalized statements about cognition or Merleau-Ponty on
> knowing and learning (now confirmed in neurocognitive studies).
>
>
> I hope this helps,
>
> Cheers,
>
> Michael
>
>
> On 31-Mar-08, at 9:17 AM, Martin Packer wrote:
>
> Hi Michael,
>
> I have mixed reactions to your message! :) Shavelson, one of the
> presenters, has articulated a position that seems similar to yours:
>
> Overall, ³It¹s the question not the method that should drive
> the design
> of education research or any other scientific research. That is,
> investigators ought to design a study to answer the question that
> they think
> is important, not fit the question to a convenient or popular design²
> (Shavelson & Towne, 2004).
>
> But then his NRC committee went on to identify the methods most
> appropriate
> to answer three fundamental types of question: (1) What¹s happening?
> (2) Is
> there a systematic (causal) effect? and (3) What is the causal
> mechanism or
> how does it work? They concluded that the first type of question is
> asking
> for a description, which they recommended should be provided by a
> survey,
> ethnographic methods, or a case study. The second type of question is
> asking
> Did X cause Y? Here the most desirable method is a randomized clinical
> trial. Quasi-experimental, correlational, or time-series studies
> may by
> needed when random assignment is either impractical or unethical, but
> ³logically randomized trials should be the preferred method if they
> are
> feasible and ethical to do.² The third type of question how does it
> work?
> asks for identification of the causal mechanism that creates a
> described
> effect. Here it seems mixed methods could do the job. (The committee
> seemed
> a bit confused here, perhaps because they believe that causal
> mechanisms can
> never be directly observed.)
>
> A significant problem with these recommendations, well-intended
> though they
> undoubtedly are, is that they perpetuate a widely held but incorrect
> belief
> that qualitative research can answer only descriptive questions, while
> quantitative research is able to answer explanatory questions and
> that such
> questions are always answered by identifying a causal mechanism. If
> this
> were so, qualitative research would be adequate for generating
> hypotheses,
> but measurement and experimentation would be needed to test these
> hypotheses. Experimentation, the committee asserts, ³is still the
> single
> best methodological route to ferreting out systematic relations
> between
> actions and outcomes² (Feuer, Towne & Shavelson, 2002, p. 8).
> Although they
> say they regret that ³the rhetoric of scientifically based research in
> education seems to denigrate the legitimate role of qualitative
> methods in
> elucidating the complexities of teaching, learning, and schooling,²
> they see
> this ³legitimate role² as a limited one: ³When a problem is poorly
> understood and plausible hypotheses are scant as is the case in
> many areas
> of education qualitative methods such as ethnographiesÐ are
> necessary to
> describe complex phenomena, generate models, and reframe
> questions² (p. 8).
>
> In my view this is a sadly limited and completely inaccurate
> conception of
> qualitative research, and indeed of research itself.
>
> Feuer, M. J., Towne, L., & Shavelson, R. J. (2002). Scientific
> culture and
> educational research. Educational Researcher, 31(8), 4-14.
>
> Shavelson, R. J., & Towne, L. (2004). What drives scientific
> research in
> education? American Psychological Society Observer, 17(4).
>
>
> Martin
>
>
>
> On 3/31/08 7:14 AM, "Wolff-Michael Roth" <mroth@uvic.ca> wrote:
>
>> Hi Martin,
>> I am a trained statistician and quantitative modeler (physical
>> systems as a physicist, neural networks) who asks questions that
>> require a lot of qualitative categorical work, so developed
>> competencies in a panoply of methods, and now have become a
>> qualitative methodologist. As such, I happened to be asked a few
>> years back to write a chapter with a statistician (Kadriye Ercikan),
>> the co-organizer of the session you are referring to. As we were
>> writing this chapter, we saw that the opposition of quantitative/
>> qualitative does not assist researchers a lot and that organizing
>> research from a method perspective is not a good one, an
>> understanding I developed through years of experience teaching
>> statistics and qualitative interpretive methods. (I also co-edit an
>> online journal on qual methods, its called FQS: Forum Qualitative
>> Social Research).
>>
>> Kadriye and I then decided to write an article for Educational
>> Researcher, which was published in 2006. And now we are almost
>> finished editing this book entitled "Generalizing from Educational
>> Research" (Routledge/Taylor&Francis) where people from all sorts of
>> methods backgrounds contribute, including Bachmann (applied ling),
>> Allan Luke, Margaret Eisenhart (anthrop), Jim Gee, Ken Tobin, Rich
>> Shavelson, Pam Moss, Willy Solano, and others. It is an exciting
>> project, as people seem to agree that we need to move away from the
>> polarity of research methods to begin asking questions that matter.
>>
>> I would therefore not ask or contest LSV into one or the other camp.
>> I would ask questions along the lines LSV suggested we ask and then
>> pose the subsidiary question, "How do I answer this question?" A
>> well-
>> formed research question tends to IMPLY the method, or so I show my
>> graduate students.
>>
>> You will have noticed that in my Vygotsky talk, I used purely
>> mathematical methods for the analysis of vocal parameters. . .
>>
>> Cheers,
>>
>> Michael
>>
>> On 30-Mar-08, at 8:59 AM, Martin Packer wrote:
>>
>> I am curious about a session I was unable to attend, one on mixed
>> methods
>> which I know Mike attended, and at which Michael Roth presented. One
>> of the
>> other presenters was Pamela Moss from U of Michigan - several years
>> ago
>> Pamela and I designed and co-taught a 2-semester graduate course on
>> integrated research methods, which I think was unique at the time, so
>> I'm
>> curious to discover what is now state of the art. I'm also curious
>> because
>> the AERA session I organized was titled "Vygotsky's Qualitative
>> Methodology," and some questions were raised there about whether this
>> is an
>> appropriate label for CHAT research. Is it qualitative, mixed, or ..?
>>
>> Can people who attended that session share their impressions?
>>
>> Martin
>>
>>
>> On 3/29/08 8:35 AM, "Mike Cole" <lchcmike@gmail.com> wrote:
>>
>>>
>>> I thought it might be interesting to all if everyone took a few
>>> minutes
>>> either to report on some interesting talk or paper they have
>>> encountered
>>> recently, or a new idea that they
>>> have had that others might have something to contribute to, and
>>> post it
>>> here. (This includes, in my case, ideas that came up from people
>>> whose work
>>> we have discussed here!).
>>>
>>> I'll post a couple of such ideas as examples a lilttle later, but
>>> want to
>>> float the suggestion while I have a minute.
>>>
>>> mike
>>> _______________________________________________
>>> xmca mailing list
>>> xmca@weber.ucsd.edu
>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>
>>
>> _______________________________________________
>> xmca mailing list
>> xmca@weber.ucsd.edu
>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>
>>
>> _______________________________________________
>> xmca mailing list
>> xmca@weber.ucsd.edu
>> http://dss.ucsd.edu/mailman/listinfo/xmca
>
>
> _______________________________________________
> xmca mailing list
> xmca@weber.ucsd.edu
> http://dss.ucsd.edu/mailman/listinfo/xmca
>
>
>
> _______________________________________________
> xmca mailing list
> xmca@weber.ucsd.edu
> http://dss.ucsd.edu/mailman/listinfo/xmca
_______________________________________________
xmca mailing list
xmca@weber.ucsd.edu
http://dss.ucsd.edu/mailman/listinfo/xmca
_______________________________________________
xmca mailing list
xmca@weber.ucsd.edu
http://dss.ucsd.edu/mailman/listinfo/xmca
_______________________________________________
xmca mailing list
xmca@weber.ucsd.edu
http://dss.ucsd.edu/mailman/listinfo/xmca
_______________________________________________
xmca mailing list
xmca@weber.ucsd.edu
http://dss.ucsd.edu/mailman/listinfo/xmca
Received on Mon Mar 31 14:41 PDT 2008
This archive was generated by hypermail 2.1.8 : Sun Apr 06 2008 - 11:20:17 PDT