Re: [xmca] What new and interesting?

From: Martin Packer <packer who-is-at duq.edu>
Date: Wed Apr 02 2008 - 09:20:50 PDT

Karen,

I agree that there are ethical issues, but my point was the simpler one that
clinical trials are actually not very good at answering Why? questions. They
rest shakily on the dubious presumptions that causal relations can't be
seen, that all we can observe are associations (correlations not causation),
and that causes operate in a hidden realm of forces, social functions, etc.
The much more plausible position is the one Michael touched on with his
reference to the ethnomethodological perspective (though it can be found
elsewhere and I think Vygotsky was proposing something similar) - that
causal processes can be directly observed and described. They are visible in
the details of ongoing social action. Open the black box, as I said, and
describe what is going on. (Were I smarter I would have said in my example
of the TV that what we need to observe is not so much how it works as how it
is *made.* This is the power of the design experiments that Peter brought
up, that they provide the opportunity to study, describe and explain a
phenomenon in the making. Just what Vygotsky said experiments should be
doing!)

I honestly don't know what is the best strategy when conceptions of research
design are so different. I've chosen both to try to carve out a space for
something different (Duquesne has managed to shape a niche for
non-experimental research; I am one of the editors for the journal
Qualitative Research in Psychology - send us your manuscripts!) and at the
same time work within the mainstream, attending AERA rather than the Human
Science Research conference, for example. We're in the odd situation that
interest in qualitative research seems to be higher than ever, while
insistence on clinical trials is growing. (APA just rejected the proposal
for a new division for qualitative research; APS doesn't even acknowledge
the existence of qualitative research!)

Martin

<http://www.tandf.co.uk/journals/titles/14780887.asp>

On 4/1/08 12:05 PM, "Karen Wieckert" <wieckertk@mail.belmont.edu> wrote:

> Martin,
>
> 1. I agree that it is just wrong to allow a control to not get something
> as important as “schools.” In a “hedged” defense of the group at MIT, I
> believe they are suggesting that given the way that infrastructural projects
> are deployed, there would be a naturally occurring study design to take
> advantage of. Beyond that, I agree with your concerns about what could be
> appropriately inferred about CAUSE. I exhibit a hopeful western bias, I fear,
> in that I tend to view projects that bring contact, resources, etc. as likely
> to have positive possibilities (externalities), even if they “fail” to meet
> their intended goals. This is a view I have developed as the designer of IS.
> Rarely do the design goals meet the built goals, but bad results are not
> always extant. I have limited experience of development projects, but I have
> found the same things true of these, having been involved in a UNIDO project
> in Jordan in the late 1980’s.
>
> 2. On the other hand, looking at other projects from the MIT group, I am
> more in agreement with their message. For example, spending money to
> “de-worm” children, feed them, etc. rather than fancy IS infrastructure or
> outside curriculum just seems right. To me it seems to be common sense, not
> research. However, I can be surprised at how near sighted people can be about
> what they need to show to funding agencies, or in research papers. I prefer
> someone out there showing through a scientifically-branded process that it is
> better to have kids who are fed and not sick when you teach them mathematics!
> This is a group of economists with a development focus, and I hope that they
> can affect economics or at least economists to be more reasonable. Jeffrey
> Sachs is another common sense kinda economist in my opinion with respect to
> development.
>
> 3. I also agree that experimental research of the simplistic nature of
> Treatment X for Ailment Y is problematic. I found myself years ago stuck as
> the implementer of a decision support system for clinical trials in which it
> was the TREATMENT for the AILMENT of getting people with HIV disease enrolled
> into clinical trials. The system was to be deployed at N different county
> hospitals. As an example of the problems, at the time, none of the hospitals
> had Ethernet cable anywhere (i.e., no network). Hard to do the “treatment.”
> I tried to reason with the statisticians, but of course the funding agency had
> no other “language” to understand a medical study.
>
> 4. I wonder how best to proceed in the face of these sorts of
> experiences. I think at times to give up and work in my garden! Other times,
> work within and hope to make some of those positive externalities happen,
> rather than by chance. Other times, to do work that is different, which I am
> privileged to do with Rogers Hall, my spouse.
>
> Thank you Martin for being willing to "prod."
>
> kew
>
> -----Original Message-----
> From: xmca-bounces@weber.ucsd.edu [mailto:xmca-bounces@weber.ucsd.edu] On
> Behalf Of Martin Packer
> Sent: Monday, March 31, 2008 3:03 PM
> To: eXtended Mind, Culture, Activity
> Subject: Re: [xmca] What new and interesting?
>
> Karen,
>
> If by 'externalities' you mean factors that inevitably escape control, then
> I completely agree. A school may be randomly assigned to a community, and
> one of the effects will be the 'schooling' of the childen, but other
> consequences might be changes in the local economy, a new sense of pride in
> the community, and in the longer term things like increased mobility for
> graduates, which may even destroy the community. And the other places where
> schools are *not* built are not simply a control group, because they may
> well respond to the stigma of not getting a school. Very little of this can
> be controlled, and none of it *should* be controlled, because it's all part
> of what schooling is.
>
> And 'internalities' are important too. We need methods and designs (and we
> have them, they're just labelled unscientific!) to open the black box of the
> classoom and study what goes on inside. Let's say I want to explain how
> television sets work. I can set up experiments with a sample of TVs,
> manipulating variables. I may discover that connecting the power has an
> effect. Perhaps that's what causes the picture! Oh, an antenna helps too!
> But if I want to find out how a TV actually works I have to take one of the
> darn things apart and study its structure and function. I can 'experiment'
> with it, poking and prodding - but that's not a randomized trial. Do I then
> know how *all* TVs work? Perhaps not, but I can generalize to all TVs that
> have a similar design, and when one turns up that has a different design
> I'll take that one apart too.
>
> Martin
>
> On 3/31/08 3:39 PM, "Karen Wieckert" <wieckertk@mail.belmont.edu> wrote:
>
>> Martin,
>>
>> I absolutely agree that "The consequences of [NOMINALIZED VERB] are not
>> effects of a simple cause
>> that can be controlled," whether it be "schooling," "exercising," "driving,"
>> or whatever. I think I was attempting to suggest that the context in which
>> the [nominalized verb] is unpacked is important. I used the Poverty Action
>> Lab as an example of a group doing work "branded" as clinical trial-like
>> research (to take up your Coke/Pepsi analogy), but that in general the
>> externalities of the work might be where the real action is, albeit not
>> always
>> positive externalities.
>>
>> I feel when a highly over quantified infrastructure exists (e.g., US NCLB
>> with
>> high stakes testing, highly developed capabilities for tracking individuals,
>> highly "polished" variables, etc.) then the focus on clinical trial-like
>> research has virtually no externalities of any positive sort since there is
>> little else that researchers or research instruments need to bring to the
>> situation besides "special paper" and pencils.
>>
>> I feel a bit like I am taking a "situationally ethical stance" with respect
>> to
>> your question a ways back to Michael.
>>
>> Ka:ren
>>
>> -----Original Message-----
>> From: xmca-bounces@weber.ucsd.edu [mailto:xmca-bounces@weber.ucsd.edu] On
>> Behalf Of Martin Packer
>> Sent: Monday, March 31, 2008 12:44 PM
>> To: eXtended Mind, Culture, Activity
>> Subject: Re: [xmca] What new and interesting?
>>
>> Karen,
>>
>> Coke discoveed that taste tests - randomized clinical trials comparing Coke
>> and Pepsi - didn't predict sales, because people don't drink cola blind. The
>> perceived taste stems not just from the composition of the drink but from
>> the design of the bottle and the image of the company.
>>
>> This is from the Poverty Action Lab web site:
>>
>> " On the other hand, it is widely accepted that randomized trials provide a
>> clean and simple way of figuring out what causes what. The basic idea is
>> very simple. Suppose one group of children is randomly allocated an
>> opportunity to go to school that they would not have otherwise had, while
>> another group did not get that chance. This may occur, for example, if a
>> school construction program begins by building schools in a few areas
>> randomly. Now, we can compare apples to apples. The students who got the
>> schooling opportunity are the same as those that did not, except for the
>> flip of a coin that determined whether a school would be built near them.
>> This means that any health outcomes could only be due to the effect of the
>> school."
>>
>> But we won't know whether health outcomes are due to classroom instruction,
>> the excitement of attending school, the redistribution of household chores,
>> the reactions of parents.... As you say, we live in a complex world, and
>> randomized clinical trials do very little to help us understand its
>> complexity. The consequences of schooling are not effects of a simple cause
>> that can be controlled.
>>
>> Martin
>>
>>
>> On 3/31/08 2:22 PM, "Karen Wieckert" <wieckertk@mail.belmont.edu> wrote:
>>
>>> Martin,
>>>
>>> Using a term Mike Cole introduced awhile back, possibly we can say that
>>> methods used are dependent upon context. For example, the Poverty Action
>>> Lab (http://www.povertyactionlab.org/) endorses randomized controlled
>>> studies to evaluate development programs. Under extreme resource
>>> constraints, at the level of should money be given to "de-worm" children or
>>> provide "high-quality curriculum", can "qualitative" work only be used to
>>> critique the natural, independent entities assumed in the studies (worms,
>>> children, learning, curricula, etc)? I find randomized controlled trials
>>> less objectionable here possibly, than, for example, "testing" a textbook
>>> plus web-site for use in algebra classes in middle schools in Tennessee.
>>>
>>> On the other hand, I have just finished working with my daughter's Girl
>>> Scout troop to set up a library in Lesotho through the African Library
>>> Project (http://www.africanlibraryproject.org/index.php). No randomized
>>> controlled studies will likely be done on this project, and if we wanted to
>>> measure its import, where and what would we measure? The changes made in
>>> the lives of the US children and adults involved? The connections made
>>> inside and between countries through individuals? The "slack resource" of
>>> Peace Corps volunteer time taken up with another activity, e.g., setting up
>>> school libraries/disseminating books? The reading scores of 3rd graders in
>>> Lesotho? The reduction in the US waste stream in the US from books that
>>> cannot be resold or reused? The economic impact of using warehouse space in
>>> New Orleans!? Or, another possibility would be the amount of
>>> time/money/person hours required to identifying, tracking, evaluating,
>>> publishing quantifiable measures and cause/effect relationships.
>>>
>>> Sometimes I feel that because we are in such a rich context (US, schools of
>>> higher education) that we lose sight of what is important. Maybe that is
>>> why the 5th Dimension is so wonderful, or university schools, or "action
>>> research." We want to not simply make words on paper with or without tests
>>> of significance. We want to be significant...
>>>
>>> Ka:ren Wieckert
>>>
>>> -----Original Message-----
>>> From: xmca-bounces@weber.ucsd.edu [mailto:xmca-bounces@weber.ucsd.edu] On
>>> Behalf Of Martin Packer
>>> Sent: Monday, March 31, 2008 10:47 AM
>>> To: eXtended Mind, Culture, Activity
>>> Subject: Re: [xmca] What new and interesting?
>>>
>>> Hi Michael,
>>>
>>> OK, we're in agreement on the point that has to be made. The question then
>>> is how best to make it. In the movement for mixed methods, which seems to be
>>> founded unquestioningly on the kind of division of labor that Shavelson
>>> summarizes, with qualitative research relegated once more to the
>>> descriptive, hypothesis-testing phase, to be used only when more ´powerful´
>>> designs are unavailable? On in a critique of the dubious assumptions
>>> underlying quantitative research: of a world of natural, independent
>>> entities with purely causal relations? Perhaps I'm wrong, but I'm inclined
>>> to go for the second option. Do you find the first option more appropriate
>>> strategically/epistemologically/ethically?
>>>
>>> Martin
>>>
>>>
>>> On 3/31/08 11:34 AM, "Wolff-Michael Roth" <mroth@uvic.ca> wrote:
>>>
>>>> Hi Martin,
>>>> this is PRECISELY the point the book makes in its final chapter, at
>>>> least one, mine; it is also a point others make, that generalization
>>>> is not relegated to the clinical paradigm but that qualitative
>>>> research (e.g., in the phenomenological work a la Husserl) is making
>>>> very generalized statements about cognition or Merleau-Ponty on
>>>> knowing and learning (now confirmed in neurocognitive studies).
>>>>
>>>>
>>>> I hope this helps,
>>>>
>>>> Cheers,
>>>>
>>>> Michael
>>>>
>>>>
>>>> On 31-Mar-08, at 9:17 AM, Martin Packer wrote:
>>>>
>>>> Hi Michael,
>>>>
>>>> I have mixed reactions to your message! :) Shavelson, one of the
>>>> presenters, has articulated a position that seems similar to yours:
>>>>
>>>> Overall, ³It¹s the question ­ not the method ­ that should drive
>>>> the design
>>>> of education research or any other scientific research. That is,
>>>> investigators ought to design a study to answer the question that
>>>> they think
>>>> is important, not fit the question to a convenient or popular design²
>>>> (Shavelson & Towne, 2004).
>>>>
>>>> But then his NRC committee went on to identify the methods most
>>>> appropriate
>>>> to answer three fundamental types of question: (1) What¹s happening?
>>>> (2) Is
>>>> there a systematic (causal) effect? and (3) What is the causal
>>>> mechanism or
>>>> how does it work? They concluded that the first type of question is
>>>> asking
>>>> for a description, which they recommended should be provided by a
>>>> survey,
>>>> ethnographic methods, or a case study. The second type of question is
>>>> asking
>>>> Did X cause Y? Here the most desirable method is a randomized clinical
>>>> trial. Quasi-experimental, correlational, or time-series studies may by
>>>> needed when random assignment is either impractical or unethical, but
>>>> ³logically randomized trials should be the preferred method if they are
>>>> feasible and ethical to do.² The third type of question ­ how does it
>>>> work?
>>>> ­ asks for identification of the causal mechanism that creates a
>>>> described
>>>> effect. Here it seems mixed methods could do the job. (The committee
>>>> seemed
>>>> a bit confused here, perhaps because they believe that causal
>>>> mechanisms can
>>>> never be directly observed.)
>>>>
>>>> A significant problem with these recommendations, well-intended
>>>> though they
>>>> undoubtedly are, is that they perpetuate a widely held but incorrect
>>>> belief
>>>> that qualitative research can answer only descriptive questions, while
>>>> quantitative research is able to answer explanatory questions and
>>>> that such
>>>> questions are always answered by identifying a causal mechanism. If this
>>>> were so, qualitative research would be adequate for generating
>>>> hypotheses,
>>>> but measurement and experimentation would be needed to test these
>>>> hypotheses. Experimentation, the committee asserts, ³is still the single
>>>> best methodological route to ferreting out systematic relations between
>>>> actions and outcomes² (Feuer, Towne & Shavelson, 2002, p. 8).
>>>> Although they
>>>> say they regret that ³the rhetoric of scientifically based research in
>>>> education seems to denigrate the legitimate role of qualitative
>>>> methods in
>>>> elucidating the complexities of teaching, learning, and schooling,²
>>>> they see
>>>> this ³legitimate role² as a limited one: ³When a problem is poorly
>>>> understood and plausible hypotheses are scant ­ as is the case in
>>>> many areas
>>>> of education ­ qualitative methods such as ethnographiesŠ are
>>>> necessary to
>>>> describe complex phenomena, generate models, and reframe
>>>> questions² (p. 8).
>>>>
>>>> In my view this is a sadly limited and completely inaccurate
>>>> conception of
>>>> qualitative research, and indeed of research itself.
>>>>
>>>> Feuer, M. J., Towne, L., & Shavelson, R. J. (2002). Scientific
>>>> culture and
>>>> educational research. Educational Researcher, 31(8), 4-14.
>>>>
>>>> Shavelson, R. J., & Towne, L. (2004). What drives scientific research in
>>>> education? American Psychological Society Observer, 17(4).
>>>>
>>>>
>>>> Martin
>>>>
>>>>
>>>>
>>>> On 3/31/08 7:14 AM, "Wolff-Michael Roth" <mroth@uvic.ca> wrote:
>>>>
>>>>> Hi Martin,
>>>>> I am a trained statistician and quantitative modeler (physical
>>>>> systems as a physicist, neural networks) who asks questions that
>>>>> require a lot of qualitative categorical work, so developed
>>>>> competencies in a panoply of methods, and now have become a
>>>>> qualitative methodologist. As such, I happened to be asked a few
>>>>> years back to write a chapter with a statistician (Kadriye Ercikan),
>>>>> the co-organizer of the session you are referring to. As we were
>>>>> writing this chapter, we saw that the opposition of quantitative/
>>>>> qualitative does not assist researchers a lot and that organizing
>>>>> research from a method perspective is not a good one, an
>>>>> understanding I developed through years of experience teaching
>>>>> statistics and qualitative interpretive methods. (I also co-edit an
>>>>> online journal on qual methods, its called FQS: Forum Qualitative
>>>>> Social Research).
>>>>>
>>>>> Kadriye and I then decided to write an article for Educational
>>>>> Researcher, which was published in 2006. And now we are almost
>>>>> finished editing this book entitled "Generalizing from Educational
>>>>> Research" (Routledge/Taylor&Francis) where people from all sorts of
>>>>> methods backgrounds contribute, including Bachmann (applied ling),
>>>>> Allan Luke, Margaret Eisenhart (anthrop), Jim Gee, Ken Tobin, Rich
>>>>> Shavelson, Pam Moss, Willy Solano, and others. It is an exciting
>>>>> project, as people seem to agree that we need to move away from the
>>>>> polarity of research methods to begin asking questions that matter.
>>>>>
>>>>> I would therefore not ask or contest LSV into one or the other camp.
>>>>> I would ask questions along the lines LSV suggested we ask and then
>>>>> pose the subsidiary question, "How do I answer this question?" A well-
>>>>> formed research question tends to IMPLY the method, or so I show my
>>>>> graduate students.
>>>>>
>>>>> You will have noticed that in my Vygotsky talk, I used purely
>>>>> mathematical methods for the analysis of vocal parameters. . .
>>>>>
>>>>> Cheers,
>>>>>
>>>>> Michael
>>>>>
>>>>> On 30-Mar-08, at 8:59 AM, Martin Packer wrote:
>>>>>
>>>>> I am curious about a session I was unable to attend, one on mixed
>>>>> methods
>>>>> which I know Mike attended, and at which Michael Roth presented. One
>>>>> of the
>>>>> other presenters was Pamela Moss from U of Michigan - several years
>>>>> ago
>>>>> Pamela and I designed and co-taught a 2-semester graduate course on
>>>>> integrated research methods, which I think was unique at the time, so
>>>>> I'm
>>>>> curious to discover what is now state of the art. I'm also curious
>>>>> because
>>>>> the AERA session I organized was titled "Vygotsky's Qualitative
>>>>> Methodology," and some questions were raised there about whether this
>>>>> is an
>>>>> appropriate label for CHAT research. Is it qualitative, mixed, or ..?
>>>>>
>>>>> Can people who attended that session share their impressions?
>>>>>
>>>>> Martin
>>>>>
>>>>>
>>>>> On 3/29/08 8:35 AM, "Mike Cole" <lchcmike@gmail.com> wrote:
>>>>>
>>>>>>
>>>>>> I thought it might be interesting to all if everyone took a few
>>>>>> minutes
>>>>>> either to report on some interesting talk or paper they have
>>>>>> encountered
>>>>>> recently, or a new idea that they
>>>>>> have had that others might have something to contribute to, and
>>>>>> post it
>>>>>> here. (This includes, in my case, ideas that came up from people
>>>>>> whose work
>>>>>> we have discussed here!).
>>>>>>
>>>>>> I'll post a couple of such ideas as examples a lilttle later, but
>>>>>> want to
>>>>>> float the suggestion while I have a minute.
>>>>>>
>>>>>> mike
>>>>>> _______________________________________________
>>>>>> xmca mailing list
>>>>>> xmca@weber.ucsd.edu
>>>>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> xmca mailing list
>>>>> xmca@weber.ucsd.edu
>>>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> xmca mailing list
>>>>> xmca@weber.ucsd.edu
>>>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>>>
>>>>
>>>> _______________________________________________
>>>> xmca mailing list
>>>> xmca@weber.ucsd.edu
>>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> xmca mailing list
>>>> xmca@weber.ucsd.edu
>>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>>
>>>
>>> _______________________________________________
>>> xmca mailing list
>>> xmca@weber.ucsd.edu
>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>>
>>>
>>>
>>> _______________________________________________
>>> xmca mailing list
>>> xmca@weber.ucsd.edu
>>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>
>>
>> _______________________________________________
>> xmca mailing list
>> xmca@weber.ucsd.edu
>> http://dss.ucsd.edu/mailman/listinfo/xmca
>>
>>
>>
>> _______________________________________________
>> xmca mailing list
>> xmca@weber.ucsd.edu
>> http://dss.ucsd.edu/mailman/listinfo/xmca
>
>
> _______________________________________________
> xmca mailing list
> xmca@weber.ucsd.edu
> http://dss.ucsd.edu/mailman/listinfo/xmca
>
>
>
> _______________________________________________
> xmca mailing list
> xmca@weber.ucsd.edu
> http://dss.ucsd.edu/mailman/listinfo/xmca

_______________________________________________
xmca mailing list
xmca@weber.ucsd.edu
http://dss.ucsd.edu/mailman/listinfo/xmca
Received on Wed Apr 2 09:24 PDT 2008

This archive was generated by hypermail 2.1.8 : Sun Apr 06 2008 - 11:20:17 PDT