[Xmca-l] Re: Trying to frame studies of the web through perezhivanie

Edward Wall ewall@umich.edu
Wed Sep 26 15:10:04 PDT 2018


Henry

      Interesting subject. I have always thought Newton somewhat more ‘digital’ and Leibnitz somewhat more ‘analog’ (he used infinitesimals which Robinson much latter put on a firm mathematical basis) in how they, in essence, treat something like a point. I’ve seen a few calculus texts that do use Leibnitz’s method and there are some arguments that, mathematically speaking, extensions of his method (due, in part, to Robinson) bring some things into view that may be hard to see otherwise. 

Ed


> On Sep 26, 2018, at  4:52 PM, HENRY SHONERD <hshonerd@gmail.com> wrote:
> 
> It took me a long time to understand the calculus, because I couldn’t "get" the limit theorem, which allows for a way to use digital means to arrive quickly at as-precise-as-you-like approximations of rates of change (in differential calculus) and sums (in integral calculus) than would be possible and/or practical with analog means of counting and measuring. Without such quickly gotten precision, modern engineering would be impossible. I thought that Newton and Leibnitz discovered the calculs independently and at the same time, but a quick look at the wiki on the calculus is much more complex than that. It’s a history, it seems, that adds to the issue of concept and a word for the concept.
> Henry
>  
>> On Sep 26, 2018, at 7:53 AM, Glassman, Michael <glassman.13@osu.edu <mailto:glassman.13@osu.edu>> wrote:
>> 
>> Hi Ed,
>>  
>> This is a kind of interesting topic, including from a cultural perspective. My knowledge on this is relatively superficial.  Bruce Robinson made a really good point to me – also in your message – that analog computers were better for things like differential equations and more pure mathematic stuff (I think).  But that when it came to information processing digital was far superior.  My thinking though from the cultural perspective is that analog thinking is more representative of the way humans actually think, at least the way I believe they think. The big argument I have with information processing is that the argument is that the way the computer works (mostly software) is isomorphic to the human mind. But I wonder how much of the direction our society has gone in the last thirty years, with the timed testing using multiple choice questions, if we are attempting to make the human mind isomorphic to the computer.  As a friend who has worked at IBM for a lot of years told me recently, they are beginning to wonder if the computer is not training the human. I had wondered if we had gone the analog route (and right now I think I’m agreeing with Bruce, but I change quickly) if we might have gone in another direction, a more pure human-computer symbiosis.  Just rambling on a Tuesday morning.
>>  
>> Michael
>>  
>> From: xmca-l-bounces@mailman.ucsd.edu <mailto:xmca-l-bounces@mailman.ucsd.edu> <xmca-l-bounces@mailman.ucsd.edu <mailto:xmca-l-bounces@mailman.ucsd.edu>> On Behalf Of Edward Wall
>> Sent: Tuesday, September 25, 2018 3:11 PM
>> To: eXtended Mind, Culture, Activity <xmca-l@mailman.ucsd.edu <mailto:xmca-l@mailman.ucsd.edu>>
>> Subject: [Xmca-l] Re: Trying to frame studies of the web through perezhivanie
>>  
>> Michael
>>  
>>      I don’t know if my comments are germane to your discussion of digital and analog, but I was involved in the 60s towards the tail end of the ‘competition'. Your reading makes sense to me; however from where I was sitting there were some nuances. In those years there were, in effect,  two kinds of computing using ‘computers’: information processing and scientific computing. Both of these had an analog history stretching far back. Information processing was, in a sense, initially mechanical, a mechanical that became driven by electronics and eventually with the advent of various graphic devices (I include printers of various kinds) became what we see today. The situation with scientific computing was a little different as it has even a richer analog history. Initially, electronic analog devices had the upper hand because they could, in effect, operate in real time. However, as the digital devices became faster and faster, it became possible to, in effect, simulate an analog device on a digital machine and pragmatically the simulation was “good enough.” Thus for, in a sense, economic reasons digital ‘computers’ won the ‘battle.’ In a way the evolution is reminiscent of that of audio reproduction or using mathematics to model physical reality; it is amazingly effective.  The battle, by the way, is still going on. If I tell the Amazon Alexa to play music a little louder, the increase is done in a digital fashion. If I turn the volume control on one of the original versionsit is done in an analog fashion. So I think you are right, the digital path doesn’t completely reproduce the analog path.
>>  
>> Ed
>> 
>> On Sep 22, 2018, at  9:46 AM, Glassman, Michael <glassman.13@osu.edu <mailto:glassman.13@osu.edu>> wrote:
>>  
>> Hi Greg and Andy,
>>  
>> I wonder if, based on what Andy has said, is might be more worthwhile to focus on the Web as (Dewey’s ideas on) experience rather than perezhivaniye.  I don’t really have a good grasp on perezhivaniye, can’t even really spell it.  But if you used Dewey’s ideas on experience the Web  becomes both artefact and event in our actions.  Dewey makes the argument multiple times I think that we cannot really know our tools outside of our experience in using them, and that in attempting to separate them we are diminishing the meaning of both in our lives. So I think experience actually would be a good way to describe what you are trying to do.
>>  
>> Oh, also another take on analog and digital.  There was a battle between digital and analogous in computing but my own reading of the history is that had more to do with how we treated how computers processed information and solved problems.  I believe the crux of the battle was a bit earlier than the 1960s.  Actually Vannevar Bush who some (me included) consider the father of both the Internet and the Web (well maybe a more distant father but the actual name web is based on one of his ideas I think, web of trails) was working on the idea of an analogous computer in the late forties. I am sure others were as well.  The difference as I understand it is whether we wanted to treat the processing of information as analogous (sort of a linear logic) where one piece of information built off another piece working towards an answer or whether we wanted to treat information as a series of yes no questions leading to a solution (digital referring to the use of 0 and one as yes and no, although I always mix that up.  Digital became dominant for a lot of reasons, not the least of which is because it is more precise and efficient but it is also far more limited.  I often wonder what would have happened if we had followed Bush’s intuition). There are analog and digital circuits of course, but at least in the early history of the computer I don’t believe that was the primary discussion in the use of these terms. Of course that’s just my reading.
>>  
>> Michael
>>  
>> From: xmca-l-bounces@mailman.ucsd.edu <mailto:xmca-l-bounces@mailman.ucsd.edu> <xmca-l-bounces@mailman.ucsd.edu <mailto:xmca-l-bounces@mailman.ucsd.edu>> On Behalf Of Andy Blunden
>> Sent: Friday, September 21, 2018 9:46 PM
>> To: xmca-l@mailman.ucsd.edu <mailto:xmca-l@mailman.ucsd.edu>
>> Subject: [Xmca-l] Re: Trying to frame studies of the web through perezhivanie
>>  
>> A few comments Greg.
>> It seems to me that the web (i.w., www, yes?) is an artefact not events; each unit is a trace of perezhivaniya not a perezhivaniye as such; it is important not to conflate events and artefacts; just as an historian has to know that what they see are traces of real events, not the events as such. What you do with that evidence is something again.
>> Just by-the-by, "analog" does not mean "original" or "real"; it means the opposite of reality. The terms "digital" and "analog" originate from the 1960s when there were two types of computer. Analog computers emulate natural processes by representing natural processes in analogous electronic circuits based on the calculus. In the end digital computers won an almost complete victory, but for example, if I'm not mistaken, the bionic ear uses analog computing to achieve real-time coding of speech, or at least it did when I knew it in the 1980s. 
>> Andy
>> Andy Blunden
>> http://www.ethicalpolitics.org/ablunden/index.htm <http://www.ethicalpolitics.org/ablunden/index.htm>
>> On 22/09/2018 12:57 AM, Greg Mcverry wrote:
>> Hello all,
>>  
>> I have been spending time this summer reading up on the concept of perezhivanie after our article discussion on identify of funds.
>>  
>> I wanted to share a draft of my theoretical perspectie for feedback. Granted due to word count it will probably be reduced to a paragraph or two with drive by citations but I am trying to think this through to inform my design.
>>  
>> https://checkoutmydomain.glitch.me/theoretical.html <https://checkoutmydomain.glitch.me/theoretical.html>
>>  
>> -I got a little feedback but from Russian scholars in other fields (literature mainly)  that I missed the meaning by being too neutral and I needed to get at "growing from one's misery" or another person said "brooding over the bad stuff that happened that makes you who you are" So I want to make sure I capture the struggle.
>>  
>> -I am not diving into this now but I am also considering the identify and culture of a local web and how that plays out into how we shapes funds of identity as we create online spaces.
>>  
>> -Finally is applying this lens with adult learners not appropriate? What does it mean when you actively want to tweak the environment of learners to reduce experiencing as struggle and increase experience as contemplation.
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucsd.edu/pipermail/xmca-l/attachments/20180926/f26c8600/attachment.html 


More information about the xmca-l mailing list