MachineMachine /stream - search for sociology https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Meet The Woman Who Did Everything In Her Power To Hide Her Pregnancy From Big Data | ThinkProgress]]> http://thinkprogress.org/culture/2014/04/29/3432050/can-you-hide-from-big-data/

Janet Vertesi, assistant professor of sociology at Princeton University, had an idea: would it be possible to hide her pregnancy from big data? Thinking about technology—the way we use it and the way it uses us—is her professional life’s work.

]]>
Mon, 04 Jul 2016 04:06:08 -0700 http://thinkprogress.org/culture/2014/04/29/3432050/can-you-hide-from-big-data/
<![CDATA[Benjamin Bratton. The Post-Anthropocene. 2015]]> http://www.youtube.com/watch?v=FrNEHCZm_Sc

http://www.egs.edu Benjamin H. Bratton, born 1968, is an American theorist, sociologist and professor of visual arts, contemporary social and political theory, philosophy, and design.

The Post-Anthropocene: The Turing-incomplete Orchid Mantis Evolves Machine Vision. Public open lecture for the students and faculty of the European Graduate School EGS Media and Communication Studies department program Saas-Fee Switzerland Europe. 2015.

Benjamin H. Bratton, (b. 1968), is an American theorist, sociologist, and professor of visual arts, contemporary social and political theory, philosophy, and design. His research deals with computational media and infrastructure, design research management & methodologies, classical and contemporary sociological theory, architecture and urban design issues, and the politics of synthetic ecologies and biologies.

Bratton completed his doctoral studies in the sociology of technology at the University of California, Santa Barbara​, and was the Director of the Advanced Strategies Group at Yahoo! before expanding his cross-disciplinary research and practice in academia. He taught in the Department of Design/Media Art at UCLA from 2003-2008, and at the SCI Arc​ (Southern California Institute of Architecture)​ for a decade, and continues to teach as a member of the Visiting Faculty. While at SCI Arc, Benjamin Bratton and Hernan Diaz-Alonso co-founded the XLAB courses, which placed students in laboratory settings where they could work directly and comprehensively in robotics, scripting, biogenetics, genetic codification, and cellular systems​. Currently, in addition to his professorship at EGS, Bratton is an associate professor of Visual Arts at the University of California, San Dieg​o, where he also directs the Center for Design and Geopolitics, partnering with the California Institute of Telecommunications and Information Technology​.

In addition to his formal positions, Benjamin H. Bratton is a regular visiting lecturer at numerous universities and institutions including: Columbia University, Yale University, Pratt Institute, Bartlett School of Architecture, University of Pennsylvania, University of Southern California, University of California, Art Center College of Design, Parsons The New School for Design, University of Michigan, Brown University, The University of Applied Arts in Vienna, Bauhaus- University, Moscow State University, Moscow Institute for Higher Economics, and the Architectural Association School of Architecture in London.

Bratton's current projects focus on the political geography of cloud computing, massively- granular universal addressing systems, and alternate models of ecological governance. In his most recent book, The Stack: On Software and Sovereignty (MIT Press, 2015), Bratton asks the question, "What has planetary-scale computation done to our geopolitical realities?​" and in response, offers the proposition "that smart grids, cloud computing, mobile software and smart cities, universal addressing systems, ubiquitous computing, and other types of apparently unrelated planetary-scale computation can be viewed as forming a coherent whole—an accidental megastructure called The Stack that is both a computational apparatus and a new geopolitical architecture.​"

Other more recent texts include the following: Some Trace Effects of the Post-Anthropocene: On Accelerationist Geopolitical Aesthetics, On Apps and Elementary Forms of Interfacial Life: Object, Image, Superimposition, Deep Address, What We Do is Secrete: On Virilio, Planetarity and Data Visualization, Geoscapes & the Google Caliphate: On Mumbai Attacks, Root the Earth: On Peak Oil Apohenia and Suspicious Images/ Latent Interfaces (with Natalie Jeremijenko), iPhone City, Logistics of Habitable Circulation (introduction to the 2008 edition of Paul Virilio’s Speed and Politics). As well, recent online lectures include: 2 or 3 Things I Know About The Stack, at Bartlett School of Architecture, University of London, and University of Southampton;Cloud Feudalism at Proto/E/Co/Logics 002, Rovinj, Croatia; Nanoskin at Parsons School of Design; On the Nomos of the Cloud at Berlage Institute, Rotterdam, École Normale- Superiore, Paris, and MOCA, Los Angeles; Accidental Geopolitics at The Guardian Summit, New York; Ambivalence and/or Utopia at University of Michigan and UC Irvine, and Surviving the Interface at Parsons School of Design.

]]>
Tue, 18 Aug 2015 08:42:48 -0700 http://www.youtube.com/watch?v=FrNEHCZm_Sc
<![CDATA[IRL or it Didn't Happen: Why We Still Dismiss the Digital - The Machine Starts]]> http://themachinestarts.com/read/2013-10-irl-or-it-didnt-happen-why-we-still-dismiss-digital

Ten years ago, Stephanie Tuszynski set out on a mission that concerned the internet, sociology and Buffy the Vampire Slayer. Tuszynski wanted to find out what it was that drew members of an online Buffy fan group, The Bronze, together.

]]>
Mon, 21 Oct 2013 18:35:01 -0700 http://themachinestarts.com/read/2013-10-irl-or-it-didnt-happen-why-we-still-dismiss-digital
<![CDATA[Artist Profile: Émilie Gervais]]> http://rhizome.org/editorial/2013/apr/18/artist-profile-emilie-gervais

Animated GIF from the website Parked Domain Girl Tombstone (2013) DR: On first inspection, a lot of your work appears to be rooted in the 90s, drawing on the low bandwidth aesthetics inherent in GIFs, midi plugins, embedded frames, ASCII art, and forgotten webring hyperlinks. But the 90s comes out in other ways, too. Pop-cultural undercurrents include Nintendo and Leisure Suit Larry; mixtapes and a particular flavor of Europop. How/why do these things speak to you as a contemporary (Web) artist? EG: The origin of the meaning of most collected n found elements i use in my work is rooted in the 90s. My work itself isn't rooted in the 90s. I've been dragged to use that type of stuff mostly bc i like it n its accurate w the topics im interested in rn. Still tho the source material or what it evokes isn't really important. It jst adds semantic layer/s for some people n so does the aesthetics. Everything linked to that part of my work is treated as game elements (to be inserted) in different contexts of reception w diff codes of conduct. Its about notebooks. All that content is accessory to my work. You could really jst take the whole structure/s n insert totally diff content. It'd still make sense. Maybe Im already doing that but its not linked anywhere rn. Its kinda like people who enjoy playing Canabalt but hate playing Robot Unicorn. The gameplay is literally the same. Jst the content n aesthetic is different. That changes the whole experience. Whats a contemporary web artist?

Blinking Girls Cave (2012) DR: I love the idea of interchangeable (aesthetic) content, as if Andy Warhol could have changed the contents of a "textures" subfolder and suddenly transformed a Campbell's Soup painting into a Heinz. How is play more than a structural component to your work? I'm thinking about rulemaking and breaking, especially your collaboration with Sarah Weis, Blinking Girls Cave, which the park authorities took a disliking to while it was in progress. [Ed. – Blinking Girls Cave (2012) was a part of Apache Project, a series of artworks installed at Mother Neff State Park in Moody, Texas, in a cave that was once used by the Tonkawa Indians as a shelter as well as a burial site. After an initial proposal for an installation in the cave was rejected by park management (despite having been initially approved), the project ultimately took the form of a photo shoot, in which GIFs—some of them drawn from the imagery in seduction-based adventure game Leisure Suit Larry—were displayed on tablets, smartphones and laptops that were placed within the cave and documented. This scaled-back version also proved unacceptable to park management.] EG: I think play is a structural component of life. It's related to how i conceptualize, process n think stuff. It opens space for experimentation. To me, its more related to what sociologists do than anything performance art; like how-to approach different types of social dynamics from diff point of view per example. Also, like that Andy Warhol eating a hamburger video; a partly exhibited learning process. Breaking rules wasn't really a thing in ♡ ♥ Blinking Girls ♥ ♡. What happened at Mother Neff is that our first intended installation, which involved light effects n bubble machines, was disapproved at the last minute bc of the damage it could cause to the cave walls. Blinking Girls Cave thus became about hardwares n gifs. During the documentation - that being the installation - Nate Hitchcock, the director n curator n everything at Apache Project, was interrupted by a park ranger who requested him to leave the park because taking pictures n or making videos in the cave wasn't appropriate. DR: There’s a real sense of a partly exhibited learning process in your URL works: an ever growing array of Web 1.0 motifs, exhibited as unique URLs. For me these works expose the Internet as a spatial, material thing, still begging to be explored. You spoke of sociology, is there perhaps something archaeological in your practice? EG: The internet is def abt spatiality and materiality. One can relate to these notions differently. To me, its really more abt physicality. I wasn't really thinking abt them topics when i made these. It's jst kinda there in all websites. Thats the internet. I wouldnt say that these r really web 1.0. The user in both cases isnt primarily a content consumer. Backdoor trojan girl was exhibited at Domain Gallery in a way that highlighted the urls. Under other circumstances, it'd prob be different. The archaeological in my practice is kinda superficial rn. DR: Your URL artworks, http://backdoortrojangirl.net (2012) and http://w-h-a-t-e-v-e-r.net (2013), both flicker between female and male signifiers. Do you think the Web is gendered? How would you approach gender differently in work produced for a gallery context? EG: I don't think the web is gendered. Culture is n adds gendered filter/s to it in some cases. I don't know if i would approach it; maybe i'd dig a hole for feminists/feminism or i'd do a show about postpostpostpostpostpostpost-transexualism. It'd be really fun. DR: For your ongoing collaborative online exhibition Art Object Culture (2011-), you and Lucy Chinen bring together two artists each month to create a new work based on trinkets that were purchased online. These readily available objects accrue value as they pass through the project. I could ask you about the long shadow cast by Duchamp’s readymades, about ownership, exhibition value and artistic identity as they relate to the Web. Instead, I’d really like it if you shared some AOC secrets with us. What criteria do you use to select the artists? Which is your favorite submission so far and why? EM: Art Object Culture offers a website template for artists to explore art making within one rule: create new art objects from items pre-existing in various online stores. We mainly seek artists that have the ability to bend that rule. I don't really have a favorite submission. I like some more than others but my opinion on this is not important. There is no secret. The current format is a translation of our ideas on AOC related topics from 2011. It might eventually mutate. Hopefully we'll sell all the artworks that were made for it before that n or have a show; some kinda showcase for all of them together w everyone that made stuff for it n other people too.

Émilie Gervais  Age: my age range is 7 to 77. Location: Paca/FR. How long have you been working creatively with technology? How did you start? Since forever. I started by playing games on some used pc and recontextualizing movies, game related stuff as improvised play based on the characters n plot/s with friends at school. I've always spent a lot of time randomly surfing the internet while chatting on microsoft comic chat, mIRC, the palace n was really into customizing anything that was customizable ie. winamp skins, mirc themes, etc... Beside that, my fav drawing thing is Lite Bright n i've been deleting, moving, opening files since ive been typing on a keyboard. I've crashed the home computer a couple of times. Describe your experience with the tools you use. How did you start using them? Where did you go to school? What did you study? Experimentation n play! My main tool is the internet or jst even information. In college, ive done a dble cursus in literature n social studies. Then, I dropped out of art school in Mtl n went to Paris. In 2010/2011, i did a dnap/bfa in 1yr at the Ecole d'Art Superieure d'Aix-en-Provence where I'm currently finishing a dnsep/master w a focus in hypermedia. My thesis text thing's title is Fuck Privacy Demo Game Over. What traditional media do you use, if any? Do you think your work with traditional media relates to your work with technology? I'm not media based. The traditional/non traditional dichotomy makes no sense to me. I jst use whatever depending on the project im working on. It's more about ideas n processes. Are you involved in other creative or social activities (i.e. music, writing, activism, community organizing)? I tweet n play music on my iphone everyday. Before that, i played ice hockey n have done some cycling as a summer training thing. I love dancing. Also, health related stuff; superfoods n other stuff, but i mostly eat pizza n candies. Thats creative. I'm involved with adrenaline, gaming, immersive as non immersive n fun everyday. I'm really concerned about open source n how it affects education/academics. But im not seriously implicated in anything, im jst personally into it rn. What do you do for a living or what occupations have you held previously? Do you think this work relates to your art practice in a significant way? I worked at HMV Megastore n Liquid Nutrition in Montreal while being in college. I spent one summer selling autoportraits on the Pont Saint-Louis in Paris w a friend. I worked at some pizza place on bd de Belleville. The boss never slept, ate one fried egg a day and gave us free pizza n drinks everyday. Clients ordered one expresso and remained seated for hrs jst talking abt whatever. Total Belleville cliche. Everything influences the way i process stuff. RN im an art student. Who are your key artistic influences? Toru Iwatani, Kassia Meador, Gustav Klimt n the internet. Have you collaborated with anyone in the art community on a project? With whom, and on what? I collaborate w Lucy Chinen on Art Object Culture n conducted the Blinking Girls project w Sarah Weis. I work/ed w friends that are mostly into painting n music. I ghostpost alot n collaborate w lots of people actively n passively everyday on everything. Its mostly passive networked collaboration/s. Do you actively study art history? Im surrounded by it. I've been into it for as long as i can remember. My dad always brought the family to museums. When i was living in San Francisco, we went to Los Angeles one time mostly jst to go n visit the Getty museum. My college art history teacher was totally awesome. Art history entertains me. Do you read art criticism, philosophy, or critical theory? If so, which authors inspire you? I have phases in which i read alot and others in which i dont at all. Most of the time, i try not to remember the authors so it remains jst about the ideas. RN im reading Critical Play by Mary Flanagan. Are there any issues around the production of, or the display/exhibition of new media art that you are concerned about? Yes, but no at the same time. It really depends on the whole concept of a project. I kinda hate almst everything that is JUST about representation when it comes to new media related art tho, so i'd say im concerned about that. This conversation took place between 22 March and 1 April on a Google Drive document.

]]>
Thu, 18 Apr 2013 08:00:05 -0700 http://rhizome.org/editorial/2013/apr/18/artist-profile-emilie-gervais
<![CDATA[The Shadow Scholar]]> http://chronicle.com/article/article-content/125329/

In the past year, I've written roughly 5,000 pages of scholarly literature, most on very tight deadlines. But you won't find my name on a single paper.

I&#039;ve written toward a master&#039;s degree in cognitive psychology, a Ph.D. in sociology, and a handful of postgraduate credits in international diplomacy. I&#039;ve worked on bachelor&#039;s degrees in hospitality, business administration, and accounting. I&#039;ve written for courses in history, cinema, labor relations, pharmacology, theology, sports management, maritime security, airline services, sustainability, municipal budgeting, marketing, philosophy, ethics, Eastern religion, postmodern architecture, anthropology, literature, and public administration. I&#039;ve attended three dozen online universities. I&#039;ve completed 12 graduate theses of 50 pages or more. All for someone else.

You&#039;ve never heard of me, but there&#039;s a good chance that you&#039;ve read some of my work. I&#039;m a hired gun, a doctor of everything, an academic mercenary. My customers
]]>
Wed, 17 Nov 2010 14:56:00 -0800 http://chronicle.com/article/article-content/125329/
<![CDATA[Bizarre Systems]]> http://vimeo.com/4873884

Reductionism is an amazingly powerful strategy for leveraging the work of scientists and for disseminating the results in the form of re-usable models of structure and causality. But for some of the "remaining hard problem domains" such as Life (biology, psychology, ecology, etc), the World (world modeling, economies, sociology), Intelligence (understanding the brain, intelligence, and creating Artificial General Intelligences - AGI) and the problem of determining the semantics of language (e.g. text) Reductionism has failed. I claim that reductionist models cannot be created in these domains (which have been named "Bizarre Domains") and that we must use Model Free (Holistic) Methods for these domains. This has important implications for AGI research strategies.Cast: Monica Anderson

]]>
Thu, 11 Nov 2010 02:08:06 -0800 http://vimeo.com/4873884
<![CDATA[Analysis: Portal and the Deconstruction of the Institution]]> http://www.gamasutra.com/view/news/23960/Analysis_Portal_and_the_Deconstruction_of_the_Institution.php

In this in-depth analysis, Daniel Johnson discusses games, language and sociology with regard to Valve's Portal - please note that the article contains story spoilers for the game.

In 1959 Erving Goffman released The Presentation of Self in Everyday Life; a book that went on to heavily influence future understanding of social interactions within the sociology discipline. In it, he discusses social intercourse under the metaphor of actors performing on a stage. Specifically, in the second chapter he shares the idea of a front and backstage to social interaction.

As with the theater, we have a place where we manage the performance and a place where we give that performance. As social interlocutors engaged in interaction, we are presenting an impression of ourselves to an audience; we're acting out a role that requires constant management at the whim of the interaction.

The front stage is the grounds of the performance. The backstage is a place we rarely ever want to reveal to others,

]]>
Fri, 10 Sep 2010 07:33:00 -0700 http://www.gamasutra.com/view/news/23960/Analysis_Portal_and_the_Deconstruction_of_the_Institution.php
<![CDATA[Is Star Trek A Religion?]]> http://io9.com/5272441/is-star-trek-a-religion

Star Trek has long been described as a cult phenomenon…but is it an actual cult? Some anthropologists think so. Following the example of anthropologist Margaret Mead, they lived among the natives and studied their rituals-that is, they went to Star Trek conventions and fan clubs. Here's what they found.

Their conclusions? Writes cultural anthropologist Michael Jindra in the journal Sociology of Religion:

When I undertook this research, my first intention was to focus on how ST [Star Trek] draws a picture of the future that is attractive to many Americans. But early on I realized I was dealing with something much bigger and more complex than I had anticipated...it had features that paralleled a religious-type movement: an origin myth, a set of beliefs, an organization, and some of the most active and creative members to be found anywhere…Religion often points us to another world; ST does the same.
]]>
Thu, 30 Jul 2009 13:22:00 -0700 http://io9.com/5272441/is-star-trek-a-religion
<![CDATA[Obama’s Address to the State of Non-belief]]> http://www.3quarksdaily.com/3quarksdaily/2009/01/obamas-address-to-the-state-of-nonbelief.html

“We know that our patchwork heritage is a strength, not a weakness. We are a nation of Christians and Muslims, Jews and Hindus - and non-believers.” Barack Hussein Obama, 20th of January, 2009

(Originally published at 3quarksdaily) As a British citizen I watched the inauguration speech of America’s 44th President with a warm but distanced interest. But as someone who was brought up in a non-religious family, and has thrived without a belief in a deity, I listened to Barack Obama’s words with fascination, concern and hope. Obama’s message to his nation and the greater world was one of inclusion. A broad ranging speech during which America’s new leader threw his arms wide around those who believe in America, and even wider around those who perhaps do not. The matter of ‘belief’ resonated throughout Obama’s address: the belief in God, the belief in America and the belief in Obama himself. Yet in regard of that single word a debate among ‘non-believers’ has sprung up. A debate as to whether Obama’s nod to the millions of Americans who call themselves non-theists, atheists or agnostics should have been wrapped up in such a semantically negative phrase. To pick apart the significance of the phrase ‘non-believers’ it pays to look at the word ‘atheist’: a label which is often analysed by theistic and nontheistic communities alike. A common etymological error connects “a”, from the ancient Greek for “without”, and “theism”, denoting a belief in God. Thus, an a-theist is considered to be someone without a belief in God. The true etymology of the word though is better derived from the Greek root “atheos” meaning merely “godless”. Thus athe(os)ism is closer in kind to a “godless belief system”, rather than “without a belief in god/gods”. This analysis, although tiresome, is worth attending to in regards Obama’s inclusive rhetoric, because as a minority non-theists are some of the most pilloried in American society. In an infamous 2004 study, conducted by the University of Minnesota’s department of sociology, 39.5 percent of those interviewed stated that atheists “did not share their vision of American society”: Asked the same question about Muslims and homosexuals, the figures dropped to a slightly less depressing 26.3 percent and 22.6 percent, respectively. For Hispanics, Jews, Asian-Americans and African-Americans, they fell further to 7.6 percent, 7.4 percent, 7.0 percent and 4.6 percent, respectively. The study contains other results, but these are sufficient to underline its gist: Atheists are seen by many Americans (especially conservative Christians) as alien and are, in the words of sociologist Penny Edgell, the study’s lead researcher, “a glaring exception to the rule of increasing tolerance over the last 30 years.” - link The suggestion that an atheist’s concern for their country is of a different quality to that of a believer is enormously telling. Has the common misunderstanding of atheism as a lack of belief come to be associated in America not just with God, but with morality, patriotism and an empathy for others? A 1987 interview conducted by Rob Sherman with George Bush senior seems to attest to this. Whilst in the office of Vice President, Mr. Bush stated: “I don’t know that atheists should be regarded as citizens, nor should they be regarded as patriotic. This is one nation under God.” A comment that has rung in the ears of nontheists ever since. It is this apparent mis-conception about non-belief that makes Obama’s comment seem all the more thoughtless. Surely, in a speech of such fine rhetoric, so minutely crafted to chime with the thoughts and feelings of an entire nation - and of a world beyond - a phrase weighted as strongly as ‘non-believers’ should have been handled more carefully? It is doubtful that it was included as an afterthought; doubtful indeed that Barack Obama and his team of talented speech writers did not deliberate over its usage and inclusion in the most important piece of oratory they had ever crafted. How many Presidents of the last century have talked of ’non-believers’ in such patriotic tones? How much recent American policy has cited atheists and agnostics as integral to the character of the nation; as a minority worth even calling attention to? A closer look at the phrase is necessary, I believe, to truly grasp its significance as one of the most subtle shifts in political rhetoric the Obama team has yet delivered. Another extract from the inaugural address begins to clarify our semantic quarrel: “On this day, we come to proclaim an end to the petty grievances and false promises, the recriminations and worn-out dogmas that for far too long have strangled our politics.” Here Obama asks for the narrative of American life, of American policy, to be redrafted. A call to a young nation to “pick [itself] up, dust [itself] off, and begin again” the work of building its identity. Obama’s call for America to unite under its founding principles is a definitively secular call; a call to the American State to be once again separated from any religion, just as its founding fathers had intended. For too long the identity of America has been infused with a kind of Christian grand-narrative, a sense that if God had placed mankind on the Earth to achieve greatness, and if America was the world’s greatest nation, then God must have always intended for the Christian story to also be the American story. This dangerous ethos, often echoed in the rhetoric of the Bush administration, is arguably responsible for the current tension between America, the Islamic world and beyond. This dangerous ethos, once reassessed through the eyes of a secular nation, bears more relationship to a fundamentalist doctrine than it does to a moral bedrock for American policy. By placing ‘non-believers’ at the end of a list of religious denominations Obama and his team were speaking not to the religious beliefs that unite Americans, but the moral and social bonds that tie them together as communities. When we look at the Christian community, at the Jewish community, at the Muslim and Hindu communities, the sharing of ‘beliefs’, becomes much more irrelevant. Two distinct people may call themselves Christian, but as a Protestant and a Catholic their core religious beliefs will be very different. By citing the non-believer community in his “patchwork” identity Obama was talking of the irrelevance of any particular view of God in the constitution of the American nation. His message to the Muslim world to ”seek [together] a new way forward, based on mutual interest and mutual respect” was a message to all to put particular beliefs in Gods aside and get on with the common goal of restitching our patchwork world. A message to: “Tie up your camel first, then put your trust in Allah.” - link As a non-American, I can believe in similar ideals. As a proud atheist I can attest to the fact that not believing in a God does not mean I don’t have beliefs. After all, every one of us - Atheist or Agnostic, Christian or Muslim, Jewish or Buddhist, Hindu, Sikh, Baha’i, Shinto or Rastafarian - are non-believers in something.

]]>
Tue, 05 May 2009 08:31:00 -0700 http://www.3quarksdaily.com/3quarksdaily/2009/01/obamas-address-to-the-state-of-nonbelief.html
<![CDATA[The Next Great Discontinuity: The Data Deluge]]> http://www.3quarksdaily.com/3quarksdaily/2009/04/the-next-great-discontinuity-part-two.html

Speed is the elegance of thought, which mocks stupidity, heavy and slow. Intelligence thinks and says the unexpected; it moves with the fly, with its flight. A fool is defined by predictability… But if life is brief, luckily, thought travels as fast as the speed of light. In earlier times philosophers used the metaphor of light to express the clarity of thought; I would like to use it to express not only brilliance and purity but also speed. In this sense we are inventing right now a new Age of Enlightenment… A lot of… incomprehension… comes simply from this speed. I am fairly glad to be living in the information age, since in it speed becomes once again a fundamental category of intelligence. Michel Serres, Conversations on Science, Culture and Time

(Originally published at 3quarksdaily · Link to Part One) Human beings are often described as the great imitators: We perceive the ant and the termite as part of nature. Their nests and mounds grow out of the Earth. Their actions are indicative of a hidden pattern being woven by natural forces from which we are separated. The termite mound is natural, and we, the eternal outsiders, sitting in our cottages, our apartments and our skyscrapers, are somehow not. Through religion, poetry, or the swift skill of the craftsman smearing pigment onto canvas, humans aim to encapsulate that quality of existence that defies simple description. The best art, or so it is said, brings us closer to attaining a higher truth about the world that remains elusive from language, that perhaps the termite itself embodies as part of its nature. Termite mounds are beautiful, but were built without a concept of beauty. Termite mounds are mathematically precise, yet crawling through their intricate catacombs cannot be found one termite in comprehension of even the simplest mathematical constituent. In short, humans imitate and termites merely are. This extraordinary idea is partly responsible for what I referred to in Part One of this article as The Fallacy of Misplaced Concreteness. It leads us to consider not only the human organism as distinct from its surroundings, but it also forces us to separate human nature from its material artefacts. We understand the termite mound as integral to termite nature, but are quick to distinguish the axe, the wheel, the book, the skyscraper and the computer network from the human nature that bore them. When we act, through art, religion or with the rational structures of science, to interface with the world our imitative (mimetic) capacity has both subjective and objective consequence. Our revelations, our ideas, stories and models have life only insofar as they have a material to become invested through. The religion of the dance, the stone circle and the summer solstice is mimetically different to the religion of the sermon and the scripture because the way it interfaces with the world is different. Likewise, it is only with the consistency of written and printed language that the technical arts could become science, and through which our ‘modern’ era could be built. Dances and stone circles relayed mythic thinking structures, singular, imminent and ethereal in their explanatory capacities. The truth revealed by the stone circle was present at the interface between participant, ceremony and summer solstice: a synchronic truth of absolute presence in the moment. Anyone reading this will find truth and meaning through grapholectic interface. Our thinking is linear, reductive and bound to the page. It is reliant on a diachronic temporality that the pen, the page and the book hold in stasis for us. Imitation alters the material world, which in turn affects the texture of further imitation. If we remove the process from its material interface we lose our objectivity. In doing so we isolate the single termite from its mound and, after much careful study, announce that we have reduced termite nature to its simplest constituent. The reason for the tantalizing involutions here is obviously that intelligence is relentlessly reflexive, so that even the external tools that it uses to implement its workings become ‘internalized’, that is, part of its own reflexive process… To say writing is artificial is not to condemn it but to praise it. Like other artificial creations and indeed more than any other, it is utterly invaluable and indeed essential for the realisation of fuller, interior, human potentials. Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word. Walter J. Ong, Orality and Literacy

Anyone reading this article cannot fail but be aware of the changing interface between eye and text that has taken place over the past two decades or so. New Media – everything from the internet database to the Blackberry – has fundamentally changed the way we connect with each other, but it has also altered the way we connect with information itself. The linear, diachronic substance of the page and the book have given way to a dynamic textuality blurring the divide between authorship and readership, expert testament and the simple accumulation of experience. The main difference between traditional text-based systems and newer, data-driven ones is quite simple: it is the interface. Eyes and fingers manipulate the book, turning over pages in a linear sequence in order to access the information stored in its printed figures. For New Media, for the digital archive and the computer storage network, the same information is stored sequentially in databases which are themselves hidden to the eye. To access them one must commit a search or otherwise run an algorithm that mediates the stored data for us. The most important distinction should be made at the level of the interface, because, although the database as a form has changed little over the past 50 years of computing, the Human Control Interfaces (HCI) we access and manipulate that data through are always passing from one iteration to another. Stone circles interfacing the seasons stayed the same, perhaps being used in similar rituals over the course of a thousand years of human cultural accumulation. Books, interfacing text, language and thought, stay the same in themselves from one print edition to the next, but as a format, books have changed very little in the few hundred years since the printing press. The computer HCI is most different from the book in that change is integral to it structure. To touch a database through a computer terminal, through a Blackberry or iPhone, is to play with data at incredible speed: Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition… Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies. At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics… This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves. Wired Magazine, The End of Theory, June 2008

And as the amount of data has expanded exponentially, so have the interfaces we use to access that data and the models we build to understand that data. On the day that Senator John McCain announced his Vice Presidential Candidate the best place to go for an accurate profile of Sarah Palin was not the traditional media: it was Wikipedia. In an age of instant, global news, no newspaper could keep up with the knowledge of the cloud. The Wikipedia interface allowed knowledge about Sarah Palin from all levels of society to be filtered quickly and efficiently in real-time. Wikipedia acted as if it was encyclopaedia, as newspaper as discussion group and expert all at the same time and it did so completely democratically and at the absence of a traditional management pyramid. The interface itself became the thinking mechanism of the day, as if the notes every reader scribbled in the margins had been instantly cross-checked and added to the content. In only a handful of years the human has gone from merely dipping into the database to becoming an active component in a human-cloud of data. The interface has begun to reflect back upon us, turning each of us into a node in a vast database bigger than any previous material object. Gone are the days when clusters of galaxies had to a catalogued by an expert and entered into a linear taxonomy. Now, the same job is done by the crowd and the interface, allowing a million galaxies to be catalogued by amateurs in the same time it would have taken a team of experts to classify a tiny percentage of the same amount. This method of data mining is called ‘crowdsourcing’ and it represents one of the dominant ways in which raw data will be turned into information (and then knowledge) over the coming decades. Here the cloud serves as more than a metaphor for the group-driven interface, becoming a telling analogy for the trans-grapholectic culture we now find ourselves in. To grasp the topological shift in our thought patterns it pays to move beyond the interface and look at a few of the linear, grapholectic models that have undergone change as a consequence of the information age. One of these models is evolution, a biological theory the significance of which we are still in the process of discerning:

If anyone now thinks that biology is sorted, they are going to be proved wrong too. The more that genomics, bioinformatics and many other newer disciplines reveal about life, the more obvious it becomes that our present understanding is not up to the job. We now gaze on a biological world of mind-boggling complexity that exposes the shortcomings of familiar, tidy concepts such as species, gene and organism. A particularly pertinent example [was recently provided in New Scientist] - the uprooting of the tree of life which Darwin used as an organising principle and which has been a central tenet of biology ever since. Most biologists now accept that the tree is not a fact of nature - it is something we impose on nature in an attempt to make the task of understanding it more tractable. Other important bits of biology - notably development, ageing and sex - are similarly turning out to be much more involved than we ever imagined. As evolutionary biologist Michael Rose at the University of California, Irvine, told us: “The complexity of biology is comparable to quantum mechanics.” New Scientist, Editorial, January 2009

As our technologies became capable of gathering more data than we were capable of comprehending, a new topology of thought, reminiscent of the computer network, began to emerge. For the mindset of the page and the book science could afford to be linear and diachronic. In the era of The Data Deluge science has become more cloud-like, as theories for everything from genetics to neuroscience, particle physics to cosmology have shed their linear constraints. Instead of seeing life as a branching tree, biologists are now speaking of webs of life, where lineages can intersect and interact, where entire species are ecological systems in themselves. As well as seeing the mind as an emergent property of the material brain, neuroscience and philosophy have started to consider the mind as manifest in our extended, material environment. Science has exploded, and picking up the pieces will do no good. Through the topology of the network we have begun to perceive what Michel Serres calls ‘The World Object’, an ecology of interconnections and interactions that transcends and subsumes the causal links propounded by grapholectic culture. At the limits of science a new methodology is emerging at the level of the interface, where masses of data are mined and modelled by systems and/or crowds which themselves require no individual understanding to function efficiently. Where once we studied events and ideas in isolation we now devise ever more complex, multi-dimensional ways for those events and ideas to interconnect; for data sources to swap inputs and output; for outsiders to become insiders. Our interfaces are in constant motion, on trajectories that curve around to meet themselves, diverge and cross-pollinate. Thought has finally been freed from temporal constraint, allowing us to see the physical world, life, language and culture as multi-dimensional, fractal patterns, winding the great yarn of (human) reality: The advantage that results from it is a new organisation of knowledge; the whole landscape is changed. In philosophy, in which elements are even more distanced from one another, this method at first appears strange, for it brings together the most disparate things. People quickly crit[cize] me for this… But these critics and I no longer have the same landscape in view, the same overview of proximities and distances. With each profound transformation of knowledge come these upheavals in perception. Michel Serres, Conversations on Science, Culture and Time

]]>
Tue, 05 May 2009 07:35:00 -0700 http://www.3quarksdaily.com/3quarksdaily/2009/04/the-next-great-discontinuity-part-two.html
<![CDATA[The Next Great Discontinuity: Part One]]> http://www.3quarksdaily.com/3quarksdaily/2009/03/the-next-great-discontinuity-part-one.html

Grapholectic Thought and The Fallacy of Misplaced Concreteness (Originally published at 3quarksdaily · Link to Part Two) “There are things,” Christoph Martin Wieland… contended, “which by their very nature are so dependent upon human caprice that they either exist or do not exist as soon as we desire that they should or should not exist.”…We are, at the very least, reminded that seeing is a talent that needs to be cultivated, as John Berger saliently argued in his popular Ways of Seeing (1972) “…perspective makes the single eye the centre of the visible world.” John A. Mccarthy, Remapping Reality

From the Greco-Roman period onwards humans have perceived themselves at the centre of a grand circle:

The circle is physical: a heliocentric vision of the cosmos, where the Earth travels around the sun. The circle is biological: an order of nature, perhaps orchestrated by a benign creator, where the animals and plants exist to satisfy the needs of mankind. And according to Sigmund Freud, in his Introductory Lectures on Psycho-Analysis, the circle is psychological: where a central engine of reason rules over the chaos of passion and emotion.

The history of science maintains that progress – should one be comfortable in using such a term – contracted these perceptual loops. Indeed it was Freud himself, (the modest pivot of his own solar-system) who suggested that through the Copernican, Darwinian and Freudian “revolutions” mankind had transcended these “three great discontinuities” of thought and, “[uttered a] call to introspection”. If one were to speculate on the “great discontinuities” that followed, one might consider Albert Einstein’s relativistic model of space-time, or perhaps the work carried out by many “introspective” minds on quantum theory. Our position at the centre of the cosmos was offset by Copernicus; our position as a special kind of creature was demolished by Darwin’s Theory of Evolution. From Freud we inherited the capacity to see beneath the freedom of the individual; from Einstein and quantum theory we learnt to mistrust the mechanistic clock of space and time. From all we learnt, as John Berger so succinctly put it, that “…perspective makes the single eye the centre of the visible world.” Of course my mini-history of scientific revolution should not be taken itself as a “truth”. I draw it as a parable of progress, as one silken thread leading back through time’s circular labyrinth to my very own Ariadne. What I do maintain though, is that all great moves in human thought have come at the expense of a perceptual circle. That, if science, sociology, economics - or any modern system of knowledge - is to move beyond the constraints of its circle it must first decentre the “single eye”.

Scientific rational inquiry has revelled in the overturning of these “great discontinuities”, positioning each of them as a plotted point on the graph we understand as “progress”. We maintain, without any hint of irony, that we exist at the pinnacle of this irreversible line of diachronic time, that the further up the line we climb, the closer to “truth” we ascend. “…Reason is statistically distributed everywhere; no one can claim exclusive rights to it. [A] division… is [thus] echoed in the image, in the imaginary picture that one makes of time. Instead of condemning or excluding, one consigns a certain thing to antiquity, to archaism. One no longer says “false” but, rather, “out of date,” or “obsolete.” In earlier times people dreamed; now we think. Once people sang poetry; today we experiment efficiently. History is thus the projection of this very real exclusion into an imaginary, even imperialistic time. The temporal rupture is the equivalent of a dogmatic expulsion.” Michel Serres, Conversations on Science, Culture and Time

According to Michel Serres “time” is the common misconception that pollutes all our models. In the scientific tradition knowledge is located at the present: a summation of all inquiry that has lead up to this point. This notion is extraordinarily powerful in its reasoning power, bringing all previous data together in one great cataclysm of meaning. It has spawned its own species of cliché, the type where science ‘landed us on the moon’ or ‘was responsible for the extinction of smallpox’ or ‘increased the life expectancy of the third world’. These types of truths are necessary – you will not find me arguing against that – but they are also only one notion of what “truth” amounts to. And it is here perhaps where the circumference of yet another perceptual circle materialises from out of the mist. Progress and diachronic time are symbiotically united: the one being incapable of meaningful existence without the other. Our modern notion of “truth” denies all wisdom that cannot be plotted on a graph; that cannot be traced backwards through the recorded evidence or textual archive. Our modern conceptions are, what Walter J. Ong calls, the consequence of a ‘grapholectic’ culture – that is, one reliant on the technologies of writing and/or print. Science, as we understand it, could not have arisen without a system of memorisation and retrieval that extended beyond the limits of an oral culture. In turn, modern religious practices are as much a consequence of ‘the written word’ as they are ‘the word of God’. The “truth” of science is similar in kind to the ”truth” of modern religion. It is the “truth” of the page; of a diachronic, grapholectic culture – a difficult ”truth” to swallow for those who maintain that ’dogma’ is only a religous vice. Dialectic cultures – ones which are based in oral traditions – do not consider history and time in the same way as grapholectic cultures. To the dialectic, meaning is reliant on what one can personally or culturally remember, rather than on what the extended memory of the page can hold in storage. Thus the attribution of meaning emerges from the present, synchronic situation, rather than being reliant on the consequences of past observation: “Some decades ago among the Tiv people of Nigeria the genealogies actually used orally in settling court disputes have been found to diverge considerably from the genealogies carefully recorded in writing by the British forty years earlier (because of the importance then, too, in court disputes). The later Tiv have maintained that they were using the same genealogies as forty years earlier and that the earlier written record was wrong. What had happened was that the later genealogies had been adjusted to the changed social relations among the Tiv: they were the same in that they functioned in the same way to regulate the real world. The integrity of the past was subordinate to the integrity of the present.” Walter J. Ong, Orality and Literacy

In the oral culture “truth” must be rooted in systems that are not time-reliant. As Karen Armstrong has oft noted, “a myth was an event which in some sense had happened once, but which also happened all the time.” Before the written tradition was used to brand Religious inclinations onto the page the flavour of myth was understood as its most valuable “truth”, rather than its ingredients. The transcendence of Buddha, of Brahmā or Jesus is a parable of existence, and not a true fact garnered from evidence and passed down in the pages of a book. Meaning is not to be found in final “truths”, but in the questioning of contexts; in the deliberation of what constitutes the circle. If we forget this then we commit, what A. N. Whitehead called, the fallacy of misplaced concreteness: “This… consists in mistaking the abstract for the concrete. More specifically it involves setting up distinctions which disregard the genuine interconnections of things…. [The] fallacy occurs when one assumes that in expressing the space and time relations of a bit of matter it is unnecessary to say more than that it is present in a specific position in space at a specific time. It is Whitehead’s contention that it is absolutely essential to refer to other regions of space and other durations of time… [Another] general illustration of the fallacy of Misplaced Concreteness is… the notion that each real entity is absolutely separate and distinct from every other real entity, and that the qualities of each have no essential relation to the qualities of others.” A. H. Johnson, Whitehead’s Theory of Reality

Our error is to mistake grapholectic thought - thought maintained by writing and print - as the only kind of thought we are capable of. I predict that the next “great discontinuity” to be uncovered, the one that historians will look back upon as “the biggest shift in our understanding since Einstein”, will emerge not from the traditional laboratory, or from notions computed through the hazy-filters of written memory, but from our very notion of what it is for “events” to become “data” and for that data to become “knowledge”. The circle we now sit at the centre of, is one enclosed by the grapholectic perceptions we rely on to consider the circle in the first place. In order to shift it we will need a new method of transposing events that occur ‘outside’ the circle, into types of knowledge that have value ‘within’ the circle. This may sound crazy, even impossible in scope, but we may have already begun devising new ways for this kind of knowledge to reach us. Continued in… Part Two: The Data Deluge

]]>
Mon, 04 May 2009 07:17:00 -0700 http://www.3quarksdaily.com/3quarksdaily/2009/03/the-next-great-discontinuity-part-one.html