MachineMachine /stream - search for analogy https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Digital Metaphors: Editor’s Introduction | Alluvium]]> http://www.alluvium-journal.org/2013/12/04/digital-metaphors-editors-introduction/

Metaphor wants to be…

‘[...] metaphors work to change people’s minds. Orators have known this since Demosthenes. [...] But there’s precious little evidence that they tell you what people think. [...] And in any case, words aren’t meanings. As any really good spy knows, a word is a code that stands for something else. If you take the code at face value then you’ve fallen for the trick.’ (Daniel Soar, “The Bourne Analogy”).

Tao Lin’s recent novel Taipei (2013) is a fictional document of life in our current digital culture. The protagonist, Paul — who is loosely based on the author — is numb from his always turned on digitally mediated life, and throughout the novel increases his recreational drug taking as a kind of compensation: the chemical highs and trips are the experiential counterpoint to the mundanity of what once seemed otherworldly — his online encounters. In the novel online interactions are not distinguished from real life ones, they are all real, and so Paul’s digital malaise is also his embodied depressive mindset. The apotheosis of both these highs and lows is experienced by Paul, and his then girlfriend Erin, on a trip to visit Paul’s parents in Taipei. There the hyper-digital displays of the city — ‘lighted signs [...] animated and repeating like GIF files, attached to every building’ (166) — launch some of the more explicit mediations on digital culture in the novel: Paul asked [Erin] if she could think of a newer word for “computer” than “computer,” which seemed outdated and, in still being used, suspicious in some way, like maybe the word itself was intelligent and had manipulated culture in its favor, perpetuating its usage (167). Here Paul intimates a sense that language is elusive, that it is sentient, and that, in the words of Daniel Soar quoted above as an epitaph, it tricks us. It seems to matter that in this extract from Taipei the word ‘computer’ is conflated with a sense of the object ‘computer’. The word, in being ‘intelligent’, has somehow taken on the quality of the thing it denotes — a potentially malevolent agency. The history of computing is one of people and things: computers were first the women who calculated ballistics trajectories during the Second World War, whose actions became the template for modern automated programming. The computer, as an object, is also-always a metaphor of a human-machine relation. The name for the machine asserts a likeness between the automated mechanisms of computing and the physical and mental labour of the first human ‘computers’. Thinking of computing as a substantiated metaphor for a human-machine interaction pervades the way we talk about digital culture. Most particularly in the way we think of computers as sentient — however casually. We often speak of computers as acting independently from our commands, and frequently we think of them ‘wanting’ things, ‘manipulating’ culture, or ourselves.

Pre-Electronic Binary Code Pre-electronic binary code: the history of computing offers us metaphors for human-machine interaction which pervade the way we talk about digital culture today [Image by Erik Wilde under a CC BY-SA license]

Julie E. Cohen, in her 2012 book Configuring the Networked Self, describes the way the misplaced metaphor of human-computer and machine-computer has permeated utopian views of digitally mediated life: Advocates of information-as-freedom initially envisioned the Internet as a seamless and fundamentally democratic web of information [...]. That vision is encapsulated in Stewart Brand’s memorable aphorism “Information wants to be free.” [...] Information “wants” to be free in the same sense that objects with mass present in the earth’s gravitational field “want” to fall to the ground. (8) Cohen’s sharp undercutting of Brand’s aphorism points us toward the way the metaphor of computing is also an anthropomorphisation. The metaphor implicates a human desire in machine action. This linguistic slipperiness filters through discussion of computing at all levels. In particular the field of software studies — concerned with theorising code and programming as praxis and thing — contains at its core a debate on the complexity of considering code in a language which will always metaphorise, or allegorise. Responding to an article of Alexander R. Galloway’s titled “Language Wants to Be Overlooked: On Software and Ideology”, Wendy Hui Kyong Chun argues that Galloway’s stance against a kind of ‘anthropomorphization’ of code studies (his assertion that as an executable language code is ‘against interpretation’) is impossible within a discourse of critical theory. Chun argues, ‘to what extent, however, can source code be understood outside of anthropomorphization? [...] (The inevitability of this anthropomorphization is arguably evident in the title of Galloway’s article: “Language Wants to Be Overlooked” [emphasis added].)’ (Chun 305). In her critique of Galloway’s approach Wendy Chun asserts that it is not possible to extract the metaphor from the material, that they are importantly and intrinsically linked.[1] For Julie E. Cohen the relationship between metaphor and digital culture-as-it-is-lived is a problematic tie that potentially damages legal and constitutional understanding of user rights. Cohen convincingly argues that a term such as ‘cyberspace’, which remains inextricable from its fictional and virtual connotations, does not transition into legal language successfully; in part because the word itself is a metaphor, premised on an imagined reality rather than ‘the situated, embodied beings who inhabit it’ (Cohen 3). And yet Cohen’s writing itself demonstrates the tenacious substance of metaphoric language, using extended exposition of metaphors as a means to think more materially about the effects of legal and digital protocol and action. In the following extract from Configuring the Networked Self, Cohen is winding down a discussion of the difficulty of forming actual policy out of freedom versus control debates surrounding digital culture. Throughout the discussion Cohen has emphasised the way that both sides of the debate are unable to substantiate their rhetoric with embodied user practice; instead Cohen identifies a language that defers specific policy aims.[2] Cohen’s own use of metaphor in this section — ‘objections to control fuel calls [...]’, ‘darknets’ (the latter in inverted commas) — is made to mean something grounded, through a kind of allegorical framework. I am not suggesting that allegory materialises metaphor — allegory functioning in part as itself an extended metaphor — but it does contextualise metaphor.

Circuit Board 2 How tenacious is metaphoric language? The persistence of computational metaphors in understanding digital culture could harm legal and constitutional understandings of user rights [Image by Christian under a CC BY-NC-ND license]

This is exemplified in Cohen’s description of the ways US policy discussions regarding code, rights and privacy of the subject are bound to a kind of imaginary, and demonstrate great difficulty in becoming concrete: Policy debates have a circular, self-referential quality. Allegations of lawlessness bolster the perceived need for control, and objections to control fuel calls for increased openness. That is no accident; rigidity and license historically have maintained a curious symbiosis. In the 1920s, Prohibition fueled the rise of Al Capone; today, privately deputized copyright cops and draconian technical protection systems spur the emergence of uncontrolled “darknets.” In science fiction, technocratic, rule-bound civilizations spawn “edge cities” marked by their comparative heterogeneity and near imperviousness to externally imposed authority. These cities are patterned on the favelas and shantytowns that both sap and sustain the world’s emerging megacities. The pattern suggests an implicit acknowledgment that each half of the freedom/control binary contains and requires the other (9-10). I quote this passage at length in order to get at the way in which the ‘self-referential nature’ of policy discussion is here explained through a conceptual, and specifically literary, framing. Technology is always both imagined and built: this seems obvious, but it justifies reiteration because the material operations of technology are always metaphorically considered just as they are concretely manifest. The perilous circumstance this creates is played on in Cohen’s writing as she critiques constitutional policy that repeatedly cannot get at the embodied subject that uses digital technology; thwarted by the writing and rewriting of debate. In Cohen’s words this real situation is like the science fiction that is always-already seemingly like the real technology. Whether William Gibson’s ‘cyberspace’, a programmer’s speculative coding, or a lawyer’s articulation of copyright, there is no easy way to break apart the relationship between the imaginary and the actual of technoculture. Perhaps then what is called for is an explosion of the metaphors that pervade contemporary digital culture. To, so to speak, push metaphors until they give way; to generate critical discourse that tests the limits of metaphors, in an effort to see what pretext they may yield for our daily digital interactions. The articles in this issue all engage with exactly this kind of discourse. In Sophie Jones’ “The Electronic Heart”, the history of computing as one of women’s labour is used to reconfigure the metaphor of a computer as an ‘electronic brain’; instead asking whether cultural anxieties about computer-simulated emotion are linked to the naturalization of women’s affective labour. In “An Ontology of Everything on the Face of the Earth”, Daniel Rourke also considers computers as a sentient metaphor: uncovering an uncanny symbiosis between what a computer wants and what a human can effect with computing, through a critical dissection of the biocybernetic leeching of John Carpenter’s 1982 film The Thing. Finally, in “The Metaphorics of Virtual Depth”, Rob Gallagher uses Marcel Proust’s treatment of novelistic spacetime to generate a critical discourse on spatial and perspectival metaphor in virtual game environments. All these articles put into play an academic approach to metaphors of computing that dig up and pull out the stuff in between language and machine. In his introduction to Understanding Digital Humanities David M. Berry has argued for such an approach: [what is needed is a] ‘critical understanding of the literature of the digital, and through that [to] develop a shared culture through a form of Bildung’ (8).

Elysium A wheel in the sky: Neil Blomkamp's futuristic L.A. plays on the territorial paranoia of the U.S. over alien invasion and dystopian metaphors of digitally-mediated environments [Image used under fair dealings provisions]

I am writing this article a day after seeing Neill Blomkamp’s film Elysium (2013). Reading Cohen’s assertion regarding the cyclical nature of US digital rights policy debates on control and freedom, her allegory with science fiction seems entirely pertinent. Elysium is set in 2154; the earth is overpopulated, under-resourced, and a global elite have escaped to a man-made (and machine-made) world on a spaceship, ‘Elysium’. Manufacturing for Elysium continues on earth where the population, ravaged by illness, dreams of escaping to Elysium to be cured in “Med-Pods”. The movie focuses on the slums of near future L.A. and — perhaps unsurprisingly given Blomkamp’s last film District 9 (2009) — plays on the real territorial paranoia of the U.S. over alien invasion: that the favelas of Central and South America, and the political structures they embody, are always threatening ascension. In Elysium the “edge city” is the whole world, and the technocratic power base is a spaceship garden, circling the earth’s orbit. ‘Elysium’ is a green and white paradise; a techno-civic environment in which humans and nature are equally managed, and manicured. ‘Elysium’, visually, looks a lot like Disney’s Epcot theme park — which brings me back to where I started. In Tao Lin’s Taipei Paul’s disillusionment with technology is in part with its failure to be as he imagined, and his imagination was informed by the Disney-fied future of Epcot. In Taipei: Paul stared at the lighted signs, some of which were animated and repeated like GIF files, attached to almost every building to face oncoming traffic [...] and sleepily thought how technology was no longer the source of wonderment and possibility it had been when, for example, he learned as a child at Epcot Center [...] that families of three, with one or two robot dogs and one maid, would live in self-sustaining, underwater, glass spheres by something like 2004 or 2008 (166). Thinking through the metaphor of Elysium has me thinking toward the fiction of Epcot (via Tao Lin’s book). The metaphor-come-allegories at work here are at remove from my digitally mediated, embodied reality, but they seep through nonetheless. Rather than only look for the concrete reality that drives the metaphor, why not also engage with the messiness of the metaphor; its potential disjunction with technology as it is lived, and its persistent presence regardless.

CITATION: Zara Dinnen, "Digital Metaphors: Editor's Introduction," Alluvium, Vol. 2, No. 6 (2013): n. pag. Web. 4 December 2013, http://dx.doi.org/10.7766/alluvium.v2.6.04

]]>
Wed, 11 Dec 2013 15:42:41 -0800 http://www.alluvium-journal.org/2013/12/04/digital-metaphors-editors-introduction/
<![CDATA[Digital Archaeology]]> http://www.wired.com/wired/archive/1.05/1.5_archaeology.html

Anyone who can read German can read the first book ever printed. If you can read Sumerian cuneiform, you can read clay tablets that were probably the first things ever written. Hard copy sticks around, equally a delight to scholars and a burden to office managers. More significantly, the read-out system - the human eye, hand, and brain for which the scribes of Sumer scratched in clay - has not changed appreciably, which is why nearly every written thing that survives, from the dawn of writing to yesterday's newspaper, is still accessible and constitutes a fragment of civilization. But as we move forward we discover that not all modern means of storing data share the characteristic of eternal readability. This problem originally appeared in the pre-electronic age, with the invention of sound recording. Signals were embodied in an object that required a specific machine to render it back into a form that could be apprehended by the senses. In those days, there were dozens of incompatible recording formats. The 10-inch, 78-rpm shellac platter ultimately won out, but not before the losers had produced a substantial body of recorded material, some of it irreplaceable. Serious audiophiles constructed customized machines that could play anything from Edison cylinders to the various platter formats - including those that ran from the axis to the circumference - to the standard outside-in disk. When LPs were introduced, turntable manufacturers included variable-speed switches, so you could play your old 45s as well as the new "albums." That was a slower era, of course, when decades passed between one standard and another. But the advent of digital computing in the early '50s vastly accelerated the pace at which we replace formats designed to store information. With computers increasing an order of magnitude in speed every two or three years, at the same time decreasing in cost, the pressure to dump the old, less efficient standards was irresistible. Obviously, much of the data stored on the old systems - the material of immediate or archival value to the organization doing the replacement - is recorded in the new format and lives on. But a lot of it doesn't. Digital archaeology is a discipline that doesn't quite exist yet, but may develop to deal with this problem, which is pervasive in the world of data. NASA, for example, has huge quantities of information sent back from space missions in the 1960s stored on deteriorating magnetic tape, in formats designed for computers that were obsolete twenty years ago. NASA didn't have the funds to transfer the data before the machines became junk. The National Center for Atmospheric research has "thousands of terabits" of data on aging media that will probably never be updated because it would take a century to do it. The archival tapes of Doug Engelbart's Augment project - an important part of the history of computing - are decaying in a St. Louis warehouse. "The 'aging of the archives' issue isn't trivial," says desktop publisher Ari Davidow. "We're thinking of CD-ROM as a semi-permanent medium, but it isn't. We already have PageMaker files that are useless." Also, recall that the PC era is an eye-blink compared to the mainframe generations that came and went under the care of the old Egyptian priesthood of computer geeks. (Would you believe a '60s vintage GE 225 machine that ran tapes that stored 256 bits per inch? Drop some developer on it and you can actually see the bits.) J. Paul Holbrook, technical services manager for CICNet (one such Egyptian priest), summarizes the problem this way: "The biggest challenge posed by systems like this is the sheer volume of information saved - there's too much stuff, it isn't indexed when it's saved, so there's lots of stuff you could never discover without loading it up again - that is, if you could load it up. "The nature of the technology makes saving it all a daunting task. It's certainly possible to keep information moving forward indefinitely, if you keep upgrading it as you go along. But given the volume of data and how fast it's growing, this could present an enormous challenge." Holbrook says twenty years is the maximum time you can expect to maintain a form of digital data without converting it to a newer format. He draws an analogy to print: "What if all your books had only a twenty-year life span before you had to make copies of them?" A 'museum of information,' suggested by WELL info-maven Hank Roberts, might help to stem the leakage. Roberts says, "[Museum] collections are spotty and odd sometimes, because whenever people went out to look for anything, they brought back 'everything else interesting.' And that's the only way to do it, because it always costs too much to get info on demand - a library makes everything available and throws out old stuff; a museum has lots of stuff tucked away as a gift to the future."

]]>
Wed, 11 Dec 2013 15:42:33 -0800 http://www.wired.com/wired/archive/1.05/1.5_archaeology.html
<![CDATA[Darwin Among the Machines — [To the Editor of the Press, Christchurch, New Zealand, 13 June, 1863.]]]> http://nzetc.victoria.ac.nz/tm/scholarly/tei-ButFir-t1-g1-t1-g1-t4-body.html

Sir—There are few things of which the present generation is more justly proud than of the wonderful improvements which are daily taking place in all sorts of mechanical appliances. And indeed it is matter for great congratulation on many grounds. It is unnecessary to mention these here, for they are sufficiently obvious; our present business lies with considerations which may somewhat tend to humble our pride and to make us think seriously of the future prospects of the human race. If we revert to the earliest primordial types of mechanical life, to the lever, the wedge, the inclined plane, the screw and the pulley, or (for analogy would lead us one step further) to that one primordial type from which all the mechanical kingdom has been PAGE 180 developed, we mean to the lever itself, and if we then examine the machinery of the Great Eastern, we find ourselves almost awestruck at the vast development of the mechanical world, at the gigantic strides with which it has advanced in compari

]]>
Mon, 31 Dec 2012 06:54:00 -0800 http://nzetc.victoria.ac.nz/tm/scholarly/tei-ButFir-t1-g1-t1-g1-t4-body.html
<![CDATA[Creating Artificial Intelligence Based on the Real Thing]]> http://www.nytimes.com/2011/12/06/science/creating-artificial-intelligence-based-on-the-real-thing.html

For the most part, the biological metaphor has long been just that — a simplifying analogy rather than a blueprint for how to do computing. Engineering, not biology, guided the pursuit of artificial intelligence. As Frederick Jelinek, a pioneer in speech recognition, put it, “airplanes don’t flap their wings.”

Yet the principles of biology are gaining ground as a tool in computing. The shift in thinking results from advances in neuroscience and computer science, and from the prod of necessity.

The physical limits of conventional computer designs are within sight — not today or tomorrow, but soon enough. Nanoscale circuits cannot shrink much further. Today’s chips are power hogs, running hot, which curbs how much of a chip’s circuitry can be used. These limits loom as demand is accelerating for computing capacity to make sense of a surge of new digital data from sensors, online commerce, social networks, video streams and corporate and government databases.

]]>
Wed, 08 Aug 2012 02:44:00 -0700 http://www.nytimes.com/2011/12/06/science/creating-artificial-intelligence-based-on-the-real-thing.html
<![CDATA[Binary Nomination]]> http://machinemachine.net/text/ideas/binary-nomination

‘An important feature of a learning machine is that its teacher will often be very largely ignorant of quite what is going on inside, although he may still be able to some extent to predict his pupil’s behaviour.’ Alan Turing, Computing Machinery and Intelligence (1950)

Replenishing each worn-out piece of its glimmering hull, one by one, the day arrives when the entire ship of Argo has been displaced – each of its parts now distinct from those of the ‘original’ vessel. For Roland Barthes, this myth exposes two modest activities:

Substitution (one part replaces another, as in a paradigm) Nomination (the name is in no way linked to the stability of the parts) 1

The discrete breaches the continuous in the act of nomination. Take for instance the spectrum of colours, the extension of which ‘is verbally reduced to a series of discontinuous terms’ 2 such as red, green, lilac or puce. Each colour has no cause but its name. By being isolated in language the colour ‘blue’ is allowed to exist, but its existence is an act of linguistic and, some would argue, perceptual severance. The city of Hull, the phrase “I will”, the surface of an ice cube and an image compression algorithm are entities each sustained by the same nominative disclosure: a paradox of things that seem to flow into one another with liquid potential, but things, nonetheless, limited by their constant, necessary re-iteration in language. There is no thing more contradictory in this regard than the human subject, a figure Barthes’ tried to paradoxically side-step in his playful autobiography. Like the ship of Argo, human experience has exchangeable parts, but at its core, such was Barthes’ intention, ‘the subject, unreconciled, demands that language represent the continuity of desire.’ 3

In an esoteric paper, published in 1930, Lewis Richardson teased out an analogy between flashes of human insight and the spark that leaps across a stop gap in an electrical circuit. The paper, entitled The Analogy Between Mental Images and Sparks, navigates around a provocative sketch stencilled into its pages of a simple indeterminate circuit, whose future state it is impossible to predict. Richardson’s playful label for the diagram hides a deep significance. For even at the simplest binary level, Richardson argued, computation need not necessarily be deterministic.

The discrete and the continuous are here again blurred by analogy. Electricity flowing and electricity not flowing: a binary imposition responsible for the entire history of information technology.

 

1 Roland Barthes, Roland Barthes (University of California Press, 1994), 46.

2 Roland Barthes, Elements of Semiology (Hill and Wang, 1977), 64.

3 Paul John Eakin, Touching the World: Reference in Autobiography (Princeton University Press, 1992), 16.

]]>
Thu, 19 Jul 2012 09:32:00 -0700 http://machinemachine.net/text/ideas/binary-nomination
<![CDATA[The Manifest Destiny of Artificial Intelligence]]> http://www.americanscientist.org/issues/id.15837,y.2012,no.4,content.true,page.1,css.print/issue.aspx

Artificial intelligence began with an ambitious research agenda: To endow machines with some of the traits we value most highly in ourselves—the faculty of reason, skill in solving problems, creativity, the capacity to learn from experience. Early results were promising. Computers were programmed to play checkers and chess, to prove theorems in geometry, to solve analogy puzzles from IQ tests, to recognize letters of the alphabet. Marvin Minsky, one of the pioneers, declared in 1961: “We are on the threshold of an era that will be strongly influenced, and quite possibly dominated, by intelligent problem-solving machines.”

Fifty years later, problem-solving machines are a familiar presence in daily life. Computer programs suggest the best route through cross-town traffic, recommend movies you might like to see, recognize faces in photographs, transcribe your voicemail messages and translate documents from one language to another. As for checkers and chess, computers are not merely good

]]>
Tue, 10 Jul 2012 02:48:00 -0700 http://www.americanscientist.org/issues/id.15837,y.2012,no.4,content.true,page.1,css.print/issue.aspx
<![CDATA[Sloppy MicroChips: Can a fair comparison be made between biological and silicon entropy?]]> http://ask.metafilter.com/mefi/217051

Was reading about microchips that are designed to allow a few mistakes (known as 'Sloppy Chips'), and pondering equivalent kinds of 'coding' errors and entropy in biological systems. Can a fair comparison be made between the two? OK, to setup my question I probably need to run through my (basic) understanding of biological vs silicon entropy...

In the transistor, error is a bad thing (in getting the required job done as efficiently and cheaply as possible), metered by parity bits that come as standard in every packet of data transmitted. But, in biological systems error is not necessarily bad. Most copying errors are filtered out, but some propogate and some of those might become beneficial to the organism (in thermodynamics sometimes known as "autonomy producing equivocations").

Relating to the article about 'sloppy chips', how does entropy and energy efficiency factor into this? For the silicon chip efficiency leads to heat (a problem), for the string of DNA efficiency leads to fewer mutations, and thus less change within populations, and thus, inevitably, less capacity for organisms to diversify and react to their environments - leading to no evolution, no change, no good. Slightly less efficiency is good for biology, and, it seems, good for some kinds of calculations and computer processes.

What work has been done on these connections I draw between the biological and the silicon?

I'm worried that my analogy is limited, based as it is on a paradigm for living systems that too closely mirrors the digital systems we have built. Can DNA and binary parity bit transistors be understood on their own terms, without resorting to using the other as a metaphor to understanding?

Where do the boundaries lie in comparing the two?

]]>
Tue, 05 Jun 2012 10:05:10 -0700 http://ask.metafilter.com/mefi/217051
<![CDATA[Sloppy MicroChips: Oh, that’s near enough]]> http://www.economist.com/node/21556087

Letting microchips make a few mistakes here and there could make them much faster and more energy-efficient.

Managing the probability of errors and limiting where they occur can ensure that the errors do not cause any problems. The result of a mathematical calculation, for example, need not always be calculated precisely—an accuracy of two or three decimal places is often enough. Dr Palem offers the analogy of a person about to cross a big room. Rather than wasting time and energy calculating the shortest path, it’s better just to start walking in roughly the right direction.

]]>
Tue, 05 Jun 2012 09:18:58 -0700 http://www.economist.com/node/21556087
<![CDATA[The Politically Incorrect Guide to Ending Poverty]]> http://www.theatlantic.com/magazine/archive/2010/07/the-politically-incorrect-guide-to-ending-poverty/8134

In the 1990s, Paul Romer revolutionized economics. In the aughts, he became rich as a software entrepreneur. Now he’s trying to help the poorest countries grow rich—by convincing them to establish foreign-run “charter cities” within their borders. Romer’s idea is unconventional, even neo-colonial—the best analogy is Britain’s historic lease of Hong Kong. And against all odds, he just might make it happen.

Halfway through the 12th century, and a long time before economists began pondering how to turn poor places into rich ones, the Germanic prince Henry the Lion set out to create a merchant’s mecca on the lawless Baltic coast. It was an ambitious project, a bit like trying to build a new Chicago in modern Congo or Iraq. Northern Germany was plagued by what today’s development gurus might delicately call a “bad-governance equilibrium,” its townships frequently sacked by Slavic marauders such as the formidable pirate Niclot the Obotrite. But Henry was not a mouse.

]]>
Sat, 12 Jun 2010 09:21:00 -0700 http://www.theatlantic.com/magazine/archive/2010/07/the-politically-incorrect-guide-to-ending-poverty/8134
<![CDATA[A Provisional Theory of Non-Sites: Robert Smithson]]> http://www.robertsmithson.com/essays/provisional.htm

By drawing a diagram, a ground plan of a house, a street plan to the location of a site, or a topographic map, one draws a "logical two dimensional picture." A "logical picture" differs from a natural or realistic picture in that it rarely looks like the thing it stands for. It is a two dimensional analogy or metaphor - A is Z.

The Non-Site (an indoor earthwork)* is a three dimensional logical picture that is abstract, yet it represents an actual site in N.J. (The Pine Barrens Plains). It is by this dimensional metaphor that one site can represent another site which does not resemble it - this The Non-Site. To understand this language of sites is to appreciate the metaphor between the syntactical construct and the complex of ideas, letting the former function as a three dimensional picture which doesn't look like a picture. "Expressive art" avoids the problem of logic; therefore it is not truly abstract. A logical intuition can develop in an entirely "new sense of metaphor" free of na

]]>
Sun, 07 Mar 2010 15:53:00 -0800 http://www.robertsmithson.com/essays/provisional.htm
<![CDATA[De-constructing 'code' (picking apart its assumptions)]]> http://ask.metafilter.com/mefi/144810

De-constructing 'code': I am looking for philosophical (from W. Benjamin through to post-structuralism and beyond) examinations of 'code'. That both includes the assumptions contained in the word 'code' and any actual objects or subjects that code is connected to - including, but not limited to: computer programming, cyphers, linguistics, genetics etc. I am looking to question the assumptions of 'code'. Perhaps a specific example of a theorist de-constructing the term.

I am currently knee deep in an examination of certain practices and assumptions that have arisen from digital media/medium and digital practice (art and making in the era of data packets and compression-artefacts for example). Through my analysis I wish to investigate the paradigms of text and writing practice (the making of textual arts).

A simple analogy to this process would be looking at dialectic cultures (speech based) from the perspective/hindsight of a grapholectic culture (writing/print based). In a similar way, I want to examine writing, film and their making with the hindsight of digital paradigms.

I am aware of the works of Deleuze, Derrida, Barthes, Genette, Ong, Serres, Agamben etc. but any of their works that deal specifically with 'code' would be very very useful.

I look forward to any pointers you can give me

]]>
Tue, 02 Feb 2010 06:35:00 -0800 http://ask.metafilter.com/mefi/144810
<![CDATA[On Seeing (an Imitation)]]> http://www.3quarksdaily.com/3quarksdaily/2010/01/on-seeing-an-imitation.html

by Daniel Rourke

“Mimesis here is not the representation of one thing by another, the relation of resemblance or of identification between two beings, the reproduction of a product of nature by a product of art. It is not the relation of two products but of two productions. And of two freedoms... 'True' mimesis is between two producing subjects and not between two produced things.”

Jacques Derrida, Economimesis

Enlarged pupil (an eye with iritis)
As the day drew closer to its end so I strained my eyes to compensate. A milieu of symbols littered my computer screen, each connected to a staccato breach between breath and tongue. And in conjunction, fused one to another in a series, these symbols formed words and concepts, visions and ideas to which I felt an obligation.

I was designing a book, turning a text into a form through the processes of a computer design interface. The semblance of a page confronted each turn of my wrist or tap of finger, until the virtual book lay splayed open, its central fissure dilating as the words grew bigger or shrank to barely perceptible pricks of black. By manipulating the interface I could expand letters until they inked out the screen, or, in turn, spiral to infinite distance, turning definite symbols into the pixels of a cloud.

This process of making occurred at a virtual distance to me and yet, as the nights rolled onwards, this work was limiting my ability to see.

The doctor examined my right eye. I had iritis, a strain of the pupil with no particular cause, except perhaps for its over-use: for one's over-reliance on its mechanical operation. Being that my right eye was the strongest of the two it had over-compensated at each dimming of the day, allowing my left eye to relax as the symbols of my book whirled on. The strain resulted in a blood-shot appearance accompanied by a searing, throbbing pain. It hurt to see, and even more so to look. It hurt because looking was its cause.

Standing at the base of the Southern tower I arced my neck back as far as I dare. As the horizon descended into my stomach I could just about perceive the towers' tallest corners, pinching at sky. How many coins did it take to build these things? And how many steps was I expected to ascend in order to get to the 'observation deck'?

In exchange for my tiny coin I fathomed a giant network called 'New York'. From up here everything was horizon: the imaginary boundary between earth and sky that moves in respect of one's position.

In 2001 the two towers tumbled. How profane their figures seem now. How could it be that these prisms, designed and built in the 1960s, opened and occupied in the 1970s, witness of boom in the 80s and bust in the 90s, would come to stand for all the tumult and turmoil, striving and hope of our newest century?

The precision of the prism – flat, grey surfaces observed in isometric space – will forever be bound to these charismatic towers built of steel, concrete and capital. That they now stand as symbols effaces their identity in time or in space. They will always be contemporary, so long as cities are built and planes soar the skies above them. Looking back at them it is now I that stand on the horizon. Yet, howsoever I alter my vision, the towers stay solid and fixed to their position, being at one and the same time the landscape, the illumination and the roving eye.

'Office Block With Twin' by Koizumi Meiro, 2006

Idiopathic is an adjective used primarily in medicine meaning arising spontaneously or from an obscure or unknown cause. From Greek ἴδιος, idios (one's own) + πάθος, pathos (suffering), it means approximately "a disease of its own kind."

extract from Wikipedia

In 2006 Koizumi Meiro tore pages from pornographic magazines. Over images composed of two erotically entwined women he painted tones of grey. The resulting collages speak of capture, of closure and the banal. They are severely a-erotic, displaying none of the titillation that their originary magazines wished upon their audience. The women's heads have been disembodied, or more precisely, have been relocated onto the bodies of twin prisms. Does Meiro's objectification of these women mirror the objectification they suffer under the guise of the erotic gaze? Perhaps. What draws me into the images though, and what emerges most strikingly as I look upon them, is a haunting sense of recognition. This simplified, perfect horizon, these strutting prisms of grey mirror the defining twin icons of our era. Captured, closed off and made banal to my mind by the passing of time, by their over admittance into the symbolic syntax of the new century.

My recognition is itself an imitation, such that seeing and looking are intertwined.

A focal point rushes to meet me, like a pupil contracting as the first band of sun breaches an ever distant imaginary line.

Cargo Cult

In the 1940s the Southwest Pacific Ocean became of fundamental strategic importance for both the Japanese and American forces. After establishing bases on a range of Melanesian and Micronesian islands the US Military settled into the routines of war.

To the native peoples of these islands the military presence signified a complete over-turning of the natural order. Within a few months the beaches and grasslands were transformed into encampments and runways, and as the war effort ensued the skies above must have seemed filled with the buzz of alien craft. The native people came to know American society through the exchange of commodities and the gestures of an unknown tongue. As planes soared overhead and countless ships descended over the horizon the islands became saturated with cargo of all kinds, from cans of coca cola to livestock the likeness of which the islanders had never seen.

Much has been written of the so called 'Cargo Cults' which later emerged on these islands. Strange rituals still carried out today seem to hark back to those informative years when Western civilisation first imposed itself on the native Micronesians. Islanders build imitation planes and runways from straw and dirt; act out military processions with bamboo guns slung over their shoulders. In order to bring back the abundance of cargo that used to land on their islands the native people appear to be imitating the conditions under which its arrival used to occur.

Ritual obtains a value at the meeting point between the thing imitated and the imitation. Ritual is action, but it is also object. It is natural because it is always a copy; repeated whilst never attaining perfect resemblance; repeated to bring into order the miasma of our visions.

With work there is always consequence, both intended and in excess. For the tribal communities of the cargo islands the dividing lines between nature and ritual, between alien technology and the routines of war must have seemed identical. A resemblance, a dividing line, that was worthy of imitation whether it brought cargo or not.

We cannot know what they saw. We can only imitate an idea of their seeing by analogy with the kind of seeing we consider in ourselves.

Upon the arrival of the American Military in the Southwestern Pacific there was a lot more to see than had been seen before.

“Why should we be at all interested in perceiving the obscurity that emanates from the epoch? Is darkness not precisely an anonymous experience that is by definition impenetrable; something that is not directed at us and thus cannot concern us? On the contrary, the contemporary is the person who perceives the darkness of his time as something that concerns him. Darkness is something that – more than any light – turns directly and singularly toward him. The contemporary is the one whose eyes are struck by the beam of darkness that comes from his own time.”

Giorgio Agamben, What is The Contemporary?

The eye-drops soothed the burning pain, but they also gave me chronic photo-phobia, such that stepping out into daylight was excruciating. I needed to let my eye rest, and this meant shutting off its ability to work. Whether the light was dim or bright, whether the object of my attention was near or far, the muscles around my pupil lay dormant. I considered the world through a pupil locked at its fullest expanse. The light gushed in.

In place of depth, of shade and colour, there now existed a miasma which my left eye alone could not navigate. The physical frames of everyday life were impossible to attenuate. It was as if upon being freed from the shallow glare of the computer screen I had stumbled into a space between signified and signifier. Everything was flattened to the status of an interface, but an interface that lead nowhere and manipulated nothing.

My book had been printed and bound. I could hold it in my hands, flick through its pages. In real space I could consider it, scanning its lines and paragraphs with my working eye. Wearing a make-shift eye patch or a pair of sun glasses I was able to avoid headaches and spatial confusion. But upon holding the very object whose making had rendered my right eye useless I was overcome with a different kind of dislocation.

Was this the book I had designed on my computer? It bore a resemblance, there was even a sense that my fingers had observed it before, the memory of its movements surfacing as I turned it over in my hands. But this sense did not transfer to the content of the book, to the meaning that emerged when words were read in conjunction, and pages, phrases, paragraphs and footnotes came to meet each other in endless variation. I recognised the words themselves, but I did not recognise from where they had come. I saw the book's space, time and content, yet I could not see its work.

Between seeing and looking which paradigm was closest to this work: the roving eye or the mind engaged in making?

by Daniel Rourke

“To go beyond is to communicate with ideas, to understand. Does not the function of art lie in not understanding?... Art does not know a particular type of reality; it contrasts with knowledge. It is the very event of obscuring, a descent of the night, an invasion of shadow.”

Emmanuel Levinas, Reality and Its Shadow

]]>
Sun, 24 Jan 2010 21:04:00 -0800 http://www.3quarksdaily.com/3quarksdaily/2010/01/on-seeing-an-imitation.html
<![CDATA[Beyond the Topology of the Book]]> http://machinemachine.net/text/featured/beyond-the-topology-of-the-book

…the human perceptive apparatus [has] a potential to break with action and self-organisation: to see as such, without that point of view being folded around my organizing striving centre. It is precisely the image of bounded life that Deleuze sees as the illusion that has dominated philosophy and that is overcome in the radical connections of art. - Claire Colebrook, Deleuze: A Guide for the Perplexed

Books are passive, denying their rigid topologies only as their pages are turned to meet each other, face-to-face. Unlike writers and readers books do not converse, do not react to stimuli, do not alter over time. Unlike a group of readers there is always only one book, and although one book may be understood a thousand ways no single book can exhibit even one of those thousand to any new reader who happens by. The human apparatus is cajoled by the book-medium into an order which delimits the extent to which the human can interface its content. Our natural inclination is to perceive the act of writing as happening on fresh ground. The writer’s movement, of the pen or through the word-processor, gouges marks in the page that the reader re-traces. This analogy, though, forgets the temporal dimension of the writing act. If a writer diverges from their original pathway, or backsteps in order to begin a new one, the printed page conceals their indecisive movements. At the level of the interface - the printed and bound book - only the writer’s final path is available for the reader to follow. New mediums, such as web browsers and ebook readers, have the potential to store these divergent pathway in branching archives of potential. And for the first time in history the reader’s habits may also be gouged into the digital medium, such that a thousand readers may meet with a thousand writers, each able to marvel at the movements of the other. Writing and reading have always happened against the illusion of permanent boundary provided by the scroll, the page, the book and the manuscript. If the medium had allowed it every pathway would have overlapped, in time, writing the acts of movement, or perception and incomprehension, into the surface of the bounded page. Like the Desire Lines made as we navigate our physical environments, exchange between text and interface should create Desire Lines through repetition and reflexion - lines that do not dictate our desires, but allow them to break free from the topologies the medium insists we traverse.

]]>
Wed, 27 May 2009 09:01:00 -0700 http://machinemachine.net/text/featured/beyond-the-topology-of-the-book
<![CDATA[The Next Great Discontinuity: The Data Deluge]]> http://www.3quarksdaily.com/3quarksdaily/2009/04/the-next-great-discontinuity-part-two.html

Speed is the elegance of thought, which mocks stupidity, heavy and slow. Intelligence thinks and says the unexpected; it moves with the fly, with its flight. A fool is defined by predictability… But if life is brief, luckily, thought travels as fast as the speed of light. In earlier times philosophers used the metaphor of light to express the clarity of thought; I would like to use it to express not only brilliance and purity but also speed. In this sense we are inventing right now a new Age of Enlightenment… A lot of… incomprehension… comes simply from this speed. I am fairly glad to be living in the information age, since in it speed becomes once again a fundamental category of intelligence. Michel Serres, Conversations on Science, Culture and Time

(Originally published at 3quarksdaily · Link to Part One) Human beings are often described as the great imitators: We perceive the ant and the termite as part of nature. Their nests and mounds grow out of the Earth. Their actions are indicative of a hidden pattern being woven by natural forces from which we are separated. The termite mound is natural, and we, the eternal outsiders, sitting in our cottages, our apartments and our skyscrapers, are somehow not. Through religion, poetry, or the swift skill of the craftsman smearing pigment onto canvas, humans aim to encapsulate that quality of existence that defies simple description. The best art, or so it is said, brings us closer to attaining a higher truth about the world that remains elusive from language, that perhaps the termite itself embodies as part of its nature. Termite mounds are beautiful, but were built without a concept of beauty. Termite mounds are mathematically precise, yet crawling through their intricate catacombs cannot be found one termite in comprehension of even the simplest mathematical constituent. In short, humans imitate and termites merely are. This extraordinary idea is partly responsible for what I referred to in Part One of this article as The Fallacy of Misplaced Concreteness. It leads us to consider not only the human organism as distinct from its surroundings, but it also forces us to separate human nature from its material artefacts. We understand the termite mound as integral to termite nature, but are quick to distinguish the axe, the wheel, the book, the skyscraper and the computer network from the human nature that bore them. When we act, through art, religion or with the rational structures of science, to interface with the world our imitative (mimetic) capacity has both subjective and objective consequence. Our revelations, our ideas, stories and models have life only insofar as they have a material to become invested through. The religion of the dance, the stone circle and the summer solstice is mimetically different to the religion of the sermon and the scripture because the way it interfaces with the world is different. Likewise, it is only with the consistency of written and printed language that the technical arts could become science, and through which our ‘modern’ era could be built. Dances and stone circles relayed mythic thinking structures, singular, imminent and ethereal in their explanatory capacities. The truth revealed by the stone circle was present at the interface between participant, ceremony and summer solstice: a synchronic truth of absolute presence in the moment. Anyone reading this will find truth and meaning through grapholectic interface. Our thinking is linear, reductive and bound to the page. It is reliant on a diachronic temporality that the pen, the page and the book hold in stasis for us. Imitation alters the material world, which in turn affects the texture of further imitation. If we remove the process from its material interface we lose our objectivity. In doing so we isolate the single termite from its mound and, after much careful study, announce that we have reduced termite nature to its simplest constituent. The reason for the tantalizing involutions here is obviously that intelligence is relentlessly reflexive, so that even the external tools that it uses to implement its workings become ‘internalized’, that is, part of its own reflexive process… To say writing is artificial is not to condemn it but to praise it. Like other artificial creations and indeed more than any other, it is utterly invaluable and indeed essential for the realisation of fuller, interior, human potentials. Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word. Walter J. Ong, Orality and Literacy

Anyone reading this article cannot fail but be aware of the changing interface between eye and text that has taken place over the past two decades or so. New Media – everything from the internet database to the Blackberry – has fundamentally changed the way we connect with each other, but it has also altered the way we connect with information itself. The linear, diachronic substance of the page and the book have given way to a dynamic textuality blurring the divide between authorship and readership, expert testament and the simple accumulation of experience. The main difference between traditional text-based systems and newer, data-driven ones is quite simple: it is the interface. Eyes and fingers manipulate the book, turning over pages in a linear sequence in order to access the information stored in its printed figures. For New Media, for the digital archive and the computer storage network, the same information is stored sequentially in databases which are themselves hidden to the eye. To access them one must commit a search or otherwise run an algorithm that mediates the stored data for us. The most important distinction should be made at the level of the interface, because, although the database as a form has changed little over the past 50 years of computing, the Human Control Interfaces (HCI) we access and manipulate that data through are always passing from one iteration to another. Stone circles interfacing the seasons stayed the same, perhaps being used in similar rituals over the course of a thousand years of human cultural accumulation. Books, interfacing text, language and thought, stay the same in themselves from one print edition to the next, but as a format, books have changed very little in the few hundred years since the printing press. The computer HCI is most different from the book in that change is integral to it structure. To touch a database through a computer terminal, through a Blackberry or iPhone, is to play with data at incredible speed: Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition… Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies. At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics… This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves. Wired Magazine, The End of Theory, June 2008

And as the amount of data has expanded exponentially, so have the interfaces we use to access that data and the models we build to understand that data. On the day that Senator John McCain announced his Vice Presidential Candidate the best place to go for an accurate profile of Sarah Palin was not the traditional media: it was Wikipedia. In an age of instant, global news, no newspaper could keep up with the knowledge of the cloud. The Wikipedia interface allowed knowledge about Sarah Palin from all levels of society to be filtered quickly and efficiently in real-time. Wikipedia acted as if it was encyclopaedia, as newspaper as discussion group and expert all at the same time and it did so completely democratically and at the absence of a traditional management pyramid. The interface itself became the thinking mechanism of the day, as if the notes every reader scribbled in the margins had been instantly cross-checked and added to the content. In only a handful of years the human has gone from merely dipping into the database to becoming an active component in a human-cloud of data. The interface has begun to reflect back upon us, turning each of us into a node in a vast database bigger than any previous material object. Gone are the days when clusters of galaxies had to a catalogued by an expert and entered into a linear taxonomy. Now, the same job is done by the crowd and the interface, allowing a million galaxies to be catalogued by amateurs in the same time it would have taken a team of experts to classify a tiny percentage of the same amount. This method of data mining is called ‘crowdsourcing’ and it represents one of the dominant ways in which raw data will be turned into information (and then knowledge) over the coming decades. Here the cloud serves as more than a metaphor for the group-driven interface, becoming a telling analogy for the trans-grapholectic culture we now find ourselves in. To grasp the topological shift in our thought patterns it pays to move beyond the interface and look at a few of the linear, grapholectic models that have undergone change as a consequence of the information age. One of these models is evolution, a biological theory the significance of which we are still in the process of discerning:

If anyone now thinks that biology is sorted, they are going to be proved wrong too. The more that genomics, bioinformatics and many other newer disciplines reveal about life, the more obvious it becomes that our present understanding is not up to the job. We now gaze on a biological world of mind-boggling complexity that exposes the shortcomings of familiar, tidy concepts such as species, gene and organism. A particularly pertinent example [was recently provided in New Scientist] - the uprooting of the tree of life which Darwin used as an organising principle and which has been a central tenet of biology ever since. Most biologists now accept that the tree is not a fact of nature - it is something we impose on nature in an attempt to make the task of understanding it more tractable. Other important bits of biology - notably development, ageing and sex - are similarly turning out to be much more involved than we ever imagined. As evolutionary biologist Michael Rose at the University of California, Irvine, told us: “The complexity of biology is comparable to quantum mechanics.” New Scientist, Editorial, January 2009

As our technologies became capable of gathering more data than we were capable of comprehending, a new topology of thought, reminiscent of the computer network, began to emerge. For the mindset of the page and the book science could afford to be linear and diachronic. In the era of The Data Deluge science has become more cloud-like, as theories for everything from genetics to neuroscience, particle physics to cosmology have shed their linear constraints. Instead of seeing life as a branching tree, biologists are now speaking of webs of life, where lineages can intersect and interact, where entire species are ecological systems in themselves. As well as seeing the mind as an emergent property of the material brain, neuroscience and philosophy have started to consider the mind as manifest in our extended, material environment. Science has exploded, and picking up the pieces will do no good. Through the topology of the network we have begun to perceive what Michel Serres calls ‘The World Object’, an ecology of interconnections and interactions that transcends and subsumes the causal links propounded by grapholectic culture. At the limits of science a new methodology is emerging at the level of the interface, where masses of data are mined and modelled by systems and/or crowds which themselves require no individual understanding to function efficiently. Where once we studied events and ideas in isolation we now devise ever more complex, multi-dimensional ways for those events and ideas to interconnect; for data sources to swap inputs and output; for outsiders to become insiders. Our interfaces are in constant motion, on trajectories that curve around to meet themselves, diverge and cross-pollinate. Thought has finally been freed from temporal constraint, allowing us to see the physical world, life, language and culture as multi-dimensional, fractal patterns, winding the great yarn of (human) reality: The advantage that results from it is a new organisation of knowledge; the whole landscape is changed. In philosophy, in which elements are even more distanced from one another, this method at first appears strange, for it brings together the most disparate things. People quickly crit[cize] me for this… But these critics and I no longer have the same landscape in view, the same overview of proximities and distances. With each profound transformation of knowledge come these upheavals in perception. Michel Serres, Conversations on Science, Culture and Time

]]>
Tue, 05 May 2009 07:35:00 -0700 http://www.3quarksdaily.com/3quarksdaily/2009/04/the-next-great-discontinuity-part-two.html