MachineMachine /stream - search for exaptation https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Journal Contribution: Exaptation and the Digital Now]]> http://median.newmediacaucus.org/caa-edition/

Earlier this year I devised and delivered the New Media Caucus sponsored panel and journal editorial: ‘Exaptation and the Digital Now’, with Zara Dinnen, Rob Gallagher and Alex Myers: Exaptation and the Digital Now: INTRODUCTION Case Study #1: Holoback Zara Dinnen Case Study #2: The Phantom Zone Daniel Rourke Case Study #3: Fire in the Hole – The Obviously Non-Short History of Art Games Alex Myers Case Study #4: Exaptation, Interpretation, PlayStation Rob Gallagher

The panel took place at the College Art Association annual conference, Chicago, February 14th 2014. Our write-up was featured in the New Media Caucus journal CAA 2014 conference edition. Click-through for each of our papers and the specially extended introduction:

Evolution is a dominant metaphor for thinking about and describing the processes of new technologies; we believe ‘exaptation’ offers a more productive, nuanced approach to questions of adaptation and co-option that surround digital media. [8] According to Svetlana Boym in her essay “The Off-Modern Mirror:”

Exaptation is described in biology as an example of ‘lateral adaptation,’ which consists in a co-option of a feature for its present role from some other origin… Exaptation is not the opposite of adaptation; neither is it merely an accident, a human error or lack of scientific data that would in the end support the concept of adaptation. Exaptation questions the very process of assigning meaning and function in hindsight, the process of assigning the prefix ‘post’ and thus containing a complex phenomenon within the grid of familiar interpretation. [9]

Media is replete with exaptations. Features specific to certain media are exapted – co-opted – as matters of blind chance, convenience, technical necessity, aesthetics, and even fashion. Narratives of progress cannot account for the ways technologies branch out or are reused, misused, and abused across communities and networks. Exaptation offers a way to think about digital culture not as ever-newer, ever-faster, ever-more-seamless, but rather as something that must always negotiate its own noisy history. Yesterday’s incipient hardware becomes the ordering mechanism of today’s cultural affects: a complex renewal that calls into question established notions of utility, value, and engendered experience. Exaptation accounts for features now considered integral to media without falling back into narratives that appear to anticipate what one could not anticipate. This article is a collaborative work that brings together the four co-authors’ various responses to the provocation of exaptation. In what follows exaptation is put into play as a model to help unsettle dominant narratives about the digital image in particular. Considering the digital image in various guises: as animated GIFs, poor images, art games, hardware, and holograms, this article will trace the traits that jump between media and metaphor; complicating linear narratives of progression, and reductive readings of remediation associated with new media. [10]

]]>
Sun, 31 Aug 2014 06:59:28 -0700 http://median.newmediacaucus.org/caa-edition/
<![CDATA[Exaptation & managed serendipity: II - Cognitive Edge Network Blog]]> http://cognitive-edge.com/blog/entry/5575/exaptation-managed-serendipity-ii/

In this second of three posts on exaptation I am going to continue to build on reporting discussions and ideas that came out of the Durham conference. In the final post I'll pick up on what I presented (and what I wish I had thought of presenting at the time) on managed serendipity.

]]>
Wed, 26 Feb 2014 09:06:49 -0800 http://cognitive-edge.com/blog/entry/5575/exaptation-managed-serendipity-ii/
<![CDATA[Exaptation & managed serendipity: I - Cognitive Edge Network Blog]]> http://cognitive-edge.com/blog/entry/5573/exaptation-managed-serendipity-part-i/

The feather original evolved for regulation of temperature, but then evolved for flight. In 1942 a scientist at Raytheon was testing a magnetron, a key component of radar, and noticed that a candy bar melted in his pocket.

]]>
Wed, 26 Feb 2014 09:06:44 -0800 http://cognitive-edge.com/blog/entry/5573/exaptation-managed-serendipity-part-i/
<![CDATA[Conference Panel: ‘Exaptation and the Digital Now’]]> http://www.scribd.com/doc/215721615/Panel-Delivered-‘Exaptation-and-the-Digital-Now

College Art Association annual conference, New Media Caucus affiliated panel ‘Exaptation and the Digital Now’, with Zara Dinnen, Rob Gallagher and Alex Myers, Chicago, February 14th 2014

]]>
Fri, 14 Feb 2014 07:44:05 -0800 http://www.scribd.com/doc/215721615/Panel-Delivered-‘Exaptation-and-the-Digital-Now
<![CDATA[Rigid Implementation vs Flexible Materiality]]> http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality

Wow. It’s been a while since I updated my blog. I intend to get active again here soon, with regular updates on my research. For now, I thought it might be worth posting a text I’ve been mulling over for a while (!) Yesterday I came across this old TED presentation by Daniel Hillis, and it set off a bunch of bells tolling in my head. His book The Pattern on the Stone was one I leafed through a few months back whilst hunting for some analogies about (digital) materiality. The resulting brainstorm is what follows. (This blog post, from even longer ago, acts as a natural introduction: On (Text and) Exaptation) In the 1960s and 70s Roland Barthes named “The Text” as a network of production and exchange. Whereas “the work” was concrete, final – analogous to a material – “the text” was more like a flow, a field or event – open ended. Perhaps even infinite. In, From Work to Text, Barthes wrote: The metaphor of the Text is that of the network… (Barthes 1979) This semiotic approach to discourse, by initiating the move from print culture to “text” culture, also helped lay the ground for a contemporary politics of content-driven media. Skipping backwards through From Work to Text, we find this statement: The text must not be understood as a computable object. It would be futile to attempt a material separation of works from texts. I am struck here by Barthes” use of the phrase “computable object”, as well as his attention to the “material”. Katherine Hayles in her essay, Text is Flat, Code is Deep, (Hayles 2004) teases out the statement for us: ‘computable’ here mean[s] to be limited, finite, bound, able to be reckoned. Written twenty years before the advent of the microcomputer, his essay stands in the ironic position of anticipating what it cannot anticipate. It calls for a movement away from works to texts, a movement so successful that the ubiquitous ‘text’ has all but driven out the media-specific term book. Hayles notes that the “ubiquity” of Barthes” term “Text” allowed – in its wake – an erasure of media-specific terms, such as “book”. In moving from, The Work to The Text, we move not just between different politics of exchange and dissemination, we also move between different forms and materialities of mediation. (Manovich 2002)For Barthes the material work was computable, whereas the network of the text – its content – was not.

In 1936, the year that Alan Turing wrote his iconic paper ‘On Computable Numbers’, a German engineer by the name of Konrad Zuse built the first working digital computer. Like its industrial predecessors, Zuse’s computer was designed to function via a series of holes encoding its program. Born as much out of convenience as financial necessity, Zuse punched his programs directly into discarded reels of 35mm film-stock. Fused together by the technologies of weaving and cinema, Zuse’s computer announced the birth of an entirely new mode of textuality. The Z3, the world’s first working programmable, fully automatic computer, arrived in 1941. (Manovich 2002) A year earlier a young graduate by the name of Claude Shannon had published one of the most important Masters theses in history. In it he demonstrated that any logical expression of Boolean algebra could be programmed into a series of binary switches. Today computers still function with a logic impossible to distinguish from their mid-20th century ancestors. What has changed is the material environment within which Boolean expressions are implemented. Shannon’s work first found itself manifest in the fragile rows of vacuum tubes that drove much of the technical innovation of the 40s and 50s. In time, the very same Boolean expressions were firing, domino-like, through millions of transistors etched onto the surface of silicon chips. If we were to query the young Shannon today, he might well gawp in amazement at the material advances computer technology has gone through. But, if Shannon was to examine either your digital wrist watch or the world’s most advanced supercomputer in detail, he would once again feel at home in the simple binary – on/off – switches lining those silicon highways. Here the difference between how computers are implemented and what computers are made of digs the first of many potholes along our journey. We live in an era not only practically driven by the computer, but an era increasingly determined by the metaphors computers have injected into our language. Let us not make the mistake of presupposing that brains (or perhaps minds) are “like” computers. Tempting though it is to reduce the baffling complexities of the human being to the functions of the silicon chip, the parallel processor or Wide Area Network this reduction occurs most usefully at the level of metaphor and metonym. Again the mantra must be repeated that computers function through the application of Boolean logic and binary switches, something that can not be said about the human brain with any confidence a posteriori. Later I will explore the consequences on our own understanding of ourselves enabled by the processing paradigm, but for now, or at least the next few paragraphs, computers are to be considered in terms of their rigid implementation and flexible materiality alone. At the beginning of his popular science book, The Pattern on the Stone, (Hillis 1999) W.  Daniel Hillis narrates one of his many tales on the design and construction of a computer. Built from tinker-toys the computer in question was/is functionally complex enough to “play” tic-tac-toe (noughts and crosses). The tinker-toy was chosen to indicate the apparent simplicity of computer design, but as Hillis argues himself, he may very well have used pipes and valves to create a hydraulic computer, driven by water pressure, or stripped the design back completely, using flowing sand, twigs and twine or any other recipe of switches and connectors. The important point is that the tinker-toy tic-tac-toe computer functions perfectly well for the task it is designed for, perfectly well, that is, until the tinker-toy material begins to fail. This failure is what Chapter 1 of this thesis is about: why it happens, why its happening is a material phenomenon and how the very idea of “failure” is suspect. Tinker-toys fail because the mechanical operation of the tic-tac-toe computer puts strain on the strings of the mechanism, eventually stretching them beyond practical use. In a perfect world, devoid of entropic behaviour, the tinker-toy computer may very well function forever, its users setting O or X conditions, and the computer responding according to its program in perfect, logical order. The design of the machine, at the level of the program, is completely closed; finished; perfect. Only materially does the computer fail (or flail), noise leaking into the system until inevitable chaos ensues and the tinker-toys crumble back into jumbles of featureless matter. This apparent closure is important to note at this stage because in a computer as simple as the tic-tac-toe machine, every variable can be accounted for and thus programmed for. Were we to build a chess playing computer from tinker-toys (pretending we could get our hands on the, no doubt, millions of tinker-toy sets we”d need) the closed condition of the computer may be less simple to qualify. Tinker-toys, hydraulic valves or whatever material you choose, could be manipulated into any computer system you can imagine, even the most brain numbingly complicated IBM supercomputer is technically possible to build from these fundamental materials. The reason we don”t do this, why we instead choose etched silicon as our material of choice for our supercomputers, exposes another aspect of computers we need to understand before their failure becomes a useful paradigm. A chess playing computer is probably impossible to build from tinker-toys, not because its program would be too complicated, but because tinker-toys are too prone to entropy to create a valid material environment. The program of any chess playing application could, theoretically, be translated into a tinker-toy equivalent, but after the 1,000th string had stretched, with millions more to go, no energy would be left in the system to trigger the next switch along the chain. Computer inputs and outputs are always at the mercy of this kind of entropy: whether in tinker-toys or miniature silicon highways. Noise and dissipation are inevitable at any material scale one cares to examine. The second law of thermo dynamics ensures this. Claude Shannon and his ilk knew this, even back when the most advanced computers they had at their command couldn”t yet play tic-tac-toe. They knew that they couldn”t rely on materiality to delimit noise, interference or distortion; that no matter how well constructed a computer is, no matter how incredible it was at materially stemming entropy (perhaps with stronger string connectors, or a built in de-stretching mechanism), entropy nonetheless was inevitable. But what Shannon and other computer innovators such as Alan Turing also knew, is that their saviour lay in how computers were implemented. Again, the split here is incredibly important to note:

Flexible materiality: How and of what a computer is constructed e.g. tinker-toys, silicon Rigid implementation: Boolean logic enacted through binary on/off switches (usually with some kind of input à storage à feedback/program function à output). Effectively, how a computer works

Boolean logic was not enough on its own. Computers, if they were to avoid entropy ruining their logical operations, needed to have built within them an error management protocol. This protocol is still in existence in EVERY computer in the world. Effectively it takes the form of a collection of parity bits delivered alongside each packet of data that computers, networks and software deal with. The bulk of data contains the binary bits encoding the intended quarry, but the receiving element in the system also checks the main bits alongside the parity bits to determine whether any noise has crept into the system. What is crucial to note here is the error-checking of computers happens at the level of their rigid implementation. It is also worth noting that for every eight 0s and 1s delivered by a computer system, at least one of those bits is an error checking function. W. Daniel Hillis puts the stretched strings of his tinker-toy mechanism into clear distinction and in doing so, re-introduces an umbrella term set to dominate this chapter: I constructed a later version of the Tinker Toy computer which fixed the problem, but I never forgot the lesson of the first machine: the implementation technology must produce perfect outputs from imperfect inputs, nipping small errors in the bud. This is the essence of digital technology, which restores signals to near perfection at every stage. It is the only way we know – at least, so far – for keeping a complicated system under control. (Hillis 1999, 18)   Bibliography  Barthes, Roland. 1979. ‘From Work to Text.’ In Textual Strategies: Perspectives in Poststructuralist Criticism, ed. Josue V. Harari, 73–81. Ithaca, NY: Cornell University Press. Hayles, N. Katherine. 2004. ‘Print Is Flat, Code Is Deep: The Importance of Media-Specific Analysis.’ Poetics Today 25 (1) (March): 67–90. doi:10.1215/03335372-25-1-67. Hillis, W. 1999. The Pattern on the Stone : the Simple Ideas That Make Computers Work. 1st paperback ed. New York: Basic Books. Manovich, Lev. 2002. The Language of New Media. 1st MIT Press pbk. ed. Cambridge  Mass.: MIT Press.      

]]>
Thu, 07 Jun 2012 06:08:07 -0700 http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality
<![CDATA[On (Text and) Exaptation]]> http://machinemachine.net/text/ideas/on-text-and-exaptation

(This post was written as a kind of ‘prequel’ to a previous essay, Rancière’s Ignoramus) ‘Text’ originates from the Latin word texere, to weave. A material craft enabled by a human ingenuity for loops, knots and pattern. Whereas a single thread may collapse under its own weight, looped and intertwined threads originate their strength and texture as a network. The textile speaks of repetition and multiplicity, yet it is only once we back away from the tapestry that the larger picture comes into focus. At an industrial scale textile looms expanded beyond the frame of their human operators. Reducing a textile design to a system of coded instructions, the complex web of a decorative rug could be fixed into the gears and pulleys that drove the clattering apparatus. In later machines long reels of card, punched through with holes, told a machine how, or what, to weave. Not only could carpets and textiles themselves be repeated, with less chance of error, but the punch-cards that ordered them were now equally capable of being mass-produced for a homogenous market. From one industrial loom an infinite number of textile variations could be derived. All one needed to do was feed more punch-card into the greedy, demanding reels of the automated system. The material origins of film may also have been inspired by weaving. Transparent reels of celluloid were pulled through mechanisms resembling the steam-driven contraptions of the industrial revolution. The holes running down its edges delimit a reel’s flow. Just as the circular motion of a mechanical loom is translated into a network of threads, so the material specificity of the film-stock and projector weave the illusion of cinematic time. Some of the more archaic, out-moded types of film are known to shrink slightly as they decay, affording us – the viewer – a juddering, inconsistent vision of the world captured in the early 20th century. In 1936, the year that Alan Turing wrote his iconic paper “On Computable Numbers”, a German engineer by the name of Konrad Zuse built the first working digital computer. Like its industrial predecessors, Zuse’s computer was designed to function via a series of holes encoding its program. Born as much out of convenience as financial necessity, Zuse punched his programs directly into discarded reels of 35mm film-stock. Fused together by the technologies of weaving and cinema, Zuse’s digital computer announced the birth of an entirely new mode of textuality. As Lev Manovich suggests: “The pretence of modern media to create simulations of sensible reality is… cancelled; media are reduced to their original condition as information carrier, nothing less, nothing more… The iconic code of cinema is discarded in favour of the more efficient binary one. Cinema becomes a slave to the computer.” Rather than Manovich’s ‘slave’ / ‘master’ relationship, I want to suggest a kind of lateral pollination of media traits. As technologies develop, specificities from one media are co-opted by another. Reverting to biological metaphor, we see genetic traits jumping between media species. From a recent essay by Svetlana Boym, The Off-Modern Mirror: “Exaptation is described in biology as an example of “lateral adaptation,” which consists in a cooption of a feature for its present role from some other origin… Exaptation is not the opposite of adaptation; neither is it merely an accident, a human error or lack of scientific data that would in the end support the concept of adaptation. Exaptation questions the very process of assigning meaning and function in hindsight, the process of assigning the prefix “post” and thus containing a complex phenomenon within the grid of familiar interpretation.” Media history is littered with exaptations. Features specific to certain media are exapted – co-opted – as matters of convenience, technical necessity or even aesthetics. Fashion has a role to play also, for instance, many of the early models of mobile phone sported huge, extendible aerials which the manufacturers now admit had no impact whatsoever on the workings of the technology. Lev Manovich’s suggestion is that as the computer has grown in its capacities, able to re-present all other forms of media on a single computer apparatus, the material traits that define a media have been co-opted by the computer at the level of software and interface. A strip of celluloid has a definite weight, chemistry and shelf-life – a material history with origins in the mechanisms of the loom. Once we encode the movie into the binary workings of a digital computer, each media-specific – material – trait can be reduced to an informational equivalent. If I want to increase the frames per second of a celluloid film I have to physically wind the reel faster. For the computer encoded, digital equivalent, a code that re-presents each frame can be introduced via my desktop video editing software. Computer code determines the content as king. In the 1960s and 70s Roland Barthes named ‘The Text’ as a network of production and exchange. Whereas ‘the work’ was concrete, final – analogous to a material – ‘the text’ was more like a flow, a field or event – open ended. Perhaps even infinite. In, From Work to Text, Barthes wrote: “The metaphor of the Text is that of the network…” This semiotic approach to discourse, by initiating the move from print culture to ‘text’ culture, also helped lay the ground for a contemporary politics of content-driven media. Skipping backwards through From Work to Text, we find this statement: “The text must not be understood as a computable object. It would be futile to attempt a material separation of works from texts.” I am struck here by Barthes’ use of the phrase ‘computable object’, as well as his attention to the ‘material’. Katherine Hayles in her essay, Text is Flat, Code is Deep, teases out the statement for us: “computable” here mean[s] to be limited, finite, bound, able to be reckoned. Written twenty years before the advent of the microcomputer, his essay stands in the ironic position of anticipating what it cannot anticipate. It calls for a movement away from works to texts, a movement so successful that the ubiquitous “text” has all but driven out the media-specific term book. Hayles notes that the ‘ubiquity’ of Barthes’ term ‘Text’ allowed – in its wake – an erasure of media-specific terms, such as ‘book’. In moving from, The Work to The Text, we move not just between different politics of exchange and dissemination, we also move between different forms and materialities of mediation. To echo (and subvert) the words of Marshall Mcluhan, not only is The Medium the Message, The Message is also the Medium. …media are only a subspecies of communications which includes all forms of communication. For example, at first people did not call the internet a medium, but now it has clearly become one… We can no longer understand any medium without language and interaction – without multimodal processing… We are now clearly moving towards an integration of all kinds of media and communications, which are deeply interconnected. Extract from a 2005 interview with Manuel Castells, Global Media and Communication Journal

(This post was written as a kind of ‘prequel’ to a previous essay, Rancière’s Ignoramus)

]]>
Mon, 06 Dec 2010 13:41:24 -0800 http://machinemachine.net/text/ideas/on-text-and-exaptation