MachineMachine /stream - search for chaos https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Jordan Peterson & Fascist Mysticism | Pankaj Mishra | The New York Review of Books]]> https://www.nybooks.com/online/2018/03/19/jordan-peterson-and-fascist-mysticism/

“Men have to toughen up,” Jordan B. Peterson writes in 12 Rules For Life: An Antidote to Chaos, “Men demand it, and women want it.” So, the first rule is, “Stand up straight with your shoulders back” and don’t forget to “clean your room.

]]>
Fri, 24 Nov 2023 11:33:34 -0800 https://www.nybooks.com/online/2018/03/19/jordan-peterson-and-fascist-mysticism/
<![CDATA[Something that is CHAOGENOUS is born out of total chaos.]]> https://twitter.com/therourke/statuses/1397560557977415681 ]]> Wed, 26 May 2021 07:29:37 -0700 https://twitter.com/therourke/statuses/1397560557977415681 <![CDATA[Jordan Peterson & Fascist Mysticism | by Pankaj Mishra | NYR Daily | The New York Review of Books]]> http://www.nybooks.com/daily/2018/03/19/jordan-peterson-and-fascist-mysticism/

“Men have to toughen up,” Jordan B. Peterson writes in 12 Rules For Life: An Antidote to Chaos, “Men demand it, and women want it.” So, the first rule is, “Stand up straight with your shoulders back” and don’t forget to “clean your room.

]]>
Fri, 01 Jun 2018 08:52:18 -0700 http://www.nybooks.com/daily/2018/03/19/jordan-peterson-and-fascist-mysticism/
<![CDATA[John Lanchester reviews ‘The Attention Merchants’ by Tim Wu, ‘Chaos Monkeys’ by Antonio García Martínez and ‘Move Fast and Break Things’ by Jonathan Taplin · LRB 17 August 2017]]> https://www.lrb.co.uk/v39/n16/john-lanchester/you-are-the-product

At the end of June, Mark Zuckerberg announced that Facebook had hit a new level: two billion monthly active users. That number, the company’s preferred ‘metric’ when measuring its own size, means two billion different people used Facebook in the preceding month.

]]>
Wed, 06 Sep 2017 03:24:16 -0700 https://www.lrb.co.uk/v39/n16/john-lanchester/you-are-the-product
<![CDATA[Sonic Acts 2017: The Noise of Becoming: On Monsters, Men, and Every Thing in Between]]> https://machinemachine.net/portfolio/sonic-acts-2017-the-noise-of-becoming-on-monsters-men-and-every-thing-in-between/

UPDATE: My talk is also now available in The Noise of Being publication, published by Sonic Acts in September 2017 A talk I delivered at Sonic Acts Festival 2017: The Noise of Being, in which I refigure the sci-fi horror monster The Thing from John Carpenter’s 1982 film of the same name:

The Thing is a creature of endless mimetic transformations, capable of becoming the grizzly faced men who fail to defeat it. The most enduring quality of The Thing is its ability to perform self-effacement and subsequent renewal at every moment, a quality we must embrace and mimic ourselves if we are to outmanoeuvre the monsters that harangue us.

This talk was part of a panel featuring Laurie Penny and Ytasha Womack, entitled Speculative Fiction: Radical Figuration For Social Change. You can see their wonderful talks here:

Laurie Penny: Feminism Against Fascism Ytasha Womack: Afrofuturism: Imagination and Humanity

full text follows (+ references & slides) An Ontology of Every Thing on the Face of the Earth John Carpenter’s 1982 film, The Thing, is a claustrophobic science fiction thriller exhibiting many hallmarks of the horror genre. The film depicts a sinister turn for matter where the chaos of the replicating, cancerous cell is expanded to the human scale and beyond. We watch as an alien force terrorises an isolated Antarctic outpost. The creature exhibits an awesome ability to imitate; devouring any form of life it comes across, whilst simultaneously giving birth to an exact copy in a burst of bile and protoplasm. The Thing copies cell by cell in a process so perfect, that the resultant simulacrum speaks, acts, and even thinks like the original. The Thing is so relentless and its copies so perfect, that the outpost’s Doctor, Blair, is sent mad at the implications: If a cell gets out it could imitate everything on the face of the Earth… and it’s not gonna stop! [1] This text is also available in The Noise of Being publication (published September 2017) Based on John W. Campbell’s 1938 novella, Who Goes There?, Carpenter’s film revisits a gothic trope that is numerous in its incarnations. In Campbell’s novella, The Thing is condensed as much from the minds of the men as from its own horrific, defrosting bulk. A slowly surfacing nightmare that transforms alien matter into earthly biology also has the effect of transferring the inner, mental lives of the men into the resultant condensation. John W. Campbell knew that The Thing could become viscous human flesh, but in order to truly imitate its prey the creature must infect inner life separately, pulling kicking and screaming ghosts out of their biological – Cartesian – machines. As a gothic figure, Campbell’s Thing disrupts the stable and integral vision of human being: self-same bodies housing ‘unitary and securely bounded’ [2] subjectivities, identical and extensive through time. His characters confront their anguish at being embodied: their nightmares are literally made flesh. To emphasise the otherness of each human’s flesh, Campbell’s story is inhabited exclusively with male characters. The absence of women makes the conflict between each of the men feel more rudimentary, but it also centres the novel’s horror on the growing realisation that to be human is also to be alien to oneself. Differences between sexes within the single species homo sapiens are bypassed, allowing the alien entity to exhibit the features of human female ‘otherness’ alongside a gamut of horrific bodily permutations. Perhaps, as Barbara Creed, [3] Rosi Braidotti, [4] and others [5] have argued, The Thing signifies the intrinsic absence of the mother figure: the female body’s capacity to be differentiated from itself in the form of pregnancy; to open up and usher forth into the world a creature other to itself. This Thingly quality is given credence by Julia Kristeva in a passage that could equally refer to The Thing as to the development of a fetus during pregnancy: Cells fuse, split, and proliferate; volumes grow, tissues stretch, and the body fluids change rhythm, speeding up or slowing down. With the body, growing as a graft, indomitable, there is another. And no one is present, within that simultaneously dual and alien space, to signify what is going on. [6] The Thing does exhibit demeanours of copulation and fertility, but also of disease, fragmentation, dismemberment, and asexual fission. In the novella, during a drug induced nightmare Dr. Copper sits bolt upright and blurts out ‘Garry – listen. Selfish – from hell they came, and hellish shellfish – I mean self – Do I? What do I mean?,’ McReady [7] turns to the other men in the cabin, ‘Selfish, and as Dr. Copper said – every part is a whole. Every piece is self-sufficient, and animal in itself.’ [8] The Thing is aberrant at a level more fundamental than allusions to pregnancy can convey. Dr. Copper’s inability to articulate what The Thing is, indicates a categorical nightmare he and the men are suffering. As in the work of Mary Douglas, [9] The Thing’s nightmarish transformation denies the very concept of physical and categorical purity. The Thing’s distributed biology calls to mind the Hardt and Negri’s vision of the early Internet (ARPANET), designed, according to them: …to withstand military attack. Since it has no center and almost any portion can operate as an autonomous whole, the network can continue to function even when part of it has been destroyed. The same design element that ensures survival, the decentralisation, is also what makes control of the network so difficult. [10] The image of mankind’s outright destruction, via totalising narratives such as nuclear war, viral pandemic, or meteor strike is undermined by the paradigm of a Thingly technological infrastructure designed to avoid ‘absolute’ assault. Decentralisation is a categorical horror in its capacity to highlight our self-same, constantly threatened and weak, embodied selves. But shift the lens away from the self-same human subject, and the image of a distributed, amorphous network of autonomous cells immediately becomes a very good description of how biological life has always been constituted. The metaphysical dualism of the sexes, as Kelly Hurley concludes, is an inadequate paradigm of such horrific embodiment, rather any and all ‘ontological security’ [11] is challenged through a ‘collapsing of multiple and incompatible morphic possibilities into one amorphous embodiment.’ [12] The Thing is neither male nor female, two nor one, inside nor outside, living nor dead. If it does settle into a form that can be exclaimed, screamed or defined in mutually incompatible words, it does so only for a moment and only in the mind of its onlooker as they scrabble to deduce its next amorphous conflation. The Thing is a figure performing ontogenesis (something coming to be) rather than ontology (something that already is). [13] ‘The very definition of the real,’ as Jean Baudrillard affirmed, has become ‘that of which it is possible to give an equivalent reproduction.’ [14] Does The Thing ‘produce’ something other than human life, or ‘reproduce’ human life in its entirety, and what, if anything, would be the difference? In a text on bio and necropolitics, Eugene Thacker undertakes an examination of the ‘difference between “Life” as an ontological foundation, and “the living,” or the various specific instantiations of Life.’ [15] Thacker highlights a passage in Poetics where Aristotle speaks of mimesis giving rise to the art of poetry in human beings: We take delight in viewing the most accurate possible images of objects which in themselves cause distress when we see them (e.g. the shapes of the lowest species of animal, and corpses). Recognition of mimetic forms can instill a certain degree of displeasure if that form depicts a carcass or something considered equally abhorrent. But this is often tinged with what Aristotle calls the ‘extremely pleasurable’ dual capacities of recognising an imitation as such, whilst at the same time recognising what it is the form is imitative of. The horror of The Thing is bound to this endless ontogenetic re-forming, its limitless capacity to imitate and become without necessarily settling into a final, stable and agreeable categorical – that is, ontological – form. The men of the Antarctic encampment grasp in their minds at the forms ushering from The Thing but can never keep up with its propensity toward the next shapeless-shape, bodiless-limb, or ontogenetic-extrudence. The Thing is a phenomenon, to use Eugene Thacker’s words once more, that is ‘at once “above” and “below” the scale of the human being,’ [16] throwing, as Rosi Braidotti puts it, ‘a terminal challenge towards a human identity that is commonly predicated on the One.’ [17] The ‘other’ of The Thing never settles down, always falling outside the dialectical circle. As Helene Cixous remarks in The Newly Born Woman, with the ‘truly “other” there is nothing to say; it cannot be theorized. The “other” escapes me.’ [18] The figure of The Thing bursts into popular culture at the meeting point between dream and flesh, and has been pursued ever since by men whose individuality is considered inseparable from their self-same embodiment. By modifying the rules through which dominant norms such as gender binaries operate, The Thing can be conceived as an incarnation of détournement: an intervention that hijacks and continually modifies the rules of engagement. ‘The radical implication [being] that [all] meaning is connected to a relationship with power.’ [19] Considered through Michel Foucault’s definition of bio-power, or the bio-political, The Thing is the process of sex and sexuality severed from the humans who are forced to proliferate ‘through’ it. Above all, the men set against this propagation – this mobilisation of images of ‘other’ – scramble to protect the normative image of the human they hold most dear: the mirage of ‘man’. Becoming World The filmic Thing is a fictional device enabled by animatronic augmentations coated with fleshy stand-ins, KY Jelly, and occasionally, real animal offal. As John Carpenter described his rendition of the creature in a 2014 interview, ‘It’s just a bunch of rubber on the floor.’ [20] Bringing The Thing ‘to life’ is an activity that performs the collapse ‘between “Life” as an ontological foundation, and “the living,” or the various specific instantiations of Life.’ [21] The animatronic Thing exists in the space between stable forms; it is vibrant, expressive technology realised by dead matter; and human ingenuity made discernible by uncanny machinic novelty. Ontological uncertainty finds fluidity in language on a page, in the ability to poetically gesture towards interstitiality. But on-screen animatronics, rubber, and KY Jelly are less fluid, more mimetically rooted by the expectations of the audience reveling in, and reviled by, their recognition of The Thing’s many forms. Upon its release critical reactions to John Carpenter’s The Thing were at best muted and at worst downright vitriolic. The special effects used to depict the creature were the focus of an attack by Steve Jenkins’. Jenkins attacks the film essentially for its surrealist nature… he writes that: “with regard to the effects, they completely fail to ‘clarify the weirdness’ of the Thing”, and that “because one is ever sure exactly how it [the alien] functions, its eruptions from the shells of its victims seem as arbitrary as they are spectacular’.” [22] In short, the reviews lingered on two opposing readings of The Thing’s shock/gore evocations: that they go too far and thus tend towards sensational fetishism, or that they can’t go far enough, depicting kitsch sensibilities rather than alien otherness. Jenkins’ concern that the special effects do not ‘clarify’ The Thing’s ‘weirdness’ is contradictory, if not oxymoronic. The implication is that Things could never be so weird as to defy logical function, and that all expressions should, and eventually do, lend themselves to being read through some parochial mechanism or other, however surreal they may at first seem. That The Thing’s nature could actually defy comprehensibility is not considered, nor how impossible the cinematic depiction of that defiance might be. Rather, the critical view seems to be that every grisly eruption, bifurcation, and horrific permutation on screen must necessarily express an inner order temporarily hidden from, but not inaccessible to, its human onlookers. This critical desire for a ‘norm’ defies the same critical desire for ‘true’ horror. Our will to master matter and technology through imitative forms is the same will that balks at the idea that imitative forms could have ontologies incommensurable with our own. The Thing is ‘weird’: a term increasingly applied to those things defying categorisation. A conviction, so wrote the late Mark Fisher, ‘that this does not belong, is often a sign that we are in the presence of the new… that the concepts and frameworks which we have previously employed are now obsolete.’ [23] In reflecting on the origins of this slippery anti-category, Eugene Thacker reminds us that within horror, ‘The threat is not the monster, or that which threatens existing categories of knowledge. Rather, it is the “nameless thing,” or that which presents itself as a horizon for thought… the weird is the discovery of an unhuman limit to thought, that is nevertheless foundational for thought.’ [24] In The Thing the world rises up to meet its male inhabitants in a weird form and, by becoming them, throws into question the categorical foundations of the born and the made, of subject and object, natural and synthetic, whole and part, human and world, original and imitation. What remains is an ongoing process of animation rendered horrific by a bifurcation of ontologies: on one side the supposed human foundation of distinction, uniqueness and autonomy; on the other, a Thingly (alien and weird) propensity that dissolves differentiation, that coalesces and revels in an endless process of becoming.  As in Mikhail Bakhtin‘s study of the grotesque, the ‘human horizon’ in question is that of the ‘canon,’ [25] a norm to which all aberrations are to be compared: The grotesque body… is a body in the act of becoming. It is never finished, never completed; it is continually built, created, and builds and creates another body. Moreover, the body swallows the world and is itself swallowed by the world. [26] The Thingly is neither self-same nor enclosed unto itself. It is a plethora of openings, conjoinings and eruptions that declare ‘the world as eternally unfinished: a world dying and being born at the same time.’ [27] The bodily horror performed by The Thing is an allegory of this greater interstitial violation: the conceptual boundary between the world-for-us and the world-without-us is breached not as destruction, or even invasion, but ultimately through our inability to separate ourselves from a world that is already inherently alien and weird. [28] ‘A monstrosity’ to hijack the words of Claire Colebrook, ‘that we do not feel, live, or determine, but rather witness partially and ex post facto.’ [29] How these processes are comprehended, or more precisely, how the perception of these processes is interpreted, is more important than the so called ‘difference’ between the world which existed before and the world which remains after. Eugene Thacker clarifies this point in his analysis of the etymology of the word ‘monster’: A monster is never just a monster, never just a physical or biological anomaly. It is always accompanied by an interpretive framework within which the monster is able to be monstrum, literally “to show” or “to warn.” Monsters are always a mat­ter of interpretation. [30] Becoming Weird In a 1982 New York Times movie section, critic Vincent Canby poured yet more scorn on John Carpenter’s ‘Thing’ remake: The Thing is a foolish, depressing, overproduced movie that mixes horror with science fiction to make something that is fun as neither one thing or the other… There may be a metaphor in all this, but I doubt it… The Thing… is too phony looking to be disgusting. It qualifies only as instant junk. [31] Chiming with his critic peers, Canby expresses his desire that the monster show its nature – be monstrum – only in respect of some ‘norm’; [32] some ‘interpretive framework’, [33] that the narrative will eventually uncover. By setting up ‘junk’ as a kitschy opposite to this supposedly palatable logic, Canby unwittingly generates a point from which to disrupt the very notion of the interpretive framework itself. The Thing is more than a metaphor. Canby’s appeal to ‘instant junk’ can be read as the monstrum, the revealing of that which constitutes the norm. The monster stands in for difference, for other, and in so doing normalises the subject position from which the difference is opposed: the canon. In the case of The Thing that canon is first and foremost the human male, standing astride the idea of a world-for-us. The ‘us’ is itself monopolised, as if all non-male ontogenetic permutations were cast out into the abject abyss of alien weirdness. In reclaiming ‘junk’ as a ‘register of the unrepresentable’ [34] a Thingly discourse may share many of the tenets of queer theory. As Rosi Braidotti makes clear, referring to the work of Camilla Griggers: ‘Queer’ is no longer the noun that marks an identity they taught us to despise, but it has become a verb that destabilizes any claim to identity, even and especially to a sex-specific identity. [35] The queer, the weird, the kitsch, are among the most powerful of orders because they are inherently un-representable and in flux. The rigid delineations of language and cultural heteronormativity are further joined in the figure of The Thing by a non-anthropic imaginary that exposes a whole range of human norms and sets into play a seemingly infinite variety of non-human modes of being and embodiment. Rosi Braidotti refers to the work of Georges Canguilhem in her further turn outwards towards the weird, ‘normality is, after all, the zero-degree of monstrosity,’ [36] signalling a post-human discourse as one which, by definition, must continually question – perhaps even threaten – the male, self-same, canonised, subject position: We need to learn to think of the anomalous, the monstrously different not as a sign of pejoration but as the unfolding of virtual possibilities that point to positive alternatives for us all… the human is now displaced in the direction of a glittering range of post-human variables. [37] In her book on The Death of The Posthuman (2014), Claire Colebrook looks to the otherwise, the un-representable, to destabilise the proposition of a world being for anyone. She begins by considering the proposed naming of the current geological era ‘The Anthropocene,’ [38] a term that designates a theoretical as well as scientific impasse for human beings and civilisation, in which human activity and technological development have begun to become indistinguishable, and/or exceed processes implicit within what is considered to be the ‘natural’ world. As if registering the inevitable extinction of humans isn’t enough, The Anthropocene, by being named in honour of humans, makes monsters of those times – past and present – which do not contain humans. Its naming therefore becomes a mechanism allowing the imagination of ‘a viewing or reading in the absence of viewers or readers, and we do this through images in the present that extinguish the dominance of the present.’ [39] The world ‘without bodies’ that is imaged in this move, Colebrook argues, is written upon by the current state of impending extinction. Humans are then able to look upon the future world-without-us in a state of nostalgia coloured by their inevitable absence. Here the tenets of the horror genre indicated by Eugene Thacker are realised as a feature of a present condition. The world-in-itself has already been subsumed by The Thingly horror that is the human species. For even the coming world-without-us, a planet made barren and utterly replaced by The Thingly junk of human civilisation, will have written within its geological record a mark of human activity that goes back well before the human species had considered itself as a Thing ‘in’ any world at all. In an analysis of the etymology of the Anthropocene, McKenzie Wark also turns to theory as a necessary condition of the age of extinction: All of the interesting and useful movements in the humanities since the late twentieth century have critiqued and dissented from the theologies of the human. The Anthropocene, by contrast, calls for thinking something that is not even defeat. [40] The Anthropocene, like ‘queer’ or ‘weird’, should be made into a verb, and relinquished as a noun. Once weirded in this way it becomes a productive proposition, Wark goes on, quoting Donna Haraway, ‘another figure, a thousand names of something else.’ [41] In the 2014 lecture quoted by Wark, Haraway called for other such worldings through the horrific figure of capitalism, through arachnids spinning their silk from the waste matter of the underworld, or from the terrible nightmares evoked in the fiction of the misogynist, racist mid 20th century author H.P. Lovecraft: The activation of the chthonic powers that is within our grasp to collect up the trash of the anthropocene, and the exterminism of the capitalocene, to something that might possibly have a chance of ongoing. [42] That weird, ongoing epoch is the Chthulucene, a monstrum ‘defined by the frightening weirdness of being impossibly bound up with other organisms,’ [43] of what Haraway calls, ‘multi-species muddles.’  [44] The horror of ‘the nameless thing’ is here finally brought to bear in Haraway’s Capitalocene and Chthulucene epochs. Haraway’s call for ‘a thousand names of something else’ is Thingly in its push towards the endlessly bifurcated naming, and theoretical subsuming. The anthro-normalisation casts out infinitely more possibilities than it brings into play. Although Donna Haraway makes it clear that her Chthulucene is not directly derivative of H.P. Lovecraft’s Cthulhu mythos, her intentional mis-naming and slippery non-identification exemplifies the kind of amorphous thinking and practice she is arguing for. Haraway’s Chthulucene counters Lovecraft’s Cthulhu with an array of chthonic, non-male, tentacular, rhizomatic, and web spinning figures that attest to the monstrum still exposed by Lovecraft’s three quarters of a century old work. The continued – renewed – fascination with Lovecraft’s weird ‘others’ thus has the capacity to expose a dread of these times. As writer Alan Moore has attested: [I]t is possible to perceive Howard Lovecraft as an almost unbearably sensitive barometer of American dread. Far from outlandish eccentricities, the fears that generated Lovecraft’s stories and opinions were precisely those of the white, middle-class, heterosexual, Protestant-descended males who were most threatened by the shifting power relationships and values of the modern world… Coded in an alphabet of monsters, Lovecraft’s writings offer a potential key to understanding our current dilemma, although crucial to this is that they are understood in the full context of the place and times from which they blossomed. [45] The dominant humanistic imagination may no longer posit white cis-males as the figure that ‘must’ endure, but other uncontested figures remain in the space apparently excavated of Lovecraft’s affinities. To abandon what Claire Colebrook calls ‘the fantasy of one’s endurance,’ may be to concede that the post-human is founded on ‘the contingent, fragile, insecure, and ephemeral.’ [46] But, as Drucilla Cornell and Stephen D. Seely suggest, it is dangerous to consider this a ‘new’ refined status for the beings that remain, since ‘this sounds not like the imagination of living beyond Man, but rather like a meticulous description of the lives of the majority of the world under the condition of advanced capitalism right now.’ [47] As Claire Colebrook warns, post-humanism often relinquishes its excluded others – women, the colonised, nonhuman animals, or ‘life itself’ [48] – by merely subtracting the previously dominant paradigm of white heteropatriarchy, whilst failing to confront the monster the that particular figure was indicative of: Humanism posits an elevated or exceptional ‘man’ to grant sense to existence, then when ‘man’ is negated or removed what is left is the human all too human tendency to see the world as one giant anthropomorphic self-organizing living body… When man is destroyed to yield a posthuman world it is the same world minus humans, a world of meaning, sociality and readability yet without any sense of the disjunction, gap or limits of the human. [49] As in Haraway and Wark’s call for not just ‘naming, but of doing, of making new kinds of labor for a new kind of nature,’ [50] contemporary criticism and theory must be allowed to take on the form of the monsters it pursues, moulding and transforming critical inquiries into composite, hybrid figures that never settle in one form lest they become stable, rigid, and normalised. In fact, this metaphor itself is conditioned too readily by the notion of a mastery ‘Man’ can wield. Rather, our inquiries must be encouraged ‘to monster’ separately, to blur and mutate beyond the human capacity to comprehend them, like the infinite variety of organisms Haraway insists the future opens into. The very image of a post-humanism must avoid normalising the monster, rendering it through analysis an expression of the world-for-us. For Eugene Thacker this is the power of the sci-fi-horror genre, to take ‘aim at the presuppositions of philosophical inquiry – that the world is always the world-for-us – and [make] of those blind spots its central concern, expressing them not in abstract concepts but in a whole bestiary of impossible life forms – mists, ooze, blobs, slime, clouds, and muck.’ [51] Reflecting on the work of Noël Carroll, [52] Rosi Braidotti argues that if science fiction horror ‘is based on the disturbance of cultural norms, it is then ideally placed to represent states of crisis and change and to express the widespread anxiety of our times. As such this genre is as unstoppable as the transformations it mirrors.’ [53]  

References [1] John Carpenter, The Thing, Film, Sci-Fi Horror (Universal Pictures, 1982). [2]  Kelly Hurley, The Gothic Body: Sexuality, Materialism, and Degeneration at the Fin de Siècle (Cambridge University Press, 2004), 3. [3]  B. Creed, ‘Horror and the Monstrous-Feminine: An Imaginary Abjection.’ Screen 27, no. 1 (1 January 1986): 44–71. [4]  Rosi Braidotti, Metamorphoses: Towards a Materialist Theory of Becoming (Wiley, 2002), 192–94. [5]  Ian Conrich and David Woods, eds., The Cinema Of John Carpenter: The Technique Of Terror (Wallflower Press, 2004), 81. [6]  Julia Kristeva, quoted in Jackie Stacey, Teratologies: A Cultural Study of Cancer (Routledge, 2013), 89. [7]  The character McReady becomes MacReady in Carpenter’s 1982 retelling of the story. [8]  Campbell, Who Goes There?, 107. [9]  Noël Carroll, The Philosophy of Horror, Or, Paradoxes of the Heart (New York: Routledge, 1990). [10] Michael Hardt and Antonio Negri, Empire, New Ed (Harvard University Press, 2001), 299. [11] Braidotti, Metamorphoses, 195. [12] Kelly Hurley, ‘Reading like an Alien: Posthuman Identity in Ridley Scott’s Aliens and David Cronenberg’s Rabid,’ in Posthuman Bodies, ed. Judith M. Halberstam and Ira Livingston (Bloomington: John Wiley & Sons, 1996), 219. [13] This distinction was plucked, out of context, from Adrian MacKenzie, Transductions: Bodies and Machines at Speed (A&C Black, 2006), 17. MacKenzie is not talking about The Thing, but this distinction is, nonetheless, very useful in bridging the divide between stable being and endless becoming. [14] Jean Baudrillard, Simulations, trans. Paul Foss, Paul Patton, and Philip Beitchman (Semiotext (e) New York, 1983), 146. [15] Eugene Thacker, ‘Nekros; Or, The Poetics Of Biopolitics,’ Incognitum Hactenus 3, no. Living On: Zombies (2012): 35. [16] Ibid., 29. [17] Braidotti, Metamorphoses, 195. [18] Hélène Cixous, The Newly Born Woman (University of Minnesota Press, 1986), 71. [19] Nato Thompson et al., eds., The Interventionists: Users’ Manual for the Creative Disruption of Everyday Life (North Adams, Mass. : Cambridge, Mass: MASS MoCA ; Distributed by the MIT Press, 2004), 151. [20] John Carpenter, BBC Web exclusive: Bringing The Thing to life, Invasion, Tomorrow’s Worlds: The Unearthly History of Science Fiction, 14 November 2014. [21] Thacker, ‘Nekros; Or, The Poetics Of Biopolitics,’ 35. [22] Ian Conrich and David Woods, eds., The Cinema Of John Carpenter: The Technique Of Terror (Wallflower Press, 2004), 96. [23] Mark Fisher, The Weird and the Eerie, 2016, 13. [24] Eugene Thacker, After Life (University of Chicago Press, 2010), 23. [25] Mikhail Mikhaĭlovich Bakhtin, Rabelais and His World (Indiana University Press, 1984), 321. [26] Ibid., 317. [27] Ibid., 166. [28] This sentence is a paraphrased, altered version of a similar line from Eugene Thacker, ‘Nine Disputations on Theology and Horror,’ Collapse: Philosophical Research and Development IV: 38. [29] Claire Colebrook, Sex After Life: Essays on Extinction, Vol. 2 (Open Humanities Press, 2014), 14. [30] Eugene Thacker, ‘The Sight of a Mangled Corpse—An Interview with’, Scapegoat Journal, no. 05: Excess (2013): 380. [31] Vincent Canby, ‘“The Thing” Is Phony and No Fun,’ The New York Times, 25 June 1982, sec. Movies. [32] Derrida, ‘Passages: From Traumatism to Promise,’ 385–86. [33] Thacker, ‘The Sight of a Mangled Corpse—An Interview with,’ 380. [34] Braidotti, Metamorphoses, 180. [35] Ibid. [36] Ibid., 174. [37] Rosi Braidotti, ‘Teratologies’, in Deleuze and Feminist Theory, ed. Claire Colebrook and Ian Buchanan (Edinburgh: Edinburgh University Press, 2000), 172. [38] A term coined in the 1980s by ecologist Eugene F. Stoermer and widely popularized in the 2000s by atmospheric chemist Paul Crutzen. The Anthropocene is, according to Jan Zalasiewicz et al., ‘a distinctive phase of Earth’s evolution that satisfies geologist’s criteria for its recognition as a distinctive statigraphic unit.’ – Jan Zalasiewicz et al., ‘Are We Now Living in the Anthropocene,’ GSA Today 18, no. 2 (2008): 6. [39] Claire Colebrook, Death of the PostHuman: Essays on Extinction, Vol. 1 (Open Humanities Press, 2014), 28. [40] McKenzie Wark, ‘Anthropocene Futures’ Versobooks.com, 23 February 2015. [41] Ibid. [42] Donna Haraway, ‘Capitalocene, Chthulucene: Staying with the Trouble’ (University of California at Santa Cruz, 5 September 2014). [43] Leif Haven, ‘We’ve All Always Been Lichens: Donna Haraway, the Cthulhucene, and the Capitalocene,’ ENTROPY, 22 September 2014. [44] Donna Haraway, ‘SF: Sympoiesis, String Figures, Multispecies Muddles’ (University of Alberta, Edmonton, Canada, 24 March 2014). [45] H. P Lovecraft, The New Annotated H.P. Lovecraft, ed. Leslie S Klinger (Liveright, 2014), xiii. [46] Claire Colebrook, Sex After Life: Essays on Extinction, Vol. 2 (Open Humanities Press, 2014), 22. [47] Drucilla Cornell and Stephen D Seely, The Spirit of Revolution: Beyond the Dead Ends of Man (Polity press, 2016), 5. [48] Ibid., 3–4. [49] Claire Colebrook, Death of the PostHuman: Essays on Extinction, Vol. 1 (Open Humanities Press, 2014), 163–64. [50] Wark, ‘Anthropocene Futures.’ [51] Thacker, In the Dust of This Planet, 9. [52]   Carroll, The Philosophy of Horror, Or, Paradoxes of the Heart. [53]   Braidotti, Metamorphoses, 185 (my emphasis).

]]>
Sun, 26 Feb 2017 04:43:01 -0800 https://machinemachine.net/portfolio/sonic-acts-2017-the-noise-of-becoming-on-monsters-men-and-every-thing-in-between/
<![CDATA[Transmediale 2017 (events)]]> http://machinemachine.net/text/ideas/transmediale-2017/

I just came back from two jam packed weeks at Transmediale festival, 2017. Morehshin Allahyari and I were involved in a wealth of events, mostly in relation to our #Additivism project. Including: On the Far Side of the Marchlands: an exhibition at Schering Stiftung gallery, featuring work by Catherine Disney, Keeley Haftner, Brittany Ransom, Morehshin and myself.

Photos from the event are gathered here.

The 3D Additivist Cookbook european launch: held at Transmediale on Saturday 4th Feb.

Audio of the event is available here.

Singularities: a panel and discussion conceived and introduced by Morehshin and myself. Featuring Luiza Prado & Pedro Oliveira (A parede), Rasheedah Phillips, and Dorothy R. Santos.

Audio of the entire panel is available here. The introduction to the panel – written by Morehshin and myself – can be found below. Photos from the panel are here.

Alien Matter exhibition: curated by Inke Arns as part of Transmediale 2017. Featuring The 3D Additivist Cookbook and works by Joey Holder, Dov Ganchrow, and Kuang-Yi Ku.

Photos from the exhibition can be found here.

 

Singularities Panel delivered at Transmediale, Sunday 5th February 2017 Introduction by Morehshin Allahyari and Daniel Rourke   Morehshin: In 1979, the Iranian Islamic revolution resulted in the overthrowing of the Pahlavi deen-as-ty and led to the establishment of an Islamic republic. Many different organizations, parties and guerrilla groups were involved in the Iranian Revolution. Some groups were created after the fall of Pahlavi and still survive in Iran; others helped overthrow the Shah but no longer exist. Much of Iranian society was hopeful about the coming revolution. Secular and leftist politicians participated in the movement to gain power in the aftermath, believing that Khomeini would support their voice and allow multiple positions and parties to be active and involved in the shaping of the post-revolution Iran. Like my mother – a Marxist at the time – would always say: The Iranian revolution brought sudden change, death, violence in unforeseen ways. It was a point, a very fast point of collapse and rise. The revolution spun out of control and the country was taken over by Islamists so fast that people weren’t able to react to it; to slow it; or even to understand it. The future was now in the hands of a single party with a single vision that would change the lives of generations of Iranians, including myself, in the years that followed. We were forced and expected to live in one singular reality. A mono authoritarian singularity. In physics, a singularity is a point in space and time of such incredible density that the very nature of reality is brought into question. Associated with elusive black holes and the alien particles that bubble out of the quantum foam at their event horizon, the term ‘singularity’ has also been co-opted by cultural theorists and techno-utopianists to describe moments of profound social, political, ontological or material transformation. The coming-into-being of new worlds that redefine their own origins. For mathematicians and physicists, singularities are often considered as ‘bad behaviour’ in the numbers and calculations. Infinite points may signal weird behaviours existing ‘in’ the physical world: things outside or beyond our ability to comprehend. Or perhaps, more interestingly, a singularity may expose the need for an entirely new physics. Some anomalies can only be made sense of by drafting a radically new model of the physical world to include them. For this panel we consider ‘bad behaviours’ in social, technological and ontological singularities. Moments of profound change triggered by a combination of technological shifts, cultural mutations, or unforeseen political dramas and events. Like the physicists who comprehend singularities in the physical world, we do not know whether the singularities our panelists highlight today tell us something profound about the world itself, or force us to question the model we have of the world or worlds. Daniel: As well as technological or socio-political singularities, this panel will question the ever narcissistic singularities of ‘I’, ‘here’ and ‘now’ – confounding the principles of human universality upon which these suppositions are based. We propose ‘singularities’ as eccentric and elusive figures in need of collective attention. It is no coincidence that ‘Singularity’ is often used as a term to indicate human finitude. Self-same subjects existing at particular points in time, embedded within particular contexts, told through a singular history or single potential future. The metaphor of the transformative Singularity signals not one reality ‘to come’, nor even two realities – one moved from and one towards – but of many, all dependant on who the subject of the singularity is and how much autonomy they are ascribed. The ‘Technological’ Singularity is a myth of the ‘transhumanists’, a group of mainly Western, commonly white, male enthusiasts, who ascribe to the collective belief that technology will help them to become ‘more than human’… ‘possessed of drastically augmented intellects, memories, and physical powers.’ As technological change accelerates, according to prominent Transhumanist Ray Kurzweil, so it pulls us upwards in its wake. Kurzweil argues that as the curve of change reaches an infinite gradient reality itself will be brought into question: like a Black Hole in space-time subjects travelling toward this spike will find it impossible to turn around, to escape its pull. A transformed post-human reality awaits us on the other side of the Technological Singularity. A reality Kurzweil and his ilk believe ‘we’ will inevitably pass into in the coming decades. In a 2007 paper entitled ‘Droppin’ Science Fiction’, Darryl A. Smith explores the metaphor of the singularity through Afro-American and Afrofuturist science fiction. He notes that the metaphor of runaway change positions those subject to it in the place of Sisyphus, the figure of Greek myth condemned to push a stone up a hill forever. For Sisyphus to progress he has to fight gravity as it conspires with the stone to pull him back to the bottom of the slope. The singularity in much science fiction from black and afro-american authors focusses on this potential fall, rather than the ascent:

“Here, in the geometrics of spacetime, the Spike lies not at the highest point on an infinite curve but at the lowest… Far from being the shift into a posthumanity, the Negative Spike is understood… as an infinite collapsing and, thus, negation of reality. Escape from such a region thus requires an opposing infinite movement.”

The image of a collective ‘push’ of the stone of progress up the slope necessarily posits a universal human subject, resisting the pull of gravity back down the slope. A universal human subject who passes victorious to the other side of the event horizon. But as history has shown us, technological, social and political singularities – arriving with little warning – often split the world into those inside and those outside their event horizons. Singularities like the 1979 Iranian revolution left many more on the outside of the Negative Spike, than the inside. Singularities such as the Industrial Revolution, which is retrospectively told in the West as a tale of imperial and technological triumph, rather than as a story of those who were violently abducted from their homelands, and made to toil and die in fields of cotton and sugarcane. The acceleration toward and away from that singularity brought about a Negative Spike so dense, that many millions of people alive today still find their identities subject to its social and ontological mass. In their recent definition of The Anthropocene, the International Commission on Stratigraphy named the Golden Spike after World War II as the official signal of the human-centric geological epoch. A series of converging events marked in the geological record around the same time: the detonation of the first nuclear warhead; the proliferation of synthetic plastic from crude oil constituents; and the introduction of large scale, industrialised farming practices, noted by the appearance of trillions of discarded chicken bones in the geological record. Will the early 21st century be remembered for the 9/11 terrorist event? The introduction of the iPhone, and Twitter? Or for the presidency of Donald J Trump? Or will each of these extraordinary events be considered as part of a single, larger shift in global power and techno-mediated autonomy? If ‘we’ are to rebuild ourselves through stronger unities, and collective actions in the wake of recent political upheavals, will ‘we’ also forego the need to recognise the different subjectivities and distinct realities that bubble out of each singularity’s wake? As the iPhone event sent shockwaves through the socio-technical cultures of the West, so the rare earth minerals required to power those iPhones were pushed skywards in value, forcing more bodies into pits in the ground to mine them. As we gather at Transmediale to consider ai, infrastructural, data, robotic, or cyborgian revolutions, what truly remains ‘elusive’ is a definition of ‘the human’ that does justice to the complex array of subjectivities destined to be impacted – and even crafted anew – by each of these advances. In his recent text on the 2011 Fukushima Daiichi nuclear disaster Jean-Luc Nancy proposes instilling “the condition of an ever-renewed present” into the urgent design and creation of new, mobile futures. In this proposition Nancy recognises that each singularity is equal to all others in its finitude; an equivalence he defines as “the essence of community.” To contend with the idea of singularities – plural – of ruptures as such, we must share together that which will forever remain unimaginable alone. Morehshin: This appeal to a plurality of singularities is easily mistaken for the kinds of large scale collective action we have seen in recent years around the world. From the Arab Springs, and Occupy Movement through to the recent Women’s March, which took place not 24 hours after the inauguration of Donald Trump. These events in particular spoke of a universal drive, a collective of people’s united against a single cause. Much has been written about the ‘human microphone’ technique utilized by Occupy protesters to amplify the voice of a speaker when megaphones and loud speakers were banned or unavailable. We wonder whether rather than speak as a single voice we should seek to emphasise the different singularities enabled by different voices, different minds; distinct votes and protestations. We wonder whether black and brown protestors gathered in similar numbers, with similar appeals to their collective unity and identity would have been portrayed very differently by the media. Whether the radical white women and population that united for the march would also show up to the next black lives matter or Muslim ban protests. These are not just some academic questions but an actual personal concern… what is collectivism and for who does the collective function? When we talk about futures and worlds and singularities, whose realities are we talking about? Who is going to go to Mars with Elon Musk? And who will be left? As we put this panel together, in the last weeks, our Manifesto’s apocalyptic vision of a world accelerated to breaking point by technological progress began to seem strangely comforting compared to the delirious political landscape we saw emerging before us. Whether you believe political mele-ee-ze, media delirium, or the inevitable implosion of the neo-liberal project is to blame for the rise of figures like Farage, Trump or – in the Philippines – the outspoken President Rodrigo Duterte, the promises these figures make of an absolute shift in the conditions of power, appear grand precisely because they choose to demonize the discrete differences of minority groups, or attempt to overturn truths that might fragment and disturb their all-encompassing narratives. Daniel: The appeal to inclusivity – in virtue of a shared political identity – often instates those of ‘normal’ body, race, sex, or genome as exclusive harbingers of the-change-which-should – or so we are told, will – come. A process that theorist Rosi Braidotti refers to as a ‘dialectics of otherness’ which subtly disguises difference, in celebration of a collective voice of will or governance. Morehshin: Last week on January 27, as part of a plan to keep out “Islamic terrorists” outside of the United States Trump signed an order, that suspended entry for citizens of seven countries for 90 days. This includes Iran, the country I am a citizen of. I have lived in the U.S. for 9 years and hold a green-card which was included in Trump’s ban and now is being reviewed case by case for each person who enters the U.S.. When the news came out, I was already in Berlin for Transmediale and wasn’t sure whether I had a home to go back to. Although the chaos of Trump’s announcement has now settled, and my own status as a resident of America appears a bit more clear for now, the ripples of emotion and uncertainty from last week have coloured my experience at this festival. As I have sat through panels and talks in the last 3 days, and as I stand here introducing this panel about elusive events, potential futures and the in betweenness of all profound technological singularities… the realities that feel most significant to me are yet to take place in the lives of so many Middle-Easterners and Muslims affected by Trump’s ban. How does one imagine/re-imagine/figure/re-figure the future when there are still so many ‘presents’ existing in conflict? I grew up in Iran for 23 years, where science fiction didn’t really exist as a genre in popular culture. I always think we were discouraged to imagine the future other than how it was ‘imagined’ for us. Science-fiction as a genre flourishes in the West… But I still struggle with the kinds of futures we seem most comfortable imagining. THANKS   We now want to hand over to our fantastic panelists, to highlight their voices, and build harmonies and dissonances with our own. We are extremely honoured to introduce them: Dorothy Santos is a Filipina-American writer, editor, curator, and educator. She has written and spoken on a wide variety of subjects, including art, activism, artificial intelligence, and biotechnology. She is managing editor of Hyphen Magazine, and a Yerba Buena Center for the Arts fellow, where she is researching the concept of citizenship. Her talk today is entitled Machines and Materiality: Speculations of Future Biology and the Human Body. Luiza Prado and Pedro Oliveira are Brazilian design researchers, who very recently wrapped up their PhDs at the University of the Arts Berlin. Under the ‘A Parede’ alias, the duo researches new design methodologies, processes, and pedagogies for an onto-epistemological decolonization of the field. In their joint talk and performance, Luiza and Pedro will explore the tensions around hyperdense gravitational pulls and acts of resistance. With particular focus on the so-called “non-lethal” bombs – teargas and stun grenades – manufactured in Brazil, and exported and deployed all around the world. Rasheedah Phillips is creative director of Afrofuturist Affair: a community formed to celebrate, strengthen, and promote Afrofuturistic and Sci-Fi concepts and culture. In her work with ‘Black Quantum Futurism’, Rasheedah derives facets, tenets, and qualities from quantum physics, futurist traditions, and Black/African cultural traditions to celebrate the ability of African-descended people to see “into,” choose, or create the impending future. In her talk today, Rasheedah will explore the history of linear time constructs, notions of the future, and alternative theories of temporal-spatial consciousness.      

]]>
Thu, 09 Feb 2017 08:50:26 -0800 http://machinemachine.net/text/ideas/transmediale-2017/
<![CDATA[Obama chuckled. "You mean the Chaos Emeralds?"]]> https://twitter.com/therourke/statuses/800478666388111360 ]]> Sun, 20 Nov 2016 15:19:20 -0800 https://twitter.com/therourke/statuses/800478666388111360 <![CDATA[This is what happens when you divide by zero on a mechanical calculator]]> https://www.youtube.com/watch?v=OFJUYFlSYsM

From early on in math class, you’re taught that you cannot divide a number by zero. On paper, it doesn’t work out. Do it electronically, and you’ll get an error message. http://goo.gl/K1HGYC

Try do divide by zero with a mechanical calculator and, well, that’s where things get interesting.

YouTuber MultiGlizda recorded the chaos that happens within a Facit ESA-01 mechanical calculator when it’s asked to divide a number by zero. With the case off, viewers are able to see the fascinating inner workings of these old machines in operation, and also demonstrate the dicey nature of the number zero and its division.

YouTube channel numberphile explains that division is based on subtraction; that is, if you want to divide a number by a second number, you just subtract second number from the first number over and over again. So, 20 divided by 5 would be 20 minus 5, which equals 15, minus 5 which equals 10, minus 5 which equals 5, minus 5 which equals 0. Since it took four subtractions to get to zero, the answer is 4.

It’s a bit of a convoluted way of explaining division, but it helps us understand the video below. You see, when you divide 20 by 0, you’ll end up subtracting 0 from 20 an infinite amount of times. And in the case of the Facit ESA-01 mechanical calculator, what winds up happening is the machine attempts to complete the infinite number of operations it believes is necessary to complete the division.

]]>
Tue, 03 May 2016 12:41:47 -0700 https://www.youtube.com/watch?v=OFJUYFlSYsM
<![CDATA[6 Ways to Fight Climate Chaos | Novara Wire]]> http://wire.novaramedia.com/2015/05/6-ways-to-fight-climate-chaos/

Climate change is an issue so big it can be paralysing. It doesn’t help with the paralysis that proposed solutions tend to be either hopelessly inadequate (change your lightbulbs! buy local!), or hopelessly ambitious (just replace capitalism with global eco-communes!).

]]>
Sun, 31 May 2015 05:38:42 -0700 http://wire.novaramedia.com/2015/05/6-ways-to-fight-climate-chaos/
<![CDATA[The Radical Capacity of Glitch Art: Expression through an Aesthetic Rooted in Error - REDEFINE magazine]]> http://www.redefinemag.com/2014/glitch-art-expression-through-an-aesthetic-rooted-in-error/

In an experimental collision of chaos and purpose, glitch art exists as a low-key but important form of new media that broadly encompasses works of photography, video stills, moving pictures, and other image data that has been corrupted.

]]>
Wed, 29 Apr 2015 20:02:03 -0700 http://www.redefinemag.com/2014/glitch-art-expression-through-an-aesthetic-rooted-in-error/
<![CDATA[A Cyberattack Has Caused Confirmed Physical Damage for the Second Time Ever | WIRED]]> http://www.wired.com/2015/01/german-steel-mill-hack-destruction/

Amid all the noise the Sony hack generated over the holidays, a far more troubling cyber attack was largely lost in the chaos. Unless you follow security news closely, you likely missed it. I’m referring to the revelation, in a German report released just before Christmas (.

]]>
Fri, 09 Jan 2015 02:46:07 -0800 http://www.wired.com/2015/01/german-steel-mill-hack-destruction/
<![CDATA[Meet the Father of Digital Life]]> http://nautil.us/issue/14/mutation/meet-the-father-of-digital-life

n 1953, at the dawn of modern computing, Nils Aall Barricelli played God. Clutching a deck of playing cards in one hand and a stack of punched cards in the other, Barricelli hovered over one of the world’s earliest and most influential computers, the IAS machine, at the Institute for Advanced Study in Princeton, New Jersey. During the day the computer was used to make weather forecasting calculations; at night it was commandeered by the Los Alamos group to calculate ballistics for nuclear weaponry. Barricelli, a maverick mathematician, part Italian and part Norwegian, had finagled time on the computer to model the origins and evolution of life.

Inside a simple red brick building at the northern corner of the Institute’s wooded wilds, Barricelli ran models of evolution on a digital computer. His artificial universes, which he fed with numbers drawn from shuffled playing cards, teemed with creatures of code—morphing, mutating, melting, maintaining. He created laws that determined, independent of any foreknowledge on his part, which assemblages of binary digits lived, which died, and which adapted. As he put it in a 1961 paper, in which he speculated on the prospects and conditions for life on other planets, “The author has developed numerical organisms, with properties startlingly similar to living organisms, in the memory of a high speed computer.” For these coded critters, Barricelli became a maker of worlds.

Until his death in 1993, Barricelli floated between biological and mathematical sciences, questioning doctrine, not quite fitting in. “He was a brilliant, eccentric genius,” says George Dyson, the historian of technology and author of Darwin Among The Machines and Turing’s Cathedral, which feature Barricelli’s work. “And the thing about geniuses is that they just see things clearly that other people don’t see.”

Barricelli programmed some of the earliest computer algorithms that resemble real-life processes: a subdivision of what we now call “artificial life,” which seeks to simulate living systems—evolution, adaptation, ecology—in computers. Barricelli presented a bold challenge to the standard Darwinian model of evolution by competition by demonstrating that organisms evolved by symbiosis and cooperation.

Pixar cofounder Alvy Ray Smith says Barricelli influenced his earliest thinking about the possibilities for computer animation.

In fact, Barricelli’s projects anticipated many current avenues of research, including cellular automata, computer programs involving grids of numbers paired with local rules that can produce complicated, unpredictable behavior. His models bear striking resemblance to the one-dimensional cellular automata—life-like lattices of numerical patterns—championed by Stephen Wolfram, whose search tool Wolfram Alpha helps power the brain of Siri on the iPhone. Nonconformist biologist Craig Venter, in defending his creation of a cell with a synthetic genome—“the first self-replicating species we’ve had on the planet whose parent is a computer”—echoes Barricelli.

Barricelli’s experiments had an aesthetic side, too. Uncommonly for the time, he converted the digital 1s and 0s of the computer’s stored memory into pictorial images. Those images, and the ideas behind them, would influence computer animators in generations to come. Pixar cofounder Alvy Ray Smith, for instance, says Barricelli stirred his earliest thinking about the possibilities for computer animation, and beyond that, his philosophical muse. “What we’re really talking about here is the notion that living things are computations,” he says. “Look at how the planet works and it sure does look like a computation.”

Despite Barricelli’s pioneering experiments, barely anyone remembers him. “I have not heard of him to tell you the truth,” says Mark Bedau, professor of humanities and philosophy at Reed College and editor of the journal Artificial Life. “I probably know more about the history than most in the field and I’m not aware of him.”

Barricelli was an anomaly, a mutation in the intellectual zeitgeist, an unsung hero who has mostly languished in obscurity for the past half century. “People weren’t ready for him,” Dyson says. That a progenitor has not received much acknowledgment is a failing not unique to science. Visionaries often arrive before their time. Barricelli charted a course for the digital revolution, and history has been catching up ever since.

Barricelli_BREAKER-02 EVOLUTION BY THE NUMBERS: Barricelli converted his computer tallies of 1s and 0s into images. In this 1953 Barricelli print, explains NYU associate professor Alexander Galloway, the chaotic center represents mutation and disorganization. The more symmetrical fields toward the margins depict Barricelli’s evolved numerical organisms.From the Shelby White and Leon Levy Archives Center, Institute for Advanced Study, Princeton. Barricelli was born in Rome on Jan. 24, 1912. According to Richard Goodman, a retired microbiologist who met and befriended the mathematician in the 1960s, Barricelli claimed to have invented calculus before his tenth birthday. When the young boy showed the math to his father, he learned that Newton and Leibniz had preempted him by centuries. While a student at the University of Rome, Barricelli studied mathematics and physics under Enrico Fermi, a pioneer of quantum theory and nuclear physics. A couple of years after graduating in 1936, he immigrated to Norway with his recently divorced mother and younger sister.

As World War II raged, Barricelli studied. An uncompromising oddball who teetered between madcap and mastermind, Barricelli had a habit of exclaiming “Absolut!” when he agreed with someone, or “Scandaloos!” when he found something disagreeable. His accent was infused with Scandinavian and Romantic pronunciations, making it occasionally challenging for colleagues to understand him. Goodman recalls one of his colleagues at the University of California, Los Angeles who just happened to be reading Barricelli’s papers “when the mathematician himself barged in and, without ceremony, began rattling off a stream of technical information about his work on phage genetics,” a science that studies gene mutation, replication, and expression through model viruses. Goodman’s colleague understood only fragments of the speech, but realized it pertained to what he had been reading.

“Are you familiar with the work of Nils Barricelli?” he asked.

“Barricelli! That’s me!” the mathematician cried.

Notwithstanding having submitted a 500-page dissertation on the statistical analysis of climate variation in 1946, Barricelli never completed his Ph.D. Recalling the scene in the movie Amadeus in which the Emperor of Austria commends Mozart’s performance, save for there being “too many notes,” Barricelli’s thesis committee directed him to slash the paper to a tenth of the size, or else it would not accept the work. Rather than capitulate, Barricelli forfeited the degree.

Barricelli began modeling biological phenomena on paper, but his calculations were slow and limited. He applied to study in the United States as a Fulbright fellow, where he could work with the IAS machine. As he wrote on his original travel grant submission in 1951, he sought “to perform numerical experiments by means of great calculating machines,” in order to clarify, through mathematics, “the first stages of evolution of a species.” He also wished to mingle with great minds—“to communicate with American statisticians and evolution-theorists.” By then he had published papers on statistics and genetics, and had taught Einstein’s theory of relativity. In his application photo, he sports a pyramidal moustache, hair brushed to the back of his elliptic head, and hooded, downturned eyes. At the time of his application, he was a 39-year-old assistant professor at the University of Oslo.

Although the program initially rejected him due to a visa issue, in early 1953 Barricelli arrived at the Institute for Advanced Study as a visiting member. “I hope that you will be finding Mr. Baricelli [sic] an interesting person to talk with,” wrote Ragnar Frisch, a colleague of Barricelli’s who would later win the first Nobel Prize in Economics, in a letter to John von Neumann, a mathematician at IAS, who helped devise the institute’s groundbreaking computer. “He is not very systematic always in his exposition,” Frisch continued, “but he does have interesting ideas.”

Barricelli_BREAKER_2crop PSYCHEDELIC BARRICELLI: In this recreation of a Barricelli experiment, NYU associate professor Alexander Galloway has added color to show the gene groups more clearly. Each swatch of color signals a different organism. Borders between the color fields represent turbulence as genes bounce off and meld with others, symbolizing Barricelli’s symbiogenesis.Courtesy Alexander Galloway Centered above Barricelli’s first computer logbook entry at the Institute for Advanced Study, in handwritten pencil script dated March 3, 1953, is the title “Symbiogenesis problem.” This was his theory of proto-genes, virus-like organisms that teamed up to become complex organisms: first chromosomes, then cellular organs, onward to cellular organisms and, ultimately, other species. Like parasites seeking a host, these proto-genes joined together, according to Barricelli, and through their mutual aid and dependency, originated life as we know it.

Standard neo-Darwinian doctrine maintained that natural selection was the main means by which species formed. Slight variations and mutations in genes combined with competition led to gradual evolutionary change. But Barricelli disagreed. He pictured nimbler genes acting as a collective, cooperative society working together toward becoming species. Darwin’s theory, he concluded, was inadequate. “This theory does not answer our question,” he wrote in 1954, “it does not say why living organisms exist.”

Barricelli coded his numerical organisms on the IAS machine in order to prove his case. “It is very easy to fabricate or simply define entities with the ability to reproduce themselves, e.g., within the realm of arithmetic,” he wrote.

The early computer looked sort of like a mix between a loom and an internal combustion engine. Lining the middle region were 40 Williams cathode ray tubes, which served as the machine’s memory. Within each tube, a beam of electrons (the cathode ray) bombarded one end, creating a 32-by-32 grid of points, each consisting of a slight variation in electrical charge. There were five kilobytes of memory total stored in the machine. Not much by today’s standards, but back then it was an arsenal.

Barricelli saw his computer organisms as a blueprint of life—on this planet and any others.

Inside the device, Barricelli programmed steadily mutable worlds each with rows of 512 “genes,” represented by integers ranging from negative to positive 18. As the computer cycled through hundreds and thousands of generations, persistent groupings of genes would emerge, which Barricelli deemed organisms. The trick was to tweak his manmade laws of nature—“norms,” as he called them—which governed the universe and its entities just so. He had to maintain these ecosystems on the brink of pandemonium and stasis. Too much chaos and his beasts would unravel into a disorganized shamble; too little and they would homogenize. The sweet spot in the middle, however, sustained life-like processes.

Barricelli’s balancing act was not always easygoing. His first trials were riddled with pests: primitive, often single numeric genes invaded the space and gobbled their neighbors. Typically, he was only able to witness a couple of hereditary changes, or a handful at best, before the world unwound. To create lasting evolutionary processes, he needed to handicap these pests’ ability to rapidly reproduce. By the time he returned to the Institute in 1954 to begin a second round of experiments, Barricelli made some critical changes. First, he capped the proliferation of the pests to once per generation. That constraint allowed his numerical organisms enough leeway to outpace the pests. Second, he began employing different norms to different sections of his universes. That forced his numerical organisms always to adapt.

Even in the earlier universes, Barricelli realized that mutation and natural selection alone were insufficient to account for the genesis of species. In fact, most single mutations were harmful. “The majority of the new varieties which have shown the ability to expand are a result of crossing-phenomena and not of mutations, although mutations (especially injurious mutations) have been much more frequent than hereditary changes by crossing in the experiments performed,” he wrote.

When an organism became maximally fit for an environment, the slightest variation would only weaken it. In such cases, it took at least two modifications, effected by a cross-fertilization, to give the numerical organism any chance of improvement. This indicated to Barricelli that symbioses, gene crossing, and “a primitive form of sexual reproduction,” were essential to the emergence of life.

“Barricelli immediately figured out that random mutation wasn’t the important thing; in his first experiment he figured out that the important thing was recombination and sex,” Dyson says. “He figured out right away what took other people much longer to figure out.” Indeed, Barricelli’s theory of symbiogenesis can be seen as anticipating the work of independent-thinking biologist Lynn Margulis, who in the 1960s showed that it was not necessarily genetic mutations over generations, but symbiosis, notably of bacteria, that produced new cell lineages.

Barricelli saw his computer organisms as a blueprint of life—on this planet and any others. “The question whether one type of symbio-organism is developed in the memory of a digital computer while another type is developed in a chemical laboratory or by a natural process on some planet or satellite does not add anything fundamental to this difference,” he wrote. A month after Barricelli began his experiments on the IAS machine, Crick and Watson announced the shape of DNA as a double helix. But learning about the shape of biological life didn’t put a dent in Barricelli’s conviction that he had captured the mechanics of life on a computer. Let Watson and Crick call DNA a double helix. Barricelli called it “molecule-shaped numbers.”

Barricelli_BREAKER

What buried Barricelli in obscurity is something of a mystery. “Being uncompromising in his opinions and not a team player,” says Dyson, no doubt led to Barricelli’s “isolation from the academic mainstream.” Dyson also suspects Barricelli and the indomitable Hungarian mathematician von Neumann, an influential leader at the Institute of Advanced Study, didn’t hit it off. Von Neumann appears to have ignored Barricelli. “That was sort of fatal because everybody looked to von Neumann as the grandfather of self-replicating machines.”

Ever so slowly, though, Barricelli is gaining recognition. That stems in part from another of Barricelli’s remarkable developments; certainly one of his most beautiful. He didn’t rest with creating a universe of numerical organisms, he converted his organisms into images. His computer tallies of 1s and 0s would then self-organize into visual grids of exquisite variety and texture. According to Alexander Galloway, associate professor in the department of media, culture, and communication at New York University, a finished Barricelli “image yielded a snapshot of evolutionary time.”

When Barricelli printed sections of his digitized universes, they were dazzling. To modern eyes they might look like satellite imagery of an alien geography: chaotic oceans, stratigraphic outcrops, and the contours of a single stream running down the center fold, fanning into a delta at the patchwork’s bottom. “Somebody needs to do a museum show and show this stuff because they’re outrageous,” Galloway says.

Barricelli was an uncompromising oddball who teetered between madcap and mastermind.

Today, Galloway, a member of Barricelli’s small but growing cadre of boosters, has recreated the images. Following methods described by Barricelli in one of his papers, Galloway has coded an applet using the computer language Processing to revive Barricelli’s numerical organisms—with slight variation. While Barricelli encoded his numbers as eight-unit-long proto-pixels, Galloway condensed each to a single color-coded cell. By collapsing each number into a single pixel, Galloway has been able to fit eight times as many generations in the frame. These revitalized mosaics look like psychedelic cross-sections of the fossil record. Each swatch of color represents an organism, and when one color field bumps up against another one, that’s where cross-fertilization takes place.

“You can see these kinds of points of turbulence where the one color meets another color,” Galloway says, showing off the images on a computer in his office. “That’s a point where a number would be—or a gene would be—sort of jumping from one organism to another.” Here, in other words, is artificial life—Barricelli’s symbiogenesis—frozen in amber. And cyan and lavender and teal and lime and fuchsia.

Galloway is not the only one to be struck by the beauty of Barricelli’s computer-generated digital images. As a doctoral student, Pixar cofounder Smith became familiar with Barricelli’s work while researching the history of cellular automata for his dissertation. When he came across Barricelli’s prints he was astonished. “It was remarkable to me that with such crude computing facilities in the early 50s, he was able to be making pictures,” Smith says. “I guess in a sense you can say that Barricelli got me thinking about computer animation before I thought about computer animation. I never thought about it that way, but that’s essentially what it was.”

Cyberspace now swells with Barricelli’s progeny. Self-replicating strings of arithmetic live out their days in the digital wilds, increasingly independent of our tampering. The fittest bits survive and propagate. Researchers continue to model reduced, pared-down versions of life artificially, while the real world bursts with Boolean beings. Scientists like Venter conjure synthetic organisms, assisted by computer design. Swarms of autonomous codes thrive, expire, evolve, and mutate underneath our fingertips daily. “All kinds of self-reproducing codes are out there doing things,” Dyson says. In our digital lives, we are immersed in Barricelli’s world.

]]>
Fri, 20 Jun 2014 06:08:03 -0700 http://nautil.us/issue/14/mutation/meet-the-father-of-digital-life
<![CDATA[The Gray Zone | The Nation]]> http://www.thenation.com/article/177444/gray-zone

If there is one issue in contemporary life that supposedly defines the progressive nature of liberal societies, it is gay rights. Over the past half-century, most of the world’s Western democracies have seen incredible strides toward fuller acceptance of gay people. In the United States, the pace is, if anything, increasing, as each step toward full equality—from the striking down of anti-sodomy laws, to the Supreme Court’s recent decision voiding the Defense of Marriage Act, to the increasing number of state legislatures legalizing gay marriage—builds on prior ones.

The sense of history moving forward is not limited to people who cheer on this expansion of rights. When Justice Antonin Scalia dissented from the majority opinion in Lawrence v. Texas (2003), the case that struck down the Lone Star State’s anti-sodomy law, he wrote, “If moral disapprobation of homosexual conduct is ‘no legitimate state interest’ for purposes of proscribing that conduct…what justification could there possibly be for denying the benefits of marriage to homosexual couples exercising ‘[t]he liberty protected by the Constitution?’” The more recent decision in United States v. Windsor—which did not legalize gay marriage in all fifty states—allowed Scalia to make another slippery-slope prediction: “By formally declaring anyone opposed to same-sex marriage an enemy of human decency, the majority arms well every challenger to a state law restricting marriage to its traditional definition.” Scalia’s views are odious, but it’s hard to look at the history of the issue and doubt that he is right: gay marriage is coming to all fifty states, and he can’t do a thing about it.

To John Gray, the British philosopher, political theorist and wide-ranging cultural critic, the optimistic narrative I have sketched is another example of fanciful, misguided optimism. According to Gray, human flourishing is cyclical, and does not inevitably increase over time. Advances are followed by setbacks, and eras of peace by horrific wars. Unprecedented developments in medicine, science and women’s rights in the first half of the twentieth century were succeeded by the worst conflict in human history. Jim Crow came after Reconstruction. And revolutions that initially seemed to offer the promise of more freedom—whether in France or Iran or Egypt today—have led to violence and depravity, if not chaos. One imagines Gray arguing that of course the Western world could see a further entrenchment of gay rights; at the same time, an unknown series of events might lead to the reverse scenario. All we know is that we don’t know.

What concerns Gray, as he has argued in numerous articles, books and lectures, is that those who believe in steady progress are foolishly engaging in teleological thinking. “Progressives”—in the most literal sense of the word—have replaced religion with a faithful humanism that allows for a nearly supernatural view of human functioning, behavior and flourishing. Rather than viewing humans as just another member of the animal kingdom, “humanists” believe that our species can fulfill a unique destiny and reach The End of History. This faith in progress, Gray believes, will end up leading to great crimes and disasters. Ideological fanaticism, whether rooted in a teleological view of human liberation, national destiny or divine provenance, has led us down this road before.

Gray has become one of the most visible and prolific public intellectuals of the past decade, and he is almost always worth reading. His knowledge of philosophy and history is nicely integrated with his passion for literature and the arts. He would scorn the title of humanist, but his writing contains a wide-ranging curiosity about other people. In his recent work, however, he has chosen to simplify the arguments of writers he scorns and proclaim that anyone who disagrees with him is near messianic in his or her thinking. Gray’s incessant pessimism about humanity’s ability to spark durable change has produced its own form of teleology. As E.H. Carr wrote in “What Is History,” “To denounce ideologies in general is to set up an ideology of one’s own.”


People who have moved through various stages of political orientation have a tendency to prove that the last stage of ideological drift is ideological certainty. David Horowitz went from honorary Black Panther to contented Reaganite before settling into the role of insufferable campus troublemaker. Arianna Huffington metamorphosed from anti-feminist Republican to establishment centrist and, at least for the time being, into a harsh critic of the financial system and committed liberal partisan.

Gray would at first appear to be an exception to this rule. Although he has inhabited the roles of moderate Thatcherite, admirer of Tony Blair’s New Labour experiment and strong opponent of the Iraq War, he currently scorns free market evangelism and interventionism. His general political outlook now appears to approximate that of a mainstream liberal, if only because he heaps scorn on anyone too far on either side of the current political spectrum. (Mainstream liberalism has made its compromises with imperialism and more rapacious forms of capitalism, and so it is to Gray’s credit that he has devoted so much energy to criticizing both.)

It is in the field of criticism—in both senses—that Gray has flourished. His close reading of Marx has frequently come in handy when evaluating such ideologically distinct figures as Thomas Friedman and Slavoj Zizek. In the former case, Gray explained the surprising similarities between Friedman’s thinking about globalization and Marx’s, both of which were prone to shunting aside cultural analysis to focus on technological advancement. In his dissection of Zizek, meanwhile, he lauded Marx’s empiricism, which stands in stark contrast to the blathering of his “Leninist” (in Zizek’s word) follower.

It was in his 1995 book on Isaiah Berlin, however, that Gray (who studied under Berlin at Oxford) was at his finest, largely because he managed to put forth a reading of Berlin’s political philosophy that added up to something significant. Berlin was often accused of failing to provide a grand theory for his many arguments about liberalism, largely because celebrating “negative liberty”—essentially being left alone, free from interference—does not necessarily yield a coherent political philosophy. But Gray showed that Berlin’s distrust of monism added up to a robust pluralism, or what Gray called an “agnostic liberalism.” “The master-thesis of pluralism supports liberalism,” he wrote, further defining it as a sort of liberalism that “grounds itself on the radical choices we must make among incommensurables.”

]]>
Wed, 11 Dec 2013 15:43:05 -0800 http://www.thenation.com/article/177444/gray-zone
<![CDATA[Umberto Eco and why we still dream of utopia]]> http://www.newstatesman.com/culture/2013/11/no-place-home

Places that have never existed except in the human imagination may find an incongruous afterlife in the everyday world. Umberto Eco tells of how an attempt to commemorate the brownstone New York home of Nero Wolfe, Rex Stout’s orchid-loving fictional detective, runs up against the resistance of fact. Wolfe’s house cannot be identified because Stout “always talked of a brownstone at a certain number on West 35th Street, but in the course of his novels he mentioned at least ten different street numbers – and what is more, there are no brownstones on 35th Street”. Using Eco’s typology, a fiction has been transmuted into a legend: “Legendary lands and places are of various kinds and have only one characteristic in common: whether they depend on ancient legends whose origins are lost in the mists of time or whether they are in effect a modern invention, they have created flows of belief.”

Because they involve the belief that they existed, exist or can be made to exist – whether in the past, the future or somewhere off the map – legendary places are illusions rather than fictions. The distinction may sometimes be blurry, as the example of Nero Wolfe’s house shows; but the difference is fundamental to this enriching and playfully erudite exploration of the fabulous lands that human beings have invented.

Fictions we know to be neither true nor false and paradoxically this gives them a kind of absolute veracity that historical facts can never have: “The credulous believe that El Dorado and Lemuria exist or existed somewhere or other, but we all know that it is undeniably certain that Superman is Clark Kent and that Dr Watson was never Nero Wolfe’s right-hand man ... All the rest is open to debate.” Unfortunately, humans have an invincible need to believe in their fictions. So they turn them into legends, which they anxiously defend from doubt – even to the point of attacking and killing those who do not share them.

Eco thinks it is not too difficult to explain why humankind is so drawn to legendary places: “It seems that every culture – because the world of everyday reality is cruel and hard to live in – dreams of a happy land to which men once belonged, and may one day return.” Nowadays everyone believes that the ability to envision alternate worlds is one of humankind’s most precious gifts, a view Eco seems to endorse when, at the end of his journey through legendary lands, he describes these visions as “a truthful part of the reality of our imagination”. Yet Eco highlights a darker side of these visions when he describes how the Nazis drew inspiration from legends of ancient peoples, variously situated in ultima Thule (“a land of fire and ice where the sun never set”), Atlantis and the polar regions, who spoke languages that were “racially pure”. Himmler was obsessed with ancient Nordic runes, while in an interview after the war the commander of the SS in Rome claimed that when Hitler ordered him to kidnap Pope Pius XII so he could be interned in Germany, he also ordered the Pope to take from the Vatican library “certain runic manuscripts that evidently had esoteric value for him”.

The Nazi adoption of the swastika began with the Thule Society, a secret racist organisation founded in 1918. Legends of lost lands fed the ideology of Aryan supremacy. In 1907, Jörg Lanz founded the Order of the New Temple, preaching that “inferior races” should be subjected to castration, sterilisation, deportation to Madagascar and incineration – ideas, Eco notes, that “were later to be applied by the Nazis”. Legendary lands are idylls from which minorities, outsiders and other disturbing elements have been banished. When these fantasies of harmony enter politics, a process of exclusion is set in motion whose end point is mass murder and genocide.

A metamorphosis of fiction into legend occurred when some Nazis took seriously a picture of the world presented by the Victorian novelist Edward Bulwer-Lytton. In his novel The Coming Race (1871), Bulwer-Lytton tells of the “Vril-ya”, survivors from the destruction of Atlantis who possessed amazing powers as a result of being imbued with Vril, a type of cosmic energy, living in the hollow interior of earth. He intended the book as an exercise in fantasy literature but the founder of the Thule Society, who also founded a Vril Society, seems to have taken it more literally. Occultists in several countries read Bulwer-Lytton’s novel as a fictional rendition of events that may actually have happened and the legend was mixed in the stew of mad and bad ideas we now call Nazism.

The process at work was something like that described in Jorge Luis Borges’s story “Tlön, Uqbar, Orbis Tertius”, in which an encyclopaedia of an imaginary world subverts and disrupts the world that has hitherto been real. The difference is that in Borges’s incomparable fable the secret society that devised the encyclopaedia knew it to be fiction, while 19th-century occultists and some 20th-century Nazis accepted Bulwer-Lytton’s fiction as a version of fact. Among the marks that Bulwer-Lytton’s Vril-ya left in the real world, the most lasting was reassuringly prosaic: the name given to Bovril, the meat extract invented in the 1870s.

Among the legendary places human beings have dreamed up, those that Eco calls “the islands of utopia” have exercised a particular fascination in recent times. As he reminds us, “Etymologically speaking, utopia means non-place” – ou-topos, or no place. Thomas More, who coined the term in his book Utopia (composed in Latin and only translated in 1551 after More had been executed for treason in 1535), plays on an ambiguity in which the word also means a good or excellent place. Using a non-existent country to present an ideal model of government, More established a new literary genre, which included Étienne Cabet’s A Journey to Icaria (1840), in which a proto-communist society is envisioned, Samuel Butler’s Erewhon (1872, an anagram of “nowhere”) and William Morris’s News from Nowhere (1890).

Visions of ideal societies have recurred throughout history but such societies were nearly always placed in an irretrievable past. The paradise of milk and honey of which human beings dreamed – a land of perpetual peace and abundance – belonged in religion and mythology rather than history or science. Yet by the end of the 19th century, the fiction of an ideal society had been turned into a realisable human condition. Already in the second half of the 18th century, Rousseau was writing of an egalitarian society as if something of the kind had once existed – a move repeated by Marx and Engels in their theory of primitive communism, which they believed could be recreated at a higher level. More’s non-existent land was given a veneer of science and situated in a non-existent future. Having been a literary genre, utopia became a political legend.

The Book of Legendary Lands covers a vast range of non-places, including a flat and a hollow earth, the Antipodes, the lands of Homer and the many versions of Cockaigne (where honey and bread fall from the sky and no one is rich or poor). A fascinating chapter deals with the far more recent invention of Rennes-le-Château, a French village near Carcassonne that has been hailed as a site of immense treasure and of a priory established by descendants of Jesus, who supposedly did not die on the cross but fled to France and began the Merovingian dynasty.

Presented by Eco in light and witty prose, these legendary places are made more vivid by many well-chosen illustrations and historic texts. Yet this is far from being another coffee table book, however beautiful. As in much of his work, Eco’s theme is the slippage from fiction to illusion in the human mind. Rightly he sees this as a perennial tendency but it is one that has gathered momentum in modern times. So-called primitive cultures understood that history runs in cycles, with civilisations rising and falling much as the seasons come and go – a view of things echoed in Aristotle and the Roman historians. The rise of monotheism changed the picture, so that history came to be seen as an unfolding drama – a story with a beginning, an end and a redemptive meaning. Either way, no one believed that history could be governed by human will. It was fate, God or mere chaos that ruled human events.

Legendary lands began to multiply when human beings started to believe they could shape the future. Non-places envisioned by writers in the past were turned into utopian projects. At the same time, literature became increasingly filled with visions of hellish lands. As Eco puts it, “Sometimes utopia has taken the form of dystopia, accounts of negative societies.”

What counts as a dystopia, however, is partly a matter of taste. Aldous Huxley may have meant Brave New World (1932) as a warning but I suspect many people would find the kind of world he describes – genetically engineered and drug-medicated but also without violence, poverty or acute unhappiness – quite an attractive prospect. If the nightmarish society Huxley imagines is fortunately impossible, it is because it is supposed to be capable of renewing itself endlessly – a feature of utopias and one of the clearest signs of their unreality.

Whether you think a vision of the future is utopian or not depends on how you view society at the present time. Given the ghastly record of utopian politics in the 20th century, bien-pensants of all stripes never tire of declaring that all they want is improvement. They assume that the advances of the past are now permanent and new ones can simply be added on. But if you think society today is like all others have been – deeply flawed and highly fragile – you will understand that improvement can’t be inherited in this way. Sooner or later, past advances are sure to be lost, as the societies that have inherited them decline and fail. As everyone understood until just a few hundred years ago, this is the normal course of history.

No bien-pensant will admit this to be so. Indeed, many find the very idea of such a reversal difficult to comprehend. How could the advances that have produced the current level of civilisation – including themselves – be only a passing moment in the history of the species? Without realising the fact, these believers in improvement inhabit a legendary land – a place where what has been achieved in the past can be handed on into an indefinite future. The human impulse to dream up imaginary places and then believe them to be real, which Eco explores in this enchanting book, is as strong as it has ever been.

John Gray is the lead book reviewer of the NS. His latest book, “The Silence of Animals: On Progress and Other Modern Myths”, is published by Allen Lane (£18.99)

]]>
Wed, 11 Dec 2013 15:42:42 -0800 http://www.newstatesman.com/culture/2013/11/no-place-home
<![CDATA[An Ontology of Everything on the Face of the Earth]]> http://www.alluvium-journal.org/2013/12/04/an-ontology-of-everything-on-the-face-of-the-earth/

This essay was originally published as part of a special issue of Alluvium Journal on Digital Metaphors, edited by Zara Dinnen and featuring contributions from Rob Gallagher and Sophie Jones. John Carpenter’s 1982 film, The Thing, is a claustrophobic sci-fi thriller, exhibiting many hallmarks of the horror genre. The film depicts a sinister turn for matter, where the chaos of the replicating, cancerous cell is expanded to the human scale and beyond. In The Thing we watch as an alien force terrorises an isolated Antarctic outpost. The creature exhibits an awesome ability to imitate, devouring any creature it comes across before giving birth to an exact copy in a burst of blood and protoplasm. The Thing copies cell by cell and its process is so perfect – at every level of replication – that the resultant simulacrum speaks, acts and even thinks like the original. The Thing is so relentless, its copies so perfect, that the outpost’s Doctor, Blair, is sent mad at the implications: Blair: If a cell gets out it could imitate everything on the face of the Earth… and it’s not gonna stop!!! Based on John W. Campbell’s 1938 novella, Who Goes There?, Carpenter’s film revisits a gothic trope, as numerous in its incarnations as are the forms it is capable of taking. In Campbell’s original novella, the biologically impure is co-inhabited by a different type of infection: an infection of the Antarctic inhabitants’ inner lives. Plucked from an icy grave, The Thing sits, frozen solid, in a dark corner of the outpost, drip dripping towards re-animation. Before its cells begin their interstitial jump from alien to earthly biology, it is the dreams of the men that become infected: ‘So far the only thing you have said this thing gave off that was catching was dreams. I’ll go so far as to admit that.’ An impish, slightly malignant grin crossed the little man’s seamed face. ‘I had some, too. So. It’s dream-infectious. No doubt an exceedingly dangerous malady.’ (Campbell)

The Thing’s voracious drive to consume and imitate living beings calls to mind Freud’s uncanny: the dreadful creeping horror that dwells between homely and unhomely. According to Ernst Jentsch, whose work Freud references in his study, the uncanny is kindled, ‘when there is intellectual uncertainty whether an object is alive or not, and when an inanimate object becomes too much like an animate one’ (Grenville 233). A body in the act of becoming: John W. Campbell’s novella depicts The Thing as a monstrous body that “swallows the world and is itself swallowed by the world”

In the original novella, The Thing is condensed as much from the minds of the men, as from its own horrific, defrosting bulk. A slowly surfacing nightmare that acts to transform alien matter into earthly biology also has the effect of transferring the inner, mental lives of the men, into the resultant condensation. John W. Campbell had no doubts that The Thing could become viscous, mortal human flesh, but in order to truly imitate its prey, the creature must infect and steal inner life too, pulling ghosts, kicking and screaming, out of their biological machines. As a gothic figure, Campbell’s Thing disrupts the stable and integral vision of human being, of self-same bodies housing ‘unitary and securely bounded’ (Hurley 3) subjectivities, identical and extensive through time. John W. Campbell’s characters confront their anguish at being embodied: their nightmares are literally made flesh. As Kelly Hurley reminds us in her study on The Gothic Body, Mikhail Bakhtin noted: The grotesque body… is a body in the act of becoming. It is never finished, never completed; it is continually built, created, and builds and creates another body. Moreover, the body swallows the world and is itself swallowed by the world (Hurley 28). Each clone’s otherness is an uncanny exposure of the abject relationship we endure with ourselves as vicarious, fragmented, entropic forms. In the 44 years between the novella and John Carpenter’s 1982 film, there were many poor clones of The Thing depicted in cinema. Films such as Invasion of the Body Snatchers (1956) and, It Came from Outer Space (1953) are replete with alien dopplegangers, abject human forms, cast away very much as in gothic tradition. Howard Hawk’s film, The Thing from Another World (1951), the first to explicitly translate Who Goes There?, completely disfigures Campbell’s story. The resultant monster is nothing more than, what one character calls, ‘an intellectual carrot’, grown from alien cells in a laboratory. The film is worth considering though for its Cold War undertones. Recast in an Arctic military base, Hawk’s Thing is an isolated monster set against a small, well organised army of cooperative men. Faced with disaster the men group together, fighting for a greater good than each of them alone represents.

Cinematic clones of The Thing: 1950s American Science Fiction films like It Came From Outer Space and Invasion of the Body Snatchers are replete with alien doppelgangers and abject human forms [Images used under fair dealings provisions] The metaphor of discrete cells coordinating into autopoeitic organisms, does not extend to the inhabitants of the isolated Antarctic outpost in the original short story, nor in the 1982 version. Rather than unite against their foe, they begin to turn on each other, never knowing who might be The Thing. In a series of enactments of game-theory, the characters do piece together a collective comprehension: that if The Thing is to eventually imitate ‘everything on the face of the Earth’ it must not show itself now, lest the remaining humans group together and destroy it. The Thing’s alien biology calls to mind the original design of the internet, intended, according to Michael Hardt and Antonio Negri: …to withstand military attack. Since it has no center and almost any portion can operate as an autonomous whole, the network can continue to function even when part of it has been destroyed. The same design element that ensures survival, the decentralisation, is also what makes control of the network so difficult (Hardt and Negri 299). The novella Who Goes There? and the film, The Thing, sit either side of a pivotal era in the advancement of information technology. How a life form or a biological computer work is immaterial to the behaviours they present to an observer. John Carpenter’s The Thing explores the fulfilment of Alan Turing’s ‘Imitation Game.’ Moving away from Campbell’s original appeal to telepathy and a mind/body split, the materialist vision of Carpenter’s film confronts us with a more fundamental horror. That every part of us is reducible to every other. In her book Refiguring Life, Evelyn Fox Keller argues that: As a consequence of the technological and conceptual transformations we have witnessed in the last three decades, the body itself has been irrevocably transformed… The body of modern biology, like the DNA molecule – and also like the modern corporate or political body – has become just another part of an informational network, now machine, now message, always ready for exchange, each for the other (Keller 117–118). Meanwhile, eschewing Martin Heidegger’s definition of a thing (in which objects are brought out of the background of existence through human use), Bill Brown marks the emergence of things through the encounter: As they circulate through our lives… we look through objects because there are codes by which our interpretive attention makes them meaningful, because there is a discourse of objectivity that allows us to use them as facts. A thing, in contrast, can hardly function as a window. We begin to confront the thingness of objects when they stop working for us… (Brown 4).

A thing or an object? Bill Brown argues that we look through objects but are confronted by things [Image by Marc PhOtOnQuAnTiQuE under a CC BY-NC-ND license] In his infamous 1950 paper, Computing Machinery and Intelligence, Alan Turing introduced the notion that a computer is nothing more than a machine that functions by pretending to be other machines. (Turing) Asking the question ‘can machines think?’ Turing replaced the ambiguity of ‘thought’ and ‘intelligence’ with imitation, proposing a test that avoided the need to know what was going on inside a machine, in favour of merely experiencing its affects. In a lecture entitled ‘Can Digital Computers Think?’, Turing expounds his point: It is not difficult to design machines whose behaviour appears quite random to anyone who does not know the details of their construction. Naturally enough the inclusion of this random element, whichever technique is used, does not solve our main problem, how to programme a machine to imitate a brain, or as we might say more briefly, if less accurately, to think. But it gives us some indication of what the process will be like. We must not always expect to know what the computer is going to do. We should be pleased when the machine surprises us, in rather the same way as one is pleased when a pupil does something which he had not been explicitly taught to do (Shieber 114–115). The mutability of Earthly life, its ability to err, to stumble upon novel strategies through random, blind chance, represents its most innate capacity. Biological life changes by mutation, passing those mutations on to the next generation, ad infinitum. The Thing, in opposition to this, can only become its other absolutely. There is no room for error, for mutation, for change or evolution: instead, The Thingly cadaver of Norris must protect its otherness in the only way it knows how: by transforming itself into a defensive form previously programmed and stored in its protoplasm. In terms of creativity it cannot escape its programming. Turing’s lecture hints at a further unsettling conclusion we can make: that even though novel behaviour may be consistent with error, from appearances alone it is impossible to distinguish something ontologically novel, with a behaviour which has been programmed to appear as such. The Thing is a Universal Turing Machine, a post-digital plasma, encoded with the biological ticker-tape of a thousand alien worlds. Put more simply, in the words of protagonist John MacReady: MacReady: Somebody in this camp ain’t what he appears to be. [my emphasis]

The “Gothicity” of matter? The digital metaphor of the Thing reveals that through imitation computers confer humanity upon us [Image by 

]]>
Mon, 09 Dec 2013 10:34:38 -0800 http://www.alluvium-journal.org/2013/12/04/an-ontology-of-everything-on-the-face-of-the-earth/
<![CDATA[An Instagram short film]]> http://vimeo.com/79207239

Instagram is an incredible resource for all kinds of images. I wanted to create structure out of this chaos. The result is a crowd source short-film that shows the endless possibilities of social media. The video consists of 852 different pictures, from 852 different instagram users. If you are one of them, shout and I will add you to the credits. music: The black Keys - Gold on the ceiling iamthomasjullien.com Special thanks to Amadeus Henhapl, Pieter van den Heuvel and Silje Lian who pushed me to finnish this short. Credits: @palmalotieneCast: Thomas JullienTags: instagram, stop-motion, travel, around the globe and social media

]]>
Sat, 23 Nov 2013 12:32:00 -0800 http://vimeo.com/79207239
<![CDATA[Rigid Implementation vs Flexible Materiality]]> http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality

Wow. It’s been a while since I updated my blog. I intend to get active again here soon, with regular updates on my research. For now, I thought it might be worth posting a text I’ve been mulling over for a while (!) Yesterday I came across this old TED presentation by Daniel Hillis, and it set off a bunch of bells tolling in my head. His book The Pattern on the Stone was one I leafed through a few months back whilst hunting for some analogies about (digital) materiality. The resulting brainstorm is what follows. (This blog post, from even longer ago, acts as a natural introduction: On (Text and) Exaptation) In the 1960s and 70s Roland Barthes named “The Text” as a network of production and exchange. Whereas “the work” was concrete, final – analogous to a material – “the text” was more like a flow, a field or event – open ended. Perhaps even infinite. In, From Work to Text, Barthes wrote: The metaphor of the Text is that of the network… (Barthes 1979) This semiotic approach to discourse, by initiating the move from print culture to “text” culture, also helped lay the ground for a contemporary politics of content-driven media. Skipping backwards through From Work to Text, we find this statement: The text must not be understood as a computable object. It would be futile to attempt a material separation of works from texts. I am struck here by Barthes” use of the phrase “computable object”, as well as his attention to the “material”. Katherine Hayles in her essay, Text is Flat, Code is Deep, (Hayles 2004) teases out the statement for us: ‘computable’ here mean[s] to be limited, finite, bound, able to be reckoned. Written twenty years before the advent of the microcomputer, his essay stands in the ironic position of anticipating what it cannot anticipate. It calls for a movement away from works to texts, a movement so successful that the ubiquitous ‘text’ has all but driven out the media-specific term book. Hayles notes that the “ubiquity” of Barthes” term “Text” allowed – in its wake – an erasure of media-specific terms, such as “book”. In moving from, The Work to The Text, we move not just between different politics of exchange and dissemination, we also move between different forms and materialities of mediation. (Manovich 2002)For Barthes the material work was computable, whereas the network of the text – its content – was not.

In 1936, the year that Alan Turing wrote his iconic paper ‘On Computable Numbers’, a German engineer by the name of Konrad Zuse built the first working digital computer. Like its industrial predecessors, Zuse’s computer was designed to function via a series of holes encoding its program. Born as much out of convenience as financial necessity, Zuse punched his programs directly into discarded reels of 35mm film-stock. Fused together by the technologies of weaving and cinema, Zuse’s computer announced the birth of an entirely new mode of textuality. The Z3, the world’s first working programmable, fully automatic computer, arrived in 1941. (Manovich 2002) A year earlier a young graduate by the name of Claude Shannon had published one of the most important Masters theses in history. In it he demonstrated that any logical expression of Boolean algebra could be programmed into a series of binary switches. Today computers still function with a logic impossible to distinguish from their mid-20th century ancestors. What has changed is the material environment within which Boolean expressions are implemented. Shannon’s work first found itself manifest in the fragile rows of vacuum tubes that drove much of the technical innovation of the 40s and 50s. In time, the very same Boolean expressions were firing, domino-like, through millions of transistors etched onto the surface of silicon chips. If we were to query the young Shannon today, he might well gawp in amazement at the material advances computer technology has gone through. But, if Shannon was to examine either your digital wrist watch or the world’s most advanced supercomputer in detail, he would once again feel at home in the simple binary – on/off – switches lining those silicon highways. Here the difference between how computers are implemented and what computers are made of digs the first of many potholes along our journey. We live in an era not only practically driven by the computer, but an era increasingly determined by the metaphors computers have injected into our language. Let us not make the mistake of presupposing that brains (or perhaps minds) are “like” computers. Tempting though it is to reduce the baffling complexities of the human being to the functions of the silicon chip, the parallel processor or Wide Area Network this reduction occurs most usefully at the level of metaphor and metonym. Again the mantra must be repeated that computers function through the application of Boolean logic and binary switches, something that can not be said about the human brain with any confidence a posteriori. Later I will explore the consequences on our own understanding of ourselves enabled by the processing paradigm, but for now, or at least the next few paragraphs, computers are to be considered in terms of their rigid implementation and flexible materiality alone. At the beginning of his popular science book, The Pattern on the Stone, (Hillis 1999) W.  Daniel Hillis narrates one of his many tales on the design and construction of a computer. Built from tinker-toys the computer in question was/is functionally complex enough to “play” tic-tac-toe (noughts and crosses). The tinker-toy was chosen to indicate the apparent simplicity of computer design, but as Hillis argues himself, he may very well have used pipes and valves to create a hydraulic computer, driven by water pressure, or stripped the design back completely, using flowing sand, twigs and twine or any other recipe of switches and connectors. The important point is that the tinker-toy tic-tac-toe computer functions perfectly well for the task it is designed for, perfectly well, that is, until the tinker-toy material begins to fail. This failure is what Chapter 1 of this thesis is about: why it happens, why its happening is a material phenomenon and how the very idea of “failure” is suspect. Tinker-toys fail because the mechanical operation of the tic-tac-toe computer puts strain on the strings of the mechanism, eventually stretching them beyond practical use. In a perfect world, devoid of entropic behaviour, the tinker-toy computer may very well function forever, its users setting O or X conditions, and the computer responding according to its program in perfect, logical order. The design of the machine, at the level of the program, is completely closed; finished; perfect. Only materially does the computer fail (or flail), noise leaking into the system until inevitable chaos ensues and the tinker-toys crumble back into jumbles of featureless matter. This apparent closure is important to note at this stage because in a computer as simple as the tic-tac-toe machine, every variable can be accounted for and thus programmed for. Were we to build a chess playing computer from tinker-toys (pretending we could get our hands on the, no doubt, millions of tinker-toy sets we”d need) the closed condition of the computer may be less simple to qualify. Tinker-toys, hydraulic valves or whatever material you choose, could be manipulated into any computer system you can imagine, even the most brain numbingly complicated IBM supercomputer is technically possible to build from these fundamental materials. The reason we don”t do this, why we instead choose etched silicon as our material of choice for our supercomputers, exposes another aspect of computers we need to understand before their failure becomes a useful paradigm. A chess playing computer is probably impossible to build from tinker-toys, not because its program would be too complicated, but because tinker-toys are too prone to entropy to create a valid material environment. The program of any chess playing application could, theoretically, be translated into a tinker-toy equivalent, but after the 1,000th string had stretched, with millions more to go, no energy would be left in the system to trigger the next switch along the chain. Computer inputs and outputs are always at the mercy of this kind of entropy: whether in tinker-toys or miniature silicon highways. Noise and dissipation are inevitable at any material scale one cares to examine. The second law of thermo dynamics ensures this. Claude Shannon and his ilk knew this, even back when the most advanced computers they had at their command couldn”t yet play tic-tac-toe. They knew that they couldn”t rely on materiality to delimit noise, interference or distortion; that no matter how well constructed a computer is, no matter how incredible it was at materially stemming entropy (perhaps with stronger string connectors, or a built in de-stretching mechanism), entropy nonetheless was inevitable. But what Shannon and other computer innovators such as Alan Turing also knew, is that their saviour lay in how computers were implemented. Again, the split here is incredibly important to note:

Flexible materiality: How and of what a computer is constructed e.g. tinker-toys, silicon Rigid implementation: Boolean logic enacted through binary on/off switches (usually with some kind of input à storage à feedback/program function à output). Effectively, how a computer works

Boolean logic was not enough on its own. Computers, if they were to avoid entropy ruining their logical operations, needed to have built within them an error management protocol. This protocol is still in existence in EVERY computer in the world. Effectively it takes the form of a collection of parity bits delivered alongside each packet of data that computers, networks and software deal with. The bulk of data contains the binary bits encoding the intended quarry, but the receiving element in the system also checks the main bits alongside the parity bits to determine whether any noise has crept into the system. What is crucial to note here is the error-checking of computers happens at the level of their rigid implementation. It is also worth noting that for every eight 0s and 1s delivered by a computer system, at least one of those bits is an error checking function. W. Daniel Hillis puts the stretched strings of his tinker-toy mechanism into clear distinction and in doing so, re-introduces an umbrella term set to dominate this chapter: I constructed a later version of the Tinker Toy computer which fixed the problem, but I never forgot the lesson of the first machine: the implementation technology must produce perfect outputs from imperfect inputs, nipping small errors in the bud. This is the essence of digital technology, which restores signals to near perfection at every stage. It is the only way we know – at least, so far – for keeping a complicated system under control. (Hillis 1999, 18)   Bibliography  Barthes, Roland. 1979. ‘From Work to Text.’ In Textual Strategies: Perspectives in Poststructuralist Criticism, ed. Josue V. Harari, 73–81. Ithaca, NY: Cornell University Press. Hayles, N. Katherine. 2004. ‘Print Is Flat, Code Is Deep: The Importance of Media-Specific Analysis.’ Poetics Today 25 (1) (March): 67–90. doi:10.1215/03335372-25-1-67. Hillis, W. 1999. The Pattern on the Stone : the Simple Ideas That Make Computers Work. 1st paperback ed. New York: Basic Books. Manovich, Lev. 2002. The Language of New Media. 1st MIT Press pbk. ed. Cambridge  Mass.: MIT Press.      

]]>
Thu, 07 Jun 2012 06:08:07 -0700 http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality
<![CDATA[Noise; Mutation; Autonomy: A Mark on Crusoe’s Island]]> http://machinemachine.net/text/research/a-mark-on-crusoes-island

This mini-paper was given at the Escapologies symposium, at Goldsmiths University, on the 5th of December Daniel Defoe’s 1719 novel Robinson Crusoe centres on the shipwreck and isolation of its protagonist. The life Crusoe knew beyond this shore was fashioned by Ships sent to conquer New Worlds and political wills built on slavery and imperial demands. In writing about his experiences, Crusoe orders his journal, not by the passing of time, but by the objects produced in his labour. A microcosm of the market hierarchies his seclusion removes him from: a tame herd of goats, a musket and gunpowder, sheafs of wheat he fashions into bread, and a shelter carved from rock with all the trappings of a King’s castle. Crusoe structures the tedium of the island by gathering and designing these items that exist solely for their use-value: “In a Word, The Nature and Experience of Things dictated to me upon just Reflection, That all the good Things of this World, are no farther good to us, than they are for our Use…” [1] Although Crusoe’s Kingdom mirrors the imperial British order, its mirroring is more structural than anything else. The objects and social contrivances Crusoe creates have no outside with which to be exchanged. Without an ‘other’ to share your labour there can be no mutual assurance, no exchanges leading to financial agreements, no business partners, no friendships. But most importantly to the mirroring of any Kingdom, without an ‘other’ there can be no disagreements, no coveting of a neighbours ox, no domination, no war: in short, an Empire without an outside might be complete, total, final, but an Empire without an outside has also reached a state of complete inertia. Crusoe’s Empire of one subject, is what I understand as “a closed system”… The 2nd law of thermo dynamics maintains that without an external source of energy, all closed systems will tend towards a condition of inactivity. Eventually, the bacteria in the petri dish will multiply, eating up all the nutrients until a final state of equilibrium is reached, at which point the system will collapse in on itself: entropy cannot be avoided indefinitely. The term ‘negative entropy’ is often applied to living organisms because they seem to be able to ‘beat’ the process of entropy, but this is as much an illusion as the illusion of Crusoe’s Kingdom: negative entropy occurs at small scales, over small periods of time. Entropy is highly probable: the order of living beings is not. Umberto Eco: “Consider, for example, the chaotic effect… of a strong wind on the innumerable grains of sand that compose a beach: amid this confusion, the action of a human foot on the surface of the beach constitutes a complex interaction of events that leads to the statistically very improbable configuration of a footprint.” [2] The footprint in Eco’s example is a negative entropy event: the system of shifting sands is lent a temporary order by the cohesive action of the human foot. In physical terms, the footprint stands as a memory of the foot’s impression. The 2nd law of thermodynamics establishes a relationship between entropy and information: memory remains as long as its mark. Given time, the noisy wind and chaotic waves will cause even the strongest footprint to fade. A footprint is a highly improbable event. Before you read on, watch this scene from Luis Buñuel’s Robinson Crusoe (1954):

The footprint, when it first appears on the island, terrifies Crusoe as a mark of the outsider, but soon, realising what this outsider might mean for the totality of his Kingdom, Robinson begins the process of pulling the mark inside his conceptions: “Sometimes I fancied it must be the Devil; and reason joined in with me upon this supposition. For how should any other thing in human shape come into the place? Where was the vessel that brought them? What marks were there of any other footsteps? And how was it possible a man should come there?” [3] In the novel, it is only on the third day that Crusoe re-visits the site to compare his own foot with the print. The footprint is still there on the beach after all this time, a footprint Crusoe now admits is definitely not his own. This chain of events affords us several allegorical tools: firstly, that of the Devil, Crusoe believes to be the only rational explanation for the print. This land, which has been Crusoe’s own for almost 2 decades, is solid, unchanging and eternal. Nothing comes in nor goes beyond its shores, yet its abundance of riches have served Crusoe perfectly well: seemingly infinite riches for a Kingdom’s only inhabitant. Even the footprint, left for several days, remains upon Crusoe’s return. Like the novel of which it is a part, the reader of the mark may revisit the site of this unlikely incident again and again, each time drawing more meanings from its appearance. Before Crusoe entertains that the footprint might be that of “savages of the mainland” he eagerly believes it to be Satan’s, placed there deliberately to fool him. Crusoe revisits the footprint, in person and then, as it fades, in his own memory. He ‘reads’ the island, attributing meanings to marks he discovers that go far beyond what is apparent. As Susan Stewart has noted: “In allegory the vision of the reader is larger than the vision of the text; the reader dreams to an excess, to an overabundance.” [4] Simon O’Sullivan, following from Deleuze, takes this further, arguing that in his isolation, a world free from ‘others’, Crusoe has merged with, become the island. The footprint is a mark that must be recuperated if Crusoe’s identity, his “power of will”, is to be maintained. An outsider must have caused the footprint, but Crusoe is only capable of reading in the mark something about himself. The evocation of a Demon, then, is Crusoe’s way of re-totalising his Empire, of removing the ‘other’ from his self-subjective identification with the island. So, how does this relate to thermodynamics? To answer that I will need to tell the tale of a second Demon, more playful even than Crusoe’s. In his 1871 essay, Theory of Heat, James Clerk Maxwell designed a thought experiment to test the 2nd law of Thermodynamics. Maxwell imagines a microscopic being able to sort atoms bouncing around a closed system into two categories: fast and slow. If such a creature did exist, it was argued, no work would be required to decrease the entropy of a closed system. By sorting unlikely footprints from the chaotic arrangement of sand particles Maxwell’s Demon, as it would later become known, appeared to contradict the law Maxwell himself had helped to develop. One method of solving the apparent paradox was devised by Charles H. Bennet, who recognised that the Demon would have to remember where he placed the fast and slow particles. Here, once again, the balance between the order and disorder of a system comes down to the balance between memory and information. As the demon decreases the entropy of its environment, so it must increase the entropy of its memory. The information required by the Demon acts like a noise in the system. The laws of physics had stood up under scrutiny, resulting in a new branch of science we now know as ‘Information Theory’. Maxwell’s Demon comes from an old view of the universe, “fashioned by divine intervention, created for man and responsive to his will” [5]. Information Theory represents a threshold, a revelation that the “inhuman force of increasing entropy, [is] indifferent to man and uncontrollable by human will.” [6] Maxwell’s Demon shows that the law of entropy has only a statistical certainty, that nature orders only on small scales and, that despite any will to control, inertia will eventually be reached. Developed at the peak of the British Empire, thermodynamics was sometimes called “the science of imperialism”, as Katherine Hayles has noted: “…to thermodynamicists, entropy represented the tendency of the universe to run down, despite the best efforts of British rectitude to prevent it from doing so… The rhetoric of imperialism confronts the inevitability of failure. In this context, entropy represents an apparently inescapable limit on the human will to control.” [7] Like Maxwell, Crusoe posits a Demon, with faculties similar in kind to his own, to help him quash his “terror of mind”. Crusoe’s fear is not really about outsiders coming in, the terror he feels comes from the realisation that the outsiders may have been here all along, that in all the 20 years of his isolation those “savages of the mainland” may have visited his island time and again. It is not an outside ‘other’ that disturbs and reorganises Crusoe’s Kingdom. A more perverse logic is at work here, and once again Crusoe will have to restructure his imperial order from the inside out. Before you read on, watch another scene from Luis Buñuel’s Robinson Crusoe (1954):

Jacques Rancière prepares for us a parable. A student who is illiterate, after living a fulfilled life without text, one day decides to teach herself to read. Luckily she knows a single poem by heart and procures a copy of that poem, presumably from a trusted source, by which to work. By comparing her memory of the poem, sign by sign, word by word, with the text of the poem she can, Rancière believes, finally piece together a foundational understanding of her written language: “From this ignoramus, spelling out signs, to the scientist who constructs hypotheses, the same intelligence is always at work – an intelligence that translates signs into other signs and proceeds by comparisons and illustrations in order to communicate its intellectual adventures and understand what another intelligence is endeavouring to communicate to it… This poetic labour of translation is at the heart of all learning.” [8] What interests me in Rancière’s example is not so much the act of translation as the possibility of mis-translation. Taken in light of The Ignorant Schoolmaster we can assume that Rancière is aware of the wide gap that exists between knowing something and knowing enough about something for it to be valuable. How does one calculate the value of what is a mistake? The ignoramus has an autonomy, but it is effectively blind to the quality and make-up of the information she parses. If she makes a mistake in her translation of the poem, this mistake can be one of two things: it can be a blind error, or, it can be a mutation. In information theory, the two ways to understand change within a closed system are understood to be the product of ‘noise’. The amount of change contributed by noise is called ‘equivocation’. If noise contributes to the reorganisation of a system in a beneficial way, for instance if a genetic mutation in an organism results in the emergence of an adaptive trait, then the equivocation is said to be ‘autonomy-producing’. Too much noise is equivalent to too much information, a ‘destructive’ equivocation, leading to chaos. This balance is how evolution functions. An ‘autonomy-producing’ mutation will be blindly passed on to an organism’s offspring, catalysing the self-organisation of the larger system (in this case, the species). All complex, what are called ‘autopoietic’ systems, inhabit this fine divide between noise and inertia.  Given just the right balance of noise recuperated by the system, and noise filtered out by the system, a state of productive change can be maintained, and a state of inertia can be avoided, at least, for a limited time. According to Umberto Eco, in ‘The Open Work’: “To be sure, this word information in communication theory relates not so much to what you do say, as to what you could say… In the end… there is no real difference between noise and signal, except in intent.” [9] This rigid delineator of intent is the driving force of our contemporary, communication paradigm. Information networks underpin our economic, political and social interactions: the failure to communicate is to be avoided at all costs. All noise is therefore seen as a problem. These processes, according to W. Daniel Hillis, define, “the essence of digital technology, which restores signal to near perfection at every stage.” [10] To go back to Umberto Eco then, we appear to be living in a world of “do say” rather than “could say”. Maintenance of the network and the routines of error management are our primary economic and political concern: control the networks and the immaterial products will manage themselves. The modern network paradigm acts like a Maxwell Demon, categorising information as either pure signal or pure noise. As Mark Nunes has noted, following the work of Deleuze and Guattari: “This forced binary imposes a kind of violence, one that demands a rationalisation of all singularities of expressions within a totalising system… The violence of information is, then, the violence of silencing or making to speak that which cannot communicate.” [11] To understand the violence of this binary logic, we need go no further than Robinson Crusoe. Friday’s questions are plain spoken, but do not adhere to the “do say” logic of Crusoe’s conception. In the novel, Crusoe’s approach to Friday becomes increasingly one sided, until Friday utters little more than ‘yes’ and ‘no’ answers, “reducing his language to a pure function of immediate context and perpetuating a much larger imperialist tradition of levelling the vox populi.”[12] Any chance in what Friday “could say” has been violently obliterated. The logic of Ranciere’s Ignoramous, and of Crusoe’s levelling of Friday’s speech, are logics of imperialism: reducing the possibility of noise and information to an either/or, inside/outside, relationship. Mark Nunes again: “This balance between total flow and total control parallels Deleuze and Guattari’s discussion of a regime of signs in which anything that resists systematic incorporation is cast out as an asignifying scapegoat “condemned as that which exceeds the signifying regime’s power of deterritorialisation.” [13] In the system of communication these “asignifying” events are not errors, in the common sense of the word. Mutation names a randomness that redraws the territory of complex systems. The footprint is the mark that reorganised the Empire. In Ranciere’s parable, rather than note her intent to decode the poem, we should hail the moment when the Ignoramus fails, as her autonomous moment. In a world where actants “translate signs into other signs and proceed by comparison and illustration” [14] the figures of information and communication are made distinct not by the caprice of those who control the networks, nor the desires of those who send and receive the messages, but by mutation itself. Michel Foucault, remarking on the work of Georges Canguilhem, drew the conclusion that the very possibility of mutation, rather than existing in opposition to our will, was what human autonomy was predicated upon: “In this sense, life – and this is its radical feature – is that which is capable of error… Further, it must be questioned in regard to that singular but hereditary error which explains the fact that, with man, life has led to a living being that is never completely in the right place, that is destined to ‘err’ and to be ‘wrong’.” [15] In his writings on the history of Heredity, The Logic of Life, Francois Jacob lingers on another Demon in the details, fashioned by Rene Descartes in his infamous meditation on human knowledge. François Jacob positions Descartes’ meditation in a period of explosive critical thought focussed on the very ontology of ‘nature’: “For with the arrival of the 17th Century, the very nature of knowledge was transformed. Until then, knowledge had been grafted on God, the soul and the cosmos… What counted [now] was not so much the code used by God for creating nature as that sought by man for understanding it.” [16] The infinite power of God’s will was no longer able to bend nature to any whim. If man were to decipher nature, to reveal its order, Descartes surmised, it was with the assurance that “the grid will not change in the course of the operation”[17]. For Descartes, the evil Demon, is a metaphor for deception espoused on the understanding that underlying that deception, nature had a certainty. God may well have given the world its original impetus, have designed its original make-up, but that make-up could not be changed. The network economy has today become the grid of operations onto which we map the world. Its binary restrictions predicate a logic of minimal error and maximum performance: a regime of control that drives our economic, political and social interdependencies. Trapped within his imperial logic, Robinson Crusoe’s levelling of inside and outside, his ruthless tidying of Friday’s noisy speech into a binary dialectic, disguises a higher order of reorganisation. As readers navigating the narrative we are keen to recognise the social changes Defoe’s novel embodies in its short-sighted central character. Perhaps, though, the most productive way to read this fiction, is to allegorise it as an outside perspective on our own time? Gathering together the fruits of research, I am often struck by the serendipitous quality of so many discoveries. In writing this mini-paper I have found it useful to engage with these marks, that become like demonic footprints, mutations in my thinking. Comparing each side by side, I hope to find, in the words of Michel Foucault: “…a way from the visible mark to that which is being said by it and which, without that mark, would lie like unspoken speech, dormant within things.” [18]    

References & Bibliography [1] Daniel Defoe, Robinson Crusoe, Penguin classics (London: Penguin Books, 2001).

[2] Umberto Eco, The open work (Cambridge: Harvard University Press, n.d.).

[3] Defoe, Robinson Crusoe.

[4] Susan Stewart, On longing: narratives of the miniature, the gigantic, the souvenir, the collection (Duke University Press, 1993).

[5] N. Katherine Hayles, “Maxwell’s Demon and Shannon’s Choice,” in Chaos bound: orderly disorder in contemporary literature and science (Cornell University Press, 1990).

[6] Ibid.

[7] Ibid.

[8] Jacques Rancière, The emancipated spectator (London: Verso, 2009).

[9] Umberto Eco, The open work (Cambridge: Harvard University Press, n.d.). (My emphasis)

[10] W Hillis, The pattern on the stone?: the simple ideas that make computers work, 1st ed. (New York: Basic Books, 1999).

[11] Mark Nunes, Error: glitch, noise, and jam in new media cultures (Continuum International Publishing Group, 2010).

[12] Susan Stewart, On longing: narratives of the miniature, the gigantic, the souvenir, the collection (Duke University Press, 1993).

[13] Nunes, Error.

[14] Rancière, The emancipated spectator.

[15] Michel Foucault, “Life: Experience and Science,” in Aesthetics, method, and epistemology (The New Press, 1999).

[16] François Jacob, The logic of life: a history of heredity?; the possible and the actual (Penguin, 1989).

[17] Ibid.

[18] Michel Foucault, The order of things?: an archaeology of the human sciences., 2003.

]]>
Wed, 07 Dec 2011 08:50:14 -0800 http://machinemachine.net/text/research/a-mark-on-crusoes-island
<![CDATA[Kipple and Things: How to Hoard and Why Not To Mean]]> http://machinemachine.net/portfolio/kipple-and-things

This is paper (more of an essay, really) was originally delivered at the Birkbeck/London Consortium ‘Rubbish Symposium‘, 30th July 2011 Living at the very limit of his means, Philip K. Dick, a two-bit, pulp sci-fi author, was having a hard time maintaining his livelihood. It was the 1950s and Dick was living with his second wife, Kleo, in a run-down apartment in Berkley, California, surrounded by library books Dick later claimed, “They could not afford to pay the fines on.” In 1956, Dick had a short story published in a brand new pulp magazine: Satellite Science Fiction. Entitled, Pay for the Printer, the story contained a whole host of themes that would come to dominate his work On an Earth gripped by nuclear winter, humankind has all but forgotten the skills of invention and craft. An alien, blob-like, species known as the Biltong co-habit Earth with the humans. They have an innate ability to ‘print’ things, popping out copies of any object they are shown from their formless bellies. The humans are enslaved not simply because everything is replicated for them, but, in a twist Dick was to use again and again in his later works, as the Biltong grow old and tired, each copied object resembles the original less and less. Eventually everything emerges as an indistinct, black mush. The short story ends with the Biltong themselves decaying, leaving humankind on a planet full of collapsed houses, cars with no doors, and bottles of whiskey that taste like anti-freeze. In his 1968 novel Do Androids Dream of Electric Sheep? Dick gave a name to this crumbling, ceaseless, disorder of objects: Kipple. A vision of a pudding-like universe, in which obsolescent objects merge, featureless and identical, flooding every apartment complex from here to the pock-marked surface of Mars. “No one can win against kipple,” Dick wrote: “It’s a universal principle operating throughout the universe; the entire universe is moving toward a final state of total, absolute kippleization.” In kipple, Dick captured the process of entropy, and put it to work to describe the contradictions of mass-production and utility. Saved from the wreckage of the nuclear apocalypse, a host of original items – lawn mowers, woollen sweaters, cups of coffee – are in short supply. Nothing ‘new’ has been made for centuries. The Biltong must produce copies from copies made of copies – each replica seeded with errors will eventually resemble kipple. Objects; things, are mortal; transient. The wrist-watch functions to mark the passing of time, until it finally runs down and becomes a memory of a wrist-watch: a skeleton, an icon, a piece of kipple. The butterfly emerges from its pupae in order to pass on its genes to another generation of caterpillar. Its demise – its kipple-isation – is programmed into its genetic code. An inevitable consequence of the cosmic lottery of biological inheritance. Both the wrist-watch and the butterfly have fulfilled their functions: I utilised the wrist-watch to mark time: the ‘genetic lottery’ utilised the butterfly to extend its lineage. Entropy is absolutely certain, and pure utility will always produce it. In his book Genesis, Michel Serres, argues that objects are specific to the human lineage. Specific, not because of their utility, but because they indicate our drive to classify, categorise and order: “The object, for us, makes history slow.” Before things become kipple, they stand distinct from one another. Nature seems to us defined in a similar way, between a tiger and a zebra there appears a broad gap, indicated in the creatures’ inability to mate with one another; indicated by the claws of the tiger and the hooves of the zebra. But this gap is an illusion, as Michel Foucault neatly points out in The Order of Things: “…all nature forms one great fabric in which beings resemble one another from one to the next…” The dividing lines indicating categories of difference are always unreal, removed as they are from the ‘great fabric’ of nature, and understood through human categories isolated in language. Humans themselves are constituted by this great fabric: our culture and language lie on the same fabric. Our apparent mastery over creation comes from one simple quirk of our being: the tendency we exhibit to categorise, to cleave through the fabric of creation. For Philip K. Dick, this act is what separates us from the alien Biltong. They can merely copy, a repeated play of resemblance that will always degrade to kipple. Humans, on the other hand, can do more than copy. They can take kipple and distinguish it from itself, endlessly, through categorisation and classification. Far from using things until they run down, humans build new relations, new meanings, carefully and slowly from the mush. New categories produce new things, produce newness. At least, that’s what Dick – a Platonic idealist – believed. At the end of Pay for the Printer, a disparate group camp in the kipple-ised, sagging pudding of a formless city. One of the settlers has with him a crude wooden cup he has apparently cleaved himself with an even cruder, hand-made knife: “You made this knife?” Fergesson asked, dazed. “I can’t believe it. Where do you start? You have to have tools to make this. It’s a paradox!” In his essay, The System of Collecting, Jean Baudrillard makes a case for the profound subjectivity produced in this apparent production of newness. Once things are divested of their function and placed into a collection, they: “…constitute themselves as a system, on the basis of which the subject seeks to piece together [their] world, [their] personal microcosm.” The use-value of objects gives way to the passion of systematization, of order, sequence and the projected perfection of the complete set. In the collection, function is replaced by exemplification. The limits of the collection dictate a paradigm of finality; of perfection. Each object – whether wrist-watch or butterfly – exists to define new orders. Once the blue butterfly is added to the collection it stands, alone, as an example of the class of blue butterflies to which the collection dictates it belongs. Placed alongside the yellow and green butterflies, the blue butterfly exists to constitute all three as a series. The entire series itself then becomes the example of all butterflies. A complete collection: a perfect catalogue. Perhaps, like Borges’ Library of Babel, or Plato’s ideal realm of forms, there exists a room somewhere with a catalogue of everything. An ocean of examples. Cosmic disorder re-constituted and classified as a finite catalogue, arranged for the grand cosmic collector’s singular pleasure. The problem with catalogues is that absolutely anything can be collected and arranged. The zebra and the tiger may sit side-by-side if the collector is particularly interested in collecting mammals, striped quadrupeds or – a particularly broad collection – things that smell funny. Too much classification, too many cleaves in the fabric of creation, and order once again dissolves into kipple. Disorder arises when too many conditions of order have been imposed. William H. Gass reminds us of the linguistic conjunction ‘AND’ an absolute necessity in the cleaving of kipple into things: “[W]e must think of chaos not as a helter-skelter of worn-out and broken or halfheartedly realised things, like a junkyard or potter’s midden, but as a fluid mishmash of thinglessness in every lack of direction as if a blender had run amok. ‘AND’ is that sunderer. It stands between. It divides light from darkness.” Collectors gather things about them in order to excerpt a mastery over the apparent disorder of creation. The collector attains true mastery over their microcosm. The narcissism of the individual extends to the precise limits of the catalogue he or she has arranged about them. Without AND language would function as nothing but pudding, each clause, condition or acting verb leaking into its partner, in an endless series. But the problem with AND, with classes, categories and order is that they can be cleaved anywhere. Jorge Luis Borges exemplified this perfectly in a series of fictional lists he produced throughout his career. The most infamous list, Michel Foucault claimed influenced him to write The Order of Things, refers to a “certain Chinese encyclopaedia” in which: Animals are divided into

belonging to the Emporer, embalmed, tame, sucking pigs, sirens, fabulous, stray dogs, included in the present classification, frenzied, innumerable, drawn with a very fine camelhair brush, et cetera, having just broken the water pitcher, that from a long way off look like flies…

In writing about his short story The Aleph, Borges also remarked: “My chief problem in writing the story lay in… setting down of a limited catalog of endless things. The task, as is evident, is impossible, for such a chaotic enumeration can only be simulated, and every apparently haphazard element has to be linked to its neighbour either by secret association or by contrast.” No class of things, no collection, no cleaving of kipple into nonkipple can escape the functions of either “association OR contrast…” The lists Borges compiled are worthy of note because they remind us of the binary contradiction classification always comes back to:

Firstly, that all collections are arbitrary and Secondly, that a perfect collection of things is impossible, because, in the final instance there is only pudding “…in every lack of direction…”

Human narcissism – our apparent mastery over kipple – is an illusion. Collect too many things together, and you re-produce the conditions of chaos you tried so hard to avoid. When the act of collecting comes to take precedence over the microcosm of the collection, when the differentiation of things begins to break down: collectors cease being collectors and become hoarders. The hoard exemplifies chaos: the very thing the collector builds their catalogues in opposition to. To tease apart what distinguishes the hoarder, from the collector, I’d like to introduce two new characters into this arbitrary list I have arranged about myself. Some of you may have heard of them, indeed, they are the brothers whom the syndrome of compulsive hoarding is named after.

Brothers, Homer and Langley Collyer lived in a mansion at 2078, Fifth Avenue, Manhattan. Sons of wealthy parents – their father was a respected gynaecologist, their mother a renowned opera singer – the brothers both attended Columbia University, where Homer studied law and Langley engineering. In 1933 Homer suffered a stroke which left him blind and unable to work at his law firm. As Langley began to devote his time entirely to looking after his helpless brother, both men became locked inside the mansion their family’s wealth and prestige had delivered. Over the following decade or so Langley would leave the house only at night. Wandering the streets of Manhattan, collecting water and provisions to sustain his needy brother, Langley’s routines became obsessive, giving his life a meaning above and beyond the streets of Harlem that were fast becoming run-down and decrepid. But the clutter only went one way: into the house, and, as the interest from the New York newspaper media shows, the Collyer brothers and their crumbling mansion became something of a legend in a fast changing city. On March 21st 1947 the New York Police Department received an anonymous tip-off that there was a dead body in the Collyer mansion. Attempting to gain entry, police smashed down the front-door, only to be confronted with a solid wall of newspapers (which, Langley had claimed to reporter’s years earlier his brother “would read once his eyesight was restored”.) Finally, after climbing in through an upstairs window, a patrolman found the body of Homer – now 65 years old – slumped dead in his kippleised armchair. In the weeks that followed, police removed one hundred and thirty tons of rubbish from the house. Langley’s body was eventually discovered crushed and decomposing under an enormous mound of junk, lying only a few feet from where Homer had starved to death. Crawling through the detritus to reach his ailing brother, Langley had triggered one of his own booby traps, set in place to catch any robbers who attempted to steal the brother’s clutter. The list of objects pulled from the brother’s house reads like a Borges original. From Wikipedia: Items removed from the house included baby carriages, a doll carriage, rusted bicycles, old food, potato peelers, a collection of guns, glass chandeliers, bowling balls, camera equipment, the folding top of a horse-drawn carriage, a sawhorse, three dressmaking dummies, painted portraits, pinup girl photos, plaster busts, Mrs. Collyer’s hope chests, rusty bed springs, a kerosene stove, a child’s chair, more than 25,000 books (including thousands about medicine and engineering and more than 2,500 on law), human organs pickled in jars, eight live cats, the chassis of an old Model T Ford, tapestries, hundreds of yards of unused silks and fabric, clocks, 14 pianos (both grand and upright), a clavichord, two organs, banjos, violins, bugles, accordions, a gramophone and records, and countless bundles of newspapers and magazines. Finally: There was also a great deal of rubbish. A Time Magazine obituary from April 1947 said of the Collyer brothers: “They were shy men, and showed little inclination to brave the noisy world.” In a final ironic twist of kippleisation, the brothers themselves became mere examples within the system of clutter they had amassed. Langley especially had hoarded himself to death. His body, gnawed by rats, was hardly distinguishable from the kipple that fell on top of it. The noisy world had been replaced by the noise of the hoard: a collection so impossible to conceive, to cleave, to order, that it had dissolved once more to pure, featureless kipple. Many hoarders achieve a similar fate to the Collyer brothers: their clutter eventually wiping them out in one final collapse of systemic disorder. To finish, I want to return briefly to Philip K. Dick. In the 1960s, fuelled by amphetamines and a debilitating paranoia, Dick wrote 24 novels, and hundreds of short stories, the duds and the classics mashed together into an indistinguishable hoard. UBIK, published in 1966, tells of a world which is itself degrading. Objects regress to previous forms, 3D televisions turn into black and white tube-sets, then stuttering reel-to-reel projections; credit cards slowly change into handfuls of rusted coins, impressed with the faces of Presidents long since deceased. Turning his back for a few minutes a character’s hover vehicle has degraded to become a bi-propeller airplane. The Three Stigmata of Palmer Eldritch, a stand-out novel from 1965, begins with this memo, “dictated by Leo Bulero immediately on his return from Mars”: “I mean, after all; you have to consider we’re only made out of dust. That’s admittedly not much to go on and we shouldn’t forget that. But even considering, I mean it’s a sort of bad beginning, we’re not doing too bad. So I personally have faith that even in this lousy situation we’re faced with we can make it. You get me?”

]]>
Sun, 31 Jul 2011 10:28:32 -0700 http://machinemachine.net/portfolio/kipple-and-things
<![CDATA[Biomathematics: The formula of life]]> http://www.newstatesman.com/ideas/2011/04/viruses-essay-pattern

Biology used to be about plants, animals and insects, but five great revolutions have changed the way that scientists think about life: the invention of the microscope, the systematic classification of the planet's living creatures, evolution, the discovery of the gene and the structure of DNA. Now, a sixth is on its way - mathematics.

Maths has played a leading role in the physical sciences for centuries, but in the life sciences it was little more than a bit player, a routine tool for analysing data. However, it is moving towards centre stage, providing new understanding of the complex processes of life.

The ideas involved are varied and novel; they range from pattern formation to chaos theory. They are helping us to understand not just what life is made from, but how it works, on every scale from molecules to the entire planet - and possibly beyond.

]]>
Wed, 11 May 2011 03:32:59 -0700 http://www.newstatesman.com/ideas/2011/04/viruses-essay-pattern