MachineMachine /stream - search for difference https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Humans evolved for punching, study confirms - Big Think]]> https://bigthink.com/the-present/men-evolution-punching/

According to biologist David Carrier of the University of Utah, “In mammals in general, the difference between males and females is often greatest in the structures that are used as weapons.

]]>
Tue, 18 Apr 2023 00:52:34 -0700 https://bigthink.com/the-present/men-evolution-punching/
<![CDATA[Charles Murray Returns, Nodding to Caution but Still Courting Controversy - The New York Times]]> https://www.nytimes.com/2020/02/12/books/review-human-diversity-charles-murray.html

Yes, that Charles Murray, who in 1994 co-authored “The Bell Curve,” with Richard J. Herrnstein, arguing in two notorious chapters that I.Q. differences between the races were mostly innate and mostly intractable. (They allowed that environmental factors play a part in I.Q.

]]>
Sat, 15 Feb 2020 06:06:12 -0800 https://www.nytimes.com/2020/02/12/books/review-human-diversity-charles-murray.html
<![CDATA[PhD Thesis: The Practice of Posthumanism]]> http://research.gold.ac.uk/26601/

Post-humanism is best understood as several overlapping and interrelated fields coming out of the traditions of anti-humanism, post-colonialism, and feminist discourse. But the term remains contested, both by those who wish to overturn, or even destroy, the ‘humanism’ after that decisive hyphen (post-humanists), and those engaged in the project of maximising their chance of merging with technologies, and reaching a supposed point of transition, when the current ‘human’ has been augmented, upgraded, and surpassed (transhumanists). For both those who wish to move beyond ‘humanism’, and those who wish to transcend ‘the human’, there remains a significant, shared, problem: the supposed originary separations, between information and matter, culture and nature, mankind and machine, singular and plural, that post-humanism seeks to problematise, and transhumanism often problematically ignores, lead to the delineation of ‘the human’ as a single, universalised figure. This universalism erases the pattern of difference, which post-humanists see as both the solution to, and the problem of, the human paradigm. This thesis recognises this problem as an ongoing one, and one which – for those who seek to establish posthumanism as a critical field of enquiry – can never be claimed to be finally overcome, lest the same problem of universalism rear its head again.

To tackle this problem, this thesis also enters into the complex liminal space where the terms ‘human’ and ‘humanism’ confuse and interrupt one another, but rather than delineate the same boundaries (as transhumanists have done), or lay claim over certain territories of the discourse (as post-humanists have done), this thesis implicates itself, myself, and yourself in the relational becoming posthuman of which we, and it, are co-constituted. My claim being, that critical posthumanism must be the action it infers onto the world of which it is not only part, but in mutual co-constitution with.The Practice of Posthumanism claims that critical posthumanism must be enacted in practice, and stages itself as an example of that process, through a hybrid theoretical and practice-based becoming. It argues that posthumanism is necessarily a vibrant, lively process being undergone, and as such, that it cannot be narrativized or referred to discursively without collapsing that process back into a static, universalised delineation once again. It must remain in practice, and as such, this thesis enacts the process of which it itself is a principle paradigm.After establishing the critical field termed ‘posthumanism’ through analyses of associated discourses such as humanism and transhumanism, each of the four written chapters and hybrid conclusion/portfolio of work is enacted through a ‘figure’ which speaks to certain monstrous dilemmas posed by thinkers of the posthuman. These five figures are: The Phantom Zone, Crusoe’s Island, The Thing, The Collapse of The Hoard, and The 3D Printer (#Additivism). Each figure – echoing Donna Haraway – ‘resets the stage for possible pasts and futures’ by calling into question the fictional/theoretical ground upon which it is predicated. Considered together, the dissertation and conclusion/portfolio of work, position critical posthumanism as a hybrid ‘other’, my claim being that only through representing the human as and through an ongoing process (ontogenesis rather than ontology) can posthumanism re-conceptualise the ‘norms’ deeply embedded within the fields it confronts.The practice of critical posthumanism this thesis undertakes is inherently a political project, displacing and disrupting the power dynamics which are co-opted in the hierarchical structuring of individuals within ‘society’, of categories within ‘nature’, of differences which are universalised in the name of the ‘human’, as well as the ways in which theory delineates itself into rigid fields of study. By confounding articulations of the human in fiction, theory, science, media, and art, this practice in practice enacts its own ongoing, ontogenetic becoming; the continual changing of itself, necessary to avoid a collapse into new absolutes and universals.

]]>
Thu, 08 Aug 2019 05:56:23 -0700 http://research.gold.ac.uk/26601/
<![CDATA[Naked]]> https://vimeo.com/12251595

A surgeon and his assistant wheel a bed with a panda on it into the operating room of a large hospital in Sweden. This is not a flesh-and-blood panda, however, but a mechanical toy. With great concentration, the surgeon starts to operate, painstakingly removing layer after layer of the furry covering followed by the soft filling. While the surgeon searches for the plastic core in the operational innards, the panda doggedly resists its deconstruction. The noises it emits sound more animal than machine-made, reinforcing the absurd nature of this clash between a living being and a machine. The film's calm, serious, observational style invites us to consider a number of current cultural issues. What is the difference between man and the technology he designs? To what extent are our flesh-and-blood bodies becoming technological bodies? And if, at some point in the future, we all have a technological body, will the operations of the future be like this one, on this pitiably whimpering panda? These are questions artist Tove Kjellmark also addresses in her other work./IDFACast: Tove KjellmarkTags: naked, skinned, art, sculture, toy, toys, panda, operation, another nature and idfa

]]>
Sun, 19 Aug 2018 10:22:02 -0700 https://vimeo.com/12251595
<![CDATA[The Revolution Will Be Weird and Eerie - VICE]]> https://www.vice.com/en_uk/article/the-revolution-will-be-weird-and-eerie

The simplest way to distinguish between the "weird" and the "eerie", the late Mark Fisher writes, is to think about the difference between presence and absence. Weirdness is produced by the presence of "that which does not belong". It shouldn't, or couldn't, exist – and yet there it is.

]]>
Sat, 11 Mar 2017 17:21:28 -0800 https://www.vice.com/en_uk/article/the-revolution-will-be-weird-and-eerie
<![CDATA[Rosi Braidotti, “Memoirs of a Posthumanist“]]> https://www.youtube.com/watch?v=OjxelMWLGCo

Philosopher Rosi Braidotti of Utrecht University in the Netherlands delivered the 2017 Tanner Lectures on Human Values this spring at the Yale’s Whitney Humanities Center.  Her talks are jointly titled “Posthuman, All Too Human.” The first, “Memoirs of a Posthumanist,” took place on Wednesday, March 1; the second, “Aspirations of a Posthumanist,” on Thursday, March 2. Professor Braidotti was joined by Professors Joanna Radin (History of Medicine, History) and Rüdiger Campe (German, Comparative Literature) for further discussion on Friday, March 3.   Rosi Braidotti is Distinguished University Professor and founding director of the Centre for the Humanities at Utrecht University. Her published works include Patterns of Dissonance: An Essay on Women in Contemporary French Philosophy (1991); Nomadic Subjects: Embodiment and Sexual Difference in Contemporary Feminist Theory (1994; 2d ed. 2011); Metamorphoses: Towards a Materialist Theory of Becoming (2002); Transpositions: On Nomadic Ethics (2006); La philosophie, lá où on ne l’attend pas (2009); Nomadic Theory: The Portable Rosi Braidotti (2011); and The Posthuman (2013). In 2016 she coedited Conflicting Humanities with Paul Gilroy.     Professor Braidotti has been an elected board member of the Consortium of Humanities Centres and Institutes since 2009. She is also an honorary fellow of the Australian Academy of the Humanities and a member of the Academia Europaea. She has been awarded honorary degrees by the University of Helsinki and the University of Linkoping. In 2005, she was knighted into the Order of the Netherlands by Queen Beatrix.

]]>
Thu, 02 Mar 2017 11:29:48 -0800 https://www.youtube.com/watch?v=OjxelMWLGCo
<![CDATA[Sonic Acts 2017: The Noise of Becoming: On Monsters, Men, and Every Thing in Between]]> https://machinemachine.net/portfolio/sonic-acts-2017-the-noise-of-becoming-on-monsters-men-and-every-thing-in-between/

UPDATE: My talk is also now available in The Noise of Being publication, published by Sonic Acts in September 2017 A talk I delivered at Sonic Acts Festival 2017: The Noise of Being, in which I refigure the sci-fi horror monster The Thing from John Carpenter’s 1982 film of the same name:

The Thing is a creature of endless mimetic transformations, capable of becoming the grizzly faced men who fail to defeat it. The most enduring quality of The Thing is its ability to perform self-effacement and subsequent renewal at every moment, a quality we must embrace and mimic ourselves if we are to outmanoeuvre the monsters that harangue us.

This talk was part of a panel featuring Laurie Penny and Ytasha Womack, entitled Speculative Fiction: Radical Figuration For Social Change. You can see their wonderful talks here:

Laurie Penny: Feminism Against Fascism Ytasha Womack: Afrofuturism: Imagination and Humanity

full text follows (+ references & slides) An Ontology of Every Thing on the Face of the Earth John Carpenter’s 1982 film, The Thing, is a claustrophobic science fiction thriller exhibiting many hallmarks of the horror genre. The film depicts a sinister turn for matter where the chaos of the replicating, cancerous cell is expanded to the human scale and beyond. We watch as an alien force terrorises an isolated Antarctic outpost. The creature exhibits an awesome ability to imitate; devouring any form of life it comes across, whilst simultaneously giving birth to an exact copy in a burst of bile and protoplasm. The Thing copies cell by cell in a process so perfect, that the resultant simulacrum speaks, acts, and even thinks like the original. The Thing is so relentless and its copies so perfect, that the outpost’s Doctor, Blair, is sent mad at the implications: If a cell gets out it could imitate everything on the face of the Earth… and it’s not gonna stop! [1] This text is also available in The Noise of Being publication (published September 2017) Based on John W. Campbell’s 1938 novella, Who Goes There?, Carpenter’s film revisits a gothic trope that is numerous in its incarnations. In Campbell’s novella, The Thing is condensed as much from the minds of the men as from its own horrific, defrosting bulk. A slowly surfacing nightmare that transforms alien matter into earthly biology also has the effect of transferring the inner, mental lives of the men into the resultant condensation. John W. Campbell knew that The Thing could become viscous human flesh, but in order to truly imitate its prey the creature must infect inner life separately, pulling kicking and screaming ghosts out of their biological – Cartesian – machines. As a gothic figure, Campbell’s Thing disrupts the stable and integral vision of human being: self-same bodies housing ‘unitary and securely bounded’ [2] subjectivities, identical and extensive through time. His characters confront their anguish at being embodied: their nightmares are literally made flesh. To emphasise the otherness of each human’s flesh, Campbell’s story is inhabited exclusively with male characters. The absence of women makes the conflict between each of the men feel more rudimentary, but it also centres the novel’s horror on the growing realisation that to be human is also to be alien to oneself. Differences between sexes within the single species homo sapiens are bypassed, allowing the alien entity to exhibit the features of human female ‘otherness’ alongside a gamut of horrific bodily permutations. Perhaps, as Barbara Creed, [3] Rosi Braidotti, [4] and others [5] have argued, The Thing signifies the intrinsic absence of the mother figure: the female body’s capacity to be differentiated from itself in the form of pregnancy; to open up and usher forth into the world a creature other to itself. This Thingly quality is given credence by Julia Kristeva in a passage that could equally refer to The Thing as to the development of a fetus during pregnancy: Cells fuse, split, and proliferate; volumes grow, tissues stretch, and the body fluids change rhythm, speeding up or slowing down. With the body, growing as a graft, indomitable, there is another. And no one is present, within that simultaneously dual and alien space, to signify what is going on. [6] The Thing does exhibit demeanours of copulation and fertility, but also of disease, fragmentation, dismemberment, and asexual fission. In the novella, during a drug induced nightmare Dr. Copper sits bolt upright and blurts out ‘Garry – listen. Selfish – from hell they came, and hellish shellfish – I mean self – Do I? What do I mean?,’ McReady [7] turns to the other men in the cabin, ‘Selfish, and as Dr. Copper said – every part is a whole. Every piece is self-sufficient, and animal in itself.’ [8] The Thing is aberrant at a level more fundamental than allusions to pregnancy can convey. Dr. Copper’s inability to articulate what The Thing is, indicates a categorical nightmare he and the men are suffering. As in the work of Mary Douglas, [9] The Thing’s nightmarish transformation denies the very concept of physical and categorical purity. The Thing’s distributed biology calls to mind the Hardt and Negri’s vision of the early Internet (ARPANET), designed, according to them: …to withstand military attack. Since it has no center and almost any portion can operate as an autonomous whole, the network can continue to function even when part of it has been destroyed. The same design element that ensures survival, the decentralisation, is also what makes control of the network so difficult. [10] The image of mankind’s outright destruction, via totalising narratives such as nuclear war, viral pandemic, or meteor strike is undermined by the paradigm of a Thingly technological infrastructure designed to avoid ‘absolute’ assault. Decentralisation is a categorical horror in its capacity to highlight our self-same, constantly threatened and weak, embodied selves. But shift the lens away from the self-same human subject, and the image of a distributed, amorphous network of autonomous cells immediately becomes a very good description of how biological life has always been constituted. The metaphysical dualism of the sexes, as Kelly Hurley concludes, is an inadequate paradigm of such horrific embodiment, rather any and all ‘ontological security’ [11] is challenged through a ‘collapsing of multiple and incompatible morphic possibilities into one amorphous embodiment.’ [12] The Thing is neither male nor female, two nor one, inside nor outside, living nor dead. If it does settle into a form that can be exclaimed, screamed or defined in mutually incompatible words, it does so only for a moment and only in the mind of its onlooker as they scrabble to deduce its next amorphous conflation. The Thing is a figure performing ontogenesis (something coming to be) rather than ontology (something that already is). [13] ‘The very definition of the real,’ as Jean Baudrillard affirmed, has become ‘that of which it is possible to give an equivalent reproduction.’ [14] Does The Thing ‘produce’ something other than human life, or ‘reproduce’ human life in its entirety, and what, if anything, would be the difference? In a text on bio and necropolitics, Eugene Thacker undertakes an examination of the ‘difference between “Life” as an ontological foundation, and “the living,” or the various specific instantiations of Life.’ [15] Thacker highlights a passage in Poetics where Aristotle speaks of mimesis giving rise to the art of poetry in human beings: We take delight in viewing the most accurate possible images of objects which in themselves cause distress when we see them (e.g. the shapes of the lowest species of animal, and corpses). Recognition of mimetic forms can instill a certain degree of displeasure if that form depicts a carcass or something considered equally abhorrent. But this is often tinged with what Aristotle calls the ‘extremely pleasurable’ dual capacities of recognising an imitation as such, whilst at the same time recognising what it is the form is imitative of. The horror of The Thing is bound to this endless ontogenetic re-forming, its limitless capacity to imitate and become without necessarily settling into a final, stable and agreeable categorical – that is, ontological – form. The men of the Antarctic encampment grasp in their minds at the forms ushering from The Thing but can never keep up with its propensity toward the next shapeless-shape, bodiless-limb, or ontogenetic-extrudence. The Thing is a phenomenon, to use Eugene Thacker’s words once more, that is ‘at once “above” and “below” the scale of the human being,’ [16] throwing, as Rosi Braidotti puts it, ‘a terminal challenge towards a human identity that is commonly predicated on the One.’ [17] The ‘other’ of The Thing never settles down, always falling outside the dialectical circle. As Helene Cixous remarks in The Newly Born Woman, with the ‘truly “other” there is nothing to say; it cannot be theorized. The “other” escapes me.’ [18] The figure of The Thing bursts into popular culture at the meeting point between dream and flesh, and has been pursued ever since by men whose individuality is considered inseparable from their self-same embodiment. By modifying the rules through which dominant norms such as gender binaries operate, The Thing can be conceived as an incarnation of détournement: an intervention that hijacks and continually modifies the rules of engagement. ‘The radical implication [being] that [all] meaning is connected to a relationship with power.’ [19] Considered through Michel Foucault’s definition of bio-power, or the bio-political, The Thing is the process of sex and sexuality severed from the humans who are forced to proliferate ‘through’ it. Above all, the men set against this propagation – this mobilisation of images of ‘other’ – scramble to protect the normative image of the human they hold most dear: the mirage of ‘man’. Becoming World The filmic Thing is a fictional device enabled by animatronic augmentations coated with fleshy stand-ins, KY Jelly, and occasionally, real animal offal. As John Carpenter described his rendition of the creature in a 2014 interview, ‘It’s just a bunch of rubber on the floor.’ [20] Bringing The Thing ‘to life’ is an activity that performs the collapse ‘between “Life” as an ontological foundation, and “the living,” or the various specific instantiations of Life.’ [21] The animatronic Thing exists in the space between stable forms; it is vibrant, expressive technology realised by dead matter; and human ingenuity made discernible by uncanny machinic novelty. Ontological uncertainty finds fluidity in language on a page, in the ability to poetically gesture towards interstitiality. But on-screen animatronics, rubber, and KY Jelly are less fluid, more mimetically rooted by the expectations of the audience reveling in, and reviled by, their recognition of The Thing’s many forms. Upon its release critical reactions to John Carpenter’s The Thing were at best muted and at worst downright vitriolic. The special effects used to depict the creature were the focus of an attack by Steve Jenkins’. Jenkins attacks the film essentially for its surrealist nature… he writes that: “with regard to the effects, they completely fail to ‘clarify the weirdness’ of the Thing”, and that “because one is ever sure exactly how it [the alien] functions, its eruptions from the shells of its victims seem as arbitrary as they are spectacular’.” [22] In short, the reviews lingered on two opposing readings of The Thing’s shock/gore evocations: that they go too far and thus tend towards sensational fetishism, or that they can’t go far enough, depicting kitsch sensibilities rather than alien otherness. Jenkins’ concern that the special effects do not ‘clarify’ The Thing’s ‘weirdness’ is contradictory, if not oxymoronic. The implication is that Things could never be so weird as to defy logical function, and that all expressions should, and eventually do, lend themselves to being read through some parochial mechanism or other, however surreal they may at first seem. That The Thing’s nature could actually defy comprehensibility is not considered, nor how impossible the cinematic depiction of that defiance might be. Rather, the critical view seems to be that every grisly eruption, bifurcation, and horrific permutation on screen must necessarily express an inner order temporarily hidden from, but not inaccessible to, its human onlookers. This critical desire for a ‘norm’ defies the same critical desire for ‘true’ horror. Our will to master matter and technology through imitative forms is the same will that balks at the idea that imitative forms could have ontologies incommensurable with our own. The Thing is ‘weird’: a term increasingly applied to those things defying categorisation. A conviction, so wrote the late Mark Fisher, ‘that this does not belong, is often a sign that we are in the presence of the new… that the concepts and frameworks which we have previously employed are now obsolete.’ [23] In reflecting on the origins of this slippery anti-category, Eugene Thacker reminds us that within horror, ‘The threat is not the monster, or that which threatens existing categories of knowledge. Rather, it is the “nameless thing,” or that which presents itself as a horizon for thought… the weird is the discovery of an unhuman limit to thought, that is nevertheless foundational for thought.’ [24] In The Thing the world rises up to meet its male inhabitants in a weird form and, by becoming them, throws into question the categorical foundations of the born and the made, of subject and object, natural and synthetic, whole and part, human and world, original and imitation. What remains is an ongoing process of animation rendered horrific by a bifurcation of ontologies: on one side the supposed human foundation of distinction, uniqueness and autonomy; on the other, a Thingly (alien and weird) propensity that dissolves differentiation, that coalesces and revels in an endless process of becoming.  As in Mikhail Bakhtin‘s study of the grotesque, the ‘human horizon’ in question is that of the ‘canon,’ [25] a norm to which all aberrations are to be compared: The grotesque body… is a body in the act of becoming. It is never finished, never completed; it is continually built, created, and builds and creates another body. Moreover, the body swallows the world and is itself swallowed by the world. [26] The Thingly is neither self-same nor enclosed unto itself. It is a plethora of openings, conjoinings and eruptions that declare ‘the world as eternally unfinished: a world dying and being born at the same time.’ [27] The bodily horror performed by The Thing is an allegory of this greater interstitial violation: the conceptual boundary between the world-for-us and the world-without-us is breached not as destruction, or even invasion, but ultimately through our inability to separate ourselves from a world that is already inherently alien and weird. [28] ‘A monstrosity’ to hijack the words of Claire Colebrook, ‘that we do not feel, live, or determine, but rather witness partially and ex post facto.’ [29] How these processes are comprehended, or more precisely, how the perception of these processes is interpreted, is more important than the so called ‘difference’ between the world which existed before and the world which remains after. Eugene Thacker clarifies this point in his analysis of the etymology of the word ‘monster’: A monster is never just a monster, never just a physical or biological anomaly. It is always accompanied by an interpretive framework within which the monster is able to be monstrum, literally “to show” or “to warn.” Monsters are always a mat­ter of interpretation. [30] Becoming Weird In a 1982 New York Times movie section, critic Vincent Canby poured yet more scorn on John Carpenter’s ‘Thing’ remake: The Thing is a foolish, depressing, overproduced movie that mixes horror with science fiction to make something that is fun as neither one thing or the other… There may be a metaphor in all this, but I doubt it… The Thing… is too phony looking to be disgusting. It qualifies only as instant junk. [31] Chiming with his critic peers, Canby expresses his desire that the monster show its nature – be monstrum – only in respect of some ‘norm’; [32] some ‘interpretive framework’, [33] that the narrative will eventually uncover. By setting up ‘junk’ as a kitschy opposite to this supposedly palatable logic, Canby unwittingly generates a point from which to disrupt the very notion of the interpretive framework itself. The Thing is more than a metaphor. Canby’s appeal to ‘instant junk’ can be read as the monstrum, the revealing of that which constitutes the norm. The monster stands in for difference, for other, and in so doing normalises the subject position from which the difference is opposed: the canon. In the case of The Thing that canon is first and foremost the human male, standing astride the idea of a world-for-us. The ‘us’ is itself monopolised, as if all non-male ontogenetic permutations were cast out into the abject abyss of alien weirdness. In reclaiming ‘junk’ as a ‘register of the unrepresentable’ [34] a Thingly discourse may share many of the tenets of queer theory. As Rosi Braidotti makes clear, referring to the work of Camilla Griggers: ‘Queer’ is no longer the noun that marks an identity they taught us to despise, but it has become a verb that destabilizes any claim to identity, even and especially to a sex-specific identity. [35] The queer, the weird, the kitsch, are among the most powerful of orders because they are inherently un-representable and in flux. The rigid delineations of language and cultural heteronormativity are further joined in the figure of The Thing by a non-anthropic imaginary that exposes a whole range of human norms and sets into play a seemingly infinite variety of non-human modes of being and embodiment. Rosi Braidotti refers to the work of Georges Canguilhem in her further turn outwards towards the weird, ‘normality is, after all, the zero-degree of monstrosity,’ [36] signalling a post-human discourse as one which, by definition, must continually question – perhaps even threaten – the male, self-same, canonised, subject position: We need to learn to think of the anomalous, the monstrously different not as a sign of pejoration but as the unfolding of virtual possibilities that point to positive alternatives for us all… the human is now displaced in the direction of a glittering range of post-human variables. [37] In her book on The Death of The Posthuman (2014), Claire Colebrook looks to the otherwise, the un-representable, to destabilise the proposition of a world being for anyone. She begins by considering the proposed naming of the current geological era ‘The Anthropocene,’ [38] a term that designates a theoretical as well as scientific impasse for human beings and civilisation, in which human activity and technological development have begun to become indistinguishable, and/or exceed processes implicit within what is considered to be the ‘natural’ world. As if registering the inevitable extinction of humans isn’t enough, The Anthropocene, by being named in honour of humans, makes monsters of those times – past and present – which do not contain humans. Its naming therefore becomes a mechanism allowing the imagination of ‘a viewing or reading in the absence of viewers or readers, and we do this through images in the present that extinguish the dominance of the present.’ [39] The world ‘without bodies’ that is imaged in this move, Colebrook argues, is written upon by the current state of impending extinction. Humans are then able to look upon the future world-without-us in a state of nostalgia coloured by their inevitable absence. Here the tenets of the horror genre indicated by Eugene Thacker are realised as a feature of a present condition. The world-in-itself has already been subsumed by The Thingly horror that is the human species. For even the coming world-without-us, a planet made barren and utterly replaced by The Thingly junk of human civilisation, will have written within its geological record a mark of human activity that goes back well before the human species had considered itself as a Thing ‘in’ any world at all. In an analysis of the etymology of the Anthropocene, McKenzie Wark also turns to theory as a necessary condition of the age of extinction: All of the interesting and useful movements in the humanities since the late twentieth century have critiqued and dissented from the theologies of the human. The Anthropocene, by contrast, calls for thinking something that is not even defeat. [40] The Anthropocene, like ‘queer’ or ‘weird’, should be made into a verb, and relinquished as a noun. Once weirded in this way it becomes a productive proposition, Wark goes on, quoting Donna Haraway, ‘another figure, a thousand names of something else.’ [41] In the 2014 lecture quoted by Wark, Haraway called for other such worldings through the horrific figure of capitalism, through arachnids spinning their silk from the waste matter of the underworld, or from the terrible nightmares evoked in the fiction of the misogynist, racist mid 20th century author H.P. Lovecraft: The activation of the chthonic powers that is within our grasp to collect up the trash of the anthropocene, and the exterminism of the capitalocene, to something that might possibly have a chance of ongoing. [42] That weird, ongoing epoch is the Chthulucene, a monstrum ‘defined by the frightening weirdness of being impossibly bound up with other organisms,’ [43] of what Haraway calls, ‘multi-species muddles.’  [44] The horror of ‘the nameless thing’ is here finally brought to bear in Haraway’s Capitalocene and Chthulucene epochs. Haraway’s call for ‘a thousand names of something else’ is Thingly in its push towards the endlessly bifurcated naming, and theoretical subsuming. The anthro-normalisation casts out infinitely more possibilities than it brings into play. Although Donna Haraway makes it clear that her Chthulucene is not directly derivative of H.P. Lovecraft’s Cthulhu mythos, her intentional mis-naming and slippery non-identification exemplifies the kind of amorphous thinking and practice she is arguing for. Haraway’s Chthulucene counters Lovecraft’s Cthulhu with an array of chthonic, non-male, tentacular, rhizomatic, and web spinning figures that attest to the monstrum still exposed by Lovecraft’s three quarters of a century old work. The continued – renewed – fascination with Lovecraft’s weird ‘others’ thus has the capacity to expose a dread of these times. As writer Alan Moore has attested: [I]t is possible to perceive Howard Lovecraft as an almost unbearably sensitive barometer of American dread. Far from outlandish eccentricities, the fears that generated Lovecraft’s stories and opinions were precisely those of the white, middle-class, heterosexual, Protestant-descended males who were most threatened by the shifting power relationships and values of the modern world… Coded in an alphabet of monsters, Lovecraft’s writings offer a potential key to understanding our current dilemma, although crucial to this is that they are understood in the full context of the place and times from which they blossomed. [45] The dominant humanistic imagination may no longer posit white cis-males as the figure that ‘must’ endure, but other uncontested figures remain in the space apparently excavated of Lovecraft’s affinities. To abandon what Claire Colebrook calls ‘the fantasy of one’s endurance,’ may be to concede that the post-human is founded on ‘the contingent, fragile, insecure, and ephemeral.’ [46] But, as Drucilla Cornell and Stephen D. Seely suggest, it is dangerous to consider this a ‘new’ refined status for the beings that remain, since ‘this sounds not like the imagination of living beyond Man, but rather like a meticulous description of the lives of the majority of the world under the condition of advanced capitalism right now.’ [47] As Claire Colebrook warns, post-humanism often relinquishes its excluded others – women, the colonised, nonhuman animals, or ‘life itself’ [48] – by merely subtracting the previously dominant paradigm of white heteropatriarchy, whilst failing to confront the monster the that particular figure was indicative of: Humanism posits an elevated or exceptional ‘man’ to grant sense to existence, then when ‘man’ is negated or removed what is left is the human all too human tendency to see the world as one giant anthropomorphic self-organizing living body… When man is destroyed to yield a posthuman world it is the same world minus humans, a world of meaning, sociality and readability yet without any sense of the disjunction, gap or limits of the human. [49] As in Haraway and Wark’s call for not just ‘naming, but of doing, of making new kinds of labor for a new kind of nature,’ [50] contemporary criticism and theory must be allowed to take on the form of the monsters it pursues, moulding and transforming critical inquiries into composite, hybrid figures that never settle in one form lest they become stable, rigid, and normalised. In fact, this metaphor itself is conditioned too readily by the notion of a mastery ‘Man’ can wield. Rather, our inquiries must be encouraged ‘to monster’ separately, to blur and mutate beyond the human capacity to comprehend them, like the infinite variety of organisms Haraway insists the future opens into. The very image of a post-humanism must avoid normalising the monster, rendering it through analysis an expression of the world-for-us. For Eugene Thacker this is the power of the sci-fi-horror genre, to take ‘aim at the presuppositions of philosophical inquiry – that the world is always the world-for-us – and [make] of those blind spots its central concern, expressing them not in abstract concepts but in a whole bestiary of impossible life forms – mists, ooze, blobs, slime, clouds, and muck.’ [51] Reflecting on the work of Noël Carroll, [52] Rosi Braidotti argues that if science fiction horror ‘is based on the disturbance of cultural norms, it is then ideally placed to represent states of crisis and change and to express the widespread anxiety of our times. As such this genre is as unstoppable as the transformations it mirrors.’ [53]  

References [1] John Carpenter, The Thing, Film, Sci-Fi Horror (Universal Pictures, 1982). [2]  Kelly Hurley, The Gothic Body: Sexuality, Materialism, and Degeneration at the Fin de Siècle (Cambridge University Press, 2004), 3. [3]  B. Creed, ‘Horror and the Monstrous-Feminine: An Imaginary Abjection.’ Screen 27, no. 1 (1 January 1986): 44–71. [4]  Rosi Braidotti, Metamorphoses: Towards a Materialist Theory of Becoming (Wiley, 2002), 192–94. [5]  Ian Conrich and David Woods, eds., The Cinema Of John Carpenter: The Technique Of Terror (Wallflower Press, 2004), 81. [6]  Julia Kristeva, quoted in Jackie Stacey, Teratologies: A Cultural Study of Cancer (Routledge, 2013), 89. [7]  The character McReady becomes MacReady in Carpenter’s 1982 retelling of the story. [8]  Campbell, Who Goes There?, 107. [9]  Noël Carroll, The Philosophy of Horror, Or, Paradoxes of the Heart (New York: Routledge, 1990). [10] Michael Hardt and Antonio Negri, Empire, New Ed (Harvard University Press, 2001), 299. [11] Braidotti, Metamorphoses, 195. [12] Kelly Hurley, ‘Reading like an Alien: Posthuman Identity in Ridley Scott’s Aliens and David Cronenberg’s Rabid,’ in Posthuman Bodies, ed. Judith M. Halberstam and Ira Livingston (Bloomington: John Wiley & Sons, 1996), 219. [13] This distinction was plucked, out of context, from Adrian MacKenzie, Transductions: Bodies and Machines at Speed (A&C Black, 2006), 17. MacKenzie is not talking about The Thing, but this distinction is, nonetheless, very useful in bridging the divide between stable being and endless becoming. [14] Jean Baudrillard, Simulations, trans. Paul Foss, Paul Patton, and Philip Beitchman (Semiotext (e) New York, 1983), 146. [15] Eugene Thacker, ‘Nekros; Or, The Poetics Of Biopolitics,’ Incognitum Hactenus 3, no. Living On: Zombies (2012): 35. [16] Ibid., 29. [17] Braidotti, Metamorphoses, 195. [18] Hélène Cixous, The Newly Born Woman (University of Minnesota Press, 1986), 71. [19] Nato Thompson et al., eds., The Interventionists: Users’ Manual for the Creative Disruption of Everyday Life (North Adams, Mass. : Cambridge, Mass: MASS MoCA ; Distributed by the MIT Press, 2004), 151. [20] John Carpenter, BBC Web exclusive: Bringing The Thing to life, Invasion, Tomorrow’s Worlds: The Unearthly History of Science Fiction, 14 November 2014. [21] Thacker, ‘Nekros; Or, The Poetics Of Biopolitics,’ 35. [22] Ian Conrich and David Woods, eds., The Cinema Of John Carpenter: The Technique Of Terror (Wallflower Press, 2004), 96. [23] Mark Fisher, The Weird and the Eerie, 2016, 13. [24] Eugene Thacker, After Life (University of Chicago Press, 2010), 23. [25] Mikhail Mikhaĭlovich Bakhtin, Rabelais and His World (Indiana University Press, 1984), 321. [26] Ibid., 317. [27] Ibid., 166. [28] This sentence is a paraphrased, altered version of a similar line from Eugene Thacker, ‘Nine Disputations on Theology and Horror,’ Collapse: Philosophical Research and Development IV: 38. [29] Claire Colebrook, Sex After Life: Essays on Extinction, Vol. 2 (Open Humanities Press, 2014), 14. [30] Eugene Thacker, ‘The Sight of a Mangled Corpse—An Interview with’, Scapegoat Journal, no. 05: Excess (2013): 380. [31] Vincent Canby, ‘“The Thing” Is Phony and No Fun,’ The New York Times, 25 June 1982, sec. Movies. [32] Derrida, ‘Passages: From Traumatism to Promise,’ 385–86. [33] Thacker, ‘The Sight of a Mangled Corpse—An Interview with,’ 380. [34] Braidotti, Metamorphoses, 180. [35] Ibid. [36] Ibid., 174. [37] Rosi Braidotti, ‘Teratologies’, in Deleuze and Feminist Theory, ed. Claire Colebrook and Ian Buchanan (Edinburgh: Edinburgh University Press, 2000), 172. [38] A term coined in the 1980s by ecologist Eugene F. Stoermer and widely popularized in the 2000s by atmospheric chemist Paul Crutzen. The Anthropocene is, according to Jan Zalasiewicz et al., ‘a distinctive phase of Earth’s evolution that satisfies geologist’s criteria for its recognition as a distinctive statigraphic unit.’ – Jan Zalasiewicz et al., ‘Are We Now Living in the Anthropocene,’ GSA Today 18, no. 2 (2008): 6. [39] Claire Colebrook, Death of the PostHuman: Essays on Extinction, Vol. 1 (Open Humanities Press, 2014), 28. [40] McKenzie Wark, ‘Anthropocene Futures’ Versobooks.com, 23 February 2015. [41] Ibid. [42] Donna Haraway, ‘Capitalocene, Chthulucene: Staying with the Trouble’ (University of California at Santa Cruz, 5 September 2014). [43] Leif Haven, ‘We’ve All Always Been Lichens: Donna Haraway, the Cthulhucene, and the Capitalocene,’ ENTROPY, 22 September 2014. [44] Donna Haraway, ‘SF: Sympoiesis, String Figures, Multispecies Muddles’ (University of Alberta, Edmonton, Canada, 24 March 2014). [45] H. P Lovecraft, The New Annotated H.P. Lovecraft, ed. Leslie S Klinger (Liveright, 2014), xiii. [46] Claire Colebrook, Sex After Life: Essays on Extinction, Vol. 2 (Open Humanities Press, 2014), 22. [47] Drucilla Cornell and Stephen D Seely, The Spirit of Revolution: Beyond the Dead Ends of Man (Polity press, 2016), 5. [48] Ibid., 3–4. [49] Claire Colebrook, Death of the PostHuman: Essays on Extinction, Vol. 1 (Open Humanities Press, 2014), 163–64. [50] Wark, ‘Anthropocene Futures.’ [51] Thacker, In the Dust of This Planet, 9. [52]   Carroll, The Philosophy of Horror, Or, Paradoxes of the Heart. [53]   Braidotti, Metamorphoses, 185 (my emphasis).

]]>
Sun, 26 Feb 2017 04:43:01 -0800 https://machinemachine.net/portfolio/sonic-acts-2017-the-noise-of-becoming-on-monsters-men-and-every-thing-in-between/
<![CDATA[Transmediale 2017 (events)]]> http://machinemachine.net/text/ideas/transmediale-2017/

I just came back from two jam packed weeks at Transmediale festival, 2017. Morehshin Allahyari and I were involved in a wealth of events, mostly in relation to our #Additivism project. Including: On the Far Side of the Marchlands: an exhibition at Schering Stiftung gallery, featuring work by Catherine Disney, Keeley Haftner, Brittany Ransom, Morehshin and myself.

Photos from the event are gathered here.

The 3D Additivist Cookbook european launch: held at Transmediale on Saturday 4th Feb.

Audio of the event is available here.

Singularities: a panel and discussion conceived and introduced by Morehshin and myself. Featuring Luiza Prado & Pedro Oliveira (A parede), Rasheedah Phillips, and Dorothy R. Santos.

Audio of the entire panel is available here. The introduction to the panel – written by Morehshin and myself – can be found below. Photos from the panel are here.

Alien Matter exhibition: curated by Inke Arns as part of Transmediale 2017. Featuring The 3D Additivist Cookbook and works by Joey Holder, Dov Ganchrow, and Kuang-Yi Ku.

Photos from the exhibition can be found here.

 

Singularities Panel delivered at Transmediale, Sunday 5th February 2017 Introduction by Morehshin Allahyari and Daniel Rourke   Morehshin: In 1979, the Iranian Islamic revolution resulted in the overthrowing of the Pahlavi deen-as-ty and led to the establishment of an Islamic republic. Many different organizations, parties and guerrilla groups were involved in the Iranian Revolution. Some groups were created after the fall of Pahlavi and still survive in Iran; others helped overthrow the Shah but no longer exist. Much of Iranian society was hopeful about the coming revolution. Secular and leftist politicians participated in the movement to gain power in the aftermath, believing that Khomeini would support their voice and allow multiple positions and parties to be active and involved in the shaping of the post-revolution Iran. Like my mother – a Marxist at the time – would always say: The Iranian revolution brought sudden change, death, violence in unforeseen ways. It was a point, a very fast point of collapse and rise. The revolution spun out of control and the country was taken over by Islamists so fast that people weren’t able to react to it; to slow it; or even to understand it. The future was now in the hands of a single party with a single vision that would change the lives of generations of Iranians, including myself, in the years that followed. We were forced and expected to live in one singular reality. A mono authoritarian singularity. In physics, a singularity is a point in space and time of such incredible density that the very nature of reality is brought into question. Associated with elusive black holes and the alien particles that bubble out of the quantum foam at their event horizon, the term ‘singularity’ has also been co-opted by cultural theorists and techno-utopianists to describe moments of profound social, political, ontological or material transformation. The coming-into-being of new worlds that redefine their own origins. For mathematicians and physicists, singularities are often considered as ‘bad behaviour’ in the numbers and calculations. Infinite points may signal weird behaviours existing ‘in’ the physical world: things outside or beyond our ability to comprehend. Or perhaps, more interestingly, a singularity may expose the need for an entirely new physics. Some anomalies can only be made sense of by drafting a radically new model of the physical world to include them. For this panel we consider ‘bad behaviours’ in social, technological and ontological singularities. Moments of profound change triggered by a combination of technological shifts, cultural mutations, or unforeseen political dramas and events. Like the physicists who comprehend singularities in the physical world, we do not know whether the singularities our panelists highlight today tell us something profound about the world itself, or force us to question the model we have of the world or worlds. Daniel: As well as technological or socio-political singularities, this panel will question the ever narcissistic singularities of ‘I’, ‘here’ and ‘now’ – confounding the principles of human universality upon which these suppositions are based. We propose ‘singularities’ as eccentric and elusive figures in need of collective attention. It is no coincidence that ‘Singularity’ is often used as a term to indicate human finitude. Self-same subjects existing at particular points in time, embedded within particular contexts, told through a singular history or single potential future. The metaphor of the transformative Singularity signals not one reality ‘to come’, nor even two realities – one moved from and one towards – but of many, all dependant on who the subject of the singularity is and how much autonomy they are ascribed. The ‘Technological’ Singularity is a myth of the ‘transhumanists’, a group of mainly Western, commonly white, male enthusiasts, who ascribe to the collective belief that technology will help them to become ‘more than human’… ‘possessed of drastically augmented intellects, memories, and physical powers.’ As technological change accelerates, according to prominent Transhumanist Ray Kurzweil, so it pulls us upwards in its wake. Kurzweil argues that as the curve of change reaches an infinite gradient reality itself will be brought into question: like a Black Hole in space-time subjects travelling toward this spike will find it impossible to turn around, to escape its pull. A transformed post-human reality awaits us on the other side of the Technological Singularity. A reality Kurzweil and his ilk believe ‘we’ will inevitably pass into in the coming decades. In a 2007 paper entitled ‘Droppin’ Science Fiction’, Darryl A. Smith explores the metaphor of the singularity through Afro-American and Afrofuturist science fiction. He notes that the metaphor of runaway change positions those subject to it in the place of Sisyphus, the figure of Greek myth condemned to push a stone up a hill forever. For Sisyphus to progress he has to fight gravity as it conspires with the stone to pull him back to the bottom of the slope. The singularity in much science fiction from black and afro-american authors focusses on this potential fall, rather than the ascent:

“Here, in the geometrics of spacetime, the Spike lies not at the highest point on an infinite curve but at the lowest… Far from being the shift into a posthumanity, the Negative Spike is understood… as an infinite collapsing and, thus, negation of reality. Escape from such a region thus requires an opposing infinite movement.”

The image of a collective ‘push’ of the stone of progress up the slope necessarily posits a universal human subject, resisting the pull of gravity back down the slope. A universal human subject who passes victorious to the other side of the event horizon. But as history has shown us, technological, social and political singularities – arriving with little warning – often split the world into those inside and those outside their event horizons. Singularities like the 1979 Iranian revolution left many more on the outside of the Negative Spike, than the inside. Singularities such as the Industrial Revolution, which is retrospectively told in the West as a tale of imperial and technological triumph, rather than as a story of those who were violently abducted from their homelands, and made to toil and die in fields of cotton and sugarcane. The acceleration toward and away from that singularity brought about a Negative Spike so dense, that many millions of people alive today still find their identities subject to its social and ontological mass. In their recent definition of The Anthropocene, the International Commission on Stratigraphy named the Golden Spike after World War II as the official signal of the human-centric geological epoch. A series of converging events marked in the geological record around the same time: the detonation of the first nuclear warhead; the proliferation of synthetic plastic from crude oil constituents; and the introduction of large scale, industrialised farming practices, noted by the appearance of trillions of discarded chicken bones in the geological record. Will the early 21st century be remembered for the 9/11 terrorist event? The introduction of the iPhone, and Twitter? Or for the presidency of Donald J Trump? Or will each of these extraordinary events be considered as part of a single, larger shift in global power and techno-mediated autonomy? If ‘we’ are to rebuild ourselves through stronger unities, and collective actions in the wake of recent political upheavals, will ‘we’ also forego the need to recognise the different subjectivities and distinct realities that bubble out of each singularity’s wake? As the iPhone event sent shockwaves through the socio-technical cultures of the West, so the rare earth minerals required to power those iPhones were pushed skywards in value, forcing more bodies into pits in the ground to mine them. As we gather at Transmediale to consider ai, infrastructural, data, robotic, or cyborgian revolutions, what truly remains ‘elusive’ is a definition of ‘the human’ that does justice to the complex array of subjectivities destined to be impacted – and even crafted anew – by each of these advances. In his recent text on the 2011 Fukushima Daiichi nuclear disaster Jean-Luc Nancy proposes instilling “the condition of an ever-renewed present” into the urgent design and creation of new, mobile futures. In this proposition Nancy recognises that each singularity is equal to all others in its finitude; an equivalence he defines as “the essence of community.” To contend with the idea of singularities – plural – of ruptures as such, we must share together that which will forever remain unimaginable alone. Morehshin: This appeal to a plurality of singularities is easily mistaken for the kinds of large scale collective action we have seen in recent years around the world. From the Arab Springs, and Occupy Movement through to the recent Women’s March, which took place not 24 hours after the inauguration of Donald Trump. These events in particular spoke of a universal drive, a collective of people’s united against a single cause. Much has been written about the ‘human microphone’ technique utilized by Occupy protesters to amplify the voice of a speaker when megaphones and loud speakers were banned or unavailable. We wonder whether rather than speak as a single voice we should seek to emphasise the different singularities enabled by different voices, different minds; distinct votes and protestations. We wonder whether black and brown protestors gathered in similar numbers, with similar appeals to their collective unity and identity would have been portrayed very differently by the media. Whether the radical white women and population that united for the march would also show up to the next black lives matter or Muslim ban protests. These are not just some academic questions but an actual personal concern… what is collectivism and for who does the collective function? When we talk about futures and worlds and singularities, whose realities are we talking about? Who is going to go to Mars with Elon Musk? And who will be left? As we put this panel together, in the last weeks, our Manifesto’s apocalyptic vision of a world accelerated to breaking point by technological progress began to seem strangely comforting compared to the delirious political landscape we saw emerging before us. Whether you believe political mele-ee-ze, media delirium, or the inevitable implosion of the neo-liberal project is to blame for the rise of figures like Farage, Trump or – in the Philippines – the outspoken President Rodrigo Duterte, the promises these figures make of an absolute shift in the conditions of power, appear grand precisely because they choose to demonize the discrete differences of minority groups, or attempt to overturn truths that might fragment and disturb their all-encompassing narratives. Daniel: The appeal to inclusivity – in virtue of a shared political identity – often instates those of ‘normal’ body, race, sex, or genome as exclusive harbingers of the-change-which-should – or so we are told, will – come. A process that theorist Rosi Braidotti refers to as a ‘dialectics of otherness’ which subtly disguises difference, in celebration of a collective voice of will or governance. Morehshin: Last week on January 27, as part of a plan to keep out “Islamic terrorists” outside of the United States Trump signed an order, that suspended entry for citizens of seven countries for 90 days. This includes Iran, the country I am a citizen of. I have lived in the U.S. for 9 years and hold a green-card which was included in Trump’s ban and now is being reviewed case by case for each person who enters the U.S.. When the news came out, I was already in Berlin for Transmediale and wasn’t sure whether I had a home to go back to. Although the chaos of Trump’s announcement has now settled, and my own status as a resident of America appears a bit more clear for now, the ripples of emotion and uncertainty from last week have coloured my experience at this festival. As I have sat through panels and talks in the last 3 days, and as I stand here introducing this panel about elusive events, potential futures and the in betweenness of all profound technological singularities… the realities that feel most significant to me are yet to take place in the lives of so many Middle-Easterners and Muslims affected by Trump’s ban. How does one imagine/re-imagine/figure/re-figure the future when there are still so many ‘presents’ existing in conflict? I grew up in Iran for 23 years, where science fiction didn’t really exist as a genre in popular culture. I always think we were discouraged to imagine the future other than how it was ‘imagined’ for us. Science-fiction as a genre flourishes in the West… But I still struggle with the kinds of futures we seem most comfortable imagining. THANKS   We now want to hand over to our fantastic panelists, to highlight their voices, and build harmonies and dissonances with our own. We are extremely honoured to introduce them: Dorothy Santos is a Filipina-American writer, editor, curator, and educator. She has written and spoken on a wide variety of subjects, including art, activism, artificial intelligence, and biotechnology. She is managing editor of Hyphen Magazine, and a Yerba Buena Center for the Arts fellow, where she is researching the concept of citizenship. Her talk today is entitled Machines and Materiality: Speculations of Future Biology and the Human Body. Luiza Prado and Pedro Oliveira are Brazilian design researchers, who very recently wrapped up their PhDs at the University of the Arts Berlin. Under the ‘A Parede’ alias, the duo researches new design methodologies, processes, and pedagogies for an onto-epistemological decolonization of the field. In their joint talk and performance, Luiza and Pedro will explore the tensions around hyperdense gravitational pulls and acts of resistance. With particular focus on the so-called “non-lethal” bombs – teargas and stun grenades – manufactured in Brazil, and exported and deployed all around the world. Rasheedah Phillips is creative director of Afrofuturist Affair: a community formed to celebrate, strengthen, and promote Afrofuturistic and Sci-Fi concepts and culture. In her work with ‘Black Quantum Futurism’, Rasheedah derives facets, tenets, and qualities from quantum physics, futurist traditions, and Black/African cultural traditions to celebrate the ability of African-descended people to see “into,” choose, or create the impending future. In her talk today, Rasheedah will explore the history of linear time constructs, notions of the future, and alternative theories of temporal-spatial consciousness.      

]]>
Thu, 09 Feb 2017 08:50:26 -0800 http://machinemachine.net/text/ideas/transmediale-2017/
<![CDATA[A is for Afrofuturism — Cyberfeminism vs. Afrofuturistfeminism]]> http://aisforafrofuturism.tumblr.com/post/67955218020/cyberfeminism-vs-afrofuturistfeminism

There are differences and sameness on some accounts within cyberfeminism versus afrofuturistfeminism, I’ll quickly list the basics.

]]>
Fri, 13 Jan 2017 05:50:55 -0800 http://aisforafrofuturism.tumblr.com/post/67955218020/cyberfeminism-vs-afrofuturistfeminism
<![CDATA[The Hunt for the Algorithms That Drive Life on Earth | WIRED]]> http://www.wired.com/2016/02/the-hunt-for-the-algorithms-that-drive-life-on-earth/

To the computer scientist Leslie Valiant, “machine learning” is redundant. In his opinion, a toddler fumbling with a rubber ball and a deep-learning network classifying cat photos are both learning; calling the latter system a “machine” is a distinction without a difference.

]]>
Sun, 06 Mar 2016 07:20:10 -0800 http://www.wired.com/2016/02/the-hunt-for-the-algorithms-that-drive-life-on-earth/
<![CDATA[Algorithmic Narratives and Synthetic Subjects (paper)]]> http://machinemachine.net/portfolio/paper-at-theorizing-the-web-synthetic-subjects/

This was the paper I delivered at The Theorizing the Web Conference, New York, 18th April 2015. This video of the paper begins part way in, and misses out some important stuff. I urge you to watch the other, superb, papers on my panel by Natalie Kane, Solon Barocas, and Nick Seaver. A better video is forthcoming. I posted this up partly in response to this post at Wired about the UK election, Facebook’s echo-chamber effect, and other implications well worth reading into.

Data churning algorithms are integral to our social and economic networks. Rather than replace humans these programs are built to work with us, allowing the distinct strengths of human and computational intelligences to coalesce. As we are submerged into the era of ‘big data’, these systems have become more and more common, concentrating every terrabyte of raw data into meaningful arrangements more easily digestible by high-level human reasoning. A company calling themselves ‘Narrative Science’, based in Chicago, have established a profitable business model based on this relationship. Their slogan, ‘Tell the Stories Hidden in Your Data’, [1] is aimed at companies drowning in spreadsheets of cold information: a promise that Narrative Science can ‘humanise’ their databases with very little human input. Kristian Hammond, Chief Technology Officer of the company, claims that within 15 years over 90% of all news stories will also be written by algorithms. [2] But rather than replacing the jobs that human journalists now undertake, Hammond claims the vast majority of their ‘robonews’ output will report on data currently not covered by traditional news outlets. One family-friendly example of this is the coverage of little-league baseball games. Very few news organisations have the resources, or desire, to hire a swathe of human journalists to write-up every little-league game. Instead, Narrative Science offer leagues, parents and their children a miniature summary of each game gleaned from match statistics uploaded by diligent little league attendees, and then written up by Narrative Science in a variety of journalistic styles. In their book ‘Big Data’ from 2013, Oxford University Professor of internet governance Viktor Mayer-Schönberger, and  ‘data editor’ of The Economist, Kenneth Cukier, tell us excitedly about another data aggregation company, Prismatic, who: …rank content from the web on the basis of text analysis, user preferences, social network-popularity, and big-data analysis. [3] According to Mayer- Schönberger and Cukier this makes Prismatic able ‘to tell the world what it ought to pay attention to better than the editors of the New York Times’. [4] A situation, Steven Poole reminds us, we can little argue with so long as we agree that popularity underlies everything that is culturally valuable. Data is now the lifeblood of technocapitalism. A vast endless influx of information flowing in from the growing universe of networked and internet connected devices. As many of the papers at Theorizing the Web attest, our environment is more and more founded by systems whose job it is to mediate our relationship with this data. Technocapitalism still appears to respond to Jean Francois Lyotard’s formulation of Postmodernity: that whether something is true has less relevance, than whether it is useful. In 1973 Jean Francois Lyotard described the Postmodern Condition as a change in “the status of knowledge” brought about by new forms of techno-scienctific and techno-economic organisation. If a student could be taught effectively by a machine, rather than by another human, then the most important thing we could give the next generation was what he called, “elementary training in informatics and telematics.” In other words, as long as our students are computer literate “pedagogy would not necessarily suffer”. [5] The next passage – where Lyotard marks the Postmodern turn from the true to the useful – became one of the book’s most widely quoted, and it is worth repeating here at some length:

It is only in the context of the grand narratives of legitimation – the life of the spirit and/or the emancipation of humanity – that the partial replacement of teachers by machines may seem inadequate or even intolerable. But it is probable that these narratives are already no longer the principal driving force behind interest in acquiring knowledge. [6] Here, I want to pause to set in play at least three elements from Lyotard’s text that colour this paper. Firstly, the historical confluence between technocapitalism and the era now considered ‘postmodern’. Secondly, the association of ‘the grand-narrative’ with modern, and pre-modern conditions of knowledge. And thirdly, the idea that the relationship between the human and the machine – or computer, or software – is generally one-sided: i.e. we may shy away from the idea of leaving the responsibility of our children’s education to a machine, but Lyotard’s position presumes that since the machine was created and programmed by humans, it will therefore necessarily be understandable and thus controllable, by humans. Today, Lyotard’s vision of an informatically literate populous has more or less come true. Of course we do not completely understand the intimate workings of all our devices or the software that runs them, but the majority of the world population has some form of regular relationship with systems simulated on silicon. And as Lyotard himself made clear, the uptake of technocapitalism, and therefore the devices and systems it propagates, is piece-meal and difficult to predict or trace. At the same time Google’s fleet of self-driving motor vehicles are let-loose on Californian state highways, in parts of sub-Saharan Africa models of mobile-phones designed 10 or more years ago are allowing farming communities to aggregate their produce into quantities with greater potential to make profit on a world market. As Brian Massumi remarks, network technology allows us the possibility of “bringing to full expression a prehistory of the human”, a “worlding of the human” that marks the “becoming-planetary” of the body itself. [7] This “worlding of the human” represents what Edmund Berger argues is the death of the Postmodern condition itself: [T]he largest bankruptcy of Postmodernism is that the grand narrative of human mastery over the cosmos was never unmoored and knocked from its pulpit. Instead of making the locus of this mastery large aggregates of individuals and institutions – class formations, the state, religion, etc. – it simply has shifted the discourse towards the individual his or herself, promising them a modular dreamworld for their participation… [8] Algorithmic narratives appear to continue this trend. They are piece-meal, tending to feedback user’s dreams, wants and desires, through carefully aggregated, designed, packaged Narratives for individual ‘use’. A world not of increasing connectivity and understanding between entities, but a network worlded to each individual’s data-shadow. This situation is reminiscent of the problem pointed out by Eli Pariser of the ‘filter bubble’, or the ‘you loop’, a prevalent outcome of social media platforms tweaked and personalised by algorithms to echo at the user exactly the kind of thing they want to hear. As algorithms develop in complexity the stories they tell us about the vast sea of data will tend to become more and more enamoring, more and more palatable. Like some vast synthetic evolutionary experiment, those algorithms that devise narratives users dislike, will tend to be killed off in the feedback loop, in favour of other algorithms whose turn of phrase, or ability to stoke our egos, is more pronounced. For instance, Narrative Science’s early algorithms for creating little league narratives tended to focus on the victors of each game. What Narrative Science found is that parents were more interested in hearing about their own children, the tiny ups and downs that made the game significant to them. So the algorithms were tweaked in response. Again, to quote chief scientist Kris Hammond from Narrative Science: These are narratives generated by systems that understand data, that give us information to support the decisions we need to make about tomorrow. [9] Whilst we can program software to translate the informational nuances of a baseball game, or internet social trends, into human palatable narratives, larger social, economic and environmental events also tend to get pushed through an algorithmic meatgrinder to make them more palatable. The ‘tomorrow’ that Hammond claims his company can help us prepare for is one that, presumably, companies like Narrative Science and Prismatic will play an ever larger part in realising. In her recently published essay on Crisis and the Temporality of Networks, Wendy Chun reminds us of the difference between the user and the agent in the machinic assemblage: Celebrations of an all powerful user/agent – ‘you’ as the network, ‘you’ as the producer- counteract concerns over code as law as police by positing ‘you’ as the sovereign subject, ‘you’ as the decider. An agent however, is one who does the  actual labor, hence agent is one who acts on behalf of another. On networks, the agent would seem to be technology, rather than the users or programmers who authorize actions through their commands and clicks. [10] In order to unpack Wendy Chun’s proposition here we need only look at two of the most powerful, and impactful algorithms from the last ten years of the web. Firstly, Amazon’s recommendation system, which I assume you have all interacted with at some point. And secondly, Facebook’s news feed algorithm, that ranks and sorts posts on your personalised stream. Both these algorithms rely on a community of user interactions to establish a hierarchy of products, or posts, based on popularity. Both these algorithms also function in response to user’s past activity, and both, of course, have been tweaked and altered over time by the design and programming teams of the respective companies. As we are all no doubt aware, one of the most significant driving principles behind these extraordinarily successful pieces of code is capitalism itself. The drive for profit, and the relationship that has on distinguishing between a successful or failing company, service or product. Wendy Chun’s reminder that those that carry out an action, that program and click, are not the agents here should give use solace. We are positioned as sovereign subjects over our data, because that idea is beneficial to the propagation of the ‘product’. Whether we are told how well our child has done at baseball, or what particular kinds of news stories we might like, personally, to read right now, it is to the benefit of technocapitalism that those narratives are positive, palatable and uncompromising. However the aggregation and dissemination of big data effects our lives over the coming years, the likelihood is that at the surface – on our screens, and ubiquitous handheld devices – everything will seem rosey, comfortable, and suited to the ‘needs’ and ‘use’ of each sovereign subject.

TtW15 #A7 @npseaver @nd_kane @s010n @smwat pic.twitter.com/BjJndzaLz1

— Daniel Rourke (@therourke) April 17, 2015

So to finish I just want to gesture towards a much much bigger debate that I think we need to have about big data, technocapitalism and its algorithmic agents. To do this I just want to read a short paragraph which, as far as I know, was not written by an algorithm: Surface temperature is projected to rise over the 21st century under all assessed emission scenarios. It is very likely that heat waves will occur more often and last longer, and that extreme precipitation events will become more intense and frequent in many regions. The ocean will continue to warm and acidify, and global mean sea level to rise. [11] This is from a document entitled ‘Synthesis Report for Policy Makers’ drafted by The Intergovernmental Panel on Climate Change – another organisation who rely on a transnational network of computers, sensors, and programs capable of modeling atmospheric, chemical and wider environmental processes to collate data on human environmental impact. Ironically then, perhaps the most significant tool we have to understand the world, at present, is big data. Never before has humankind had so much information to help us make decisions, and help us enact changes on our world, our society, and our selves. But the problem is that some of the stories big data has to tell us are too big to be narrated, they are just too big to be palatable. To quote Edmund Berger again: For these reasons we can say that the proper end of postmodernism comes in the gradual realization of the Anthropocene: it promises the death of the narrative of human mastery, while erecting an even grander narrative. If modernism was about victory of human history, and postmodernism was the end of history, the Anthropocene means that we are no longer in a “historical age but also a geological one. Or better: we are no longer to think history as exclusively human…” [12] I would argue that the ‘grand narratives of legitimation’ Lyotard claimed we left behind in the move to Postmodernity will need to return in some way if we are to manage big data in a meaningful way. Crises such as catastrophic climate change will never be made palatable in the feedback between users, programmers and  technocapitalism. Instead, we need to revisit Lyotard’s distinction between the true and the useful. Rather than ask how we can make big data useful for us, we need to ask what grand story we want that data to tell us.   References [1] Source: www.narrativescience.com, accessed 15/10/14 [2] Steven Levy, “Can an Algorithm Write a Better News Story Than a Human Reporter?,” WIRED, April 24, 2012, http://www.wired.com/2012/04/can-an-algorithm-write-a-better-news-story-than-a-human-reporter/. [3] “Steven Poole – On Algorithms,” Aeon Magazine, accessed May 8, 2015, http://aeon.co/magazine/technology/steven-poole-can-algorithms-ever-take-over-from-humans/. [4] Ibid. [5] Jean-François Lyotard, The Postmodern Condition: A Report on Knowledge, Repr, Theory and History of Literature 10 (Manchester: Univ. Pr, 1992), 50. [6] Ibid., 51. [7] Brian Massumi, Parables for the Virtual: Movement, Affect, Sensation (Duke University Press, 2002), 128. [8] Edmund Berger, “The Anthropocene and the End of Postmodernism,” Synthetic Zero, n.d., http://syntheticzero.net/2015/04/01/the-anthropocene-and-the-end-of-postmodernism/. [9] Source: www.narrativescience.com, accessed 15/10/14 [10] Wendy Chun, “Crisis and the Temporality of Networks,” in The Nonhuman Turn, ed. Richard Grusin (Minneapolis: University of Minnesota Press, 2015), 154. [11] Rajendra K. Pachauri et al., “Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change,” 2014, http://epic.awi.de/37530/. [12] Berger, “The Anthropocene and the End of Postmodernism.”

]]>
Fri, 08 May 2015 04:02:51 -0700 http://machinemachine.net/portfolio/paper-at-theorizing-the-web-synthetic-subjects/
<![CDATA[“Please don’t call me uncanny”: Cécile B. Evans at Seventeen Gallery]]> https://rhizome.org/editorial/2014/dec/4/please-dont-call-me-uncanny-hyperlinks-seventeen-g/#new_tab

A review of Cécile B. Evans’ show Hyperlinks, at Seventeen Gallery, London 15th Oct – 6th Dec 2014. With lots of editing and writerly support from Anton Haugen and Michael Conner.

Cécile B. Evans, Hyperlinks or it didn’t happen (2014). Still frame from HD video. Courtesy of Seventeen. Media saturation in the internet’s “cut & paste” ecology has become so naturalized that contemporary film’s collaged aspects are not readily considered. Who are the subjects in, for example, a Ryan Trecartin and Lizzie Fitch film? And for whom do they perform? When I show these films in my class, my students switch tabs in their browsers, Snapchat each other, like photos, fav tweets—often on multiple screens at once—then state that this “work is about strange fake-tanned kids’ search for a toilet.” What has made this answer stay in my mind pertains to the word “about.” When used for these works, the banal statement “this work is about…” registers as a crisis of categorical closure that the simultaneous existence of disparate, accumulated content on a single screen constantly thwarts. Central to Cécile B. Evans’ show Hyperlinks at Seventeen Gallery in London is the video-essay, Hyperlinks or it didn’t happen, displayed on a high-resolution TV with headphone cords installed at a comfortable cartoon-watching height in a corner of the space. Entering at the opposite corner, I navigate the gallery space, attempting to link the objects together—a prosthetic leg atop an upturned Eames chair replica near a rubber plant that counterbalances a plexiglass structure supporting 3D-printed arms (One Foot In The Grave, 2014), another Eames replica sitting in one corner (just a chair), various prints on the floor and walls—before sitting down, cross-legged, on a thick-pile rug strewn with postcard-sized images. The film begins with a super high-resolution render of actor Philip Seymour Hoffman’s head floating over the shimmering image of a jellyfish. “I’m not magic, and please don’t call me uncanny,” says a synthetically-augmented human voice. “I’m just a bad copy made too perfectly, too soon.” The video lingers on Hoffman’s face. His lips do not move — at least, not in sync with the voice claiming to be the bad copy. “Fuck. Fuck FUCKING FUCK! I am full of him.” An audience laughter track plays. The bad copy’s hair flutters as his head bobs. The follicles on his nose look like they’d be the perfect environment for a blackhead to take up residence. The subject floating on the screen does not symbolize Hoffman, rather, it is an improper metaphor for the actor’s “untimely death’; for anything that transcends description, yet is saturated with meaning nonetheless. Hyperlinks is so full of meaning that, as the voice suggests, it is set to burst.

Evans wants us to feel uncomfortable at the absence of an uncanny feeling, and by referring to this lack directly in the monologue of the simulated voice, she sets up a relation the viewer and this, a highly stylized, digital avatar. Hoffman, the image-thing, is not really a metaphor, nor is he really a copy, a simulation, or even a simulacrum of a more-real body. Hoffman, the image-thing, is literal and actual, perhaps more so to the viewer than Phillip Seymour Hoffman, the flesh-and-blood human or his “untimely death” was/will/could ever be. In her 2010 essay A Thing Like You and Me, Hito Steyerl defines the image as a thing whose “immortality… originates… from its ability to be xeroxed, recycled, and reincarnated.” [1] Like the postcards strewn throughout Hyperlinks, the floating, self-referential Hoffman points out a literal truth: Hoffman’s head is an “improper metaphor” [2] for the image that it actually is. Catachresis, a term we can employ for such “improper metaphors,” is a forced extension of meaning employed when “when no proper, or literal, term is available.” [3] According to Vivian Sobchack, “catachresis is differentiated from proper metaphor insofar as it forces us to confront” [4] the deficiency and failure of language. In linking across the gap between figural and literal meaning, catachresis marks the precise moment “where living expression states living existence.” [5] The image-things of Evans’ film are similarly analogically hyperlinked to the metaphors they supposedly express. In several sequences, an invisible, green-screened woman wanders a beach with a man who we are told is her partner: the nameless protagonist of Ralph Ellison’s 1952 novel, The Invisible Man. For a few seconds, we are confronted with Marlon Brando’s floating head, isolated from scenes deleted from Superman II (1980) to be digitally repurposed for the 2006 film Superman Returns, so the actor could reprise his role as Superman’s father two years after his death. The vocaloid pop-star Hatsune Miku serenades us with the song “Forever Young,” referencing her own immortality in the server banks and USB sticks that confer her identity. We then see, rolling onto a stage in Canada, Edward Snowden gives a TED talk on taking back the web, through a “Telepresence Robot” (an object that looks like a flat-panel screen attached to a Segway). As in a collage, the film splices and dices contiguous space and time, producing a unique configuration of catachretic associations, rather than a continuous narrative about something. Fictions are interwoven with facts, gestures with statements, figures with subjects. Moving about the gallery, the viewer hovers about the strewn postcard-sized images of a counterfeit Kermit the Frog, the render of Philip Seymour Hoffman, and the “hologram” of Michael Jackson. The image-things in Evans’ work seem to exist beyond subject/object distinctions, outside of sense, above their own measure of themselves —selves that they, nonetheless, frequently seem to be measuring and re-measuring. The exhibition comes with its own printed glossary of terms listing references the video makes. The first term in the glossary is “Hyperlink”: A reference to external data that a reader can open either by clicking or by hovering over a point of origin. From Greek hyper (prep. And adv.) “over, beyond, overmuch, above measure.” Here again the figural and literal are called into question. In relation to what can one say the “external” or “beyond” of a hyperlink resides? Why is the etymology for “link” not also given? Though at first, the glossary seems to map the associations, the links, of the disparate imagery presented in the show, it is suggestive of the total-work, presenting an almost anarchistic circulation of imagery as a coherent system. The glossary’s reification of associations gestures towards also the internet’s systemic interpellation of our networked subjecthood; as well as in the film title’s reference to the phrase “Pics, or it didn’t happen,” the show’s contrast between a body’s lifespan and a circulating digital image seems to also echo of our status as “poor copies” of our digital semblances. The image-things in “Hyperlinks” serve – to hijack the words of Scott Bukatman – “as the partial and fragmented representations that they are.” [6] . Through the works’ superfluity of associations and meanings, I found myself considering the impossibility of categorical closure. If totalization means incorporating all disparate things, an ultimate difference erupts: a moment that also signals the deficiency and failure of systemization itself. What makes Evans work successful is this endless calling up of the specter of the beyond, the outside, the everything else, from within the perceived totality of the internet. With the glossary, the totality of the show almost feels performative, gesturing towards the systemic totalizing we confer onto art objects in a gallery space before, after, and, especially, during their imaging. But image-things are considerably more liberated than either objects or subjects. They are more real, precisely because we recognize them as images.

 

[1] Hito Steyerl, “A Thing Like You and Me,” in The Wretched of the Screen, e-flux Journal (Sternberg Press, 2012), 46–59.

[2] Vivian Carol Sobchack, Carnal Thoughts Embodiment and Moving Image Culture (Berkeley: University of California Press, 2004), 81.

[3] Richard Shiff, “Cezanne’s Physicality: The Politics of Touch,” in The Language of Art History, ed. Salim Kemal and Ivan Gaskell (Cambridge University Press, 1991), 150.

[4] Sobchack, Carnal Thoughts Embodiment and Moving Image Culture, 81.

[5] Paul Ricoeur, The Rule of Metaphor: The Creation of Meaning in Language (Routledge, 2004), 72.

[6] Scott Bukatman, Terminal Identity: The Virtual Subject in Postmodern Science Fiction (Durham: Duke University Press, 1993), 40.

]]>
Thu, 04 Dec 2014 13:17:45 -0800 https://rhizome.org/editorial/2014/dec/4/please-dont-call-me-uncanny-hyperlinks-seventeen-g/#new_tab
<![CDATA["Please don't call me uncanny": Cécile B. Evans at Seventeen Gallery]]> http://rhizome.org/editorial/2014/dec/4/please-dont-call-me-uncanny-hyperlinks-seventeen-g

Cécile B. Evans, Hyperlinks or it didn't happen (2014). Still frame from HD video. Courtesy of Seventeen. Media saturation in the internet's "cut & paste" ecology has become so naturalized that contemporary film's collaged aspects are not readily considered. Who are the subjects in, for example, a Ryan Trecartin and Lizzie Fitch film? And for whom do they perform? When I show these films in my class, my students switch tabs in their browsers, Snapchat each other, like photos, fav tweets—often on multiple screens at once—then state that this "work is about strange fake-tanned kids' search for a toilet." What has made this answer stay in my mind pertains to the word "about." When used for these works, the banal statement "this work is about…" registers as a crisis of categorical closure that the simultaneous existence of disparate, accumulated content on a single screen constantly thwarts. Central to Cécile B. Evans' show Hyperlinks at Seventeen Gallery in London is the video-essay, Hyperlinks or it didn't happen, displayed on a high-resolution TV with headphone cords installed at a comfortable cartoon-watching height in a corner of the space. Entering at the opposite corner, I navigate the gallery space, attempting to link the objects together—a prosthetic leg atop an upturned Eames chair replica near a rubber plant that counterbalances a plexiglass structure supporting 3D-printed arms (One Foot In The Grave, 2014), another Eames replica sitting in one corner (just a chair), various prints on the floor and walls—before sitting down, cross-legged, on a thick-pile rug strewn with postcard-sized images.  

Cécile B. Evans, "Hyperlinks," Installation view. Courtesy of Seventeen. The film begins with a super high-resolution render of actor Philip Seymour Hoffman's head floating over the shimmering image of a jellyfish. "I'm not magic, and please don't call me uncanny," says a synthetically-augmented human voice. "I'm just a bad copy made too perfectly, too soon." The video lingers on Hoffman's face. His lips do not move — at least, not in sync with the voice claiming to be the bad copy. "Fuck. Fuck FUCKING FUCK! I am full of him." An audience laughter track plays. The bad copy's hair flutters as his head bobs. The follicles on his nose look like they'd be the perfect environment for a blackhead to take up residence. The subject floating on the screen does not symbolize Hoffman, rather, it is an improper metaphor for the actor's "untimely death'; for anything that transcends description, yet is saturated with meaning nonetheless. Hyperlinks is so full of meaning that, as the voice suggests, it is set to burst. Evans wants us to feel uncomfortable at the absence of an uncanny feeling, and by referring to this lack directly in the monologue of the simulated voice, she sets up a relation the viewer and this, a highly stylized, digital avatar. Hoffman, the image-thing, is not really a metaphor, nor is he really a copy, a simulation, or even a simulacrum of a more-real body. Hoffman, the image-thing, is literal and actual, perhaps more so to the viewer than Phillip Seymour Hoffman, the flesh-and-blood human or his "untimely death" was/will/could ever be. In her 2010 essay A Thing Like You and Me, Hito Steyerl defines the image as a thing whose "immortality… originates… from its ability to be xeroxed, recycled, and reincarnated." [1] Like the postcards strewn throughout Hyperlinks, the floating, self-referential Hoffman points out a literal truth: Hoffman's head is an "improper metaphor" [2] for the image that it actually is.  Catachresis, a term we can employ for such "improper metaphors," is a forced extension of meaning employed when "when no proper, or literal, term is available." [3] According to Vivian Sobchack, "catachresis is differentiated from proper metaphor insofar as it forces us to confront" [4] the deficiency and failure of language. In linking across the gap between figural and literal meaning, catachresis marks the precise moment "where living expression states living existence." [5] The image-things of Evans' film are similarly analogically hyperlinked to the metaphors they supposedly express. In several sequences, an invisible, green-screened woman wanders a beach with a man who we are told is her partner: the nameless protagonist of Ralph Ellison's 1952 novel, The Invisible Man. For a few seconds, we are confronted with Marlon Brando's floating head, isolated from scenes deleted from Superman II (1980) to be digitally repurposed for the 2006 film Superman Returns, so the actor could reprise his role as Superman's father two years after his death.

The vocaloid pop-star Hatsune Miku serenades us with the song "Forever Young," referencing her own immortality in the server banks and USB sticks that confer her identity. We then see, rolling onto a stage in Canada, Edward Snowden gives a TED talk on taking back the web, through a "Telepresence Robot" (an object that looks like a flat-panel screen attached to a Segway). As in a collage, the film splices and dices contiguous space and time, producing a unique configuration of catachretic associations, rather than a continuous narrative about something. Fictions are interwoven with facts, gestures with statements, figures with subjects. Moving about the gallery, the viewer hovers about the strewn postcard-sized images of a counterfeit Kermit the Frog, the render of Philip Seymour Hoffman, and the "hologram" of Michael Jackson. The image-things in Evans' work seem to exist beyond subject/object distinctions, outside of sense, above their own measure of themselves —selves that they, nonetheless, frequently seem to be measuring and re-measuring. The exhibition comes with its own printed glossary of terms listing references the video makes. The first term in the glossary is "Hyperlink":               A reference to external data that a reader can open either by clicking or by hovering over a point of origin. From Greek hyper (prep. And adv.) "over, beyond, overmuch, above measure." Here again the figural and literal are called into question. In relation to what can one say the "external" or "beyond" of a hyperlink resides? Why is the etymology for "link" not also given? Though at first, the glossary seems to map the associations, the links, of the disparate imagery presented in the show, it is suggestive of the total-work, presenting an almost anarchistic circulation of imagery as a coherent system. The glossary's reification of associations gestures towards also the internet's systemic interpellation of our networked subjecthood; as well as in the film title's reference to the phrase "Pics, or it didn't happen," the show's contrast between a body's lifespan and a circulating digital image seems to also echo of our status as "poor copies" of our digital semblances. The image-things in "Hyperlinks" serve – to hijack the words of Scott Bukatman - "as the partial and fragmented representations that they are." [6] . Through the works' superfluity of associations and meanings, I found myself considering the impossibility of categorical closure. If totalization means incorporating all disparate things, an ultimate difference erupts: a moment that also signals the deficiency and failure of systemization itself. What makes Evans work successful is this endless calling up of the specter of the beyond, the outside, the everything else, from within the perceived totality of the internet. With the glossary, the totality of the show almost feels performative, gesturing towards the systemic totalizing we confer onto art objects in a gallery space before, after, and, especially, during their imaging. But image-things are considerably more liberated than either objects or subjects. They are more real, precisely because we recognize them as images.

[1] Hito Steyerl, “A Thing Like You and Me,” in The Wretched of the Screen, e-flux Journal (Sternberg Press, 2012), 46–59.

[2] Vivian Carol Sobchack, Carnal Thoughts Embodiment and Moving Image Culture (Berkeley: University of California Press, 2004), 81.

[3] Richard Shiff, “Cezanne’s Physicality: The Politics of Touch,” in The Language of Art History, ed. Salim Kemal and Ivan Gaskell (Cambridge University Press, 1991), 150.

[4] Sobchack, Carnal Thoughts Embodiment and Moving Image Culture, 81.

[5] Paul Ricoeur, The Rule of Metaphor: The Creation of Meaning in Language (Routledge, 2004), 72.

[6] Scott Bukatman, Terminal Identity: The Virtual Subject in Postmodern Science Fiction (Durham: Duke University Press, 1993), 40.

]]>
Thu, 04 Dec 2014 12:17:45 -0800 http://rhizome.org/editorial/2014/dec/4/please-dont-call-me-uncanny-hyperlinks-seventeen-g
<![CDATA[Who Goes There?]]> http://thing.popapostle.com/html/episodes/Who-Goes-There.htm

Didja Know? This short story first appeared in Astounding Stories, August 1938, and is the inspiration for the 1951 film The Thing from Another World and 1982's The Thing. The 1982 version is much more faithful to the short story, though there are differences, as discussed below.

]]>
Mon, 08 Sep 2014 17:11:47 -0700 http://thing.popapostle.com/html/episodes/Who-Goes-There.htm
<![CDATA[Meet the Father of Digital Life]]> http://nautil.us/issue/14/mutation/meet-the-father-of-digital-life

n 1953, at the dawn of modern computing, Nils Aall Barricelli played God. Clutching a deck of playing cards in one hand and a stack of punched cards in the other, Barricelli hovered over one of the world’s earliest and most influential computers, the IAS machine, at the Institute for Advanced Study in Princeton, New Jersey. During the day the computer was used to make weather forecasting calculations; at night it was commandeered by the Los Alamos group to calculate ballistics for nuclear weaponry. Barricelli, a maverick mathematician, part Italian and part Norwegian, had finagled time on the computer to model the origins and evolution of life.

Inside a simple red brick building at the northern corner of the Institute’s wooded wilds, Barricelli ran models of evolution on a digital computer. His artificial universes, which he fed with numbers drawn from shuffled playing cards, teemed with creatures of code—morphing, mutating, melting, maintaining. He created laws that determined, independent of any foreknowledge on his part, which assemblages of binary digits lived, which died, and which adapted. As he put it in a 1961 paper, in which he speculated on the prospects and conditions for life on other planets, “The author has developed numerical organisms, with properties startlingly similar to living organisms, in the memory of a high speed computer.” For these coded critters, Barricelli became a maker of worlds.

Until his death in 1993, Barricelli floated between biological and mathematical sciences, questioning doctrine, not quite fitting in. “He was a brilliant, eccentric genius,” says George Dyson, the historian of technology and author of Darwin Among The Machines and Turing’s Cathedral, which feature Barricelli’s work. “And the thing about geniuses is that they just see things clearly that other people don’t see.”

Barricelli programmed some of the earliest computer algorithms that resemble real-life processes: a subdivision of what we now call “artificial life,” which seeks to simulate living systems—evolution, adaptation, ecology—in computers. Barricelli presented a bold challenge to the standard Darwinian model of evolution by competition by demonstrating that organisms evolved by symbiosis and cooperation.

Pixar cofounder Alvy Ray Smith says Barricelli influenced his earliest thinking about the possibilities for computer animation.

In fact, Barricelli’s projects anticipated many current avenues of research, including cellular automata, computer programs involving grids of numbers paired with local rules that can produce complicated, unpredictable behavior. His models bear striking resemblance to the one-dimensional cellular automata—life-like lattices of numerical patterns—championed by Stephen Wolfram, whose search tool Wolfram Alpha helps power the brain of Siri on the iPhone. Nonconformist biologist Craig Venter, in defending his creation of a cell with a synthetic genome—“the first self-replicating species we’ve had on the planet whose parent is a computer”—echoes Barricelli.

Barricelli’s experiments had an aesthetic side, too. Uncommonly for the time, he converted the digital 1s and 0s of the computer’s stored memory into pictorial images. Those images, and the ideas behind them, would influence computer animators in generations to come. Pixar cofounder Alvy Ray Smith, for instance, says Barricelli stirred his earliest thinking about the possibilities for computer animation, and beyond that, his philosophical muse. “What we’re really talking about here is the notion that living things are computations,” he says. “Look at how the planet works and it sure does look like a computation.”

Despite Barricelli’s pioneering experiments, barely anyone remembers him. “I have not heard of him to tell you the truth,” says Mark Bedau, professor of humanities and philosophy at Reed College and editor of the journal Artificial Life. “I probably know more about the history than most in the field and I’m not aware of him.”

Barricelli was an anomaly, a mutation in the intellectual zeitgeist, an unsung hero who has mostly languished in obscurity for the past half century. “People weren’t ready for him,” Dyson says. That a progenitor has not received much acknowledgment is a failing not unique to science. Visionaries often arrive before their time. Barricelli charted a course for the digital revolution, and history has been catching up ever since.

Barricelli_BREAKER-02 EVOLUTION BY THE NUMBERS: Barricelli converted his computer tallies of 1s and 0s into images. In this 1953 Barricelli print, explains NYU associate professor Alexander Galloway, the chaotic center represents mutation and disorganization. The more symmetrical fields toward the margins depict Barricelli’s evolved numerical organisms.From the Shelby White and Leon Levy Archives Center, Institute for Advanced Study, Princeton. Barricelli was born in Rome on Jan. 24, 1912. According to Richard Goodman, a retired microbiologist who met and befriended the mathematician in the 1960s, Barricelli claimed to have invented calculus before his tenth birthday. When the young boy showed the math to his father, he learned that Newton and Leibniz had preempted him by centuries. While a student at the University of Rome, Barricelli studied mathematics and physics under Enrico Fermi, a pioneer of quantum theory and nuclear physics. A couple of years after graduating in 1936, he immigrated to Norway with his recently divorced mother and younger sister.

As World War II raged, Barricelli studied. An uncompromising oddball who teetered between madcap and mastermind, Barricelli had a habit of exclaiming “Absolut!” when he agreed with someone, or “Scandaloos!” when he found something disagreeable. His accent was infused with Scandinavian and Romantic pronunciations, making it occasionally challenging for colleagues to understand him. Goodman recalls one of his colleagues at the University of California, Los Angeles who just happened to be reading Barricelli’s papers “when the mathematician himself barged in and, without ceremony, began rattling off a stream of technical information about his work on phage genetics,” a science that studies gene mutation, replication, and expression through model viruses. Goodman’s colleague understood only fragments of the speech, but realized it pertained to what he had been reading.

“Are you familiar with the work of Nils Barricelli?” he asked.

“Barricelli! That’s me!” the mathematician cried.

Notwithstanding having submitted a 500-page dissertation on the statistical analysis of climate variation in 1946, Barricelli never completed his Ph.D. Recalling the scene in the movie Amadeus in which the Emperor of Austria commends Mozart’s performance, save for there being “too many notes,” Barricelli’s thesis committee directed him to slash the paper to a tenth of the size, or else it would not accept the work. Rather than capitulate, Barricelli forfeited the degree.

Barricelli began modeling biological phenomena on paper, but his calculations were slow and limited. He applied to study in the United States as a Fulbright fellow, where he could work with the IAS machine. As he wrote on his original travel grant submission in 1951, he sought “to perform numerical experiments by means of great calculating machines,” in order to clarify, through mathematics, “the first stages of evolution of a species.” He also wished to mingle with great minds—“to communicate with American statisticians and evolution-theorists.” By then he had published papers on statistics and genetics, and had taught Einstein’s theory of relativity. In his application photo, he sports a pyramidal moustache, hair brushed to the back of his elliptic head, and hooded, downturned eyes. At the time of his application, he was a 39-year-old assistant professor at the University of Oslo.

Although the program initially rejected him due to a visa issue, in early 1953 Barricelli arrived at the Institute for Advanced Study as a visiting member. “I hope that you will be finding Mr. Baricelli [sic] an interesting person to talk with,” wrote Ragnar Frisch, a colleague of Barricelli’s who would later win the first Nobel Prize in Economics, in a letter to John von Neumann, a mathematician at IAS, who helped devise the institute’s groundbreaking computer. “He is not very systematic always in his exposition,” Frisch continued, “but he does have interesting ideas.”

Barricelli_BREAKER_2crop PSYCHEDELIC BARRICELLI: In this recreation of a Barricelli experiment, NYU associate professor Alexander Galloway has added color to show the gene groups more clearly. Each swatch of color signals a different organism. Borders between the color fields represent turbulence as genes bounce off and meld with others, symbolizing Barricelli’s symbiogenesis.Courtesy Alexander Galloway Centered above Barricelli’s first computer logbook entry at the Institute for Advanced Study, in handwritten pencil script dated March 3, 1953, is the title “Symbiogenesis problem.” This was his theory of proto-genes, virus-like organisms that teamed up to become complex organisms: first chromosomes, then cellular organs, onward to cellular organisms and, ultimately, other species. Like parasites seeking a host, these proto-genes joined together, according to Barricelli, and through their mutual aid and dependency, originated life as we know it.

Standard neo-Darwinian doctrine maintained that natural selection was the main means by which species formed. Slight variations and mutations in genes combined with competition led to gradual evolutionary change. But Barricelli disagreed. He pictured nimbler genes acting as a collective, cooperative society working together toward becoming species. Darwin’s theory, he concluded, was inadequate. “This theory does not answer our question,” he wrote in 1954, “it does not say why living organisms exist.”

Barricelli coded his numerical organisms on the IAS machine in order to prove his case. “It is very easy to fabricate or simply define entities with the ability to reproduce themselves, e.g., within the realm of arithmetic,” he wrote.

The early computer looked sort of like a mix between a loom and an internal combustion engine. Lining the middle region were 40 Williams cathode ray tubes, which served as the machine’s memory. Within each tube, a beam of electrons (the cathode ray) bombarded one end, creating a 32-by-32 grid of points, each consisting of a slight variation in electrical charge. There were five kilobytes of memory total stored in the machine. Not much by today’s standards, but back then it was an arsenal.

Barricelli saw his computer organisms as a blueprint of life—on this planet and any others.

Inside the device, Barricelli programmed steadily mutable worlds each with rows of 512 “genes,” represented by integers ranging from negative to positive 18. As the computer cycled through hundreds and thousands of generations, persistent groupings of genes would emerge, which Barricelli deemed organisms. The trick was to tweak his manmade laws of nature—“norms,” as he called them—which governed the universe and its entities just so. He had to maintain these ecosystems on the brink of pandemonium and stasis. Too much chaos and his beasts would unravel into a disorganized shamble; too little and they would homogenize. The sweet spot in the middle, however, sustained life-like processes.

Barricelli’s balancing act was not always easygoing. His first trials were riddled with pests: primitive, often single numeric genes invaded the space and gobbled their neighbors. Typically, he was only able to witness a couple of hereditary changes, or a handful at best, before the world unwound. To create lasting evolutionary processes, he needed to handicap these pests’ ability to rapidly reproduce. By the time he returned to the Institute in 1954 to begin a second round of experiments, Barricelli made some critical changes. First, he capped the proliferation of the pests to once per generation. That constraint allowed his numerical organisms enough leeway to outpace the pests. Second, he began employing different norms to different sections of his universes. That forced his numerical organisms always to adapt.

Even in the earlier universes, Barricelli realized that mutation and natural selection alone were insufficient to account for the genesis of species. In fact, most single mutations were harmful. “The majority of the new varieties which have shown the ability to expand are a result of crossing-phenomena and not of mutations, although mutations (especially injurious mutations) have been much more frequent than hereditary changes by crossing in the experiments performed,” he wrote.

When an organism became maximally fit for an environment, the slightest variation would only weaken it. In such cases, it took at least two modifications, effected by a cross-fertilization, to give the numerical organism any chance of improvement. This indicated to Barricelli that symbioses, gene crossing, and “a primitive form of sexual reproduction,” were essential to the emergence of life.

“Barricelli immediately figured out that random mutation wasn’t the important thing; in his first experiment he figured out that the important thing was recombination and sex,” Dyson says. “He figured out right away what took other people much longer to figure out.” Indeed, Barricelli’s theory of symbiogenesis can be seen as anticipating the work of independent-thinking biologist Lynn Margulis, who in the 1960s showed that it was not necessarily genetic mutations over generations, but symbiosis, notably of bacteria, that produced new cell lineages.

Barricelli saw his computer organisms as a blueprint of life—on this planet and any others. “The question whether one type of symbio-organism is developed in the memory of a digital computer while another type is developed in a chemical laboratory or by a natural process on some planet or satellite does not add anything fundamental to this difference,” he wrote. A month after Barricelli began his experiments on the IAS machine, Crick and Watson announced the shape of DNA as a double helix. But learning about the shape of biological life didn’t put a dent in Barricelli’s conviction that he had captured the mechanics of life on a computer. Let Watson and Crick call DNA a double helix. Barricelli called it “molecule-shaped numbers.”

Barricelli_BREAKER

What buried Barricelli in obscurity is something of a mystery. “Being uncompromising in his opinions and not a team player,” says Dyson, no doubt led to Barricelli’s “isolation from the academic mainstream.” Dyson also suspects Barricelli and the indomitable Hungarian mathematician von Neumann, an influential leader at the Institute of Advanced Study, didn’t hit it off. Von Neumann appears to have ignored Barricelli. “That was sort of fatal because everybody looked to von Neumann as the grandfather of self-replicating machines.”

Ever so slowly, though, Barricelli is gaining recognition. That stems in part from another of Barricelli’s remarkable developments; certainly one of his most beautiful. He didn’t rest with creating a universe of numerical organisms, he converted his organisms into images. His computer tallies of 1s and 0s would then self-organize into visual grids of exquisite variety and texture. According to Alexander Galloway, associate professor in the department of media, culture, and communication at New York University, a finished Barricelli “image yielded a snapshot of evolutionary time.”

When Barricelli printed sections of his digitized universes, they were dazzling. To modern eyes they might look like satellite imagery of an alien geography: chaotic oceans, stratigraphic outcrops, and the contours of a single stream running down the center fold, fanning into a delta at the patchwork’s bottom. “Somebody needs to do a museum show and show this stuff because they’re outrageous,” Galloway says.

Barricelli was an uncompromising oddball who teetered between madcap and mastermind.

Today, Galloway, a member of Barricelli’s small but growing cadre of boosters, has recreated the images. Following methods described by Barricelli in one of his papers, Galloway has coded an applet using the computer language Processing to revive Barricelli’s numerical organisms—with slight variation. While Barricelli encoded his numbers as eight-unit-long proto-pixels, Galloway condensed each to a single color-coded cell. By collapsing each number into a single pixel, Galloway has been able to fit eight times as many generations in the frame. These revitalized mosaics look like psychedelic cross-sections of the fossil record. Each swatch of color represents an organism, and when one color field bumps up against another one, that’s where cross-fertilization takes place.

“You can see these kinds of points of turbulence where the one color meets another color,” Galloway says, showing off the images on a computer in his office. “That’s a point where a number would be—or a gene would be—sort of jumping from one organism to another.” Here, in other words, is artificial life—Barricelli’s symbiogenesis—frozen in amber. And cyan and lavender and teal and lime and fuchsia.

Galloway is not the only one to be struck by the beauty of Barricelli’s computer-generated digital images. As a doctoral student, Pixar cofounder Smith became familiar with Barricelli’s work while researching the history of cellular automata for his dissertation. When he came across Barricelli’s prints he was astonished. “It was remarkable to me that with such crude computing facilities in the early 50s, he was able to be making pictures,” Smith says. “I guess in a sense you can say that Barricelli got me thinking about computer animation before I thought about computer animation. I never thought about it that way, but that’s essentially what it was.”

Cyberspace now swells with Barricelli’s progeny. Self-replicating strings of arithmetic live out their days in the digital wilds, increasingly independent of our tampering. The fittest bits survive and propagate. Researchers continue to model reduced, pared-down versions of life artificially, while the real world bursts with Boolean beings. Scientists like Venter conjure synthetic organisms, assisted by computer design. Swarms of autonomous codes thrive, expire, evolve, and mutate underneath our fingertips daily. “All kinds of self-reproducing codes are out there doing things,” Dyson says. In our digital lives, we are immersed in Barricelli’s world.

]]>
Fri, 20 Jun 2014 06:08:03 -0700 http://nautil.us/issue/14/mutation/meet-the-father-of-digital-life
<![CDATA[The difference between a concept & a constraint, part 2: What is a constraint? | HTMLGIANT]]> http://htmlgiant.com/craft-notes/the-difference-between-a-concept-a-constraint-part-2-what-is-a-constraint/

OK, back to this.

]]>
Sat, 14 Jun 2014 15:48:47 -0700 http://htmlgiant.com/craft-notes/the-difference-between-a-concept-a-constraint-part-2-what-is-a-constraint/
<![CDATA[The difference between a concept & a constraint, part 1: What is a concept? | HTMLGIANT]]> http://htmlgiant.com/craft-notes/the-difference-between-a-concept-a-constraint-part-1-what-is-a-concept/

I wrote about this to some extent here, but I wanted to expound on the issue in what I hope is a more coherent form. Because I frequently see concepts confused with constraints, and the Oulipo lumped in with conceptual writing. For instance, this entry at Poets.

]]>
Sat, 14 Jun 2014 15:48:45 -0700 http://htmlgiant.com/craft-notes/the-difference-between-a-concept-a-constraint-part-1-what-is-a-concept/
<![CDATA[Umberto Eco and why we still dream of utopia]]> http://www.newstatesman.com/culture/2013/11/no-place-home

Places that have never existed except in the human imagination may find an incongruous afterlife in the everyday world. Umberto Eco tells of how an attempt to commemorate the brownstone New York home of Nero Wolfe, Rex Stout’s orchid-loving fictional detective, runs up against the resistance of fact. Wolfe’s house cannot be identified because Stout “always talked of a brownstone at a certain number on West 35th Street, but in the course of his novels he mentioned at least ten different street numbers – and what is more, there are no brownstones on 35th Street”. Using Eco’s typology, a fiction has been transmuted into a legend: “Legendary lands and places are of various kinds and have only one characteristic in common: whether they depend on ancient legends whose origins are lost in the mists of time or whether they are in effect a modern invention, they have created flows of belief.”

Because they involve the belief that they existed, exist or can be made to exist – whether in the past, the future or somewhere off the map – legendary places are illusions rather than fictions. The distinction may sometimes be blurry, as the example of Nero Wolfe’s house shows; but the difference is fundamental to this enriching and playfully erudite exploration of the fabulous lands that human beings have invented.

Fictions we know to be neither true nor false and paradoxically this gives them a kind of absolute veracity that historical facts can never have: “The credulous believe that El Dorado and Lemuria exist or existed somewhere or other, but we all know that it is undeniably certain that Superman is Clark Kent and that Dr Watson was never Nero Wolfe’s right-hand man ... All the rest is open to debate.” Unfortunately, humans have an invincible need to believe in their fictions. So they turn them into legends, which they anxiously defend from doubt – even to the point of attacking and killing those who do not share them.

Eco thinks it is not too difficult to explain why humankind is so drawn to legendary places: “It seems that every culture – because the world of everyday reality is cruel and hard to live in – dreams of a happy land to which men once belonged, and may one day return.” Nowadays everyone believes that the ability to envision alternate worlds is one of humankind’s most precious gifts, a view Eco seems to endorse when, at the end of his journey through legendary lands, he describes these visions as “a truthful part of the reality of our imagination”. Yet Eco highlights a darker side of these visions when he describes how the Nazis drew inspiration from legends of ancient peoples, variously situated in ultima Thule (“a land of fire and ice where the sun never set”), Atlantis and the polar regions, who spoke languages that were “racially pure”. Himmler was obsessed with ancient Nordic runes, while in an interview after the war the commander of the SS in Rome claimed that when Hitler ordered him to kidnap Pope Pius XII so he could be interned in Germany, he also ordered the Pope to take from the Vatican library “certain runic manuscripts that evidently had esoteric value for him”.

The Nazi adoption of the swastika began with the Thule Society, a secret racist organisation founded in 1918. Legends of lost lands fed the ideology of Aryan supremacy. In 1907, Jörg Lanz founded the Order of the New Temple, preaching that “inferior races” should be subjected to castration, sterilisation, deportation to Madagascar and incineration – ideas, Eco notes, that “were later to be applied by the Nazis”. Legendary lands are idylls from which minorities, outsiders and other disturbing elements have been banished. When these fantasies of harmony enter politics, a process of exclusion is set in motion whose end point is mass murder and genocide.

A metamorphosis of fiction into legend occurred when some Nazis took seriously a picture of the world presented by the Victorian novelist Edward Bulwer-Lytton. In his novel The Coming Race (1871), Bulwer-Lytton tells of the “Vril-ya”, survivors from the destruction of Atlantis who possessed amazing powers as a result of being imbued with Vril, a type of cosmic energy, living in the hollow interior of earth. He intended the book as an exercise in fantasy literature but the founder of the Thule Society, who also founded a Vril Society, seems to have taken it more literally. Occultists in several countries read Bulwer-Lytton’s novel as a fictional rendition of events that may actually have happened and the legend was mixed in the stew of mad and bad ideas we now call Nazism.

The process at work was something like that described in Jorge Luis Borges’s story “Tlön, Uqbar, Orbis Tertius”, in which an encyclopaedia of an imaginary world subverts and disrupts the world that has hitherto been real. The difference is that in Borges’s incomparable fable the secret society that devised the encyclopaedia knew it to be fiction, while 19th-century occultists and some 20th-century Nazis accepted Bulwer-Lytton’s fiction as a version of fact. Among the marks that Bulwer-Lytton’s Vril-ya left in the real world, the most lasting was reassuringly prosaic: the name given to Bovril, the meat extract invented in the 1870s.

Among the legendary places human beings have dreamed up, those that Eco calls “the islands of utopia” have exercised a particular fascination in recent times. As he reminds us, “Etymologically speaking, utopia means non-place” – ou-topos, or no place. Thomas More, who coined the term in his book Utopia (composed in Latin and only translated in 1551 after More had been executed for treason in 1535), plays on an ambiguity in which the word also means a good or excellent place. Using a non-existent country to present an ideal model of government, More established a new literary genre, which included Étienne Cabet’s A Journey to Icaria (1840), in which a proto-communist society is envisioned, Samuel Butler’s Erewhon (1872, an anagram of “nowhere”) and William Morris’s News from Nowhere (1890).

Visions of ideal societies have recurred throughout history but such societies were nearly always placed in an irretrievable past. The paradise of milk and honey of which human beings dreamed – a land of perpetual peace and abundance – belonged in religion and mythology rather than history or science. Yet by the end of the 19th century, the fiction of an ideal society had been turned into a realisable human condition. Already in the second half of the 18th century, Rousseau was writing of an egalitarian society as if something of the kind had once existed – a move repeated by Marx and Engels in their theory of primitive communism, which they believed could be recreated at a higher level. More’s non-existent land was given a veneer of science and situated in a non-existent future. Having been a literary genre, utopia became a political legend.

The Book of Legendary Lands covers a vast range of non-places, including a flat and a hollow earth, the Antipodes, the lands of Homer and the many versions of Cockaigne (where honey and bread fall from the sky and no one is rich or poor). A fascinating chapter deals with the far more recent invention of Rennes-le-Château, a French village near Carcassonne that has been hailed as a site of immense treasure and of a priory established by descendants of Jesus, who supposedly did not die on the cross but fled to France and began the Merovingian dynasty.

Presented by Eco in light and witty prose, these legendary places are made more vivid by many well-chosen illustrations and historic texts. Yet this is far from being another coffee table book, however beautiful. As in much of his work, Eco’s theme is the slippage from fiction to illusion in the human mind. Rightly he sees this as a perennial tendency but it is one that has gathered momentum in modern times. So-called primitive cultures understood that history runs in cycles, with civilisations rising and falling much as the seasons come and go – a view of things echoed in Aristotle and the Roman historians. The rise of monotheism changed the picture, so that history came to be seen as an unfolding drama – a story with a beginning, an end and a redemptive meaning. Either way, no one believed that history could be governed by human will. It was fate, God or mere chaos that ruled human events.

Legendary lands began to multiply when human beings started to believe they could shape the future. Non-places envisioned by writers in the past were turned into utopian projects. At the same time, literature became increasingly filled with visions of hellish lands. As Eco puts it, “Sometimes utopia has taken the form of dystopia, accounts of negative societies.”

What counts as a dystopia, however, is partly a matter of taste. Aldous Huxley may have meant Brave New World (1932) as a warning but I suspect many people would find the kind of world he describes – genetically engineered and drug-medicated but also without violence, poverty or acute unhappiness – quite an attractive prospect. If the nightmarish society Huxley imagines is fortunately impossible, it is because it is supposed to be capable of renewing itself endlessly – a feature of utopias and one of the clearest signs of their unreality.

Whether you think a vision of the future is utopian or not depends on how you view society at the present time. Given the ghastly record of utopian politics in the 20th century, bien-pensants of all stripes never tire of declaring that all they want is improvement. They assume that the advances of the past are now permanent and new ones can simply be added on. But if you think society today is like all others have been – deeply flawed and highly fragile – you will understand that improvement can’t be inherited in this way. Sooner or later, past advances are sure to be lost, as the societies that have inherited them decline and fail. As everyone understood until just a few hundred years ago, this is the normal course of history.

No bien-pensant will admit this to be so. Indeed, many find the very idea of such a reversal difficult to comprehend. How could the advances that have produced the current level of civilisation – including themselves – be only a passing moment in the history of the species? Without realising the fact, these believers in improvement inhabit a legendary land – a place where what has been achieved in the past can be handed on into an indefinite future. The human impulse to dream up imaginary places and then believe them to be real, which Eco explores in this enchanting book, is as strong as it has ever been.

John Gray is the lead book reviewer of the NS. His latest book, “The Silence of Animals: On Progress and Other Modern Myths”, is published by Allen Lane (£18.99)

]]>
Wed, 11 Dec 2013 15:42:42 -0800 http://www.newstatesman.com/culture/2013/11/no-place-home
<![CDATA[Glitchometry]]> http://machinemachine.net/portfolio/glitchometry

I wrote an essay released in tandem with GLITCHOMETRY: Daniel Temkin‘s solo exhibition, held at Transfer Gallery, New York – November 16 through December 14, 2013. The publication also features an interview with the artist by Curt Cloninger. Excerpt from my essay : Glitchometry turns away from the ‘new earth’; the milieu of cyphers that constitute our contemporary audio-visual cognizance. By foregoing the simulations relied on when Photoshopping an image Temkin assumes an almost meditative patience with the will of the digital. As with Duchamp’s infra-thin – ‘the warmth of a seat which has just been left, reflection from a mirror or glass… velvet trousers, their whistling sound, is an infra-thin separation signalled’ – the one of the image and the other of the raw data is treated as a signal of meagre difference. Data is carefully bent in a sequence of sonifications that always risk falling back into the totalising violence of failure. Download as PDF More info : danieltemkin.com and TransferGallery.com

]]>
Wed, 20 Nov 2013 06:51:16 -0800 http://machinemachine.net/portfolio/glitchometry
<![CDATA[Codecs and Containers - the wonderful world of video files]]> http://www.youtube.com/watch?v=WpBjGUlBTHU&feature=youtube_gdata

Many people I meet in forums don't know that there is a difference between video codecs and containers. The confusion is so widely spread that I decided to make an explanatory video on it. With a German accent. And my very first animation. Maybe it still helps...

Here are some useful software links on the topic.

Handbrake, a great tool to convert video codecs: http://handbrake.fr/

A little tool I programmed to change video containers without reencoding: http://sourceforge.net/projects/containerswitch/

The video was animated in Apple Motion 5. The graphics included are from openclipart.org, thanks a lot to them.

]]>
Thu, 14 Nov 2013 03:45:19 -0800 http://www.youtube.com/watch?v=WpBjGUlBTHU&feature=youtube_gdata