MachineMachine /stream - search for entropy https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[The Marvelization of Cinema]]> https://www.youtube.com/watch?v=5tmxfVWDgMM

Start exploring the riches of cinema with an extended free trial at https://mubi.com/likestoriesofold

Help me make more videos! Support this channel: https://www.patreon.com/LikeStoriesofOld Leave a One-Time Donation: https://www.paypal.me/TomvanderLinden

Facebook: https://www.facebook.com/LikeStoriesofOld Instagram: https://www.instagram.com/tom.vd.linden Twitter: https://twitter.com/Tom_LSOO

About this video essay: We've all felt it: the movies have changed. But how so exactly, and why? And what can be done about it? In this extensive critique, I try to capture the decline of modern cinema in one unifying theory.

00:00 The Marvelization of Cinema 01:52 What is Storytelling Entropy? 06:26 Hollow Franchises 11:15 Meta-References to Nowhere 17:40 Corporate Passion 25:23 Breaking the Cycle 33:08 Meaningful Engagement

Further Reading: Like Stories of Old – The Complete Reading List: https://kit.co/likestoriesofold/reading-list 10 Books that changed my life: https://kit.co/likestoriesofold/10-books-that-changed-my-life 10 More books that inspired my thinking: https://kit.co/likestoriesofold/10-more-books-that-inspired-my-thinking

My Camera Gear: https://kit.co/likestoriesofold/my-travel-camera-gear

Media included: 12 Angry Men; 2001 A Space Odyssey; A Good Day to Die Hard; A Hidden Life; Aftersun; Ahsoka; Alien; Alien vs. Predator; Aliens; Ant-Man and the Wasp Quantumania; Asteroid City; Avengers Age of Ultron; Avengers Endgame; Avengers Infinity War; Babylon; Barbie; Batman Begins; Batman v Superman; Blade Runner 2049; Captain America Civil War; Captain America The First Avenger; Captain America The Winter Soldier; Citadel; Deadpool; Decision to Leave; Die Hard; Doctor Strange in the Multiverse of Madness; Dune; Dungeons and Dragons; Eternals; Foundation; Fast X; Free Guy; Game of Thrones; Ghostbusters Afterlife; Ghosted; Gladiator; Godzilla; Godzilla vs Kong; Goodfellas; Inception; Indiana Jones and the Dial of Destiny; Invasion of the Body Snatchers; Iron Man 3; Iron Man; John Wick Chapter 3 & 4; Jurassic Park; Jurassic World; Jurassic World Dominion; Lawrence of Arabia; Loki; Mission Impossible 1, 2, Fallout, Rogue Nation & Dead Reckoning; Moon Knight, Obi-Wan Kenobi; Ocean's Eleven; Once Upon a Time in Hollywood; Past Lives; Predator 1 & 2; Red Notice; Secret Invasion; Seven Samurai; Shang-Chi; She-Hulk; Spiderman 1 & 2; Spider-Man No Way Home; Star Wars A New Hope, The Empire Strikes Back, Return of the Jedi, The Phantom Menace, The Revenge of the Sith, The Force Awakens, The Last Jedi & the Rise of Skywalker; Synecdoche New York; Terminator 1, 2 & Dark Fate; The Dark Knight, The Amazing Spider-Man 2; The Avengers; The Fabelmans; The Falcon and the Winter Soldier; The Flash; The Godfather 1 & 2; The Gray Man; The Hobbit Trilogy; The Lord of the Rings Trilogy; The Rings of Power; The Matrix 1 & 4; The Mummy; The Prestige; The Thing; The Witcher; Thor Love and Thunder; Thor Ragnarok; Thor The Dark World; Top Gun; Top Gun Maverick; Transformers Rise of the Beasts; Uncharted

Business inquiries: lsoo@standard.tv
Say hi: likestoriesofold@gmail.com

Music: Dexter Britain - Hoursglass Jo Blankenburg - Dirt Dexter Britain - Your Own World Alaskan Tapes - Swimming and Dancing and Floating in Circles Dexter Britain - Headspace Cultus - Between Mountains Dexter Britain - Air Dexter Britain - Raising

Take your films to the next level with music from Musicbed. Sign up for a free account to listen for yourself: https://fm.pxf.io/c/3532571/1347628/16252

]]>
Tue, 07 Nov 2023 07:35:31 -0800 https://www.youtube.com/watch?v=5tmxfVWDgMM
<![CDATA[Personal Entropy - Journal #126 April 2022 - e-flux]]> https://www.e-flux.com/journal/126/460209/personal-entropy/

Recently I began realizing that my perception of the world is beginning to change in a peculiar way. There are no longer distinctions between more and less important themes or work; both conceptually and visually, everything is becoming equally (un)important.

]]>
Fri, 03 Jun 2022 05:53:07 -0700 https://www.e-flux.com/journal/126/460209/personal-entropy/
<![CDATA[towardsdatascience.com]]> https://towardsdatascience.com/but-what-is-entropy-ae9b2e7c2137

This write-up re-introduces the concept of entropy from different perspectives with a focus on its importance in machine learning, probabilistic programming, and information theory.

]]>
Fri, 03 Jun 2022 05:52:21 -0700 https://towardsdatascience.com/but-what-is-entropy-ae9b2e7c2137
<![CDATA[Personal Entropy - Journal #126 April 2022 - e-flux]]> https://www.e-flux.com/journal/126/460209/personal-entropy/

Recently I began realizing that my perception of the world is beginning to change in a peculiar way. There are no longer distinctions between more and less important themes or work; both conceptually and visually, everything is becoming equally (un)important.

]]>
Fri, 03 Jun 2022 01:53:07 -0700 https://www.e-flux.com/journal/126/460209/personal-entropy/
<![CDATA[towardsdatascience.com]]> https://towardsdatascience.com/but-what-is-entropy-ae9b2e7c2137

This write-up re-introduces the concept of entropy from different perspectives with a focus on its importance in machine learning, probabilistic programming, and information theory.

]]>
Fri, 03 Jun 2022 01:52:21 -0700 https://towardsdatascience.com/but-what-is-entropy-ae9b2e7c2137
<![CDATA[Sonic Acts 2017: The Noise of Becoming: On Monsters, Men, and Every Thing in Between]]> https://machinemachine.net/portfolio/sonic-acts-2017-the-noise-of-becoming-on-monsters-men-and-every-thing-in-between/

UPDATE: My talk is also now available in The Noise of Being publication, published by Sonic Acts in September 2017 A talk I delivered at Sonic Acts Festival 2017: The Noise of Being, in which I refigure the sci-fi horror monster The Thing from John Carpenter’s 1982 film of the same name:

The Thing is a creature of endless mimetic transformations, capable of becoming the grizzly faced men who fail to defeat it. The most enduring quality of The Thing is its ability to perform self-effacement and subsequent renewal at every moment, a quality we must embrace and mimic ourselves if we are to outmanoeuvre the monsters that harangue us.

This talk was part of a panel featuring Laurie Penny and Ytasha Womack, entitled Speculative Fiction: Radical Figuration For Social Change. You can see their wonderful talks here:

Laurie Penny: Feminism Against Fascism Ytasha Womack: Afrofuturism: Imagination and Humanity

full text follows (+ references & slides) An Ontology of Every Thing on the Face of the Earth John Carpenter’s 1982 film, The Thing, is a claustrophobic science fiction thriller exhibiting many hallmarks of the horror genre. The film depicts a sinister turn for matter where the chaos of the replicating, cancerous cell is expanded to the human scale and beyond. We watch as an alien force terrorises an isolated Antarctic outpost. The creature exhibits an awesome ability to imitate; devouring any form of life it comes across, whilst simultaneously giving birth to an exact copy in a burst of bile and protoplasm. The Thing copies cell by cell in a process so perfect, that the resultant simulacrum speaks, acts, and even thinks like the original. The Thing is so relentless and its copies so perfect, that the outpost’s Doctor, Blair, is sent mad at the implications: If a cell gets out it could imitate everything on the face of the Earth… and it’s not gonna stop! [1] This text is also available in The Noise of Being publication (published September 2017) Based on John W. Campbell’s 1938 novella, Who Goes There?, Carpenter’s film revisits a gothic trope that is numerous in its incarnations. In Campbell’s novella, The Thing is condensed as much from the minds of the men as from its own horrific, defrosting bulk. A slowly surfacing nightmare that transforms alien matter into earthly biology also has the effect of transferring the inner, mental lives of the men into the resultant condensation. John W. Campbell knew that The Thing could become viscous human flesh, but in order to truly imitate its prey the creature must infect inner life separately, pulling kicking and screaming ghosts out of their biological – Cartesian – machines. As a gothic figure, Campbell’s Thing disrupts the stable and integral vision of human being: self-same bodies housing ‘unitary and securely bounded’ [2] subjectivities, identical and extensive through time. His characters confront their anguish at being embodied: their nightmares are literally made flesh. To emphasise the otherness of each human’s flesh, Campbell’s story is inhabited exclusively with male characters. The absence of women makes the conflict between each of the men feel more rudimentary, but it also centres the novel’s horror on the growing realisation that to be human is also to be alien to oneself. Differences between sexes within the single species homo sapiens are bypassed, allowing the alien entity to exhibit the features of human female ‘otherness’ alongside a gamut of horrific bodily permutations. Perhaps, as Barbara Creed, [3] Rosi Braidotti, [4] and others [5] have argued, The Thing signifies the intrinsic absence of the mother figure: the female body’s capacity to be differentiated from itself in the form of pregnancy; to open up and usher forth into the world a creature other to itself. This Thingly quality is given credence by Julia Kristeva in a passage that could equally refer to The Thing as to the development of a fetus during pregnancy: Cells fuse, split, and proliferate; volumes grow, tissues stretch, and the body fluids change rhythm, speeding up or slowing down. With the body, growing as a graft, indomitable, there is another. And no one is present, within that simultaneously dual and alien space, to signify what is going on. [6] The Thing does exhibit demeanours of copulation and fertility, but also of disease, fragmentation, dismemberment, and asexual fission. In the novella, during a drug induced nightmare Dr. Copper sits bolt upright and blurts out ‘Garry – listen. Selfish – from hell they came, and hellish shellfish – I mean self – Do I? What do I mean?,’ McReady [7] turns to the other men in the cabin, ‘Selfish, and as Dr. Copper said – every part is a whole. Every piece is self-sufficient, and animal in itself.’ [8] The Thing is aberrant at a level more fundamental than allusions to pregnancy can convey. Dr. Copper’s inability to articulate what The Thing is, indicates a categorical nightmare he and the men are suffering. As in the work of Mary Douglas, [9] The Thing’s nightmarish transformation denies the very concept of physical and categorical purity. The Thing’s distributed biology calls to mind the Hardt and Negri’s vision of the early Internet (ARPANET), designed, according to them: …to withstand military attack. Since it has no center and almost any portion can operate as an autonomous whole, the network can continue to function even when part of it has been destroyed. The same design element that ensures survival, the decentralisation, is also what makes control of the network so difficult. [10] The image of mankind’s outright destruction, via totalising narratives such as nuclear war, viral pandemic, or meteor strike is undermined by the paradigm of a Thingly technological infrastructure designed to avoid ‘absolute’ assault. Decentralisation is a categorical horror in its capacity to highlight our self-same, constantly threatened and weak, embodied selves. But shift the lens away from the self-same human subject, and the image of a distributed, amorphous network of autonomous cells immediately becomes a very good description of how biological life has always been constituted. The metaphysical dualism of the sexes, as Kelly Hurley concludes, is an inadequate paradigm of such horrific embodiment, rather any and all ‘ontological security’ [11] is challenged through a ‘collapsing of multiple and incompatible morphic possibilities into one amorphous embodiment.’ [12] The Thing is neither male nor female, two nor one, inside nor outside, living nor dead. If it does settle into a form that can be exclaimed, screamed or defined in mutually incompatible words, it does so only for a moment and only in the mind of its onlooker as they scrabble to deduce its next amorphous conflation. The Thing is a figure performing ontogenesis (something coming to be) rather than ontology (something that already is). [13] ‘The very definition of the real,’ as Jean Baudrillard affirmed, has become ‘that of which it is possible to give an equivalent reproduction.’ [14] Does The Thing ‘produce’ something other than human life, or ‘reproduce’ human life in its entirety, and what, if anything, would be the difference? In a text on bio and necropolitics, Eugene Thacker undertakes an examination of the ‘difference between “Life” as an ontological foundation, and “the living,” or the various specific instantiations of Life.’ [15] Thacker highlights a passage in Poetics where Aristotle speaks of mimesis giving rise to the art of poetry in human beings: We take delight in viewing the most accurate possible images of objects which in themselves cause distress when we see them (e.g. the shapes of the lowest species of animal, and corpses). Recognition of mimetic forms can instill a certain degree of displeasure if that form depicts a carcass or something considered equally abhorrent. But this is often tinged with what Aristotle calls the ‘extremely pleasurable’ dual capacities of recognising an imitation as such, whilst at the same time recognising what it is the form is imitative of. The horror of The Thing is bound to this endless ontogenetic re-forming, its limitless capacity to imitate and become without necessarily settling into a final, stable and agreeable categorical – that is, ontological – form. The men of the Antarctic encampment grasp in their minds at the forms ushering from The Thing but can never keep up with its propensity toward the next shapeless-shape, bodiless-limb, or ontogenetic-extrudence. The Thing is a phenomenon, to use Eugene Thacker’s words once more, that is ‘at once “above” and “below” the scale of the human being,’ [16] throwing, as Rosi Braidotti puts it, ‘a terminal challenge towards a human identity that is commonly predicated on the One.’ [17] The ‘other’ of The Thing never settles down, always falling outside the dialectical circle. As Helene Cixous remarks in The Newly Born Woman, with the ‘truly “other” there is nothing to say; it cannot be theorized. The “other” escapes me.’ [18] The figure of The Thing bursts into popular culture at the meeting point between dream and flesh, and has been pursued ever since by men whose individuality is considered inseparable from their self-same embodiment. By modifying the rules through which dominant norms such as gender binaries operate, The Thing can be conceived as an incarnation of détournement: an intervention that hijacks and continually modifies the rules of engagement. ‘The radical implication [being] that [all] meaning is connected to a relationship with power.’ [19] Considered through Michel Foucault’s definition of bio-power, or the bio-political, The Thing is the process of sex and sexuality severed from the humans who are forced to proliferate ‘through’ it. Above all, the men set against this propagation – this mobilisation of images of ‘other’ – scramble to protect the normative image of the human they hold most dear: the mirage of ‘man’. Becoming World The filmic Thing is a fictional device enabled by animatronic augmentations coated with fleshy stand-ins, KY Jelly, and occasionally, real animal offal. As John Carpenter described his rendition of the creature in a 2014 interview, ‘It’s just a bunch of rubber on the floor.’ [20] Bringing The Thing ‘to life’ is an activity that performs the collapse ‘between “Life” as an ontological foundation, and “the living,” or the various specific instantiations of Life.’ [21] The animatronic Thing exists in the space between stable forms; it is vibrant, expressive technology realised by dead matter; and human ingenuity made discernible by uncanny machinic novelty. Ontological uncertainty finds fluidity in language on a page, in the ability to poetically gesture towards interstitiality. But on-screen animatronics, rubber, and KY Jelly are less fluid, more mimetically rooted by the expectations of the audience reveling in, and reviled by, their recognition of The Thing’s many forms. Upon its release critical reactions to John Carpenter’s The Thing were at best muted and at worst downright vitriolic. The special effects used to depict the creature were the focus of an attack by Steve Jenkins’. Jenkins attacks the film essentially for its surrealist nature… he writes that: “with regard to the effects, they completely fail to ‘clarify the weirdness’ of the Thing”, and that “because one is ever sure exactly how it [the alien] functions, its eruptions from the shells of its victims seem as arbitrary as they are spectacular’.” [22] In short, the reviews lingered on two opposing readings of The Thing’s shock/gore evocations: that they go too far and thus tend towards sensational fetishism, or that they can’t go far enough, depicting kitsch sensibilities rather than alien otherness. Jenkins’ concern that the special effects do not ‘clarify’ The Thing’s ‘weirdness’ is contradictory, if not oxymoronic. The implication is that Things could never be so weird as to defy logical function, and that all expressions should, and eventually do, lend themselves to being read through some parochial mechanism or other, however surreal they may at first seem. That The Thing’s nature could actually defy comprehensibility is not considered, nor how impossible the cinematic depiction of that defiance might be. Rather, the critical view seems to be that every grisly eruption, bifurcation, and horrific permutation on screen must necessarily express an inner order temporarily hidden from, but not inaccessible to, its human onlookers. This critical desire for a ‘norm’ defies the same critical desire for ‘true’ horror. Our will to master matter and technology through imitative forms is the same will that balks at the idea that imitative forms could have ontologies incommensurable with our own. The Thing is ‘weird’: a term increasingly applied to those things defying categorisation. A conviction, so wrote the late Mark Fisher, ‘that this does not belong, is often a sign that we are in the presence of the new… that the concepts and frameworks which we have previously employed are now obsolete.’ [23] In reflecting on the origins of this slippery anti-category, Eugene Thacker reminds us that within horror, ‘The threat is not the monster, or that which threatens existing categories of knowledge. Rather, it is the “nameless thing,” or that which presents itself as a horizon for thought… the weird is the discovery of an unhuman limit to thought, that is nevertheless foundational for thought.’ [24] In The Thing the world rises up to meet its male inhabitants in a weird form and, by becoming them, throws into question the categorical foundations of the born and the made, of subject and object, natural and synthetic, whole and part, human and world, original and imitation. What remains is an ongoing process of animation rendered horrific by a bifurcation of ontologies: on one side the supposed human foundation of distinction, uniqueness and autonomy; on the other, a Thingly (alien and weird) propensity that dissolves differentiation, that coalesces and revels in an endless process of becoming.  As in Mikhail Bakhtin‘s study of the grotesque, the ‘human horizon’ in question is that of the ‘canon,’ [25] a norm to which all aberrations are to be compared: The grotesque body… is a body in the act of becoming. It is never finished, never completed; it is continually built, created, and builds and creates another body. Moreover, the body swallows the world and is itself swallowed by the world. [26] The Thingly is neither self-same nor enclosed unto itself. It is a plethora of openings, conjoinings and eruptions that declare ‘the world as eternally unfinished: a world dying and being born at the same time.’ [27] The bodily horror performed by The Thing is an allegory of this greater interstitial violation: the conceptual boundary between the world-for-us and the world-without-us is breached not as destruction, or even invasion, but ultimately through our inability to separate ourselves from a world that is already inherently alien and weird. [28] ‘A monstrosity’ to hijack the words of Claire Colebrook, ‘that we do not feel, live, or determine, but rather witness partially and ex post facto.’ [29] How these processes are comprehended, or more precisely, how the perception of these processes is interpreted, is more important than the so called ‘difference’ between the world which existed before and the world which remains after. Eugene Thacker clarifies this point in his analysis of the etymology of the word ‘monster’: A monster is never just a monster, never just a physical or biological anomaly. It is always accompanied by an interpretive framework within which the monster is able to be monstrum, literally “to show” or “to warn.” Monsters are always a mat­ter of interpretation. [30] Becoming Weird In a 1982 New York Times movie section, critic Vincent Canby poured yet more scorn on John Carpenter’s ‘Thing’ remake: The Thing is a foolish, depressing, overproduced movie that mixes horror with science fiction to make something that is fun as neither one thing or the other… There may be a metaphor in all this, but I doubt it… The Thing… is too phony looking to be disgusting. It qualifies only as instant junk. [31] Chiming with his critic peers, Canby expresses his desire that the monster show its nature – be monstrum – only in respect of some ‘norm’; [32] some ‘interpretive framework’, [33] that the narrative will eventually uncover. By setting up ‘junk’ as a kitschy opposite to this supposedly palatable logic, Canby unwittingly generates a point from which to disrupt the very notion of the interpretive framework itself. The Thing is more than a metaphor. Canby’s appeal to ‘instant junk’ can be read as the monstrum, the revealing of that which constitutes the norm. The monster stands in for difference, for other, and in so doing normalises the subject position from which the difference is opposed: the canon. In the case of The Thing that canon is first and foremost the human male, standing astride the idea of a world-for-us. The ‘us’ is itself monopolised, as if all non-male ontogenetic permutations were cast out into the abject abyss of alien weirdness. In reclaiming ‘junk’ as a ‘register of the unrepresentable’ [34] a Thingly discourse may share many of the tenets of queer theory. As Rosi Braidotti makes clear, referring to the work of Camilla Griggers: ‘Queer’ is no longer the noun that marks an identity they taught us to despise, but it has become a verb that destabilizes any claim to identity, even and especially to a sex-specific identity. [35] The queer, the weird, the kitsch, are among the most powerful of orders because they are inherently un-representable and in flux. The rigid delineations of language and cultural heteronormativity are further joined in the figure of The Thing by a non-anthropic imaginary that exposes a whole range of human norms and sets into play a seemingly infinite variety of non-human modes of being and embodiment. Rosi Braidotti refers to the work of Georges Canguilhem in her further turn outwards towards the weird, ‘normality is, after all, the zero-degree of monstrosity,’ [36] signalling a post-human discourse as one which, by definition, must continually question – perhaps even threaten – the male, self-same, canonised, subject position: We need to learn to think of the anomalous, the monstrously different not as a sign of pejoration but as the unfolding of virtual possibilities that point to positive alternatives for us all… the human is now displaced in the direction of a glittering range of post-human variables. [37] In her book on The Death of The Posthuman (2014), Claire Colebrook looks to the otherwise, the un-representable, to destabilise the proposition of a world being for anyone. She begins by considering the proposed naming of the current geological era ‘The Anthropocene,’ [38] a term that designates a theoretical as well as scientific impasse for human beings and civilisation, in which human activity and technological development have begun to become indistinguishable, and/or exceed processes implicit within what is considered to be the ‘natural’ world. As if registering the inevitable extinction of humans isn’t enough, The Anthropocene, by being named in honour of humans, makes monsters of those times – past and present – which do not contain humans. Its naming therefore becomes a mechanism allowing the imagination of ‘a viewing or reading in the absence of viewers or readers, and we do this through images in the present that extinguish the dominance of the present.’ [39] The world ‘without bodies’ that is imaged in this move, Colebrook argues, is written upon by the current state of impending extinction. Humans are then able to look upon the future world-without-us in a state of nostalgia coloured by their inevitable absence. Here the tenets of the horror genre indicated by Eugene Thacker are realised as a feature of a present condition. The world-in-itself has already been subsumed by The Thingly horror that is the human species. For even the coming world-without-us, a planet made barren and utterly replaced by The Thingly junk of human civilisation, will have written within its geological record a mark of human activity that goes back well before the human species had considered itself as a Thing ‘in’ any world at all. In an analysis of the etymology of the Anthropocene, McKenzie Wark also turns to theory as a necessary condition of the age of extinction: All of the interesting and useful movements in the humanities since the late twentieth century have critiqued and dissented from the theologies of the human. The Anthropocene, by contrast, calls for thinking something that is not even defeat. [40] The Anthropocene, like ‘queer’ or ‘weird’, should be made into a verb, and relinquished as a noun. Once weirded in this way it becomes a productive proposition, Wark goes on, quoting Donna Haraway, ‘another figure, a thousand names of something else.’ [41] In the 2014 lecture quoted by Wark, Haraway called for other such worldings through the horrific figure of capitalism, through arachnids spinning their silk from the waste matter of the underworld, or from the terrible nightmares evoked in the fiction of the misogynist, racist mid 20th century author H.P. Lovecraft: The activation of the chthonic powers that is within our grasp to collect up the trash of the anthropocene, and the exterminism of the capitalocene, to something that might possibly have a chance of ongoing. [42] That weird, ongoing epoch is the Chthulucene, a monstrum ‘defined by the frightening weirdness of being impossibly bound up with other organisms,’ [43] of what Haraway calls, ‘multi-species muddles.’  [44] The horror of ‘the nameless thing’ is here finally brought to bear in Haraway’s Capitalocene and Chthulucene epochs. Haraway’s call for ‘a thousand names of something else’ is Thingly in its push towards the endlessly bifurcated naming, and theoretical subsuming. The anthro-normalisation casts out infinitely more possibilities than it brings into play. Although Donna Haraway makes it clear that her Chthulucene is not directly derivative of H.P. Lovecraft’s Cthulhu mythos, her intentional mis-naming and slippery non-identification exemplifies the kind of amorphous thinking and practice she is arguing for. Haraway’s Chthulucene counters Lovecraft’s Cthulhu with an array of chthonic, non-male, tentacular, rhizomatic, and web spinning figures that attest to the monstrum still exposed by Lovecraft’s three quarters of a century old work. The continued – renewed – fascination with Lovecraft’s weird ‘others’ thus has the capacity to expose a dread of these times. As writer Alan Moore has attested: [I]t is possible to perceive Howard Lovecraft as an almost unbearably sensitive barometer of American dread. Far from outlandish eccentricities, the fears that generated Lovecraft’s stories and opinions were precisely those of the white, middle-class, heterosexual, Protestant-descended males who were most threatened by the shifting power relationships and values of the modern world… Coded in an alphabet of monsters, Lovecraft’s writings offer a potential key to understanding our current dilemma, although crucial to this is that they are understood in the full context of the place and times from which they blossomed. [45] The dominant humanistic imagination may no longer posit white cis-males as the figure that ‘must’ endure, but other uncontested figures remain in the space apparently excavated of Lovecraft’s affinities. To abandon what Claire Colebrook calls ‘the fantasy of one’s endurance,’ may be to concede that the post-human is founded on ‘the contingent, fragile, insecure, and ephemeral.’ [46] But, as Drucilla Cornell and Stephen D. Seely suggest, it is dangerous to consider this a ‘new’ refined status for the beings that remain, since ‘this sounds not like the imagination of living beyond Man, but rather like a meticulous description of the lives of the majority of the world under the condition of advanced capitalism right now.’ [47] As Claire Colebrook warns, post-humanism often relinquishes its excluded others – women, the colonised, nonhuman animals, or ‘life itself’ [48] – by merely subtracting the previously dominant paradigm of white heteropatriarchy, whilst failing to confront the monster the that particular figure was indicative of: Humanism posits an elevated or exceptional ‘man’ to grant sense to existence, then when ‘man’ is negated or removed what is left is the human all too human tendency to see the world as one giant anthropomorphic self-organizing living body… When man is destroyed to yield a posthuman world it is the same world minus humans, a world of meaning, sociality and readability yet without any sense of the disjunction, gap or limits of the human. [49] As in Haraway and Wark’s call for not just ‘naming, but of doing, of making new kinds of labor for a new kind of nature,’ [50] contemporary criticism and theory must be allowed to take on the form of the monsters it pursues, moulding and transforming critical inquiries into composite, hybrid figures that never settle in one form lest they become stable, rigid, and normalised. In fact, this metaphor itself is conditioned too readily by the notion of a mastery ‘Man’ can wield. Rather, our inquiries must be encouraged ‘to monster’ separately, to blur and mutate beyond the human capacity to comprehend them, like the infinite variety of organisms Haraway insists the future opens into. The very image of a post-humanism must avoid normalising the monster, rendering it through analysis an expression of the world-for-us. For Eugene Thacker this is the power of the sci-fi-horror genre, to take ‘aim at the presuppositions of philosophical inquiry – that the world is always the world-for-us – and [make] of those blind spots its central concern, expressing them not in abstract concepts but in a whole bestiary of impossible life forms – mists, ooze, blobs, slime, clouds, and muck.’ [51] Reflecting on the work of Noël Carroll, [52] Rosi Braidotti argues that if science fiction horror ‘is based on the disturbance of cultural norms, it is then ideally placed to represent states of crisis and change and to express the widespread anxiety of our times. As such this genre is as unstoppable as the transformations it mirrors.’ [53]  

References [1] John Carpenter, The Thing, Film, Sci-Fi Horror (Universal Pictures, 1982). [2]  Kelly Hurley, The Gothic Body: Sexuality, Materialism, and Degeneration at the Fin de Siècle (Cambridge University Press, 2004), 3. [3]  B. Creed, ‘Horror and the Monstrous-Feminine: An Imaginary Abjection.’ Screen 27, no. 1 (1 January 1986): 44–71. [4]  Rosi Braidotti, Metamorphoses: Towards a Materialist Theory of Becoming (Wiley, 2002), 192–94. [5]  Ian Conrich and David Woods, eds., The Cinema Of John Carpenter: The Technique Of Terror (Wallflower Press, 2004), 81. [6]  Julia Kristeva, quoted in Jackie Stacey, Teratologies: A Cultural Study of Cancer (Routledge, 2013), 89. [7]  The character McReady becomes MacReady in Carpenter’s 1982 retelling of the story. [8]  Campbell, Who Goes There?, 107. [9]  Noël Carroll, The Philosophy of Horror, Or, Paradoxes of the Heart (New York: Routledge, 1990). [10] Michael Hardt and Antonio Negri, Empire, New Ed (Harvard University Press, 2001), 299. [11] Braidotti, Metamorphoses, 195. [12] Kelly Hurley, ‘Reading like an Alien: Posthuman Identity in Ridley Scott’s Aliens and David Cronenberg’s Rabid,’ in Posthuman Bodies, ed. Judith M. Halberstam and Ira Livingston (Bloomington: John Wiley & Sons, 1996), 219. [13] This distinction was plucked, out of context, from Adrian MacKenzie, Transductions: Bodies and Machines at Speed (A&C Black, 2006), 17. MacKenzie is not talking about The Thing, but this distinction is, nonetheless, very useful in bridging the divide between stable being and endless becoming. [14] Jean Baudrillard, Simulations, trans. Paul Foss, Paul Patton, and Philip Beitchman (Semiotext (e) New York, 1983), 146. [15] Eugene Thacker, ‘Nekros; Or, The Poetics Of Biopolitics,’ Incognitum Hactenus 3, no. Living On: Zombies (2012): 35. [16] Ibid., 29. [17] Braidotti, Metamorphoses, 195. [18] Hélène Cixous, The Newly Born Woman (University of Minnesota Press, 1986), 71. [19] Nato Thompson et al., eds., The Interventionists: Users’ Manual for the Creative Disruption of Everyday Life (North Adams, Mass. : Cambridge, Mass: MASS MoCA ; Distributed by the MIT Press, 2004), 151. [20] John Carpenter, BBC Web exclusive: Bringing The Thing to life, Invasion, Tomorrow’s Worlds: The Unearthly History of Science Fiction, 14 November 2014. [21] Thacker, ‘Nekros; Or, The Poetics Of Biopolitics,’ 35. [22] Ian Conrich and David Woods, eds., The Cinema Of John Carpenter: The Technique Of Terror (Wallflower Press, 2004), 96. [23] Mark Fisher, The Weird and the Eerie, 2016, 13. [24] Eugene Thacker, After Life (University of Chicago Press, 2010), 23. [25] Mikhail Mikhaĭlovich Bakhtin, Rabelais and His World (Indiana University Press, 1984), 321. [26] Ibid., 317. [27] Ibid., 166. [28] This sentence is a paraphrased, altered version of a similar line from Eugene Thacker, ‘Nine Disputations on Theology and Horror,’ Collapse: Philosophical Research and Development IV: 38. [29] Claire Colebrook, Sex After Life: Essays on Extinction, Vol. 2 (Open Humanities Press, 2014), 14. [30] Eugene Thacker, ‘The Sight of a Mangled Corpse—An Interview with’, Scapegoat Journal, no. 05: Excess (2013): 380. [31] Vincent Canby, ‘“The Thing” Is Phony and No Fun,’ The New York Times, 25 June 1982, sec. Movies. [32] Derrida, ‘Passages: From Traumatism to Promise,’ 385–86. [33] Thacker, ‘The Sight of a Mangled Corpse—An Interview with,’ 380. [34] Braidotti, Metamorphoses, 180. [35] Ibid. [36] Ibid., 174. [37] Rosi Braidotti, ‘Teratologies’, in Deleuze and Feminist Theory, ed. Claire Colebrook and Ian Buchanan (Edinburgh: Edinburgh University Press, 2000), 172. [38] A term coined in the 1980s by ecologist Eugene F. Stoermer and widely popularized in the 2000s by atmospheric chemist Paul Crutzen. The Anthropocene is, according to Jan Zalasiewicz et al., ‘a distinctive phase of Earth’s evolution that satisfies geologist’s criteria for its recognition as a distinctive statigraphic unit.’ – Jan Zalasiewicz et al., ‘Are We Now Living in the Anthropocene,’ GSA Today 18, no. 2 (2008): 6. [39] Claire Colebrook, Death of the PostHuman: Essays on Extinction, Vol. 1 (Open Humanities Press, 2014), 28. [40] McKenzie Wark, ‘Anthropocene Futures’ Versobooks.com, 23 February 2015. [41] Ibid. [42] Donna Haraway, ‘Capitalocene, Chthulucene: Staying with the Trouble’ (University of California at Santa Cruz, 5 September 2014). [43] Leif Haven, ‘We’ve All Always Been Lichens: Donna Haraway, the Cthulhucene, and the Capitalocene,’ ENTROPY, 22 September 2014. [44] Donna Haraway, ‘SF: Sympoiesis, String Figures, Multispecies Muddles’ (University of Alberta, Edmonton, Canada, 24 March 2014). [45] H. P Lovecraft, The New Annotated H.P. Lovecraft, ed. Leslie S Klinger (Liveright, 2014), xiii. [46] Claire Colebrook, Sex After Life: Essays on Extinction, Vol. 2 (Open Humanities Press, 2014), 22. [47] Drucilla Cornell and Stephen D Seely, The Spirit of Revolution: Beyond the Dead Ends of Man (Polity press, 2016), 5. [48] Ibid., 3–4. [49] Claire Colebrook, Death of the PostHuman: Essays on Extinction, Vol. 1 (Open Humanities Press, 2014), 163–64. [50] Wark, ‘Anthropocene Futures.’ [51] Thacker, In the Dust of This Planet, 9. [52]   Carroll, The Philosophy of Horror, Or, Paradoxes of the Heart. [53]   Braidotti, Metamorphoses, 185 (my emphasis).

]]>
Sun, 26 Feb 2017 04:43:01 -0800 https://machinemachine.net/portfolio/sonic-acts-2017-the-noise-of-becoming-on-monsters-men-and-every-thing-in-between/
<![CDATA[Vaporwave: A Brief History]]> http://www.youtube.com/watch?v=PdpP0mXOlWM

A brief look at the history of Vaporwave. - Music (In order of appearance): Diskette Romances - Fentanyl Flowers Kenny G - Songbird Chuck Person - A1 James Ferraro - Palm Trees, Wi-Fi and Dream Sushi Macintosh Plus - リサフランク420 / 現代のコンピュー B L U E - \ Visual \ // Entropy // Internet Club - BY DESIGN ░▒▓【ALL CAPS AND αւτ kεÿ CΘᕸEᔕ™】░▒▓ - DAE le 90's Kid 骨架的 - life Blank Banshee - Ammonia Clouds Infinity Frequencies - Lotus Bloom Eco Virtual - Cumulus Fractus Hong Kong Express - Your drink, sir 鸡尾酒酒吧 식료품groceries - Aisle 3 (Summits, Clouds, and Greener Grass) GOLDEN LIVING ROOM - DREAMS Vaperror - Surf Saint Pepsi - Cherry Pepsi - マクロスMACROSS 82-99 - 水野 亜美AMY Yung Bae - Bae City Rollaz (w/ИΔΤVИ) Disconscious - Endless Escalation 猫 シ Corp. - B5 - Second Floor 死夢VANITY - nightlife §E▲ ▓F D▓G§ - ASCENDING THE DARKEST FUTURE - 無題 2814 - 恢复

]]>
Wed, 10 Jun 2015 21:00:00 -0700 http://www.youtube.com/watch?v=PdpP0mXOlWM
<![CDATA[We’ve All Always Been Lichens: Donna Haraway, the Cthulhucene, and the Capitalocene | ENTROPY]]> http://entropymag.org/weve-all-always-been-lichens-donna-haraway-the-cthulhucene-and-the-capitalocene/

Haraway gave a talk called “Anthropocene, Capitalocene, Cthulucene: Staying With the Trouble” earlier this year at the University of California at Santa Cruz, where she is a Distinguished Professor Emeritus.

]]>
Tue, 24 Mar 2015 18:07:34 -0700 http://entropymag.org/weve-all-always-been-lichens-donna-haraway-the-cthulhucene-and-the-capitalocene/
<![CDATA[There's Not Much 'Glitch' In Glitch Art | Motherboard]]> http://motherboard.vice.com/blog/theres-not-much-glitch-in-glitch-art

Artist Daniel Temkin has been creating and discussing glitch art for over seven years. In that time, he's exhibited in solo and group shows, and had his work featured in Rhizome and Fast Company, amongst other publications. For Temkin, glitch art is about the disruption of algorithms, though algorithmic art is a bit of a misnomer. He prefers "algo-glitch demented" in describing the methods, aesthetics, and philosophy of glitch.

In January, Temkin published a fascinating glitch art essay on NOOART titled "Glitch && Human/Computer Interaction." There he laid down the philosophy and "mythology" of glitch, which had really started in a series of email conversations with Hugh Manon. Though there is no shortage of writings on glitch art, many aspects of the these texts didn't address what Temkin loved most about how it is created.

"The glitch aesthetic may be rooted in the look of malfunction, but when it comes to actual practice, there’s often not much glitch in glitch art," wrote Temkin in the essay. "Yes, some glitch artists are actually exploiting bugs to get their results — but for most it would be more accurate to describe these methods as introducing noisy data to functional algorithms or applying these algorithms in unconventional ways." This, he said, doesn't make it traditional algorithmic art (algorithm-designed artworks), but a more demented form of it—algo-glitch demented.

Over a series of email conversations, Temkin elaborated on some of his conclusions in "Glitch && Human/Computer Interaction." Aside from highlighting some of the best algo-glitch demented art, Temkin also talked about bad data, image hacking, and why computers are no less "image makers" than humans even though they aren't sentient (yet).

MOTHERBOARD: Aside from being an artist working in glitch, would you say that you've also sort of become a philosopher of glitch or algorithmic art, if there is such a thing?

Temkin: There's tons of writing on glitch, much of it very good (Lab404.com, for instance), but some aspects of glitch theory didn't jibe with what really interested me about the style. Originally, Hugh Manon and I started a long email conversation about glitch, which evolved into our 2011 paper. It ranged across glitch aesthetics, methodology, and issues around authorship, while delving into glitch's ambivalence about error—the way the glitch is possible because of software's ability to "fail to fully fail" when coming across unexpected data.

We questioned why computer error is so emphasized in this form when nothing is really at stake in a digital file (a deleted but endlessly reproducible JPEG has none of the aura of an Erased DeKooning), and what it means to purposely simulate an error, something that ordinarily has power because it is unexpected and outside of our control.

Ted Davis, FFD8 project

These issues stuck with me, until I considered Clement Valla's familiar quote about his Postcards From Google Earth project: that "these images are not glitches... they are the absolute logical result of the system." It was a familiar quote, but in this instance got me thinking about how most glitchwork can be described the same way—as products of perfectly functional systems.

I wrote my recent piece for NOOART, arguing that glitch's preoccupation with error doesn't always serve it well, that it limits the scope of what's produced and how we talk about it. Bypassing computer error opened new avenues of investigation about our relationship both with technology and with logic systems more generally, and got at what interested me more about the style we call glitch.

In the NOOART essay, you write: "Some glitch artists are actually exploiting bugs to get their results — but for most it would be more accurate to describe these methods as introducing noisy data to functional algorithms or applying these algorithms in unconventional ways." Can you elaborate on that point?

In the paper, I discuss JPEG corruption, one of the fundamental glitch techniques. Introduce bad data to a JPEG file, and you'll see broken-looking images emerge. I use this example because it's so familiar to glitch practice. JPEG is not just a file format but an algorithm that compresses/decompresses image data.

When we "corrupt" a JPEG, we're altering compressed data so that it (successfully) renders to an image that no longer appears photographic, taking on a chunky, pixelated, more abstract character we associate with broken software. To the machine, it is not an error—if the image were structurally damaged, we would not be able to open it. This underscores the machine as an apparatus indifferent to what makes visual sense to us, at a place where our expectations clash with algorithmic logic.

Daniel Temkin, Dither Studies #2, 2011

The excitement of altering JPEG data directly is the sense of image hacking—making changes at the digital level without being able to predict the outcome. This becomes more apparent in other glitch techniques, such as sonification, which add layers of complexity to the process. Giving up control to a system or process has a long history in art.

Gerhard Richter describes committing to a systematic approach, veiling the work from conscious decisions that may ruin or limit it. As he puts it, "if the execution works, this is only because I partly destroy it, or because it works in spite of everything—by not detracting and by not looking the way I planned" [p179, Gerhard Richter, Panorama]. In digital art, we often function in an all-too-WYSIWYG environment. Glitch frees us from this, bringing us to unexpected places.

Can you draw a distinction between generative art (which can feature algorithms) and your concept of algo-glitch demented?

I call it algo-glitch demented, as opposed to algorithmic art (which I understand meaning generative art that uses algorithms). I'll have to paraphrase Philip Galanter and say that generative art is any practice where the artist sets a system "in motion with some degree of autonomy," resulting in a work.

"Glitch is a cyborg art, building on human/computer interaction. The patterns created by these unknown processes is what I call the wilderness within the machine." What makes algo-glitch demented is how we misuse existing algorithms, running them in contexts that had never been intended by their designers. Furthermore, there are moments of autonomy in algo-glitch, but this autonomy is not what defines it as algo-glitch; what's more important is the control we give up to the process.

You call glitch art a collaboration with the machine. That's an interesting point because the human is conscious of this, while the machine is not. Or, do you have another way of looking at that collaboration?

Machines are not sentient, but they are image-makers. Trevor Paglen, in a recent Frieze Magazine piece, says we are now or very soon to be at the point "where the majority of the world’s images are made by-machines-for-machines," and "seeing with the meat-eyes of our human bodies is increasingly the exception," refering to facial-recognition systems, qr code readers, and a host of other automation.

One of the most compelling ideas to come from James Bridle's New Aesthetic is how we can treat the machine as having a vision—even as we know it's not sentient—and just how strange this vision is, that does not hold human beings as its audience.

Jeff Donaldson, panasonic wj-mx12 video feedback, 2012

Glitch artists have been doing this for a long time, treating it as an equal collaborator and seeing where it leads us as we cede control to broken processes and zombie algorithms. Curt Cloninger describes it as "painting with a very blunt brush that has a mind of its own;" in this way, glitch is a cyborg art, building on human/computer interaction. The patterns created by these unknown processes is what I call the wilderness within the machine.

Can you talk about glitch as mythology? I've never heard it described as such.

I'm probably being a bit obnoxious there, using mythology to describe the gap between how we talk about glitch and what we're actually doing. There are several strains of work within glitch or that overlap with glitch. There is Dirty New Media, which is related to noise-based work; materialist explorations; the algo-glitch I've emphasized in the JPEG example; and what we might call "minimal slippage glitch" (a term that arose in a Facebook discussion between me and Rosa Menkman).

Minimal Slippage fits a familiar contemporary art scenario of the single gesture that puts things in motion and reveals something new. It's great when things actually work this way, but when this language is used to describe work made by manipulating data repeatedly, there's a problem.

I also take issue with the term glitch art. I don't propose we replace it, only to be more conscious of its influence. If we produce work with other visual styles using glitch processes, why limit ourselves to work that has an error-strewn appearance? This connection begins to seems artificial. I kept this in mind with my Glitchometry series. I use the sonification technique to process simple geometric shapes (b&w squares and triangles, etc.) into works that range from somewhat glitchy to abstractions that fall very far from a glitch aesthetic. They emphasize process, the back-and-forth with the machine, and an anxiety about giving up that control.

Clement Valla, from “Iconoclashes” 2013

With Glitchometry Stripes (an extension of the Glitchometry work), the results are even less glitchy in appearance; this time using only sound effects that cleanly transform the lines, ending up with Op Art-inspired, crisply graphic works that create optical buzzing when scrolled across the screen.

You mention Ted Davis's FFD8 project in your essay. What is it about the work that you like?

FFD8 is JPEG image hacking, with protection against messing up the header (which would make the image undisplayable). It's a gentle introduction to glitching, but it illustrates how it works, which encourages one to go deeper. I'm suspicious of glitch software that does all the work for you, essentially turning glitch styling into the equivalent of a Photoshop filter. With FFD8, enough of the process is exposed that folks starting out in the style might decide to take the next step and mess with raw files directly, or build their own software, or discover some new avenue to create work.

What's your opinion on something like the iPhone's panorama function, which, if you move the camera fast or in unexpected directions, creates glitches? It's movement-based as opposed to other types of glitch.

I think someone will come along with a brilliant idea of how to use it to do something fresh and interesting. One interesting work that uses photo-stitching (although not on the iPhone) is Clement Valla's Iconoclasts series. He loads images of gods from the Met's collection and lets Photoshop decide how to combine them, creating improbable composites, many physically impossible. It works because of how carefully the objects were photographed. Each is lit the same way with the same background. Many of these religious relics come from cultures where it was believed that such objects were not created by human hands. Now an algorithm, also not human, decides how to combine them to construct new artifacts.

Daniel Temkin, Glitchometry Circles #6, 2013

Where do you feel you've been most successful in your own projects?

I never trust artists to tell me which of their works are more successful. [laughs] I'll tell you the theme I'm most interested in. Much of my work revolves around this clash between human thinking and computer logic, and the compulsiveness that comes from trying to think in a logical way. My own experience with this comes from programming, which is my background from before art. Glitch gives me a way to create chaotic works as a release from the overly structured thinking programming requires.

As a few examples of work that deals with this, my Dither Studies expose the seemingly irrational patterns that come from the very simple rules of dithering patterns. They began as a collaboration with Photoshop, where I asked it to dither a solid color with two incompatible colors. From there, I constructed a web tool that walks through progressions of dithers.

In Drunk Eliza, I re-coded the classic chat bot using my language Entropy, where all data is unstable. Since the original Eliza has such a small databank of phrases, yet so clearly has a personality, I wanted to know how she would seem with her mind slowly disintegrating, HAL-style. Drunk Eliza was the result. The drunken responses she gets online have been a great source of amusement for me.

]]>
Tue, 18 Mar 2014 12:45:15 -0700 http://motherboard.vice.com/blog/theres-not-much-glitch-in-glitch-art
<![CDATA[Four Notes Towards Post-Digital Propaganda | post-digital-research]]> http://post-digital.projects.cavi.dk/?p=475

“Propaganda is called upon to solve problems created by technology, to play on maladjustments and to integrate the individual into a technological world” (Ellul xvii).

How might future research into digital culture approach a purported “post-digital” age? How might this be understood?

1.

A problem comes from the discourse of ‘the digital’ itself: a moniker which points towards units of Base-2 arbitrary configuration, impersonal architectures of code, massive extensions of modern communication and ruptures in post-modern identity. Terms are messy, and it has never been easy to establish a ‘post’ from something, when pre-discourse definitions continue to hang in the air. As Florian Cramer has articulated so well, ‘post-digital’ is something of a loose, ‘hedge your bets’ term, denoting a general tendency to criticise the digital revolution as a modern innovation (Cramer).

Perhaps it might be aligned with what some have dubbed “solutionism” (Morozov) or “computationalism” (Berry 129; Golumbia 8): the former critiquing a Silicon Valley-led ideology oriented towards solving liberalised problems through efficient computerised means. The latter establishing the notion (and critique thereof) that the mind is inherently computable, and everything associated with it. In both cases, digital technology is no longer just a business that privatises information, but the business of extending efficient, innovative logic to all corners of society and human knowledge, condemning everything else through a cultural logic of efficiency.

In fact, there is a good reason why ‘digital’ might as well be an synonym for ‘efficiency’. Before any consideration is assigned to digital media objects (i.e. platforms, operating systems, networks), consider the inception of ‘the digital’ inception as such: that is information theory. If information was a loose, shabby, inefficient method of vagueness specific to various mediums of communication, Claude Shannon compressed all forms of communication into a universal system with absolute mathematical precision (Shannon). Once information became digital, the conceptual leap of determined symbolic logic was set into motion, and with it, the ‘digital’ became synonymous with an ideology of effectivity. No longer would miscommunication be subject to human finitude, nor be subject to matters of distance and time, but only the limits of entropy and the matter of automating messages through the support of alternating ‘true’ or ‘false’ relay systems.

However, it would be quite difficult to envisage any ‘post-computational’ break from such discourses – and with good reason: Shannon’s breakthrough was only systematically effective through the logic of computation. So the old missed encounter goes: Shannon presupposed Alan Turing’s mathematical idea of computation to transmit digital information, and Turing presupposed Shannon’s information theory to understand what his Universal Turing Machines were actually transmitting. The basic theories of both have not changed, but the materials affording greater processing power, extensive server infrastructure and larger storage space have simply increased the means for these ideas to proliferate, irrespective of what Turing and Shannon actually thought of them (some historians even speculate that Turing may have made the link between information and entropy two years before Bell Labs did) (Good).

Thus a ‘post-digital’ reference point might encompass the historical acknowledgment of Shannon’s digital efficiency, and Turing’s logic but by the same measure, open up a space for critical reflection, and how such efficiencies have transformed not only work, life and culture but also artistic praxis and aesthetics. This is not to say that digital culture is reducibly predicated on efforts made in computer science, but instead fully acknowledges these structures and accounts for how ideologies propagate reactionary attitudes and beliefs within them, whilst restricting other alternatives which do not fit their ‘vision’. Hence, the post-digital ‘task’ set for us nowadays might consist in critiquing digital efficiency and how it has come to work against commonality, despite transforming the majority of Western infrastructure in its wake.

The purpose of these notes is to outline how computation has imparted an unwarranted effect of totalised efficiency, and to label this effect the type of description it deserves: propaganda. The fact that Shannon and Turing had multiple lunches together at Bell labs in 1943, held conversations and exchanged ideas, but not detailed methods of cryptanalysis (Price & Shannon) provides a nice contextual allegory for how digital informatics strategies fail to be transparent.

But in saying this, I do not mean that companies only use digital networks for propagative means (although that happens), but that the very means of computing a real concrete function is constitutively propagative. In this sense, propaganda resembles a post-digital understanding of what it means to be integrated into an ecology of efficiency, and how technical artefacts are literally enacted as propagative decisions. Digital information often deceives us into accepting its transparency, and of holding it to that account: yet in reality it does the complete opposite, with no given range of judgements available to detect manipulation from education, or persuasion from smear. It is the procedural act of interacting with someone else’s automated conceptual principles, embedding pre-determined decisions which not only generate but pre-determine ones ability to make choices about such decisions, like propaganda.

This might consist in distancing ideological definitions of false consciousness as an epistemological limit to knowing alternatives within thought, to engaging with a real programmable systems which embeds such limits concretely, withholding the means to transform them. In other words, propaganda incorporates how ‘decisional structures’ structure other decisions, either conceptually or systematically.

2.

Two years before Shannon’s famous Masters thesis, Turing published what would be a theoretical basis for computation in his 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem.” The focus of the paper was to establish the idea of computation within a formal system of logic, which when automated would solve particular mathematical problems put into function (Turing, An Application). What is not necessarily taken into account is the mathematical context to that idea: for the foundations of mathematics were already precarious, way before Turing outlined anything in 1936. Contra the efficiency of the digital, this is a precariousness built-in to computation from its very inception: the precariousness of solving all problems in mathematics.

The key word of that paper, its key focus, was on the Entscheidungsproblem, or decision problem. Originating from David Hilbert’s mathematical school of formalism, ‘decision’ means something more rigorous than the sorts of decisions in daily life. It really means a ‘proof theory’, or how analytic problems in number theory and geometry could be formalised, and thus efficiently solved (Hilbert 3). Solving a theorem is simply finding a provable ‘winning position’ in a game. Similar to Shannon, ‘decision’ is what happens when an automated system of function is constructed in such a sufficiently complex way, that an algorithm can always ‘decide’ a binary, yes or no answer to a mathematical problem, when given an arbitrary input, in a sufficient amount of time. It does not require ingenuity, intuition or heuristic gambles, just a combination of simple consistent formal rules and a careful avoidance of contradiction.

The two key words there are ‘always’ and ‘decide’. The progressive end-game of twentieth century mathematicians who, like Hilbert, sought after a simple totalising conceptual system to decide every mathematical problem and work towards absolute knowledge. All Turing had to do was make explicit Hilbert’s implicit computational treatment of formal rules, manipulate symbol strings and automate them using an ’effective’ or “systematic method” (Turing, Solvable and Unsolvable Problems 584) encoded into a machine. This is what Turing’s thesis meant (discovered independently to Alonzo Church’s equivalent thesis (Church)): any systematic algorithm solved by a mathematical theorem can be computed by a Turing machine (Turing, An Application), or in Robin Gandy’s words, “[e]very effectively calculable function is a computable function” (Gandy).

Thus effective procedures decide problems, and they resolve puzzles providing winning positions (like theorems) in the game of functional rules and formal symbols. In Turing’s words, “a systematic procedure is just a puzzle in which there is never more than one possible move in any of the positions which arise and in which some significance is attached to the final result” (Turing, Solvable and Unsolvable Problems 590). The significance, or the winning position, becomes the crux of the matter for the decision: what puzzles or problems are to be decided? This is what formalism attempted to do: encode everything through the regime of formalised efficiency, so that all of mathematically inefficient problems are, in principle, ready to be solved. Programs are simply proofs: if it could be demonstrated mathematically, it could be automated.

In 1936, Turing had showed some complex mathematical concepts of effective procedures could simulate the functional decisions of all the other effective procedures (such as the Universal Turing Machine). Ten years later, Turing and John von Neumann would independently show how physical general purpose computers, offered the same thing and from that moment on, efficient digital decisions manifested themselves in the cultural application of physical materials. Before Shannon’s information theory offered the precision of transmitting information, Hilbert and Turing developed the structure of its transmission in the underlying regime of formal decision.

Yet, there was also a non-computational importance here, for Turing was also fascinated by what decisions couldn’t compute. His thesis was quite precise, so as to elucidate that if no mathematical problem could be proved, a computer was not of any use. In fact, the entire focus of his 1936 paper, often neglected by Silicon Valley cohorts, was to show that Hilbert’s particular decision problem could not be solved. Unlike Hilbert, Turing was not interested in using computation to solve every problem, but as a curious endeavour for surprising intuitive behaviour. The most important of all, Turing’s halting, or printing problem was influential, precisely as it was undecidable; a decision problem which couldn’t be decided.

We can all picture the halting problem, even obliquely. Picture the frustrated programmer or mathematician starting at their screen, waiting to know when an algorithm will either halt and spit out a result, or provide no answer. The computer itself has already determined the answer for us, the programmer just has to know when to give up. But this is a myth, inherited with a bias towards human knowledge, and a demented understanding of machines as infinite calculating engines, rather than concrete entities of decision. For reasons that escape word space, Turing didn’t understand the halting problem in this way: instead he understood it as a contradictory example of computational decisions failing to decide on each other, on the account that there could never be one totalising decision or effective procedure. There is no guaranteed effective procedure to decide on all the others, and any attempt to build one (or invest in a view which might help build one), either has too much investment in absolute formal reason, or it ends up with ineffective procedures.

Undecidable computation might be looked at as a dystopian counterpart against the efficiency of Shannon’s ‘digital information’ theory. A base 2 binary system of information resembling one of two possible states, whereby a system can communicate with one digit, only in virtue of the fact that there is one other digit alternative to it. Yet the perfect transmission of that information, is only subject to a system which can ‘decide’ on the digits in question, and establish a proof to calculate a success rate. If there is no mathematical proof to decide a problem, then transmitting information becomes problematic for establishing a solution.

3.

What has become clear is that our world is no longer simply accountable to human decision alone. Decisions are no longer limited to the borders of human decisions and ‘culture’ is no longer simply guided by a collective whole of social human decisions. Nor is it reducible to one harmonious ‘natural’ collective decision which prompts and pre-empts everything else. Instead we seem to exist in an ecology of decisions: or better yet decisional ecologies. Before there was ever the networked protocol (Galloway), there was the computational decision. Decision ecologies are already set up before we enter the world, implicitly coterminous with our lives: explicitly determining a quantified or bureaucratic landscape upon which an individual has limited manoeuvrability.

Decisions are not just digital, they are continuous as computers can be: yet decisions are at their most efficient when digitally transferred. Decisions are everywhere and in everything. Look around. We are constantly told by governments and states that are they making tough decisions in the face of austerity. CEOs and Directors make tough decisions for the future of their companies and ‘great’ leaders are revered for being ‘great decisive leaders’: not just making decisions quickly and effectively, but also settling issues and producing definite results.

Even the word ‘decide’, comes from the Latin origin of ‘decidere’, which means to determine something and ‘to cut off.’ Algorithms in financial trading know not of value, but of decision: whether something is marked by profit or loss. Drones know not of human ambiguity, but can only decide between kill and ignore, cutting off anything in-between. Constructing a system which decides between one of two digital values, even repeatedly, means cutting off and excluding all other possible variables, leaving a final result at the end of the encoded message. Making a decision, or building a system to decide a particular ideal or judgement must force other alternatives outside of it. Decisions are always-already embedded into the framework of digital action, always already deciding what is to be done, how it can be done or what is threatening to be done. It would make little sense to suggest that these entities ‘make decisions’ or ‘have decisions’, it would be better to say that they are decisions and ecologies are constitutively constructed by them.

The importance of neo-liberal digital transmissions are not that they become innovative, or worthy of a zeitgeist break: but that they demonstrably decide problems whose predominant significance is beneficial for self-individual efficiency and accumulation of capital. Digital efficiency is simply about the expansion of automating decisions and what sort of formalised significances must be propagated to solve social and economic problems, which creates new problems in a vicious circle.

The question can no longer simply be ‘who decides’, but now, ‘what decides?’ Is it the cafe menu board, the dinner party etiquette, the NASDAQ share price, Google Pagerank, railway network delays, unmanned combat drones, the newspaper crossword, the javascript regular expression or the differential calculus? It’s not quite right to say that algorithms rule the world, whether in algo-trading or in data capture, but the uncomfortable realisation that real entities are built to determine provable outcomes time and time again: most notably ones for cumulating profit and extracting revenue from multiple resources.

One pertinent example: consider George Dantzig’s simplex algorithm: this effective procedure (whose origins began in multidimensional geometry) can always decide solutions for large scale optimisation problems which continually affect multi-national corporations. The simplex algorithm’s proliferation and effectiveness has been critical since its first commercial application in 1952, when Abraham Charnes and William Cooper used it to decide how best to optimally blend four different petroleum products at the Gulf Oil Company (Elwes 35; Gass & Assad 79). Since then the simplex algorithm has had years of successful commercial use, deciding almost everything from bus timetables and work shift patterns to trade shares and Amazon warehouse configurations. According to the optimisation specialist Jacek Gondzio, the simplex algorithm runs at “tens, probably hundreds of thousands of calls every minute” (35), always deciding the most efficient method of extracting optimisation.

In contemporary times, nearly all decision ecologies work in this way, accompanying and facilitating neo-liberal methods of self-regulation and processing all resources through a standardised efficiency: from bureaucratic methods of formal standardisation, banal forms ready to be analysed one central system, to big-data initiatives and simple procedural methods of measurement and calculation. The technique of decision is a propagative method of embedding knowledge, optimisation and standardisation techniques in order to solve problems and an urge to solve the most unsolvable ones, including us.

Google do not build into their services an option to pay for the privilege of protecting privacy: the entire point of providing a free service which purports to improve daily life, is that it primarily benefits the interests of shareholders and extend commercial agendas. James Grimmelmann gave a heavily detailed exposition on Google’s own ‘net neutrality’ algorithms and how biased they happen to be. In short, PageRank does not simply decide relevant results, it decides visitor numbers and he concluded on this note.

With disturbing frequency, though, websites are not users’ friends. Sometimes they are, but often, the websites want visitors, and will be willing to do what it takes to grab them (Grimmelmann 458).

If the post-digital stands for the self-criticality of digitalisation already underpinning contemporary regimes of digital consumption and production, then its saliency lies in understanding the logic of decision inherent to such regimes. The reality of the post-digital, shows that machines remain curiously efficient whether we relish in cynicism or not. Such regimes of standardisation and determined results, were already ‘mistakenly built in’ to the theories which developed digital methods and means, irrespective of what computers can or cannot compute.

4.

Why then should such post-digital actors be understood as instantiations of propaganda? The familiarity of propaganda is manifestly evident in religious and political acts of ideological persuasion: brainwashing, war activity, political spin, mind control techniques, subliminal messages, political campaigns, cartoons, belief indoctrination, media bias, advertising or news reports. A definition of propaganda might follow from all of these examples: namely, the systematic social indoctrination of biased information that persuades the masses to take action on something which is neither beneficial to them, nor in their best interests: or as Peter Kenez writes, propaganda is “the attempt to transmit social and political values in the hope of affecting people’s thinking, emotions, and thereby behaviour” (Kenez 4) Following Stanley B. Cunningham’s watered down definition, propaganda might also denote a helpful and pragmatic “shorthand statement about the quality of information transmitted and received in the twentieth century” (Cunningham 3).

But propaganda isn’t as clear as this general definition makes out: in fact what makes propaganda studies such a provoking topic is that nearly every scholar agrees that no stable definition exists. Propaganda moves beyond simple ‘manipulation’ and ‘lies’ or derogatory, jingoistic representation of an unsubtle mood – propaganda is as much about the paradox of constructing truth, and the irrational spread of emotional pleas, as well as endorsing rational reason. As the master propagandist William J. Daugherty wrote;

It is a complete delusion to think of the brilliant propagandist as being a professional liar. The brilliant propagandist […] tells the truth, or that selection of the truth which is requisite for his purpose, and tells it in such a way that the recipient does not think that he is receiving any propaganda…. (Daugherty 39).

Propaganda, like ideology works by being inherently implicit and social. In the same way that post-ideology apologists ignore their symptom, propaganda is also ignored. It isn’t to be taken as a shadowy fringe activity, blown apart by the democratising fairy-dust of ‘the Internet’. As many others have noted, the purported ‘decentralising’ power of online networks, offer new methods for propagative techniques, or ‘spinternet’ strategies, evident in China (Brady). Iran’s recent investment into video game technology only makes sense, only when you discover that 70% of Iran’s population are under 30 years of age, underscoring a suitable contemporary method of dissemination. Similarly in 2011, the New York City video game developer Kuma Games was mired in controversy when it was discovered that an alleged CIA agent, Amir Mirza Hekmati, had been recruited to make an episodic video game series intending to “change the public opinion’s mindset in the Middle East.” (Tehran Times). The game in question, Kuma\War (2006 – 2011) was a free-to-play First-Person Shooter series, delivered in episodic chunks, the format of which attempted to simulate biased re-enactments of real-life conflicts, shortly after they reached public consciousness.

Despite his unremarkable leanings towards Christian realism, Jacques Ellul famously updated propaganda’s definition as the end product of what he previously lamented as ‘technique’. Instead of viewing propaganda as a highly organised systematic strategy for extending the ideologues of peaceful warfare, he understood it as a general social phenomenon in contemporary society.

Ellul outlined two types: political and sociological propaganda: Political propaganda involves government, administrative techniques which intend to directly change the political beliefs of an intended audience. By contrast, sociological propaganda is the implicit unification of involuntary public behaviour which creates images, aesthetics, problems, stereotypes, the purpose of which aren’t explicitly direct, nor overtly militaristic. Ellul argues that sociological propaganda exists; “in advertising, in the movies (commercial and non-political films), in technology in general, in education, in the Reader’s Digest; and in social service, case work, and settlement houses” (Ellul 64). It is linked to what Ellul called “pre” or “sub-propaganda”: that is, an imperceptible persuasion, silently operating within ones “style of life” or permissible attitude (63). Faintly echoing Louis Althusser’s Ideological State Apparatuses (Althusser 182) nearly ten years prior, Ellul defines it as “the penetration of an ideology by means of its sociological context.” (63) Sociological propaganda is inadequate for decisive action, paving the way for political propaganda – its strengthened explicit cousin – once the former’s implicitness needs to be transformed into the latter’s explicitness.

In a post-digital world, such implicitness no longer gathers wartime spirits, but instead propagates a neo-liberal way of life that is individualistic, wealth driven and opinionated. Ellul’s most powerful assertion is that ‘facts’ and ‘education’ are part and parcel of the sociological propagative effect: nearly everyone faces a compelling need to be opinionated and we are all capable of judging for ourselves what decisions should be made, without at first considering the implicit landscape from which these judgements take place. One can only think of the implicit digital landscape of Twitter: the archetype for self-promotion and snippets of opinions and arguments – all taking place within Ellul’s sub-propaganda of data collection and concealment. Such methods, he warns, will have “solved the problem of man” (xviii).

But information is of relevance here, and propaganda is only effective within a social community when it offers the means to solve problems using the communicative purview of information:

Thus, information not only provides the basis for propaganda but gives propaganda the means to operate; for information actually generates the problems that propaganda exploits and for which it pretends to offer solutions. In fact, no propaganda can work until the moment when a set of facts has become a problem in the eyes of those who constitute public opinion (114).

]]>
Wed, 11 Dec 2013 15:42:45 -0800 http://post-digital.projects.cavi.dk/?p=475
<![CDATA[The Impulse of the Geocities Archive: One Terabyte Of Kilobyte Age]]> https://www.furtherfield.org/features/impulse-geocities-archive-one-terabyte-kilobyte-age#new_tab

I visited the Photographers’ Gallery in central London for Furtherfield, and reviewed their latest exhibit One Terabyte of Kilobyte Age by artists Olia Lialina and Dragan Espenschied, on THE WALL. Over an eight week period (18 April – 17 June 2013) they feature a non-stop stream of video captures of what they term as the lost city and its archival ruins. A documentation of a past visual culture of the web and the creativity of its users with new pages changing every 5 minutes. The project provides a glimpse into web publishing when users were in charge of design and narration in contrast to the automated templates of Facebook, YouTube and Flickr. Sifting through a dormant internet message board, or stumbling, awestruck, on a kippleised [1] html homepage, its GIF constellations still twinkling many years after the owner has abandoned them, is an encounter with the living, breathing World Wide Web. At such moments we are led, so argues Marisa Olson, ‘to consider the relationship between taxonomy à la the stuffed-pet metaphor and taxonomy à la the digital archive.’ [2] How such descript images, contrived jumbles of memory and experience, could once have felt so essential to the person who collated them, yet now seem so indecipherable, stagnant, even – dare we admit it – insane to anyone but the most hardened retro-web enthusiast. On show at London’s Photographers gallery until June 17th is an extensive archival exhibit designed to manage, reveal and keep these experiences alive. One Terabyte Of Kilobyte Age (1tb) is the fifth work to be commissioned for the Photographer Gallery’s ‘The Wall’, curated by two artists long associated with the era of the web the exhibition reveres: Olia Lialina and Dragan Espenschied. Perhaps best known for their book Digital Folklore (2009) the artists and retro-web evangelists have, with the 1tb project, strengthened their status as archivists, an impulse Hal Foster famously argued ‘concerned less with absolute origins than with obscure traces’ [3]. In the same year that Dragan and Olia launched their guide to the folk web, Yahoo! announced they were to close one of its greatest sources of inspiration: Geocities. A vast expanse of personal webpages, many of which had long since slid into html decrepitude, represented for Yahoo! little but financial embarrassment. So ancient and outmoded was Geocities that many contemporary browsers were incapable of capturing its essence, fragmenting images and link rolls randomly across modern laptop screens in an attempt to render their 800×600 pixel aura. Scraping and downloading the terabyte or so of data that made up the Geocities universe was thought important enough by some that a taskforce was put together, made up of technical wizards and wizardesses driven by the profound notion that all existent culture is worth saving. From Olia and Dragan’s webpage: In between the announcement and the official date of death a group of people calling themselves Archive Team — managed to rescue almost a terabyte of Geocities pages. On the 26th of October 2010, the first anniversary of this Digital Holocaust, the Archive Team started to seed geocities.archiveteam.torrent.

Olia and Dragan’s gesture, to feed the wealth of culture contained in that torrent back to the masses in a palatable form, is a project whose fruition at the Photographer’s Gallery is but a minor part. After downloading, storing and sorting the 16,000 archived Geocities sites the task of exactly how to display them is a problem. Since most browsers would mangle the look and feel of the Geocities pages Olia and Dragan have turned to two main methods of re-representation. The first, let loose on an automated Tumblr blog that updates over 70 times a day, is an ever growing series of front-page screen captures. In this form 1tb bends to the will of a contemporary web user who concerns themselves with likes, reposts and uplinks. Reflecting on the Tumblr-archive of the torrent-archive of the Geocities-archive, Olia and Dragan’s site contemporary-home-computing highlights particular screen captures that have garnered the most reposts and likes from their Tumblr followers. The results say much for the humour that still drives online culture, but perhaps little about the original contexts from whence those screen captures came. For instance, the screen captures that garner most attention are usually the ones that have failed a part of the retrieval/display/capture process. These ‘obscure traces’ may be GIF heavy sites, half loaded to interesting aesthetic affect, or, perhaps the most telling, captures that show nothing but the empty shell of a Netscape Navigator browser, caught forever like a millennium bug in digital amber.

The second mode of capture and re-display takes place at the Photographer’s Gallery itself. Depicted on nine large intersecting HD video screens set into ‘The Wall’ of the entrance-cum-café, one’s first experience of the exhibit is ponderous. The display cycles through the vast array of Geocities homepages at five minute intervals, giving viewers a more than generous dose of 800×600 px nostalgia. Whether the websites that fade into view are a barrage of animated GIFs,insightful commentary on life in the late 1990s, or a series of barren ‘Under Construction’ assemblages, is up to chance. As a reviewer, sent to derive something from the gallery experience, the wall leered at me with gestures that sent my inner taxonomist into a frenzy. Confronted with such tiny slithers of the archive, in such massive doses, it quickly becomes obvious that the real potential of the project has not been quite realised. Rather than static screen captures The Wall shows cleverly rendered quicktime videos, allowing the GIF whiskers of a Hello Kitty mascot to quiver once more. If you are lucky, or have the patience to watch a long series of the sites fade into view, you’ll be greeted by flickering ‘Welcome’ banners, by cartoon workmen tirelessly drilling, by unicorns cantering and sitemeter bars flashing. But The Wall also feels wholly at odds with its content, caught up in a whirl of web nostalgia that minimises the lives, experiences and aesthetic choices of a defining generation to static flashes that you can’t click on, no matter how much you want to. Archives are living, breathing entities wont to be probed for new meanings and interpretations. Whether depicted as static or faux animated, One Terabyte Of Kilobyte Age is a project with an endless surface, with little way for its viewers to delve deeper.

Trawling through the 1tb Tumblr is a much more visceral experience than the one that greets you at the Photographer’s Gallery, but the sense of a journey waiting to be embarked on is lost somewhat in the move to the Tumblr kingdom. Every five minutes offers a new chance to spot similarities on The Wall, to ponder on the origins of a site or, more profoundly, wonder where the people that toiled to make them are now. Before the days of user driven content, of Facebook timelines, and even before RSS feed aggregators, the whole web felt something like this. Today’s web is unarguably more dynamic, with a clean aesthetic that barely shifts behind the waves of content that wash over its surface. But the user has been relegated to shuffler of material. The Geocities homepage was designed, and kept updated by an army of amateur enthusiasts, organising bandwidth light GIFs in ever more meaningful arrays, in the unlikely event that another living soul would stumble upon them. There is much to love about One Terabyte Of Kilobyte Age, and much to be learned from it given the time. But part of me wishes that the Photographer’s Gallery had given over their trendy café to a row of beige Intel 486 computer stacks, their unwieldy tube monitors better capturing the spirit of the web alá 1996. The clash between the 90s amateur enthusiast and the avid content shuffler of the 2010s is inherent in the modes of display Olia and Dragan chose for their project. Beginning from a desire to save and reflect on our shared heritage, 1tb now represents itself as pure content. An impulse to probe the archive replaced by an impulse to scroll endlessly through Tumblr streams, clicking like buttons on screen captures we hope will distract/impress/outrage our friends until the next cat video refreshes into view. Go, go to the Photographer’s Gallery tomorrow, grab yourself a coffee and let the Geocities archive wash over you. If you can do it without Instagramming a snap to your friends, without updating your Facebook page with tales of your nostalgic reverie, if you can let the flickering screen captures do their own talking , only then can you claim you truly re-entered the kilobyte age.

References [1] ‘Kipple’ is a word coined by science fiction author Philip K. Dick to describe the entropy of physical forms, Dick’s comment on the contradictions of mass-production, utility and planned obsolescence. [2] Marisa Olson, “Lost Not Found: The Circulation of Images in Digital Visual Culture,” Words Without Pictures (September 18, 2008): 281. [3] Hal Foster, “An Archival Impulse,” October – (October 1, 2004): 5, doi:10.1162/0162287042379847.

]]>
Fri, 17 May 2013 04:16:13 -0700 https://www.furtherfield.org/features/impulse-geocities-archive-one-terabyte-kilobyte-age#new_tab
<![CDATA[The Impulse of the Geocities Archive: One Terabyte Of Kilobyte Age]]> http://www.furtherfield.org/features/impulse-geocities-archive-one-terabyte-kilobyte-age

I visited the Photographers’ Gallery in central London for Furtherfield, and reviewed their latest exhibit One Terabyte of Kilobyte Age by artists Olia Lialina and Dragan Espenschied, on THE WALL. Over an eight week period (18 April – 17 June 2013) they feature a non-stop stream of video captures of what they term as the lost city and its archival ruins. A documentation of a past visual culture of the web and the creativity of its users with new pages changing every 5 minutes. The project provides a glimpse into web publishing when users were in charge of design and narration in contrast to the automated templates of Facebook, YouTube and Flickr. Sifting through a dormant internet message board, or stumbling, awestruck, on a kippleised [1] html homepage, its GIF constellations still twinkling many years after the owner has abandoned them, is an encounter with the living, breathing World Wide Web. At such moments we are led, so argues Marisa Olson, ‘to consider the relationship between taxonomy à la the stuffed-pet metaphor and taxonomy à la the digital archive.’ [2] How such descript images, contrived jumbles of memory and experience, could once have felt so essential to the person who collated them, yet now seem so indecipherable, stagnant, even – dare we admit it – insane to anyone but the most hardened retro-web enthusiast. On show at London’s Photographers gallery until June 17th is an extensive archival exhibit designed to manage, reveal and keep these experiences alive. One Terabyte Of Kilobyte Age (1tb) is the fifth work to be commissioned for the Photographer Gallery’s ‘The Wall’, curated by two artists long associated with the era of the web the exhibition reveres: Olia Lialina and Dragan Espenschied. Perhaps best known for their book Digital Folklore (2009) the artists and retro-web evangelists have, with the 1tb project, strengthened their status as archivists, an impulse Hal Foster famously argued ‘concerned less with absolute origins than with obscure traces’ [3]. In the same year that Dragan and Olia launched their guide to the folk web, Yahoo! announced they were to close one of its greatest sources of inspiration: Geocities. A vast expanse of personal webpages, many of which had long since slid into html decrepitude, represented for Yahoo! little but financial embarrassment. So ancient and outmoded was Geocities that many contemporary browsers were incapable of capturing its essence, fragmenting images and link rolls randomly across modern laptop screens in an attempt to render their 800×600 pixel aura. Scraping and downloading the terabyte or so of data that made up the Geocities universe was thought important enough by some that a taskforce was put together, made up of technical wizards and wizardesses driven by the profound notion that all existent culture is worth saving. From Olia and Dragan’s webpage: In between the announcement and the official date of death a group of people calling themselves Archive Team — managed to rescue almost a terabyte of Geocities pages. On the 26th of October 2010, the first anniversary of this Digital Holocaust, the Archive Team started to seed geocities.archiveteam.torrent.

Olia and Dragan’s gesture, to feed the wealth of culture contained in that torrent back to the masses in a palatable form, is a project whose fruition at the Photographer’s Gallery is but a minor part. After downloading, storing and sorting the 16,000 archived Geocities sites the task of exactly how to display them is a problem. Since most browsers would mangle the look and feel of the Geocities pages Olia and Dragan have turned to two main methods of re-representation. The first, let loose on an automated Tumblr blog that updates over 70 times a day, is an ever growing series of front-page screen captures. In this form 1tb bends to the will of a contemporary web user who concerns themselves with likes, reposts and uplinks. Reflecting on the Tumblr-archive of the torrent-archive of the Geocities-archive, Olia and Dragan’s site contemporary-home-computing highlights particular screen captures that have garnered the most reposts and likes from their Tumblr followers. The results say much for the humour that still drives online culture, but perhaps little about the original contexts from whence those screen captures came. For instance, the screen captures that garner most attention are usually the ones that have failed a part of the retrieval/display/capture process. These ‘obscure traces’ may be GIF heavy sites, half loaded to interesting aesthetic affect, or, perhaps the most telling, captures that show nothing but the empty shell of a Netscape Navigator browser, caught forever like a millennium bug in digital amber.

The second mode of capture and re-display takes place at the Photographer’s Gallery itself. Depicted on nine large intersecting HD video screens set into ‘The Wall’ of the entrance-cum-café, one’s first experience of the exhibit is ponderous. The display cycles through the vast array of Geocities homepages at five minute intervals, giving viewers a more than generous dose of 800×600 px nostalgia. Whether the websites that fade into view are a barrage of animated GIFs,insightful commentary on life in the late 1990s, or a series of barren ‘Under Construction’ assemblages, is up to chance. As a reviewer, sent to derive something from the gallery experience, the wall leered at me with gestures that sent my inner taxonomist into a frenzy. Confronted with such tiny slithers of the archive, in such massive doses, it quickly becomes obvious that the real potential of the project has not been quite realised. Rather than static screen captures The Wall shows cleverly rendered quicktime videos, allowing the GIF whiskers of a Hello Kitty mascot to quiver once more. If you are lucky, or have the patience to watch a long series of the sites fade into view, you’ll be greeted by flickering ‘Welcome’ banners, by cartoon workmen tirelessly drilling, by unicorns cantering and sitemeter bars flashing. But The Wall also feels wholly at odds with its content, caught up in a whirl of web nostalgia that minimises the lives, experiences and aesthetic choices of a defining generation to static flashes that you can’t click on, no matter how much you want to. Archives are living, breathing entities wont to be probed for new meanings and interpretations. Whether depicted as static or faux animated, One Terabyte Of Kilobyte Age is a project with an endless surface, with little way for its viewers to delve deeper.

Trawling through the 1tb Tumblr is a much more visceral experience than the one that greets you at the Photographer’s Gallery, but the sense of a journey waiting to be embarked on is lost somewhat in the move to the Tumblr kingdom. Every five minutes offers a new chance to spot similarities on The Wall, to ponder on the origins of a site or, more profoundly, wonder where the people that toiled to make them are now. Before the days of user driven content, of Facebook timelines, and even before RSS feed aggregators, the whole web felt something like this. Today’s web is unarguably more dynamic, with a clean aesthetic that barely shifts behind the waves of content that wash over its surface. But the user has been relegated to shuffler of material. The Geocities homepage was designed, and kept updated by an army of amateur enthusiasts, organising bandwidth light GIFs in ever more meaningful arrays, in the unlikely event that another living soul would stumble upon them. There is much to love about One Terabyte Of Kilobyte Age, and much to be learned from it given the time. But part of me wishes that the Photographer’s Gallery had given over their trendy café to a row of beige Intel 486 computer stacks, their unwieldy tube monitors better capturing the spirit of the web alá 1996. The clash between the 90s amateur enthusiast and the avid content shuffler of the 2010s is inherent in the modes of display Olia and Dragan chose for their project. Beginning from a desire to save and reflect on our shared heritage, 1tb now represents itself as pure content. An impulse to probe the archive replaced by an impulse to scroll endlessly through Tumblr streams, clicking like buttons on screen captures we hope will distract/impress/outrage our friends until the next cat video refreshes into view. Go, go to the Photographer’s Gallery tomorrow, grab yourself a coffee and let the Geocities archive wash over you. If you can do it without Instagramming a snap to your friends, without updating your Facebook page with tales of your nostalgic reverie, if you can let the flickering screen captures do their own talking , only then can you claim you truly re-entered the kilobyte age.

References [1] ‘Kipple’ is a word coined by science fiction author Philip K. Dick to describe the entropy of physical forms, Dick’s comment on the contradictions of mass-production, utility and planned obsolescence. [2] Marisa Olson, “Lost Not Found: The Circulation of Images in Digital Visual Culture,” Words Without Pictures (September 18, 2008): 281. [3] Hal Foster, “An Archival Impulse,” October – (October 1, 2004): 5, doi:10.1162/0162287042379847.

]]>
Fri, 17 May 2013 03:16:13 -0700 http://www.furtherfield.org/features/impulse-geocities-archive-one-terabyte-kilobyte-age
<![CDATA[Kipple and Things II: The Subject of Digital Detritus]]> http://machinemachine.net/text/ideas/kipple-and-things-ii-the-subject-of-digital-detritus

This text is a work in progress; a segment ripped from my thesis. To better ingest some of the ideas I throw around here, you might want to read these texts first: - Kipple and Things: How to Hoard and Why Not To Mean - Digital Autonomy

Captured in celluloid under the title Blade Runner, (Scott 1982) Philip K. Dick’s vision of kipple abounds in a world where mankind lives alongside shimmering, partly superior, artificial humans. The limited lifespan built into the Nexus 6 replicants  [i] is echoed in the human character J.F. Sebastian,[ii]whose own degenerative disorder lends his body a kipple-like quality, even if the mind it enables sparkles so finely. This association with replication and its apparent failure chimes for both the commodity fetish and an appeal to digitisation. In Walter Benjamin’s The Work of Art in the Age of its Technological Reproducibility, mechanisation and mass production begin at the ‘original’, and work to distance the commodity from the form captured by each iteration. Not only does the aura of the original stay intact as copies of it are reproduced on the production line, that aura is actually heightened in the system of commoditisation. As Frederic Jameson has noted, Dick’s work ‘renders our present historical by turning it into the past of a fantasized future’ (Jameson 2005, 345). Kipple piles up at the periphery of our culture, as if Dick is teasing us to look upon our own time from a future anterior in which commodity reification will have been: It hadn’t upset him that much, seeing the half-abandoned gardens and fully abandoned equipment, the great heaps of rotting supplies. He knew from the edu-tapes that the frontier was always like that, even on Earth. (Dick 2011, 143) Kipple figures the era of the commodity as an Empire, its borders slowly expanding away from the subjects yearning for Biltong replicas, seeded with mistakes. Kipple is a death of subjects, haunted by objects, but kipple is also a renewal, a rebirth. The future anterior is a frontier, one from which it might just be possible to look back upon the human without nostalgia. Qualify the human subject with the android built in its image; the object with the entropic degradation that it must endure if its form is to be perpetuated, and you necessarily approach an ontology of garbage, junk and detritus: a glimmer of hope for the remnants of decay to assert their own identity. Commodities operate through the binary logic of fetishisation and obsolescence, in which the subject’s desire to obtain the shiny new object promotes the propagation of its form through an endless cycle of kippleisation. Kipple is an entropy of forms, ideals long since removed from their Platonic realm by the march of mimesis, and kippleisation an endless, unstoppable encounter between subjectness and thingness. Eschewing Martin Heidegger’s definition of a thing, in which objects are brought out of the background of existence through human use, (Bogost 2012, 24) Bill Brown marks the emergence of things through the encounter: As they circulate through our lives… we look through objects because there are codes by which our interpretive attention makes them meaningful, because there is a discourse of objectivity that allows us to use them as facts. A thing, in contrast, can hardly function as a window. We begin to confront the thingness of objects when they stop working for us… (Brown 2001, 4) This confrontation with the ‘being’ of the object occurs by chance when, as Brown describes, a patch of dirt on the surface of the window captures us for a moment, ‘when the drill breaks, when the car stalls… when their flow within the circuits of production and distribution, consumption and exhibition, has been arrested, however momentarily’. (Brown 2001, 4) We no longer see through the window-object (literally or metaphorically), but are brought into conflict with its own particular discrete being by the encounter with its filthy surface. A being previously submersed in the continuous background of world as experience, need not necessarily be untangled by an act of human-centric use. The encounter carries the effect of a mirror, for as experience stutters at the being of a thing, so the entity invested in that experience is made aware of their own quality as a thing – if only for a fleeting moment. Brown’s fascination with ‘how inanimate objects constitute human subjects’ (Brown 2001, 7) appears to instate the subject as the centre of worldly relations. But Bill Brown has spun a realist [iii] web in which to ensnare us. The object is not phenomenal, because its being exists independent of any culpability we may wish to claim. Instead a capture of object and human, of thing qua thing, occurs in mutual encounter, bringing us closer to a flat ontology ‘where humans are no longer monarchs of being but are instead among beings, entangled in beings, and implicated in other beings.’ (Bryant 2011, 40)

Brown’s appraisal of things flirts with the splendour of kipple. Think of the landfill, an engorged river of kipple, or the salvage yard, a veritable shrine to thingness. Tattered edges and featureless forms leak into one another in unsavoury shades of tea-stain brown and cobweb grey splashed from the horizon to your toes. Masses of broken, unremarkable remnants in plastic, glass and cardboard brimming over the edge of every shiny suburban enclave. The most astonishing thing about the turmoil of these places is how any order can be perceived in them at all. But thing aphasia does diminish, and it does so almost immediately. As the essential human instinct for order kicks in, things come to resemble objects. Classes of use, representation and resemblance neatly arising to cut through the pudding; to make the continuous universe discrete once again. You note a tricycle wheel there, underneath what looks like the shattered circumference of an Edwardian lamp. You almost trip over a bin bag full of carrot tops and potato peel before becoming transfixed by a pile of soap-opera magazines. Things, in Brown’s definition, are unreachable by human caprice. Things cannot be grasped, because their thingnessslips back into recognition as soon as it is encountered: When such a being is named, then, it is also changed. It is assimilated into the terms of the human subject at the same time that it is opposed to it as object, an opposition that is indeed necessary for the subject’s separation and definition. (Schwenger 2004, 137) The city of Hull, the phrase ‘I will’, the surface of an ice cube and an image compression algorithm are entities each sustained by the same nominative disclosure: a paradox of things that seem to flow into one another with liquid potential, but things, nonetheless limited by their constant, necessary re-iteration in language. There is no thing more contradictory in this regard than the human subject itself, a figure Roland Barthes’ tried to paradoxically side-step in his playful autobiography. Replenishing each worn-out piece of its glimmering hull, one by one, the day arrives when the entire ship of Argo has been displaced – each of its parts now distinct from those of the ‘original’ vessel. For Barthes, this myth exposes two modest activities: - Substitution (one part replaces another, as in a paradigm) – Nomination (the name is in no way linked to the stability of the parts) (Barthes 1994, 46) Like the ship of Argo, human experience has exchangeable parts, but at its core, such was Barthes’ intention, ‘the subject, unreconciled, demands that language represent the continuity of desire.’ (Eakin 1992, 16) In order that the subject remain continuous, it is the messy world that we must isolate into classes and taxonomies. We collate, aggregate and collect not merely because we desire, but because without these nominative acts the pivot of desire – the illusionary subject – could not be sustained. If the powerful stance produced in Dick’s future anterior is to be sustained, the distinction between subjects aggregating objects, and objects coagulating the subject, needs flattening. [iv] Bill Brown’s appeal to the ‘flow within the circuits of production and distribution, consumption and exhibition’ (Brown 2001, 4) partially echoes Dick’s concern with the purity of the thing. Although Dick’s Biltong were probably more of a comment on the Xerox machine than the computer, the problem of the distribution of form, as it relates to commodity fetishism, enables ‘printing’ as a neat paradigm of the contemporary network-based economy. Digital things, seeming to proliferate independent from the sinuous optical cables and super-cooled server banks that disseminate them, are absolutelyreliant on the process of copying. Copying is a fundamental component of the digital network where, unlike the material commodity, things are not passed along. The digital thing is always a copy, is always copied, and is always copying: Copying the product (mechanical reproduction technologies of modernity) evolves into copying the instructions for manufacturing (computer programs as such recipes of production). In other words, not only copying copies, but more fundamentally copying copying itself. (Parikka 2008, 72) Abstracted from its material context, copying is ‘a universal principle’ (Parikka 2008, 72) of digital things, less flowing ‘within the circuits’ (Brown 2001, 4) as being that circuitry flow in and of itself. The entire network is a ship of Argo, capable, perhaps for the first time, [v]to Substitute and Nominate its own parts, or, as the character J.F. Isidore exclaims upon showing an android around his kippleised apartment: When nobody’s around, kipple reproduces itself. [my emphasis] (Dick 1968, 53) Kipple is not garbage, nor litter, for both these forms are decided upon by humans. In a recent pamphlet distributed to businesses throughout the UK, the Keep Britain Tidy Campaign made a useful distinction: Litter can be as small as a sweet wrapper, as large as a bag of rubbish, or it can mean lots of items scattered about. ENCAMS describes litter as “Waste in the wrong place caused by human agency”. In other words, it is only people that make litter. (Keep Britain Tidy Campaign, 3) Garbage is a decisive, collaborative form, humans choose to destroy or discard. A notion of detritus that enhances the autonomy, the supposed mastery of the subject in its network. Digital networks feature their own litter in the form of copied data packets that have served their purpose, or been deemed erroneous by algorithms designed to seed out errors. These processes, according to W. Daniel Hillis, define, ‘the essence of digital technology, which restores signal to near perfection at every stage’. (Hillis 1999, 18) Maintenance of the network and the routines of error management are of primary economic and ontological concern: control the networks and the immaterial products will manage themselves; control the tendency of errors to reproduce, and we maintain a vision of ourselves as masters over, what Michel Serres has termed, ‘the abundance of the Creation’. (Serres 2007, 47) Seeming to sever their dependency on the physical processes that underlie them, digital technologies, ‘incorporate hyper-redundant error-checking routines that serve to sustain an illusion of immateriality by detecting error and correcting it’. (Kirschenbaum 2008, 12) The alleviation of error and noise, is then, an implicit feature of digital materiality. Expressed at the status of the digital image it is the visual glitch, the coding artifact, [vi]that signifies the potential of the digital object to loosen its shackles; to assert its own being. In a parody of Arthur C. Clarke’s infamous utopian appraisal of technology, another science fiction author, Bruce Sterling, delivers a neat sound bite for the digital civilisation, so that: Any sufficiently advanced technology is indistinguishable from magic (Clarke 1977, 36) …becomes… Any sufficiently advanced technology is indistinguishable from [its] garbage. (Sterling 2012)  

Footnotes [i] A label appropriated by Ridley Scott for the film Blade Runner, and not by Philip K. Dick in the original novel, Do Androids Dream of Electric Sheep?, who preferred the more archaic, general term, android. Throughout the novel characters refer to the artificial humans as ‘andys,’ portraying a casual ease with which to shrug off these shimmering subjects as mere objects. [ii] A translated version of the character, J.F. Isidore, from the original novel. [iii] Recent attempts to disable appeals to the subject, attempts by writers such as Graham Harman, Levi R. Bryant, Bill Brown and Ian Bogost, have sought to devise, in line with Bruno Latour, an ontology in which ‘Nothing can be reduced to anything else, nothing can be deduced from anything else, everything may be allied to everything else;’ (Latour 1993, 163) one in which a discussion of the being of a chilli pepper or a wrist watch may rank alongside a similar debate about the being of a human or a dolphin. An object-oriented, flat ontology (Bryant 2011) premised on the niggling sentiment that ‘all things equally exist, yet they do not exist equally.’ (Bogost 2012, 19) Unlike Graham Harman, who uses the terms interchangeably, (Bogost 2012, 24) Bill Brown’s Thing Theory approaches the problem by strongly asserting a difference between objects and things. [iv] I have carefully avoided using the term ‘posthuman,’ but I hope its resonance remains. [v] The resonance here with a biological imperative is intentional, although it is perhaps in this work alone that I wish to completely avoid such digital/biological metonyms. Boris Groys’ text From Image to Image File – And Back: Art in the Age of Digitisation, functions neatly to bridge this work with previous ones when he states: The biological metaphor says it all: not only life, which is notorious in this respect, but also technology, which supposedly opposes nature, has become the medium of non-identical reproduction.

[vi] I have very consciously chosen to spell ‘artifact’ with an ‘i’, widely known as the American spelling of the term. This spelling of the word aligns it with computer/programming terminology (i.e.’compression artifact’), leaving the ‘e’ spelling free to echo its archaeological heritage. In any case, multiple meanings for the word can be read in each instance.

Bibliography Barthes, Roland. 1994. Roland Barthes. University of California Press. Bogost, Ian. 2012. Alien Phenomenology, Or What It’s Like to Be a Thing. University of Minnesota Press. Brown, Bill. 2001. “Thing Theory.” Critical Inquiry 28 (1) (October 1): 1–22. Bryant, Levi R. 2011. The Democracy of Objects. http://hdl.handle.net/2027/spo.9750134.0001.001. Clarke, Arthur C. 1977. “Hazards of Prophecy: The Failure of Imagination.” In Profiles of the future?: an inquiry into the limits of the possible. New York: Popular Library. Dick, Philip K. 1968. Do Androids Dream of Electric Sheep? Random House Publishing Group, 2008. ———. 2011. The Three Stigmata of Palmer Eldritch. Houghton Mifflin Harcourt. Eakin, Paul John. 1992. Touching the World: Reference in Autobiography. Princeton University Press. Hillis, W. 1999. The Pattern on the Stone?: the Simple Ideas That Make Computers Work. 1st paperback ed. New York: Basic Books. Jameson, Fredric. 2005. Archaeologies of the Future: The Desire Called Utopia and Other Science Fictions. Verso. Keep Britain Tidy Campaign, Environmental Campaigns (ENCAMS). YOUR RUBBISH AND THE LAW a Guide for Businesses. http://kb.keepbritaintidy.org/fotg/publications/rlaw.pdf. Kirschenbaum, Matthew G. 2008. Mechanisms: New Media and the Forensic Imagination. MIT Press. Latour, Bruno. 1993. The Pasteurization of France. Harvard University Press. Parikka, Jussi. 2008. “Copy.” In Software Studies?: a Lexicon, ed. Matthew Fuller, 70–78. Cambridge  Mass.: MIT Press. Schwenger, Peter. 2004. “Words and the Murder of the Thing.” In Things, 135 – 150. University of Chicago Press Journals. Scott, Ridley. 1982. Blade Runner. Drama, Sci-Fi, Thriller. Serres, Michel. 2007. The Parasite. 1st University of Minnesota Press ed. Minneapolis: University of Minnesota Press. Sterling, Bruce. 2012. “Design Fiction: Sascha Pohflepp & Daisy Ginsberg, ‘Growth Assembly’.” Wired Magazine: Beyond The Beyond. http://www.wired.com/beyond_the_beyond/2012/01/design-fiction-sascha-pohflepp-daisy-ginsberg-growth-assembly/.

]]>
Sat, 25 Aug 2012 10:00:00 -0700 http://machinemachine.net/text/ideas/kipple-and-things-ii-the-subject-of-digital-detritus
<![CDATA[Evolution, Entropy, and Information]]> http://blogs.discovermagazine.com/cosmicvariance/2012/06/07/evolution-entropy-and-information/

There are really two points. The first is a bit of technical background you can ignore if you like, and skip to the next paragraph. It’s the idea of “relative entropy” and its equivalent “information” formulation. Information can be thought of as “minus the entropy,” or even better “the maximum entropy possible minus the actual entropy.” If you know that a system is in a low-entropy state, it’s in one of just a few possible microstates, so you know a lot about it. If it’s high-entropy, there are many states that look that way, so you don’t have much information about it. (Aside to experts: I’m kind of shamelessly mixing Boltzmann entropy and Gibbs entropy, but in this case it’s okay, and if you’re an expert you understand this anyway.) John explains that the information (and therefore also the entropy) of some probability distribution is always relative to some other probability distribution, even if we often hide that fact by taking the fiducial probability to be uniform (… in some va

]]>
Fri, 15 Jun 2012 05:19:00 -0700 http://blogs.discovermagazine.com/cosmicvariance/2012/06/07/evolution-entropy-and-information/
<![CDATA[Rigid Implementation vs Flexible Materiality]]> http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality

Wow. It’s been a while since I updated my blog. I intend to get active again here soon, with regular updates on my research. For now, I thought it might be worth posting a text I’ve been mulling over for a while (!) Yesterday I came across this old TED presentation by Daniel Hillis, and it set off a bunch of bells tolling in my head. His book The Pattern on the Stone was one I leafed through a few months back whilst hunting for some analogies about (digital) materiality. The resulting brainstorm is what follows. (This blog post, from even longer ago, acts as a natural introduction: On (Text and) Exaptation) In the 1960s and 70s Roland Barthes named “The Text” as a network of production and exchange. Whereas “the work” was concrete, final – analogous to a material – “the text” was more like a flow, a field or event – open ended. Perhaps even infinite. In, From Work to Text, Barthes wrote: The metaphor of the Text is that of the network… (Barthes 1979) This semiotic approach to discourse, by initiating the move from print culture to “text” culture, also helped lay the ground for a contemporary politics of content-driven media. Skipping backwards through From Work to Text, we find this statement: The text must not be understood as a computable object. It would be futile to attempt a material separation of works from texts. I am struck here by Barthes” use of the phrase “computable object”, as well as his attention to the “material”. Katherine Hayles in her essay, Text is Flat, Code is Deep, (Hayles 2004) teases out the statement for us: ‘computable’ here mean[s] to be limited, finite, bound, able to be reckoned. Written twenty years before the advent of the microcomputer, his essay stands in the ironic position of anticipating what it cannot anticipate. It calls for a movement away from works to texts, a movement so successful that the ubiquitous ‘text’ has all but driven out the media-specific term book. Hayles notes that the “ubiquity” of Barthes” term “Text” allowed – in its wake – an erasure of media-specific terms, such as “book”. In moving from, The Work to The Text, we move not just between different politics of exchange and dissemination, we also move between different forms and materialities of mediation. (Manovich 2002)For Barthes the material work was computable, whereas the network of the text – its content – was not.

In 1936, the year that Alan Turing wrote his iconic paper ‘On Computable Numbers’, a German engineer by the name of Konrad Zuse built the first working digital computer. Like its industrial predecessors, Zuse’s computer was designed to function via a series of holes encoding its program. Born as much out of convenience as financial necessity, Zuse punched his programs directly into discarded reels of 35mm film-stock. Fused together by the technologies of weaving and cinema, Zuse’s computer announced the birth of an entirely new mode of textuality. The Z3, the world’s first working programmable, fully automatic computer, arrived in 1941. (Manovich 2002) A year earlier a young graduate by the name of Claude Shannon had published one of the most important Masters theses in history. In it he demonstrated that any logical expression of Boolean algebra could be programmed into a series of binary switches. Today computers still function with a logic impossible to distinguish from their mid-20th century ancestors. What has changed is the material environment within which Boolean expressions are implemented. Shannon’s work first found itself manifest in the fragile rows of vacuum tubes that drove much of the technical innovation of the 40s and 50s. In time, the very same Boolean expressions were firing, domino-like, through millions of transistors etched onto the surface of silicon chips. If we were to query the young Shannon today, he might well gawp in amazement at the material advances computer technology has gone through. But, if Shannon was to examine either your digital wrist watch or the world’s most advanced supercomputer in detail, he would once again feel at home in the simple binary – on/off – switches lining those silicon highways. Here the difference between how computers are implemented and what computers are made of digs the first of many potholes along our journey. We live in an era not only practically driven by the computer, but an era increasingly determined by the metaphors computers have injected into our language. Let us not make the mistake of presupposing that brains (or perhaps minds) are “like” computers. Tempting though it is to reduce the baffling complexities of the human being to the functions of the silicon chip, the parallel processor or Wide Area Network this reduction occurs most usefully at the level of metaphor and metonym. Again the mantra must be repeated that computers function through the application of Boolean logic and binary switches, something that can not be said about the human brain with any confidence a posteriori. Later I will explore the consequences on our own understanding of ourselves enabled by the processing paradigm, but for now, or at least the next few paragraphs, computers are to be considered in terms of their rigid implementation and flexible materiality alone. At the beginning of his popular science book, The Pattern on the Stone, (Hillis 1999) W.  Daniel Hillis narrates one of his many tales on the design and construction of a computer. Built from tinker-toys the computer in question was/is functionally complex enough to “play” tic-tac-toe (noughts and crosses). The tinker-toy was chosen to indicate the apparent simplicity of computer design, but as Hillis argues himself, he may very well have used pipes and valves to create a hydraulic computer, driven by water pressure, or stripped the design back completely, using flowing sand, twigs and twine or any other recipe of switches and connectors. The important point is that the tinker-toy tic-tac-toe computer functions perfectly well for the task it is designed for, perfectly well, that is, until the tinker-toy material begins to fail. This failure is what Chapter 1 of this thesis is about: why it happens, why its happening is a material phenomenon and how the very idea of “failure” is suspect. Tinker-toys fail because the mechanical operation of the tic-tac-toe computer puts strain on the strings of the mechanism, eventually stretching them beyond practical use. In a perfect world, devoid of entropic behaviour, the tinker-toy computer may very well function forever, its users setting O or X conditions, and the computer responding according to its program in perfect, logical order. The design of the machine, at the level of the program, is completely closed; finished; perfect. Only materially does the computer fail (or flail), noise leaking into the system until inevitable chaos ensues and the tinker-toys crumble back into jumbles of featureless matter. This apparent closure is important to note at this stage because in a computer as simple as the tic-tac-toe machine, every variable can be accounted for and thus programmed for. Were we to build a chess playing computer from tinker-toys (pretending we could get our hands on the, no doubt, millions of tinker-toy sets we”d need) the closed condition of the computer may be less simple to qualify. Tinker-toys, hydraulic valves or whatever material you choose, could be manipulated into any computer system you can imagine, even the most brain numbingly complicated IBM supercomputer is technically possible to build from these fundamental materials. The reason we don”t do this, why we instead choose etched silicon as our material of choice for our supercomputers, exposes another aspect of computers we need to understand before their failure becomes a useful paradigm. A chess playing computer is probably impossible to build from tinker-toys, not because its program would be too complicated, but because tinker-toys are too prone to entropy to create a valid material environment. The program of any chess playing application could, theoretically, be translated into a tinker-toy equivalent, but after the 1,000th string had stretched, with millions more to go, no energy would be left in the system to trigger the next switch along the chain. Computer inputs and outputs are always at the mercy of this kind of entropy: whether in tinker-toys or miniature silicon highways. Noise and dissipation are inevitable at any material scale one cares to examine. The second law of thermo dynamics ensures this. Claude Shannon and his ilk knew this, even back when the most advanced computers they had at their command couldn”t yet play tic-tac-toe. They knew that they couldn”t rely on materiality to delimit noise, interference or distortion; that no matter how well constructed a computer is, no matter how incredible it was at materially stemming entropy (perhaps with stronger string connectors, or a built in de-stretching mechanism), entropy nonetheless was inevitable. But what Shannon and other computer innovators such as Alan Turing also knew, is that their saviour lay in how computers were implemented. Again, the split here is incredibly important to note:

Flexible materiality: How and of what a computer is constructed e.g. tinker-toys, silicon Rigid implementation: Boolean logic enacted through binary on/off switches (usually with some kind of input à storage à feedback/program function à output). Effectively, how a computer works

Boolean logic was not enough on its own. Computers, if they were to avoid entropy ruining their logical operations, needed to have built within them an error management protocol. This protocol is still in existence in EVERY computer in the world. Effectively it takes the form of a collection of parity bits delivered alongside each packet of data that computers, networks and software deal with. The bulk of data contains the binary bits encoding the intended quarry, but the receiving element in the system also checks the main bits alongside the parity bits to determine whether any noise has crept into the system. What is crucial to note here is the error-checking of computers happens at the level of their rigid implementation. It is also worth noting that for every eight 0s and 1s delivered by a computer system, at least one of those bits is an error checking function. W. Daniel Hillis puts the stretched strings of his tinker-toy mechanism into clear distinction and in doing so, re-introduces an umbrella term set to dominate this chapter: I constructed a later version of the Tinker Toy computer which fixed the problem, but I never forgot the lesson of the first machine: the implementation technology must produce perfect outputs from imperfect inputs, nipping small errors in the bud. This is the essence of digital technology, which restores signals to near perfection at every stage. It is the only way we know – at least, so far – for keeping a complicated system under control. (Hillis 1999, 18)   Bibliography  Barthes, Roland. 1979. ‘From Work to Text.’ In Textual Strategies: Perspectives in Poststructuralist Criticism, ed. Josue V. Harari, 73–81. Ithaca, NY: Cornell University Press. Hayles, N. Katherine. 2004. ‘Print Is Flat, Code Is Deep: The Importance of Media-Specific Analysis.’ Poetics Today 25 (1) (March): 67–90. doi:10.1215/03335372-25-1-67. Hillis, W. 1999. The Pattern on the Stone : the Simple Ideas That Make Computers Work. 1st paperback ed. New York: Basic Books. Manovich, Lev. 2002. The Language of New Media. 1st MIT Press pbk. ed. Cambridge  Mass.: MIT Press.      

]]>
Thu, 07 Jun 2012 06:08:07 -0700 http://machinemachine.net/text/research/rigid-implementation-vs-flexible-materiality
<![CDATA[Sloppy MicroChips: Can a fair comparison be made between biological and silicon entropy?]]> http://ask.metafilter.com/mefi/217051

Was reading about microchips that are designed to allow a few mistakes (known as 'Sloppy Chips'), and pondering equivalent kinds of 'coding' errors and entropy in biological systems. Can a fair comparison be made between the two? OK, to setup my question I probably need to run through my (basic) understanding of biological vs silicon entropy...

In the transistor, error is a bad thing (in getting the required job done as efficiently and cheaply as possible), metered by parity bits that come as standard in every packet of data transmitted. But, in biological systems error is not necessarily bad. Most copying errors are filtered out, but some propogate and some of those might become beneficial to the organism (in thermodynamics sometimes known as "autonomy producing equivocations").

Relating to the article about 'sloppy chips', how does entropy and energy efficiency factor into this? For the silicon chip efficiency leads to heat (a problem), for the string of DNA efficiency leads to fewer mutations, and thus less change within populations, and thus, inevitably, less capacity for organisms to diversify and react to their environments - leading to no evolution, no change, no good. Slightly less efficiency is good for biology, and, it seems, good for some kinds of calculations and computer processes.

What work has been done on these connections I draw between the biological and the silicon?

I'm worried that my analogy is limited, based as it is on a paradigm for living systems that too closely mirrors the digital systems we have built. Can DNA and binary parity bit transistors be understood on their own terms, without resorting to using the other as a metaphor to understanding?

Where do the boundaries lie in comparing the two?

]]>
Tue, 05 Jun 2012 10:05:10 -0700 http://ask.metafilter.com/mefi/217051
<![CDATA[entropy deserves a rest]]> http://twitter.com/therourke/statuses/174181686546931712 ]]> Mon, 27 Feb 2012 09:18:44 -0800 http://twitter.com/therourke/statuses/174181686546931712 <![CDATA[Ask MeFi: Classical vs statistical entropy]]> http://ask.metafilter.com/203138/Classical-vs-statistical-entropy

Can someone explain to me in layman's terms the relation between classical entropy (heat over temperature) and statistical entropy (-k \sum p_i ln p_i)? More specifically, how do the microstate probabilities arise and how do they relate to heat and temperature?

]]>
Mon, 12 Dec 2011 01:24:30 -0800 http://ask.metafilter.com/203138/Classical-vs-statistical-entropy
<![CDATA[Noise; Mutation; Autonomy: A Mark on Crusoe’s Island]]> http://machinemachine.net/text/research/a-mark-on-crusoes-island

This mini-paper was given at the Escapologies symposium, at Goldsmiths University, on the 5th of December Daniel Defoe’s 1719 novel Robinson Crusoe centres on the shipwreck and isolation of its protagonist. The life Crusoe knew beyond this shore was fashioned by Ships sent to conquer New Worlds and political wills built on slavery and imperial demands. In writing about his experiences, Crusoe orders his journal, not by the passing of time, but by the objects produced in his labour. A microcosm of the market hierarchies his seclusion removes him from: a tame herd of goats, a musket and gunpowder, sheafs of wheat he fashions into bread, and a shelter carved from rock with all the trappings of a King’s castle. Crusoe structures the tedium of the island by gathering and designing these items that exist solely for their use-value: “In a Word, The Nature and Experience of Things dictated to me upon just Reflection, That all the good Things of this World, are no farther good to us, than they are for our Use…” [1] Although Crusoe’s Kingdom mirrors the imperial British order, its mirroring is more structural than anything else. The objects and social contrivances Crusoe creates have no outside with which to be exchanged. Without an ‘other’ to share your labour there can be no mutual assurance, no exchanges leading to financial agreements, no business partners, no friendships. But most importantly to the mirroring of any Kingdom, without an ‘other’ there can be no disagreements, no coveting of a neighbours ox, no domination, no war: in short, an Empire without an outside might be complete, total, final, but an Empire without an outside has also reached a state of complete inertia. Crusoe’s Empire of one subject, is what I understand as “a closed system”… The 2nd law of thermo dynamics maintains that without an external source of energy, all closed systems will tend towards a condition of inactivity. Eventually, the bacteria in the petri dish will multiply, eating up all the nutrients until a final state of equilibrium is reached, at which point the system will collapse in on itself: entropy cannot be avoided indefinitely. The term ‘negative entropy’ is often applied to living organisms because they seem to be able to ‘beat’ the process of entropy, but this is as much an illusion as the illusion of Crusoe’s Kingdom: negative entropy occurs at small scales, over small periods of time. Entropy is highly probable: the order of living beings is not. Umberto Eco: “Consider, for example, the chaotic effect… of a strong wind on the innumerable grains of sand that compose a beach: amid this confusion, the action of a human foot on the surface of the beach constitutes a complex interaction of events that leads to the statistically very improbable configuration of a footprint.” [2] The footprint in Eco’s example is a negative entropy event: the system of shifting sands is lent a temporary order by the cohesive action of the human foot. In physical terms, the footprint stands as a memory of the foot’s impression. The 2nd law of thermodynamics establishes a relationship between entropy and information: memory remains as long as its mark. Given time, the noisy wind and chaotic waves will cause even the strongest footprint to fade. A footprint is a highly improbable event. Before you read on, watch this scene from Luis Buñuel’s Robinson Crusoe (1954):

The footprint, when it first appears on the island, terrifies Crusoe as a mark of the outsider, but soon, realising what this outsider might mean for the totality of his Kingdom, Robinson begins the process of pulling the mark inside his conceptions: “Sometimes I fancied it must be the Devil; and reason joined in with me upon this supposition. For how should any other thing in human shape come into the place? Where was the vessel that brought them? What marks were there of any other footsteps? And how was it possible a man should come there?” [3] In the novel, it is only on the third day that Crusoe re-visits the site to compare his own foot with the print. The footprint is still there on the beach after all this time, a footprint Crusoe now admits is definitely not his own. This chain of events affords us several allegorical tools: firstly, that of the Devil, Crusoe believes to be the only rational explanation for the print. This land, which has been Crusoe’s own for almost 2 decades, is solid, unchanging and eternal. Nothing comes in nor goes beyond its shores, yet its abundance of riches have served Crusoe perfectly well: seemingly infinite riches for a Kingdom’s only inhabitant. Even the footprint, left for several days, remains upon Crusoe’s return. Like the novel of which it is a part, the reader of the mark may revisit the site of this unlikely incident again and again, each time drawing more meanings from its appearance. Before Crusoe entertains that the footprint might be that of “savages of the mainland” he eagerly believes it to be Satan’s, placed there deliberately to fool him. Crusoe revisits the footprint, in person and then, as it fades, in his own memory. He ‘reads’ the island, attributing meanings to marks he discovers that go far beyond what is apparent. As Susan Stewart has noted: “In allegory the vision of the reader is larger than the vision of the text; the reader dreams to an excess, to an overabundance.” [4] Simon O’Sullivan, following from Deleuze, takes this further, arguing that in his isolation, a world free from ‘others’, Crusoe has merged with, become the island. The footprint is a mark that must be recuperated if Crusoe’s identity, his “power of will”, is to be maintained. An outsider must have caused the footprint, but Crusoe is only capable of reading in the mark something about himself. The evocation of a Demon, then, is Crusoe’s way of re-totalising his Empire, of removing the ‘other’ from his self-subjective identification with the island. So, how does this relate to thermodynamics? To answer that I will need to tell the tale of a second Demon, more playful even than Crusoe’s. In his 1871 essay, Theory of Heat, James Clerk Maxwell designed a thought experiment to test the 2nd law of Thermodynamics. Maxwell imagines a microscopic being able to sort atoms bouncing around a closed system into two categories: fast and slow. If such a creature did exist, it was argued, no work would be required to decrease the entropy of a closed system. By sorting unlikely footprints from the chaotic arrangement of sand particles Maxwell’s Demon, as it would later become known, appeared to contradict the law Maxwell himself had helped to develop. One method of solving the apparent paradox was devised by Charles H. Bennet, who recognised that the Demon would have to remember where he placed the fast and slow particles. Here, once again, the balance between the order and disorder of a system comes down to the balance between memory and information. As the demon decreases the entropy of its environment, so it must increase the entropy of its memory. The information required by the Demon acts like a noise in the system. The laws of physics had stood up under scrutiny, resulting in a new branch of science we now know as ‘Information Theory’. Maxwell’s Demon comes from an old view of the universe, “fashioned by divine intervention, created for man and responsive to his will” [5]. Information Theory represents a threshold, a revelation that the “inhuman force of increasing entropy, [is] indifferent to man and uncontrollable by human will.” [6] Maxwell’s Demon shows that the law of entropy has only a statistical certainty, that nature orders only on small scales and, that despite any will to control, inertia will eventually be reached. Developed at the peak of the British Empire, thermodynamics was sometimes called “the science of imperialism”, as Katherine Hayles has noted: “…to thermodynamicists, entropy represented the tendency of the universe to run down, despite the best efforts of British rectitude to prevent it from doing so… The rhetoric of imperialism confronts the inevitability of failure. In this context, entropy represents an apparently inescapable limit on the human will to control.” [7] Like Maxwell, Crusoe posits a Demon, with faculties similar in kind to his own, to help him quash his “terror of mind”. Crusoe’s fear is not really about outsiders coming in, the terror he feels comes from the realisation that the outsiders may have been here all along, that in all the 20 years of his isolation those “savages of the mainland” may have visited his island time and again. It is not an outside ‘other’ that disturbs and reorganises Crusoe’s Kingdom. A more perverse logic is at work here, and once again Crusoe will have to restructure his imperial order from the inside out. Before you read on, watch another scene from Luis Buñuel’s Robinson Crusoe (1954):

Jacques Rancière prepares for us a parable. A student who is illiterate, after living a fulfilled life without text, one day decides to teach herself to read. Luckily she knows a single poem by heart and procures a copy of that poem, presumably from a trusted source, by which to work. By comparing her memory of the poem, sign by sign, word by word, with the text of the poem she can, Rancière believes, finally piece together a foundational understanding of her written language: “From this ignoramus, spelling out signs, to the scientist who constructs hypotheses, the same intelligence is always at work – an intelligence that translates signs into other signs and proceeds by comparisons and illustrations in order to communicate its intellectual adventures and understand what another intelligence is endeavouring to communicate to it… This poetic labour of translation is at the heart of all learning.” [8] What interests me in Rancière’s example is not so much the act of translation as the possibility of mis-translation. Taken in light of The Ignorant Schoolmaster we can assume that Rancière is aware of the wide gap that exists between knowing something and knowing enough about something for it to be valuable. How does one calculate the value of what is a mistake? The ignoramus has an autonomy, but it is effectively blind to the quality and make-up of the information she parses. If she makes a mistake in her translation of the poem, this mistake can be one of two things: it can be a blind error, or, it can be a mutation. In information theory, the two ways to understand change within a closed system are understood to be the product of ‘noise’. The amount of change contributed by noise is called ‘equivocation’. If noise contributes to the reorganisation of a system in a beneficial way, for instance if a genetic mutation in an organism results in the emergence of an adaptive trait, then the equivocation is said to be ‘autonomy-producing’. Too much noise is equivalent to too much information, a ‘destructive’ equivocation, leading to chaos. This balance is how evolution functions. An ‘autonomy-producing’ mutation will be blindly passed on to an organism’s offspring, catalysing the self-organisation of the larger system (in this case, the species). All complex, what are called ‘autopoietic’ systems, inhabit this fine divide between noise and inertia.  Given just the right balance of noise recuperated by the system, and noise filtered out by the system, a state of productive change can be maintained, and a state of inertia can be avoided, at least, for a limited time. According to Umberto Eco, in ‘The Open Work’: “To be sure, this word information in communication theory relates not so much to what you do say, as to what you could say… In the end… there is no real difference between noise and signal, except in intent.” [9] This rigid delineator of intent is the driving force of our contemporary, communication paradigm. Information networks underpin our economic, political and social interactions: the failure to communicate is to be avoided at all costs. All noise is therefore seen as a problem. These processes, according to W. Daniel Hillis, define, “the essence of digital technology, which restores signal to near perfection at every stage.” [10] To go back to Umberto Eco then, we appear to be living in a world of “do say” rather than “could say”. Maintenance of the network and the routines of error management are our primary economic and political concern: control the networks and the immaterial products will manage themselves. The modern network paradigm acts like a Maxwell Demon, categorising information as either pure signal or pure noise. As Mark Nunes has noted, following the work of Deleuze and Guattari: “This forced binary imposes a kind of violence, one that demands a rationalisation of all singularities of expressions within a totalising system… The violence of information is, then, the violence of silencing or making to speak that which cannot communicate.” [11] To understand the violence of this binary logic, we need go no further than Robinson Crusoe. Friday’s questions are plain spoken, but do not adhere to the “do say” logic of Crusoe’s conception. In the novel, Crusoe’s approach to Friday becomes increasingly one sided, until Friday utters little more than ‘yes’ and ‘no’ answers, “reducing his language to a pure function of immediate context and perpetuating a much larger imperialist tradition of levelling the vox populi.”[12] Any chance in what Friday “could say” has been violently obliterated. The logic of Ranciere’s Ignoramous, and of Crusoe’s levelling of Friday’s speech, are logics of imperialism: reducing the possibility of noise and information to an either/or, inside/outside, relationship. Mark Nunes again: “This balance between total flow and total control parallels Deleuze and Guattari’s discussion of a regime of signs in which anything that resists systematic incorporation is cast out as an asignifying scapegoat “condemned as that which exceeds the signifying regime’s power of deterritorialisation.” [13] In the system of communication these “asignifying” events are not errors, in the common sense of the word. Mutation names a randomness that redraws the territory of complex systems. The footprint is the mark that reorganised the Empire. In Ranciere’s parable, rather than note her intent to decode the poem, we should hail the moment when the Ignoramus fails, as her autonomous moment. In a world where actants “translate signs into other signs and proceed by comparison and illustration” [14] the figures of information and communication are made distinct not by the caprice of those who control the networks, nor the desires of those who send and receive the messages, but by mutation itself. Michel Foucault, remarking on the work of Georges Canguilhem, drew the conclusion that the very possibility of mutation, rather than existing in opposition to our will, was what human autonomy was predicated upon: “In this sense, life – and this is its radical feature – is that which is capable of error… Further, it must be questioned in regard to that singular but hereditary error which explains the fact that, with man, life has led to a living being that is never completely in the right place, that is destined to ‘err’ and to be ‘wrong’.” [15] In his writings on the history of Heredity, The Logic of Life, Francois Jacob lingers on another Demon in the details, fashioned by Rene Descartes in his infamous meditation on human knowledge. François Jacob positions Descartes’ meditation in a period of explosive critical thought focussed on the very ontology of ‘nature’: “For with the arrival of the 17th Century, the very nature of knowledge was transformed. Until then, knowledge had been grafted on God, the soul and the cosmos… What counted [now] was not so much the code used by God for creating nature as that sought by man for understanding it.” [16] The infinite power of God’s will was no longer able to bend nature to any whim. If man were to decipher nature, to reveal its order, Descartes surmised, it was with the assurance that “the grid will not change in the course of the operation”[17]. For Descartes, the evil Demon, is a metaphor for deception espoused on the understanding that underlying that deception, nature had a certainty. God may well have given the world its original impetus, have designed its original make-up, but that make-up could not be changed. The network economy has today become the grid of operations onto which we map the world. Its binary restrictions predicate a logic of minimal error and maximum performance: a regime of control that drives our economic, political and social interdependencies. Trapped within his imperial logic, Robinson Crusoe’s levelling of inside and outside, his ruthless tidying of Friday’s noisy speech into a binary dialectic, disguises a higher order of reorganisation. As readers navigating the narrative we are keen to recognise the social changes Defoe’s novel embodies in its short-sighted central character. Perhaps, though, the most productive way to read this fiction, is to allegorise it as an outside perspective on our own time? Gathering together the fruits of research, I am often struck by the serendipitous quality of so many discoveries. In writing this mini-paper I have found it useful to engage with these marks, that become like demonic footprints, mutations in my thinking. Comparing each side by side, I hope to find, in the words of Michel Foucault: “…a way from the visible mark to that which is being said by it and which, without that mark, would lie like unspoken speech, dormant within things.” [18]    

References & Bibliography [1] Daniel Defoe, Robinson Crusoe, Penguin classics (London: Penguin Books, 2001).

[2] Umberto Eco, The open work (Cambridge: Harvard University Press, n.d.).

[3] Defoe, Robinson Crusoe.

[4] Susan Stewart, On longing: narratives of the miniature, the gigantic, the souvenir, the collection (Duke University Press, 1993).

[5] N. Katherine Hayles, “Maxwell’s Demon and Shannon’s Choice,” in Chaos bound: orderly disorder in contemporary literature and science (Cornell University Press, 1990).

[6] Ibid.

[7] Ibid.

[8] Jacques Rancière, The emancipated spectator (London: Verso, 2009).

[9] Umberto Eco, The open work (Cambridge: Harvard University Press, n.d.). (My emphasis)

[10] W Hillis, The pattern on the stone?: the simple ideas that make computers work, 1st ed. (New York: Basic Books, 1999).

[11] Mark Nunes, Error: glitch, noise, and jam in new media cultures (Continuum International Publishing Group, 2010).

[12] Susan Stewart, On longing: narratives of the miniature, the gigantic, the souvenir, the collection (Duke University Press, 1993).

[13] Nunes, Error.

[14] Rancière, The emancipated spectator.

[15] Michel Foucault, “Life: Experience and Science,” in Aesthetics, method, and epistemology (The New Press, 1999).

[16] François Jacob, The logic of life: a history of heredity?; the possible and the actual (Penguin, 1989).

[17] Ibid.

[18] Michel Foucault, The order of things?: an archaeology of the human sciences., 2003.

]]>
Wed, 07 Dec 2011 08:50:14 -0800 http://machinemachine.net/text/research/a-mark-on-crusoes-island
<![CDATA[Digital Decay (2001): by Bruce Sterling]]> http://variablemedia.net/pdf/Sterling.pdf

"Entropy requires no maintenance. Entropy has its own poetry."

]]>
Wed, 10 Aug 2011 09:59:32 -0700 http://variablemedia.net/pdf/Sterling.pdf