MachineMachine /stream - search for cyberspace https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[A Declaration of the Interdependence of Cyberspace]]> https://www.interdependence.online/declaration/e-bw-AGkYsZFYqmAe2771A6hi9ZMIkWrkBNtHIF1hF4?s=09

Sign

]]>
Thu, 25 Nov 2021 02:51:21 -0800 https://www.interdependence.online/declaration/e-bw-AGkYsZFYqmAe2771A6hi9ZMIkWrkBNtHIF1hF4?s=09
<![CDATA[#654: Indigenous Futurism & Aboriginal Territories in Cyberspace with Jason Edward Lewis | Voices of VR Podcast]]> https://huffduffer.com/therourke/513594

I talk with Jason Edward Lewis about the Aboriginal Territories in Cyberspace initiative, as well as the two 2167 Indigenous Storytelling in VR that he helped to produce. How do we reckon the past, present, and the future, and what types of possibilities open up when you start to tell stories from the perspective of seven generations from now, or about 150 years into the future. This conversation took place at the Symposium iX conference at the Society for Arts and Technology in Montreal, Canada.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCASThttps://dts.podtrac.com/redirect.mp3/d1icj85yqthyoq.cloudfront.net/wp-content/uploads/2018/06/Voices-of-VR-654-Jason-Edward-Lewis.mp3

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Support Voices of VR

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

http://voicesofvr.com/654-indigenous-futurism-aboriginal-territories-in-cyberspace-with-jason-edward-lewis/

]]>
Wed, 28 Nov 2018 10:10:13 -0800 https://huffduffer.com/therourke/513594
<![CDATA[Seduced & Abandoned: The Body in the Virtual World - The Feminine Cyberspace]]> https://www.youtube.com/watch?v=doL9mRMEUGw

Seduced & Abandoned was one of a series of ICA conferences (spanning 12-13 March, 1994) held under the umbrella title Towards the Aesthetics of the Future that explored the connections between culture, society, politics and the impact upon them of new digital processes and technologies.

In this selection, Sadie Plant argues that cyberspace is a potentially radical space which uses modes of thinking and operating that have traditionally been seen as female. She also considers the relationship between cyberspace and immaterial space and speculates on what this could mean for the future. Christine Tamblyn and Pat Cadigan contribute to the discussion/Q&A but do not make individual presentations.

Digitisation supported by Virtual Futures http://virtualfutures.co.uk

Extra title music by Vapor Lanes https://vaporlanes.bandcamp.com/

Review of the proceedings http://www.independent.co.uk/extras/indybest/gadgets-tech/computers-more-theorists-than-you-could-shake-a-stick-at-rupert-goodwins-floats-in-organic-creme-de-1429850.html

]]>
Fri, 04 Aug 2017 04:48:09 -0700 https://www.youtube.com/watch?v=doL9mRMEUGw
<![CDATA[Virtual Futures 1995 - Replicunts: the Future of Cyberfeminism]]> https://vimeo.com/28025256

Replicunts: the Future of Cyberfeminism - Liana Borghi, Pat Cadigan, Gwyneth Jones, Francesca da Rimini, Josephine Starrs, Sadie Plant Virtual Futures University of Warwick, 26-28 May 1995 VirtualFutures.co.uk The virtual revolution is also a sexual revolution. All New Gen plays with cyberspace amazons, and the Puppet Mistress weaves webs on the net. What are the virtual futures of gender and sexuality? What happens to masculinity and feminity as the Cyberflesh Girlmonsters come on line? Can patriarchy survive the emergence of cyberspace? Is anything straight in a non-linear world? Does the cyborg have a sex? Is a new sexual politics - or post-politics of some kind - gathering pace in the midst of the digital revolution?Cast: Virtual FuturesTags:

]]>
Wed, 10 Jun 2015 06:13:59 -0700 https://vimeo.com/28025256
<![CDATA[Archetypes of Cyberspace]]> https://web.archive.org/web/20121031222627/http://www.theyrule.net/psybernet/archetypes/archetypes-cyberspace-031.html

To see the archetypal in an image is thus not a hermeneutic move. It is an imagistic move. We amplify an image by means of a myth in order not to find its archetypal meaning but in order to feed it with further images that increase its volume and depth and release its fecundity.

]]>
Sat, 22 Nov 2014 05:23:59 -0800 https://web.archive.org/web/20121031222627/http://www.theyrule.net/psybernet/archetypes/archetypes-cyberspace-031.html
<![CDATA[Meet the Father of Digital Life]]> http://nautil.us/issue/14/mutation/meet-the-father-of-digital-life

n 1953, at the dawn of modern computing, Nils Aall Barricelli played God. Clutching a deck of playing cards in one hand and a stack of punched cards in the other, Barricelli hovered over one of the world’s earliest and most influential computers, the IAS machine, at the Institute for Advanced Study in Princeton, New Jersey. During the day the computer was used to make weather forecasting calculations; at night it was commandeered by the Los Alamos group to calculate ballistics for nuclear weaponry. Barricelli, a maverick mathematician, part Italian and part Norwegian, had finagled time on the computer to model the origins and evolution of life.

Inside a simple red brick building at the northern corner of the Institute’s wooded wilds, Barricelli ran models of evolution on a digital computer. His artificial universes, which he fed with numbers drawn from shuffled playing cards, teemed with creatures of code—morphing, mutating, melting, maintaining. He created laws that determined, independent of any foreknowledge on his part, which assemblages of binary digits lived, which died, and which adapted. As he put it in a 1961 paper, in which he speculated on the prospects and conditions for life on other planets, “The author has developed numerical organisms, with properties startlingly similar to living organisms, in the memory of a high speed computer.” For these coded critters, Barricelli became a maker of worlds.

Until his death in 1993, Barricelli floated between biological and mathematical sciences, questioning doctrine, not quite fitting in. “He was a brilliant, eccentric genius,” says George Dyson, the historian of technology and author of Darwin Among The Machines and Turing’s Cathedral, which feature Barricelli’s work. “And the thing about geniuses is that they just see things clearly that other people don’t see.”

Barricelli programmed some of the earliest computer algorithms that resemble real-life processes: a subdivision of what we now call “artificial life,” which seeks to simulate living systems—evolution, adaptation, ecology—in computers. Barricelli presented a bold challenge to the standard Darwinian model of evolution by competition by demonstrating that organisms evolved by symbiosis and cooperation.

Pixar cofounder Alvy Ray Smith says Barricelli influenced his earliest thinking about the possibilities for computer animation.

In fact, Barricelli’s projects anticipated many current avenues of research, including cellular automata, computer programs involving grids of numbers paired with local rules that can produce complicated, unpredictable behavior. His models bear striking resemblance to the one-dimensional cellular automata—life-like lattices of numerical patterns—championed by Stephen Wolfram, whose search tool Wolfram Alpha helps power the brain of Siri on the iPhone. Nonconformist biologist Craig Venter, in defending his creation of a cell with a synthetic genome—“the first self-replicating species we’ve had on the planet whose parent is a computer”—echoes Barricelli.

Barricelli’s experiments had an aesthetic side, too. Uncommonly for the time, he converted the digital 1s and 0s of the computer’s stored memory into pictorial images. Those images, and the ideas behind them, would influence computer animators in generations to come. Pixar cofounder Alvy Ray Smith, for instance, says Barricelli stirred his earliest thinking about the possibilities for computer animation, and beyond that, his philosophical muse. “What we’re really talking about here is the notion that living things are computations,” he says. “Look at how the planet works and it sure does look like a computation.”

Despite Barricelli’s pioneering experiments, barely anyone remembers him. “I have not heard of him to tell you the truth,” says Mark Bedau, professor of humanities and philosophy at Reed College and editor of the journal Artificial Life. “I probably know more about the history than most in the field and I’m not aware of him.”

Barricelli was an anomaly, a mutation in the intellectual zeitgeist, an unsung hero who has mostly languished in obscurity for the past half century. “People weren’t ready for him,” Dyson says. That a progenitor has not received much acknowledgment is a failing not unique to science. Visionaries often arrive before their time. Barricelli charted a course for the digital revolution, and history has been catching up ever since.

Barricelli_BREAKER-02 EVOLUTION BY THE NUMBERS: Barricelli converted his computer tallies of 1s and 0s into images. In this 1953 Barricelli print, explains NYU associate professor Alexander Galloway, the chaotic center represents mutation and disorganization. The more symmetrical fields toward the margins depict Barricelli’s evolved numerical organisms.From the Shelby White and Leon Levy Archives Center, Institute for Advanced Study, Princeton. Barricelli was born in Rome on Jan. 24, 1912. According to Richard Goodman, a retired microbiologist who met and befriended the mathematician in the 1960s, Barricelli claimed to have invented calculus before his tenth birthday. When the young boy showed the math to his father, he learned that Newton and Leibniz had preempted him by centuries. While a student at the University of Rome, Barricelli studied mathematics and physics under Enrico Fermi, a pioneer of quantum theory and nuclear physics. A couple of years after graduating in 1936, he immigrated to Norway with his recently divorced mother and younger sister.

As World War II raged, Barricelli studied. An uncompromising oddball who teetered between madcap and mastermind, Barricelli had a habit of exclaiming “Absolut!” when he agreed with someone, or “Scandaloos!” when he found something disagreeable. His accent was infused with Scandinavian and Romantic pronunciations, making it occasionally challenging for colleagues to understand him. Goodman recalls one of his colleagues at the University of California, Los Angeles who just happened to be reading Barricelli’s papers “when the mathematician himself barged in and, without ceremony, began rattling off a stream of technical information about his work on phage genetics,” a science that studies gene mutation, replication, and expression through model viruses. Goodman’s colleague understood only fragments of the speech, but realized it pertained to what he had been reading.

“Are you familiar with the work of Nils Barricelli?” he asked.

“Barricelli! That’s me!” the mathematician cried.

Notwithstanding having submitted a 500-page dissertation on the statistical analysis of climate variation in 1946, Barricelli never completed his Ph.D. Recalling the scene in the movie Amadeus in which the Emperor of Austria commends Mozart’s performance, save for there being “too many notes,” Barricelli’s thesis committee directed him to slash the paper to a tenth of the size, or else it would not accept the work. Rather than capitulate, Barricelli forfeited the degree.

Barricelli began modeling biological phenomena on paper, but his calculations were slow and limited. He applied to study in the United States as a Fulbright fellow, where he could work with the IAS machine. As he wrote on his original travel grant submission in 1951, he sought “to perform numerical experiments by means of great calculating machines,” in order to clarify, through mathematics, “the first stages of evolution of a species.” He also wished to mingle with great minds—“to communicate with American statisticians and evolution-theorists.” By then he had published papers on statistics and genetics, and had taught Einstein’s theory of relativity. In his application photo, he sports a pyramidal moustache, hair brushed to the back of his elliptic head, and hooded, downturned eyes. At the time of his application, he was a 39-year-old assistant professor at the University of Oslo.

Although the program initially rejected him due to a visa issue, in early 1953 Barricelli arrived at the Institute for Advanced Study as a visiting member. “I hope that you will be finding Mr. Baricelli [sic] an interesting person to talk with,” wrote Ragnar Frisch, a colleague of Barricelli’s who would later win the first Nobel Prize in Economics, in a letter to John von Neumann, a mathematician at IAS, who helped devise the institute’s groundbreaking computer. “He is not very systematic always in his exposition,” Frisch continued, “but he does have interesting ideas.”

Barricelli_BREAKER_2crop PSYCHEDELIC BARRICELLI: In this recreation of a Barricelli experiment, NYU associate professor Alexander Galloway has added color to show the gene groups more clearly. Each swatch of color signals a different organism. Borders between the color fields represent turbulence as genes bounce off and meld with others, symbolizing Barricelli’s symbiogenesis.Courtesy Alexander Galloway Centered above Barricelli’s first computer logbook entry at the Institute for Advanced Study, in handwritten pencil script dated March 3, 1953, is the title “Symbiogenesis problem.” This was his theory of proto-genes, virus-like organisms that teamed up to become complex organisms: first chromosomes, then cellular organs, onward to cellular organisms and, ultimately, other species. Like parasites seeking a host, these proto-genes joined together, according to Barricelli, and through their mutual aid and dependency, originated life as we know it.

Standard neo-Darwinian doctrine maintained that natural selection was the main means by which species formed. Slight variations and mutations in genes combined with competition led to gradual evolutionary change. But Barricelli disagreed. He pictured nimbler genes acting as a collective, cooperative society working together toward becoming species. Darwin’s theory, he concluded, was inadequate. “This theory does not answer our question,” he wrote in 1954, “it does not say why living organisms exist.”

Barricelli coded his numerical organisms on the IAS machine in order to prove his case. “It is very easy to fabricate or simply define entities with the ability to reproduce themselves, e.g., within the realm of arithmetic,” he wrote.

The early computer looked sort of like a mix between a loom and an internal combustion engine. Lining the middle region were 40 Williams cathode ray tubes, which served as the machine’s memory. Within each tube, a beam of electrons (the cathode ray) bombarded one end, creating a 32-by-32 grid of points, each consisting of a slight variation in electrical charge. There were five kilobytes of memory total stored in the machine. Not much by today’s standards, but back then it was an arsenal.

Barricelli saw his computer organisms as a blueprint of life—on this planet and any others.

Inside the device, Barricelli programmed steadily mutable worlds each with rows of 512 “genes,” represented by integers ranging from negative to positive 18. As the computer cycled through hundreds and thousands of generations, persistent groupings of genes would emerge, which Barricelli deemed organisms. The trick was to tweak his manmade laws of nature—“norms,” as he called them—which governed the universe and its entities just so. He had to maintain these ecosystems on the brink of pandemonium and stasis. Too much chaos and his beasts would unravel into a disorganized shamble; too little and they would homogenize. The sweet spot in the middle, however, sustained life-like processes.

Barricelli’s balancing act was not always easygoing. His first trials were riddled with pests: primitive, often single numeric genes invaded the space and gobbled their neighbors. Typically, he was only able to witness a couple of hereditary changes, or a handful at best, before the world unwound. To create lasting evolutionary processes, he needed to handicap these pests’ ability to rapidly reproduce. By the time he returned to the Institute in 1954 to begin a second round of experiments, Barricelli made some critical changes. First, he capped the proliferation of the pests to once per generation. That constraint allowed his numerical organisms enough leeway to outpace the pests. Second, he began employing different norms to different sections of his universes. That forced his numerical organisms always to adapt.

Even in the earlier universes, Barricelli realized that mutation and natural selection alone were insufficient to account for the genesis of species. In fact, most single mutations were harmful. “The majority of the new varieties which have shown the ability to expand are a result of crossing-phenomena and not of mutations, although mutations (especially injurious mutations) have been much more frequent than hereditary changes by crossing in the experiments performed,” he wrote.

When an organism became maximally fit for an environment, the slightest variation would only weaken it. In such cases, it took at least two modifications, effected by a cross-fertilization, to give the numerical organism any chance of improvement. This indicated to Barricelli that symbioses, gene crossing, and “a primitive form of sexual reproduction,” were essential to the emergence of life.

“Barricelli immediately figured out that random mutation wasn’t the important thing; in his first experiment he figured out that the important thing was recombination and sex,” Dyson says. “He figured out right away what took other people much longer to figure out.” Indeed, Barricelli’s theory of symbiogenesis can be seen as anticipating the work of independent-thinking biologist Lynn Margulis, who in the 1960s showed that it was not necessarily genetic mutations over generations, but symbiosis, notably of bacteria, that produced new cell lineages.

Barricelli saw his computer organisms as a blueprint of life—on this planet and any others. “The question whether one type of symbio-organism is developed in the memory of a digital computer while another type is developed in a chemical laboratory or by a natural process on some planet or satellite does not add anything fundamental to this difference,” he wrote. A month after Barricelli began his experiments on the IAS machine, Crick and Watson announced the shape of DNA as a double helix. But learning about the shape of biological life didn’t put a dent in Barricelli’s conviction that he had captured the mechanics of life on a computer. Let Watson and Crick call DNA a double helix. Barricelli called it “molecule-shaped numbers.”

Barricelli_BREAKER

What buried Barricelli in obscurity is something of a mystery. “Being uncompromising in his opinions and not a team player,” says Dyson, no doubt led to Barricelli’s “isolation from the academic mainstream.” Dyson also suspects Barricelli and the indomitable Hungarian mathematician von Neumann, an influential leader at the Institute of Advanced Study, didn’t hit it off. Von Neumann appears to have ignored Barricelli. “That was sort of fatal because everybody looked to von Neumann as the grandfather of self-replicating machines.”

Ever so slowly, though, Barricelli is gaining recognition. That stems in part from another of Barricelli’s remarkable developments; certainly one of his most beautiful. He didn’t rest with creating a universe of numerical organisms, he converted his organisms into images. His computer tallies of 1s and 0s would then self-organize into visual grids of exquisite variety and texture. According to Alexander Galloway, associate professor in the department of media, culture, and communication at New York University, a finished Barricelli “image yielded a snapshot of evolutionary time.”

When Barricelli printed sections of his digitized universes, they were dazzling. To modern eyes they might look like satellite imagery of an alien geography: chaotic oceans, stratigraphic outcrops, and the contours of a single stream running down the center fold, fanning into a delta at the patchwork’s bottom. “Somebody needs to do a museum show and show this stuff because they’re outrageous,” Galloway says.

Barricelli was an uncompromising oddball who teetered between madcap and mastermind.

Today, Galloway, a member of Barricelli’s small but growing cadre of boosters, has recreated the images. Following methods described by Barricelli in one of his papers, Galloway has coded an applet using the computer language Processing to revive Barricelli’s numerical organisms—with slight variation. While Barricelli encoded his numbers as eight-unit-long proto-pixels, Galloway condensed each to a single color-coded cell. By collapsing each number into a single pixel, Galloway has been able to fit eight times as many generations in the frame. These revitalized mosaics look like psychedelic cross-sections of the fossil record. Each swatch of color represents an organism, and when one color field bumps up against another one, that’s where cross-fertilization takes place.

“You can see these kinds of points of turbulence where the one color meets another color,” Galloway says, showing off the images on a computer in his office. “That’s a point where a number would be—or a gene would be—sort of jumping from one organism to another.” Here, in other words, is artificial life—Barricelli’s symbiogenesis—frozen in amber. And cyan and lavender and teal and lime and fuchsia.

Galloway is not the only one to be struck by the beauty of Barricelli’s computer-generated digital images. As a doctoral student, Pixar cofounder Smith became familiar with Barricelli’s work while researching the history of cellular automata for his dissertation. When he came across Barricelli’s prints he was astonished. “It was remarkable to me that with such crude computing facilities in the early 50s, he was able to be making pictures,” Smith says. “I guess in a sense you can say that Barricelli got me thinking about computer animation before I thought about computer animation. I never thought about it that way, but that’s essentially what it was.”

Cyberspace now swells with Barricelli’s progeny. Self-replicating strings of arithmetic live out their days in the digital wilds, increasingly independent of our tampering. The fittest bits survive and propagate. Researchers continue to model reduced, pared-down versions of life artificially, while the real world bursts with Boolean beings. Scientists like Venter conjure synthetic organisms, assisted by computer design. Swarms of autonomous codes thrive, expire, evolve, and mutate underneath our fingertips daily. “All kinds of self-reproducing codes are out there doing things,” Dyson says. In our digital lives, we are immersed in Barricelli’s world.

]]>
Fri, 20 Jun 2014 06:08:03 -0700 http://nautil.us/issue/14/mutation/meet-the-father-of-digital-life
<![CDATA[Digital Metaphors: Editor’s Introduction | Alluvium]]> http://www.alluvium-journal.org/2013/12/04/digital-metaphors-editors-introduction/

Metaphor wants to be…

‘[...] metaphors work to change people’s minds. Orators have known this since Demosthenes. [...] But there’s precious little evidence that they tell you what people think. [...] And in any case, words aren’t meanings. As any really good spy knows, a word is a code that stands for something else. If you take the code at face value then you’ve fallen for the trick.’ (Daniel Soar, “The Bourne Analogy”).

Tao Lin’s recent novel Taipei (2013) is a fictional document of life in our current digital culture. The protagonist, Paul — who is loosely based on the author — is numb from his always turned on digitally mediated life, and throughout the novel increases his recreational drug taking as a kind of compensation: the chemical highs and trips are the experiential counterpoint to the mundanity of what once seemed otherworldly — his online encounters. In the novel online interactions are not distinguished from real life ones, they are all real, and so Paul’s digital malaise is also his embodied depressive mindset. The apotheosis of both these highs and lows is experienced by Paul, and his then girlfriend Erin, on a trip to visit Paul’s parents in Taipei. There the hyper-digital displays of the city — ‘lighted signs [...] animated and repeating like GIF files, attached to every building’ (166) — launch some of the more explicit mediations on digital culture in the novel: Paul asked [Erin] if she could think of a newer word for “computer” than “computer,” which seemed outdated and, in still being used, suspicious in some way, like maybe the word itself was intelligent and had manipulated culture in its favor, perpetuating its usage (167). Here Paul intimates a sense that language is elusive, that it is sentient, and that, in the words of Daniel Soar quoted above as an epitaph, it tricks us. It seems to matter that in this extract from Taipei the word ‘computer’ is conflated with a sense of the object ‘computer’. The word, in being ‘intelligent’, has somehow taken on the quality of the thing it denotes — a potentially malevolent agency. The history of computing is one of people and things: computers were first the women who calculated ballistics trajectories during the Second World War, whose actions became the template for modern automated programming. The computer, as an object, is also-always a metaphor of a human-machine relation. The name for the machine asserts a likeness between the automated mechanisms of computing and the physical and mental labour of the first human ‘computers’. Thinking of computing as a substantiated metaphor for a human-machine interaction pervades the way we talk about digital culture. Most particularly in the way we think of computers as sentient — however casually. We often speak of computers as acting independently from our commands, and frequently we think of them ‘wanting’ things, ‘manipulating’ culture, or ourselves.

Pre-Electronic Binary Code Pre-electronic binary code: the history of computing offers us metaphors for human-machine interaction which pervade the way we talk about digital culture today [Image by Erik Wilde under a CC BY-SA license]

Julie E. Cohen, in her 2012 book Configuring the Networked Self, describes the way the misplaced metaphor of human-computer and machine-computer has permeated utopian views of digitally mediated life: Advocates of information-as-freedom initially envisioned the Internet as a seamless and fundamentally democratic web of information [...]. That vision is encapsulated in Stewart Brand’s memorable aphorism “Information wants to be free.” [...] Information “wants” to be free in the same sense that objects with mass present in the earth’s gravitational field “want” to fall to the ground. (8) Cohen’s sharp undercutting of Brand’s aphorism points us toward the way the metaphor of computing is also an anthropomorphisation. The metaphor implicates a human desire in machine action. This linguistic slipperiness filters through discussion of computing at all levels. In particular the field of software studies — concerned with theorising code and programming as praxis and thing — contains at its core a debate on the complexity of considering code in a language which will always metaphorise, or allegorise. Responding to an article of Alexander R. Galloway’s titled “Language Wants to Be Overlooked: On Software and Ideology”, Wendy Hui Kyong Chun argues that Galloway’s stance against a kind of ‘anthropomorphization’ of code studies (his assertion that as an executable language code is ‘against interpretation’) is impossible within a discourse of critical theory. Chun argues, ‘to what extent, however, can source code be understood outside of anthropomorphization? [...] (The inevitability of this anthropomorphization is arguably evident in the title of Galloway’s article: “Language Wants to Be Overlooked” [emphasis added].)’ (Chun 305). In her critique of Galloway’s approach Wendy Chun asserts that it is not possible to extract the metaphor from the material, that they are importantly and intrinsically linked.[1] For Julie E. Cohen the relationship between metaphor and digital culture-as-it-is-lived is a problematic tie that potentially damages legal and constitutional understanding of user rights. Cohen convincingly argues that a term such as ‘cyberspace’, which remains inextricable from its fictional and virtual connotations, does not transition into legal language successfully; in part because the word itself is a metaphor, premised on an imagined reality rather than ‘the situated, embodied beings who inhabit it’ (Cohen 3). And yet Cohen’s writing itself demonstrates the tenacious substance of metaphoric language, using extended exposition of metaphors as a means to think more materially about the effects of legal and digital protocol and action. In the following extract from Configuring the Networked Self, Cohen is winding down a discussion of the difficulty of forming actual policy out of freedom versus control debates surrounding digital culture. Throughout the discussion Cohen has emphasised the way that both sides of the debate are unable to substantiate their rhetoric with embodied user practice; instead Cohen identifies a language that defers specific policy aims.[2] Cohen’s own use of metaphor in this section — ‘objections to control fuel calls [...]’, ‘darknets’ (the latter in inverted commas) — is made to mean something grounded, through a kind of allegorical framework. I am not suggesting that allegory materialises metaphor — allegory functioning in part as itself an extended metaphor — but it does contextualise metaphor.

Circuit Board 2 How tenacious is metaphoric language? The persistence of computational metaphors in understanding digital culture could harm legal and constitutional understandings of user rights [Image by Christian under a CC BY-NC-ND license]

This is exemplified in Cohen’s description of the ways US policy discussions regarding code, rights and privacy of the subject are bound to a kind of imaginary, and demonstrate great difficulty in becoming concrete: Policy debates have a circular, self-referential quality. Allegations of lawlessness bolster the perceived need for control, and objections to control fuel calls for increased openness. That is no accident; rigidity and license historically have maintained a curious symbiosis. In the 1920s, Prohibition fueled the rise of Al Capone; today, privately deputized copyright cops and draconian technical protection systems spur the emergence of uncontrolled “darknets.” In science fiction, technocratic, rule-bound civilizations spawn “edge cities” marked by their comparative heterogeneity and near imperviousness to externally imposed authority. These cities are patterned on the favelas and shantytowns that both sap and sustain the world’s emerging megacities. The pattern suggests an implicit acknowledgment that each half of the freedom/control binary contains and requires the other (9-10). I quote this passage at length in order to get at the way in which the ‘self-referential nature’ of policy discussion is here explained through a conceptual, and specifically literary, framing. Technology is always both imagined and built: this seems obvious, but it justifies reiteration because the material operations of technology are always metaphorically considered just as they are concretely manifest. The perilous circumstance this creates is played on in Cohen’s writing as she critiques constitutional policy that repeatedly cannot get at the embodied subject that uses digital technology; thwarted by the writing and rewriting of debate. In Cohen’s words this real situation is like the science fiction that is always-already seemingly like the real technology. Whether William Gibson’s ‘cyberspace’, a programmer’s speculative coding, or a lawyer’s articulation of copyright, there is no easy way to break apart the relationship between the imaginary and the actual of technoculture. Perhaps then what is called for is an explosion of the metaphors that pervade contemporary digital culture. To, so to speak, push metaphors until they give way; to generate critical discourse that tests the limits of metaphors, in an effort to see what pretext they may yield for our daily digital interactions. The articles in this issue all engage with exactly this kind of discourse. In Sophie Jones’ “The Electronic Heart”, the history of computing as one of women’s labour is used to reconfigure the metaphor of a computer as an ‘electronic brain’; instead asking whether cultural anxieties about computer-simulated emotion are linked to the naturalization of women’s affective labour. In “An Ontology of Everything on the Face of the Earth”, Daniel Rourke also considers computers as a sentient metaphor: uncovering an uncanny symbiosis between what a computer wants and what a human can effect with computing, through a critical dissection of the biocybernetic leeching of John Carpenter’s 1982 film The Thing. Finally, in “The Metaphorics of Virtual Depth”, Rob Gallagher uses Marcel Proust’s treatment of novelistic spacetime to generate a critical discourse on spatial and perspectival metaphor in virtual game environments. All these articles put into play an academic approach to metaphors of computing that dig up and pull out the stuff in between language and machine. In his introduction to Understanding Digital Humanities David M. Berry has argued for such an approach: [what is needed is a] ‘critical understanding of the literature of the digital, and through that [to] develop a shared culture through a form of Bildung’ (8).

Elysium A wheel in the sky: Neil Blomkamp's futuristic L.A. plays on the territorial paranoia of the U.S. over alien invasion and dystopian metaphors of digitally-mediated environments [Image used under fair dealings provisions]

I am writing this article a day after seeing Neill Blomkamp’s film Elysium (2013). Reading Cohen’s assertion regarding the cyclical nature of US digital rights policy debates on control and freedom, her allegory with science fiction seems entirely pertinent. Elysium is set in 2154; the earth is overpopulated, under-resourced, and a global elite have escaped to a man-made (and machine-made) world on a spaceship, ‘Elysium’. Manufacturing for Elysium continues on earth where the population, ravaged by illness, dreams of escaping to Elysium to be cured in “Med-Pods”. The movie focuses on the slums of near future L.A. and — perhaps unsurprisingly given Blomkamp’s last film District 9 (2009) — plays on the real territorial paranoia of the U.S. over alien invasion: that the favelas of Central and South America, and the political structures they embody, are always threatening ascension. In Elysium the “edge city” is the whole world, and the technocratic power base is a spaceship garden, circling the earth’s orbit. ‘Elysium’ is a green and white paradise; a techno-civic environment in which humans and nature are equally managed, and manicured. ‘Elysium’, visually, looks a lot like Disney’s Epcot theme park — which brings me back to where I started. In Tao Lin’s Taipei Paul’s disillusionment with technology is in part with its failure to be as he imagined, and his imagination was informed by the Disney-fied future of Epcot. In Taipei: Paul stared at the lighted signs, some of which were animated and repeated like GIF files, attached to almost every building to face oncoming traffic [...] and sleepily thought how technology was no longer the source of wonderment and possibility it had been when, for example, he learned as a child at Epcot Center [...] that families of three, with one or two robot dogs and one maid, would live in self-sustaining, underwater, glass spheres by something like 2004 or 2008 (166). Thinking through the metaphor of Elysium has me thinking toward the fiction of Epcot (via Tao Lin’s book). The metaphor-come-allegories at work here are at remove from my digitally mediated, embodied reality, but they seep through nonetheless. Rather than only look for the concrete reality that drives the metaphor, why not also engage with the messiness of the metaphor; its potential disjunction with technology as it is lived, and its persistent presence regardless.

CITATION: Zara Dinnen, "Digital Metaphors: Editor's Introduction," Alluvium, Vol. 2, No. 6 (2013): n. pag. Web. 4 December 2013, http://dx.doi.org/10.7766/alluvium.v2.6.04

]]>
Wed, 11 Dec 2013 15:42:41 -0800 http://www.alluvium-journal.org/2013/12/04/digital-metaphors-editors-introduction/
<![CDATA[Today, cyber means war.]]> http://io9.com/today-cyber-means-war-but-back-in-the-1990s-it-mean-1325671487/1474902195/

Today, cyber means war. But back in the 1990s, it meant sex — at least, the kind of sex you can have in a chat room. Why did the word change, and where did it originally come from?

It all started with "cybernetics," an obscure term popularized by a mathematician named Norbert Weiner in the 1940s. For his groundbreaking book Cybernetics, Weiner borrowed the ancient Greek word "cyber," which is related to the idea of government or governing. Indeed, the only time the word cybernetics had appeared before was in a few works of political theory about the science of governance.

In his writing, Weiner described what was at the time a pretty futuristic idea — that one day there would be a computer system that ran on feedback. Essentially, it would be a self-governing system. And for a long time, cybernetics remained the purview of information theorists like Weiner, and early computer programmers.

Science fiction author Pat Cadigan, whose novel Mindplayers is a cyberpunk classic, recalled that her first encounter with "cyber" was of a decidedly Weinerish variety. She told io9 that the first time she heard the term was when she was in high school in 1967, and somebody mentioned cybernetics. "I asked what cybernetics was. 'It has to do with computers,' was the answer. My eyes glazed over. For years, that was the only word I knew with the prefix 'cyber' in it."

Mindplayers Amazon.com: $3.50 Buy now M34 readers bought this

But all that changed a little over a decade later. Cadigan recalled:

One morning in 1979, I was getting ready for work and Gary Numan's "Cars" came on the radio. Afterwards, the DJ said, "There's some cyberpunk for you." He was making a joke; in 1979, the punk movement was in full flower but the chaotic noise of punk music was starting to evolve into electronic noise. The Bizarre Evolution of the Word "Cyber" 4 SEXPAND Still, that joke quickly became a reality. In the early 1980s, the cyberpunk movement took over science fiction, spurred by the popularity of the film Bladerunner and William Gibson's novel Neuromancer. Authors like Cadigan, Bruce Sterling and Rudy Rucker were writing mind-blowing stories about the merging of humans and computers. Cyber became a catch-all prefix that could be added to any word to make it sound cutting-edge. Cadigan noted that cyber "sort of supplanted the term 'digital' in some ways as an indicator of something that was high tech."

The 1990s: Decade of Cyber

RELATED

Are you a cyberpunk? This early 1990s poster explains it all to you. R.U. Sirius was a founder of Mondo 2000, the definitive futurist magazine of the early 1990s. And now he's posted a ton of snippets from it over … Read… Cyberpunk was a mostly-underground artistic style in the 1980s, but suddenly in the 1990s everything was cyber. As more and more people got internet access, the alien world of cyberspace from William Gibson's work became a household consumer item.

Richard Holden, a lexicographer with the Oxford English Dictionary, recently researched the history of cyber for the dictionary. He told io9 that the 1990s were a time when use of the word underwent rapid diversification:

The Oxford English Dictionary entry for the prefix cyber­- has evidence of its use going back to 1961 (in Cybertron, as it happens), but . . . it seems to have become particularly popular in the 1990s — we don’t have all that much evidence for its use before then. This seems likely to be a result of the invention of the World Wide Web, and the earliest evidence we’ve found for words like cyber-bully, cybercommunity, cybergeek, cyberlaw, cyberstalker, and, indeed, cybersex and cyberwar all comes from the early 90s. At that time you . . . seem to get a mix of positive and negative terms involving the prefix, which possibly reflects the mixed feelings people often have about the opportunities and threats a new technology can bring. Ben Zimmer, who writes about linguistics for the Wall Street Journal, agreed with Holden, noting that the seemingly-incongruous ideas of cybersex and cyberwar "grew up side by side." The earliest recorded use of the term "cybersecurity" came in 1989, the exact same year when the word "cyberporn" was coined. But neither term was dominant. In the heady days of the 1990s "information superhighway," before people got used to the idea that shopping, dating, and work could exist online, adding the prefix cyber to something made it seem like it was taking place in the gleaming, pixelated world inhabited by futuristic youth.

Had the iPhone come along in the 1990s, it's likely that we'd be calling our devices something very different. Cadigan said, "Terminology-wise, I find it interesting that we never had cyber-phones. The mobile/celluar phone became the cell and then evolved into the smart phone, not the cyber-phone." Just as today everything from buildings to phones can be "smart," in the 1990s anything could be cyber.

Including sex.

The Cybersex Moment

The Bizarre Evolution of the Word "Cyber" 56 SEXPAND Back in the days of AOL chat rooms, IRC channels, and text-only multi-user games, lots of people started having cybersex. Most of this furtive online activity involved no more than people talking dirty via text.

But cyber-pundits suggested that teledildonics and virtual reality sex were just around the corner. Soon, we would be having sex with chrome-plated dragon beasts in landscapes made of diamond flowers. And we would be stimulating our lovers 3,000 miles away with sex toys that plugged into both partners, sending the orgasmic shivers of one to the other via the internet.

Zimmer pointed out that Douglas Adams may have invented the idea of cybersex back in 1982, when he remarked in Life, the Universe and Everything that "Zaphod had spent most of his early history lessons plotting how he was going to have sex with the girl in the cybercubicle next to him." As more college age people began piling on to the internet in the mid-1990s, cybersex became trendy slang for what you did with your long-distance boyfriend using the university dial-up connection. And, like most slang, it quickly got shortened to cyber.

]]>
Wed, 11 Dec 2013 15:42:39 -0800 http://io9.com/today-cyber-means-war-but-back-in-the-1990s-it-mean-1325671487/1474902195/
<![CDATA[Google’s Earth]]> http://www.nytimes.com/2010/09/01/opinion/01gibson.html?_r=1&

“I ACTUALLY think most people don’t want Google to answer their questions,” said the search giant’s chief executive, Eric Schmidt, in a recent and controversial interview. “They want Google to tell them what they should be doing next.” Do we really desire Google to tell us what we should be doing next? I believe that we do, though with some rather complicated qualifiers.

Science fiction never imagined Google, but it certainly imagined computers that would advise us what to do. HAL 9000, in “2001: A Space Odyssey,” will forever come to mind, his advice, we assume, eminently reliable — before his malfunction. But HAL was a discrete entity, a genie in a bottle, something we imagined owning or being assigned. Google is a distributed entity, a two-way membrane, a game-changing tool on the order of the equally handy flint hand ax, with which we chop our way through the very densest thickets of information. Google is all of those things, and a very large and powerful corporation to boot.

We have yet to take Google’s measure. We’ve seen nothing like it before, and we already perceive much of our world through it. We would all very much like to be sagely and reliably advised by our own private genie; we would like the genie to make the world more transparent, more easily navigable. Google does that for us: it makes everything in the world accessible to everyone, and everyone accessible to the world. But we see everyone looking in, and blame Google.

Google is not ours. Which feels confusing, because we are its unpaid content-providers, in one way or another. We generate product for Google, our every search a minuscule contribution. Google is made of us, a sort of coral reef of human minds and their products. And still we balk at Mr. Schmidt’s claim that we want Google to tell us what to do next. Is he saying that when we search for dinner recommendations, Google might recommend a movie instead? If our genie recommended the movie, I imagine we’d go, intrigued. If Google did that, I imagine, we’d bridle, then begin our next search.

We never imagined that artificial intelligence would be like this. We imagined discrete entities. Genies. We also seldom imagined (in spite of ample evidence) that emergent technologies would leave legislation in the dust, yet they do. In a world characterized by technologically driven change, we necessarily legislate after the fact, perpetually scrambling to catch up, while the core architectures of the future, increasingly, are erected by entities like Google.

Cyberspace, not so long ago, was a specific elsewhere, one we visited periodically, peering into it from the familiar physical world. Now cyberspace has everted. Turned itself inside out. Colonized the physical. Making Google a central and evolving structural unit not only of the architecture of cyberspace, but of the world. This is the sort of thing that empires and nation-states did, before. But empires and nation-states weren’t organs of global human perception. They had their many eyes, certainly, but they didn’t constitute a single multiplex eye for the entire human species.

Jeremy Bentham’s Panopticon prison design is a perennial metaphor in discussions of digital surveillance and data mining, but it doesn’t really suit an entity like Google. Bentham’s all-seeing eye looks down from a central viewpoint, the gaze of a Victorian warder. In Google, we are at once the surveilled and the individual retinal cells of the surveillant, however many millions of us, constantly if unconsciously participatory. We are part of a post-geographical, post-national super-state, one that handily says no to China. Or yes, depending on profit considerations and strategy. But we do not participate in Google on that level. We’re citizens, but without rights.

Much of the discussion of Mr. Schmidt’s interview centered on another comment: his suggestion that young people who catastrophically expose their private lives via social networking sites might need to be granted a name change and a fresh identity as adults. This, interestingly, is a matter of Google letting societal chips fall where they may, to be tidied by lawmakers and legislation as best they can, while the erection of new world architecture continues apace.

If Google were sufficiently concerned about this, perhaps the company should issue children with free “training wheels” identities at birth, terminating at the age of majority. One could then either opt to connect one’s adult identity to one’s childhood identity, or not. Childhoodlessness, being obviously suspect on a résumé, would give birth to an industry providing faux adolescences, expensively retro-inserted, the creation of which would gainfully employ a great many writers of fiction. So there would be a silver lining of sorts.

To be sure, I don’t find this a very realistic idea, however much the prospect of millions of people living out their lives in individual witness protection programs, prisoners of their own youthful folly, appeals to my novelistic Kafka glands. Nor do I take much comfort in the thought that Google itself would have to be trusted never to link one’s sober adulthood to one’s wild youth, which surely the search engine, wielding as yet unimagined tools of transparency, eventually could and would do.

I imagine that those who are indiscreet on the Web will continue to have to make the best of it, while sharper cookies, pocketing nyms and proxy cascades (as sharper cookies already do), slouch toward an ever more Googleable future, one in which Google, to some even greater extent than it does now, helps us decide what we’ll do next.

William Gibson is the author of the forthcoming novel “Zero History.”

]]>
Sat, 09 Nov 2013 04:02:33 -0800 http://www.nytimes.com/2010/09/01/opinion/01gibson.html?_r=1&
<![CDATA[The Myth of Cyberspace – The New Inquiry]]> http://thenewinquiry.com/essays/the-myth-of-cyberspace/

In the early 1980s, when personal computing first became a reality, the faces of glowing terminals had an almost magical aura, transubstantiating arcane passages of 1s and 0s into sensory experience.

]]>
Thu, 26 Sep 2013 00:49:56 -0700 http://thenewinquiry.com/essays/the-myth-of-cyberspace/
<![CDATA[New Sculpt]]> http://machinemachine.net/portfolio/new-sculpt

I wrote an essay released in tandem with New Sculpt: LaTurbo Avedon‘s first solo exhibition, held at Transfer Gallery, New York – July 20 through August 10, 2013 Excerpt ::: In the past three decades the term “cyberspace” has come to define a social, rather than a geometric common environment. The term still conjures up images of posthuman projections and vast parallax branes, their cross-hatched surfaces woven at right angles by virtual motorcycles. We hear “cyberspace” and we think of terminals, of cables and an ocean of information, yet the most important means by which cyberspace is produced — namely human social and economic relations — barely registers a flicker. The objects of New Sculpt play between these contradictions.

]]>
Wed, 24 Jul 2013 05:32:56 -0700 http://machinemachine.net/portfolio/new-sculpt
<![CDATA[How Our Visions of Virtual Reality Have Changed in the Past 40 Years]]> http://io9.com/how-our-visions-of-virtual-reality-have-changed-in-the-582906269

The other day, Second Life celebrated its 10-year anniversary. But long before that venerable virtual world came into existence, we were dreaming up images of virtual reality and cyberspace. Top image: Tron vs. Tron Legacy

]]>
Sun, 14 Jul 2013 16:40:02 -0700 http://io9.com/how-our-visions-of-virtual-reality-have-changed-in-the-582906269
<![CDATA["The Infinite Plasticity of the Digital"]]> http://reconstruction.eserver.org/043/leaver.htm

Ever since William Gibson coined the term "cyberspace" in his debut novel Neuromancer, his work has been seen by many as a yardstick for postmodern and, more recently, posthuman possibilities.

]]>
Wed, 19 Jun 2013 09:39:53 -0700 http://reconstruction.eserver.org/043/leaver.htm
<![CDATA[The Cyberspace Real (Between Perversion and Trauma)]]> http://www.egs.edu/faculty/slavoj-zizek/articles/the-cyberspace-real

Are the pessimistic cultural criticists (from Jean Baudrillard to Paul Virilio) justified in their claim that cyberspace ultimately generates a kind of proto-psychotic immersion into an imaginary universe of hallucinations, unconstrained by any symbolic Law or by any impossibility of some Real? If not, how are we to detect in cyberspace the contours of the other two dimensions of the Lacanian triad ISR, the Symbolic and the Real?

As to the symbolic dimension, the solution seems easy — it suffices to focus on the notion of authorship that fits the emerging domain of cyberspace narratives, that of the "procedural authorship": the author (say, of the interactive immersive environment in which we actively participate by role-playing) no longer writes detailed story-line, s/he merely provides the basic set of rules (the coordinates of the fictional universe in which we immerse ourselves, the limited set of actions we are allowed to accomplish within this virtual space, etc.)

]]>
Fri, 30 Sep 2011 07:33:41 -0700 http://www.egs.edu/faculty/slavoj-zizek/articles/the-cyberspace-real
<![CDATA[Virtual Reality and the Exploration of Cyberspace]]> http://tumblr.machinemachine.net/post/2840425473

Virtual Reality and the Exploration of Cyberspace

]]>
Thu, 20 Jan 2011 02:25:48 -0800 http://tumblr.machinemachine.net/post/2840425473
<![CDATA[John Gray on humanity's quest for immortality]]> http://www.guardian.co.uk/books/2011/jan/08/john-gray-immortality

How do we deal with a purposeless universe and the finality of death? From Victorian séances to the embalming of Lenin's corpse to schemes for uploading our minds into cyberspace, there have have been numerous attempts to deny man's mortality. Why can't we accept the limits of science?

Darwinism is impossible to reconcile with the notion that humans have any special exemption from mortality. In Darwin's scheme of things species are not fixed or everlasting; there is no impassable barrier between human minds and those of other animals. How then could only humans go on to a life beyond the grave? If all life were extinguished on Earth, possibly as a result of climate change caused by humans, would they look down from the after-world, alone, on the wasteland they had left beneath? Surely, in terms of the prospect of immortality, all sentient beings stand or fall together.

]]>
Sun, 09 Jan 2011 15:38:31 -0800 http://www.guardian.co.uk/books/2011/jan/08/john-gray-immortality
<![CDATA[Code is Law]]> http://harvardmagazine.com/2000/01/code-is-law.html

Every age has its potential regulator, its threat to liberty. Our founders feared a newly empowered federal government; the Constitution is written against that fear. John Stuart Mill worried about the regulation by social norms in nineteenth-century England; his book On Liberty is written against that regulation. Many of the progressives in the twentieth century worried about the injustices of the market. The reforms of the market, and the safety nets that surround it, were erected in response.

This regulator is code—the software and hardware that make cyberspace as it is. This code, or architecture, sets the terms on which life in cyberspace is experienced. It determines how easy it is to protect privacy, or how easy it is to censor speech. It determines whether access to information is general or whether information is zoned.

]]>
Sun, 07 Feb 2010 09:20:00 -0800 http://harvardmagazine.com/2000/01/code-is-law.html
<![CDATA[█░LAST░█░M░I░D░I░█ █░BACKGROUND░█]]> http://www.kingcosmonaut.de/lmb/#73

It is an internet radio, a cyberspace shuttle, and a kind of archive. LMB takes you on a journey through an almost forgotten web that is loud, colorful, often "personal", and doesn't care about standards. Though it might be forgotten by many, some parts of it are still there, waiting to be explored. And maybe we can learn something along the way. What?

LMB plays a continous stream of MIDI music. However these aren't just random tunes, instead the songs are taken from websites where they are being played as background music.

While playing a song the LMB cyberspace shuttle flies through a stream of images that have been taken from the website you're (kind of) listening to.

]]>
Tue, 08 Dec 2009 03:04:00 -0800 http://www.kingcosmonaut.de/lmb/#73
<![CDATA[Immaterial Labour in the Digital Economy | Eleni Ikoniadou]]> http://subsol.c3.hu/subsol_2/contributors3/ikoniadoutext.html

The Internet, arguably the most influential digital medium, has sparked an explosion of debate throughout recent years, regarding its economic, political and social status; a space where its structure is constantly criticized and its potential nurtures new and contradicting ideas. British scholar Richard Barbrook (University of Westminster) is the instigator of one of such ideas, which finds its groundwork on Marxist critical analysis of capital. According to Barbrook, the new economy of the Internet era is called "the digital economy"; its workers are "the digital artisans," and their "tools" the new technologies, that is, computer networks. [2] Barbrook believes that this is a mixed economy that fosters a successful symbiosis of the public, the market and what he calls the "gift-economy," which he understands as a representation of anarcho-communism in cyberspace.

]]>
Fri, 13 Mar 2009 05:45:00 -0700 http://subsol.c3.hu/subsol_2/contributors3/ikoniadoutext.html