MachineMachine /stream - tagged with lyotard https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Algorithmic Narratives and Synthetic Subjects (paper)]]> http://machinemachine.net/portfolio/paper-at-theorizing-the-web-synthetic-subjects/

This was the paper I delivered at The Theorizing the Web Conference, New York, 18th April 2015. This video of the paper begins part way in, and misses out some important stuff. I urge you to watch the other, superb, papers on my panel by Natalie Kane, Solon Barocas, and Nick Seaver. A better video is forthcoming. I posted this up partly in response to this post at Wired about the UK election, Facebook’s echo-chamber effect, and other implications well worth reading into.

Data churning algorithms are integral to our social and economic networks. Rather than replace humans these programs are built to work with us, allowing the distinct strengths of human and computational intelligences to coalesce. As we are submerged into the era of ‘big data’, these systems have become more and more common, concentrating every terrabyte of raw data into meaningful arrangements more easily digestible by high-level human reasoning. A company calling themselves ‘Narrative Science’, based in Chicago, have established a profitable business model based on this relationship. Their slogan, ‘Tell the Stories Hidden in Your Data’, [1] is aimed at companies drowning in spreadsheets of cold information: a promise that Narrative Science can ‘humanise’ their databases with very little human input. Kristian Hammond, Chief Technology Officer of the company, claims that within 15 years over 90% of all news stories will also be written by algorithms. [2] But rather than replacing the jobs that human journalists now undertake, Hammond claims the vast majority of their ‘robonews’ output will report on data currently not covered by traditional news outlets. One family-friendly example of this is the coverage of little-league baseball games. Very few news organisations have the resources, or desire, to hire a swathe of human journalists to write-up every little-league game. Instead, Narrative Science offer leagues, parents and their children a miniature summary of each game gleaned from match statistics uploaded by diligent little league attendees, and then written up by Narrative Science in a variety of journalistic styles. In their book ‘Big Data’ from 2013, Oxford University Professor of internet governance Viktor Mayer-Schönberger, and  ‘data editor’ of The Economist, Kenneth Cukier, tell us excitedly about another data aggregation company, Prismatic, who: …rank content from the web on the basis of text analysis, user preferences, social network-popularity, and big-data analysis. [3] According to Mayer- Schönberger and Cukier this makes Prismatic able ‘to tell the world what it ought to pay attention to better than the editors of the New York Times’. [4] A situation, Steven Poole reminds us, we can little argue with so long as we agree that popularity underlies everything that is culturally valuable. Data is now the lifeblood of technocapitalism. A vast endless influx of information flowing in from the growing universe of networked and internet connected devices. As many of the papers at Theorizing the Web attest, our environment is more and more founded by systems whose job it is to mediate our relationship with this data. Technocapitalism still appears to respond to Jean Francois Lyotard’s formulation of Postmodernity: that whether something is true has less relevance, than whether it is useful. In 1973 Jean Francois Lyotard described the Postmodern Condition as a change in “the status of knowledge” brought about by new forms of techno-scienctific and techno-economic organisation. If a student could be taught effectively by a machine, rather than by another human, then the most important thing we could give the next generation was what he called, “elementary training in informatics and telematics.” In other words, as long as our students are computer literate “pedagogy would not necessarily suffer”. [5] The next passage – where Lyotard marks the Postmodern turn from the true to the useful – became one of the book’s most widely quoted, and it is worth repeating here at some length:

It is only in the context of the grand narratives of legitimation – the life of the spirit and/or the emancipation of humanity – that the partial replacement of teachers by machines may seem inadequate or even intolerable. But it is probable that these narratives are already no longer the principal driving force behind interest in acquiring knowledge. [6] Here, I want to pause to set in play at least three elements from Lyotard’s text that colour this paper. Firstly, the historical confluence between technocapitalism and the era now considered ‘postmodern’. Secondly, the association of ‘the grand-narrative’ with modern, and pre-modern conditions of knowledge. And thirdly, the idea that the relationship between the human and the machine – or computer, or software – is generally one-sided: i.e. we may shy away from the idea of leaving the responsibility of our children’s education to a machine, but Lyotard’s position presumes that since the machine was created and programmed by humans, it will therefore necessarily be understandable and thus controllable, by humans. Today, Lyotard’s vision of an informatically literate populous has more or less come true. Of course we do not completely understand the intimate workings of all our devices or the software that runs them, but the majority of the world population has some form of regular relationship with systems simulated on silicon. And as Lyotard himself made clear, the uptake of technocapitalism, and therefore the devices and systems it propagates, is piece-meal and difficult to predict or trace. At the same time Google’s fleet of self-driving motor vehicles are let-loose on Californian state highways, in parts of sub-Saharan Africa models of mobile-phones designed 10 or more years ago are allowing farming communities to aggregate their produce into quantities with greater potential to make profit on a world market. As Brian Massumi remarks, network technology allows us the possibility of “bringing to full expression a prehistory of the human”, a “worlding of the human” that marks the “becoming-planetary” of the body itself. [7] This “worlding of the human” represents what Edmund Berger argues is the death of the Postmodern condition itself: [T]he largest bankruptcy of Postmodernism is that the grand narrative of human mastery over the cosmos was never unmoored and knocked from its pulpit. Instead of making the locus of this mastery large aggregates of individuals and institutions – class formations, the state, religion, etc. – it simply has shifted the discourse towards the individual his or herself, promising them a modular dreamworld for their participation… [8] Algorithmic narratives appear to continue this trend. They are piece-meal, tending to feedback user’s dreams, wants and desires, through carefully aggregated, designed, packaged Narratives for individual ‘use’. A world not of increasing connectivity and understanding between entities, but a network worlded to each individual’s data-shadow. This situation is reminiscent of the problem pointed out by Eli Pariser of the ‘filter bubble’, or the ‘you loop’, a prevalent outcome of social media platforms tweaked and personalised by algorithms to echo at the user exactly the kind of thing they want to hear. As algorithms develop in complexity the stories they tell us about the vast sea of data will tend to become more and more enamoring, more and more palatable. Like some vast synthetic evolutionary experiment, those algorithms that devise narratives users dislike, will tend to be killed off in the feedback loop, in favour of other algorithms whose turn of phrase, or ability to stoke our egos, is more pronounced. For instance, Narrative Science’s early algorithms for creating little league narratives tended to focus on the victors of each game. What Narrative Science found is that parents were more interested in hearing about their own children, the tiny ups and downs that made the game significant to them. So the algorithms were tweaked in response. Again, to quote chief scientist Kris Hammond from Narrative Science: These are narratives generated by systems that understand data, that give us information to support the decisions we need to make about tomorrow. [9] Whilst we can program software to translate the informational nuances of a baseball game, or internet social trends, into human palatable narratives, larger social, economic and environmental events also tend to get pushed through an algorithmic meatgrinder to make them more palatable. The ‘tomorrow’ that Hammond claims his company can help us prepare for is one that, presumably, companies like Narrative Science and Prismatic will play an ever larger part in realising. In her recently published essay on Crisis and the Temporality of Networks, Wendy Chun reminds us of the difference between the user and the agent in the machinic assemblage: Celebrations of an all powerful user/agent – ‘you’ as the network, ‘you’ as the producer- counteract concerns over code as law as police by positing ‘you’ as the sovereign subject, ‘you’ as the decider. An agent however, is one who does the  actual labor, hence agent is one who acts on behalf of another. On networks, the agent would seem to be technology, rather than the users or programmers who authorize actions through their commands and clicks. [10] In order to unpack Wendy Chun’s proposition here we need only look at two of the most powerful, and impactful algorithms from the last ten years of the web. Firstly, Amazon’s recommendation system, which I assume you have all interacted with at some point. And secondly, Facebook’s news feed algorithm, that ranks and sorts posts on your personalised stream. Both these algorithms rely on a community of user interactions to establish a hierarchy of products, or posts, based on popularity. Both these algorithms also function in response to user’s past activity, and both, of course, have been tweaked and altered over time by the design and programming teams of the respective companies. As we are all no doubt aware, one of the most significant driving principles behind these extraordinarily successful pieces of code is capitalism itself. The drive for profit, and the relationship that has on distinguishing between a successful or failing company, service or product. Wendy Chun’s reminder that those that carry out an action, that program and click, are not the agents here should give use solace. We are positioned as sovereign subjects over our data, because that idea is beneficial to the propagation of the ‘product’. Whether we are told how well our child has done at baseball, or what particular kinds of news stories we might like, personally, to read right now, it is to the benefit of technocapitalism that those narratives are positive, palatable and uncompromising. However the aggregation and dissemination of big data effects our lives over the coming years, the likelihood is that at the surface – on our screens, and ubiquitous handheld devices – everything will seem rosey, comfortable, and suited to the ‘needs’ and ‘use’ of each sovereign subject.

TtW15 #A7 @npseaver @nd_kane @s010n @smwat pic.twitter.com/BjJndzaLz1

— Daniel Rourke (@therourke) April 17, 2015

So to finish I just want to gesture towards a much much bigger debate that I think we need to have about big data, technocapitalism and its algorithmic agents. To do this I just want to read a short paragraph which, as far as I know, was not written by an algorithm: Surface temperature is projected to rise over the 21st century under all assessed emission scenarios. It is very likely that heat waves will occur more often and last longer, and that extreme precipitation events will become more intense and frequent in many regions. The ocean will continue to warm and acidify, and global mean sea level to rise. [11] This is from a document entitled ‘Synthesis Report for Policy Makers’ drafted by The Intergovernmental Panel on Climate Change – another organisation who rely on a transnational network of computers, sensors, and programs capable of modeling atmospheric, chemical and wider environmental processes to collate data on human environmental impact. Ironically then, perhaps the most significant tool we have to understand the world, at present, is big data. Never before has humankind had so much information to help us make decisions, and help us enact changes on our world, our society, and our selves. But the problem is that some of the stories big data has to tell us are too big to be narrated, they are just too big to be palatable. To quote Edmund Berger again: For these reasons we can say that the proper end of postmodernism comes in the gradual realization of the Anthropocene: it promises the death of the narrative of human mastery, while erecting an even grander narrative. If modernism was about victory of human history, and postmodernism was the end of history, the Anthropocene means that we are no longer in a “historical age but also a geological one. Or better: we are no longer to think history as exclusively human…” [12] I would argue that the ‘grand narratives of legitimation’ Lyotard claimed we left behind in the move to Postmodernity will need to return in some way if we are to manage big data in a meaningful way. Crises such as catastrophic climate change will never be made palatable in the feedback between users, programmers and  technocapitalism. Instead, we need to revisit Lyotard’s distinction between the true and the useful. Rather than ask how we can make big data useful for us, we need to ask what grand story we want that data to tell us.   References [1] Source: www.narrativescience.com, accessed 15/10/14 [2] Steven Levy, “Can an Algorithm Write a Better News Story Than a Human Reporter?,” WIRED, April 24, 2012, http://www.wired.com/2012/04/can-an-algorithm-write-a-better-news-story-than-a-human-reporter/. [3] “Steven Poole – On Algorithms,” Aeon Magazine, accessed May 8, 2015, http://aeon.co/magazine/technology/steven-poole-can-algorithms-ever-take-over-from-humans/. [4] Ibid. [5] Jean-François Lyotard, The Postmodern Condition: A Report on Knowledge, Repr, Theory and History of Literature 10 (Manchester: Univ. Pr, 1992), 50. [6] Ibid., 51. [7] Brian Massumi, Parables for the Virtual: Movement, Affect, Sensation (Duke University Press, 2002), 128. [8] Edmund Berger, “The Anthropocene and the End of Postmodernism,” Synthetic Zero, n.d., http://syntheticzero.net/2015/04/01/the-anthropocene-and-the-end-of-postmodernism/. [9] Source: www.narrativescience.com, accessed 15/10/14 [10] Wendy Chun, “Crisis and the Temporality of Networks,” in The Nonhuman Turn, ed. Richard Grusin (Minneapolis: University of Minnesota Press, 2015), 154. [11] Rajendra K. Pachauri et al., “Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change,” 2014, http://epic.awi.de/37530/. [12] Berger, “The Anthropocene and the End of Postmodernism.”

]]>
Fri, 08 May 2015 04:02:51 -0700 http://machinemachine.net/portfolio/paper-at-theorizing-the-web-synthetic-subjects/
<![CDATA[Death Is Not the End (Long Live theory!)]]> http://nplusonemag.com/death-not-end

Was theory a gigantic hoax? On the contrary. It was the only salvation, for a twenty year period, from two colossal abdications by American thinkers and writers. From about 1975 to 1995, through a historical accident, a lot of American thinking and mental living got done by people who were French, and by young Americans who followed the French.

The two grand abdications: one occurred in academic philosophy departments, the other in American fiction. In philosophy, from the 1930s on, a revolutionary group had been fighting inside universities to overcome the “tradition.” This insurgency, at first called “logical positivism” or “logical empiricism,” then simply “analytic philosophy,” was the best thing going. The original idea was that logical analysis of language would show which philosophical problems might be solved, and which eradicated because they were not phraseable in clear, logical language.

]]>
Thu, 18 Aug 2011 01:48:43 -0700 http://nplusonemag.com/death-not-end
<![CDATA[Thoughts on art practice PhDs]]> http://www.fuel.rca.ac.uk/articles/thoughts-on-art-practice-phds

“Knowledge is and will be produced in order to be sold, it is and will be consumed in order to be valorised in a new production: in both cases, the goal is exchange.” - Jean-Francois Lyotard, The Postmodern Condition

What are artists to gain from taking a PhD? How does the mantle of ‘artistic research’ enable art objects and those invested in them? And where does art’s autonomy reside when its criticality comes from within an academic institution? Over the last 20 years art has eased its way into academia. Past the door of the artist’s studio and up the back stairs it tiptoed until, in a very bold move, it seated itself in the commissioner’s chair. Where once art reacted against academies from the outside, art, and the artists who make it, now work from within the institution. Artists interested in pursuing a doctoral degree will have heard time and again about ‘the critical function of art’. Indeed, many theorists would insist on art being defined from this state of opposition (the ‘avant-garde’). But to understand the potential of art today it becomes impossible to separate it from the academic institutions that use its name to label their distinctive, often daring, new departments. Goldsmith’s Art Writing MFA and the RCA’s Critical Writing in Art & Design being two of the freshest – some might say hippest – examples.

To begin producing ‘new knowledge’, PhD researchers often need to pursue contradictory goals. A strong research question poses not the trajectory to a definitive answer, but a principle by which the researcher may begin to generate knowledge. This becomes especially slippery when that ‘knowledge’ is woven into an artistic practice, or when the art objects created by that practice are assumed to qualify the research. How does one invest research in projects that have yet to be realised? Practice-based PhDs hide another stumbling block, usually one based on the expectations of the artist: the belief that time invested in a research degree should improve the quality of practice, as well as strengthen one’s grasp of theory. The distinction between the practice and theoretical components of a PhD can vary wildly, and although on paper they each glean 50% of the final mark severing them into definite halves can be an unwieldy, often impossible, task. In the RCA’s Department of Communication Art & Design for instance, projects regularly emerge that blur the line between the written and ‘practical’ components of research. A recent edition of critical journal Texte Zur Kunst focused on artistic research declared, “Philosophy and art share the conviction that cognition requires a material form.” A practice-based PhD may have a smaller word count than its non-practical equivalent, but as final exhibitions are documented and literature reviews are spell checked, the boundary between art object and critical reflection will have hopefully elided into a single, successful, conglomeration. As a practice-based researcher myself, with two years of the academy under my belt, I’ve found that the primary method of answering these concerns is to reflect them back at the institution. Research does not take place in isolation. As with any treasured job it is the people that make a PhD worth undertaking. If you are lucky – and let’s admit it, fewer things are harder to predict than luck – the artists and academics that make up your department will be driven by similar desires as you are. Of course, I could spend the rest of this short article on the restrictions of labelling yourself a graduate of the RCA, Goldsmiths or the Slade. But reflecting back the conditions of research at the institutions that produce them comes closer to addressing what really makes academies function: exchange. To paraphrase the words of John F. Kennedy: ‘Ask not what academia can do for you—ask what you can do for your academia.’ Productive exchange begins by giving your all, whilst always expecting those around you to do the same. In terms of the market, artistic practice often inhabits an obscure space, cut off from the concerns of art galleries, of buyers, sellers and the aesthetically motivated public. The main benefit of taking up a practice-based research position is exposing one’s practice to the eyes of others. But this exposure always focuses both ways. Jean-Francois Lyotard writes, in The Postmodern Condition, “Knowledge is and will be produced in order to be sold, it is and will be consumed in order to be valorised in a new production: in both cases, the goal is exchange.” Criticality – the enactment of research – begins in the process of exchange, a goal which, if Lyotard is to be believed, should be held in higher esteem than the art market. Taking a practice-based PhD means investing time and knowledge with other practitioners, often other artists who, having undertaken their research years before, now enact their modes of exchange as tutors, professors and PhD supervisors. Research degrees are not always the best way to fortify the foundations of an artist’s practice. Indeed, many would argue that the very principle of artistic practice within the academy is to rock those foundations, even raze certain principles of practice to the ground. But when PhD researchers are supported to develop and sustain their thought from within their art it can often be the supervisor or established academic artist who has to rethink their assumptions, rather than the other way around. Personal exploration, issuing from practice, becomes valid as PhD research when its significance is a significance shared. A significance exchanged is a significance enhanced.

]]>
Thu, 21 Jul 2011 02:58:00 -0700 http://www.fuel.rca.ac.uk/articles/thoughts-on-art-practice-phds