MachineMachine /stream - tagged with big-data https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[The New Observatory at FACT]]> http://www.furtherfield.org/features/reviews/new-observatory-fact

The New Observatory opened at FACT, Liverpool on Thursday 22nd of June and runs until October 1st. The exhibition, curated by Hannah Redler Hawes and Sam Skinner, in collaboration with The Open Data Institute, transforms the FACT galleries into a playground of micro-observatories, fusing art with data science in an attempt to expand the reach of both. Reflecting on the democratisation of tools which allow new ways of sensing and analysing, The New Observatory asks visitors to reconsider raw, taciturn ‘data’ through a variety of vibrant, surprising, and often ingenious artistic affects and interactions. What does it mean for us to become observers of ourselves? What role does the imagination have to play in the construction of a reality accessed via data infrastructures, algorithms, numbers, and mobile sensors? And how can the model of the observatory help us better understand how the non-human world already measures and aggregates information about itself? In its simplest form an observatory is merely an enduring location from which to view terrestrial or celestial phenomena. Stone circles, such as Stonehenge in the UK, were simple, but powerful, measuring tools, aligned to mark the arc of the sun, the moon or certain star systems as they careered across ancient skies. Today we observe the world with less monumental, but far more powerful, sensing tools. And the site of the observatory, once rooted to specific locations on an ever spinning Earth, has become as mobile and malleable as the clouds which once impeded our ancestors’ view of the summer solstice. The New Observatory considers how ubiquitous, and increasingly invisible, technologies of observation have impacted the scale at which we sense, measure, and predict. Citizen Sense, Dustbox (2016 – 2017). The New Observatory at FACT, 2017. Photo by Gareth Jones. The Citizen Sense research group, led by Jennifer Gabrys, presents Dustbox as part of the show. A project started in 2016 to give residents of Deptford, South London, the chance to measure air pollution in their neighbourhoods. Residents borrowed the Dustboxes from their local library, a series of beautiful, black ceramic sensor boxes shaped like air pollutant particles blown to macro scales. By visiting citizensense.net participants could watch their personal data aggregated and streamed with others to create a real-time data map of local air particulates. The collapse of the micro and the macro lends the project a surrealist quality. As thousands of data points coalesce to produce a shared vision of the invisible pollutants all around us, the pleasing dimples, spikes and impressions of each ceramic Dustbox give that infinitesimal world a cartoonish charisma. Encased in a glass display cabinet as part of the show, my desire to stroke and caress each Dustbox was strong. Like the protagonist in Richard Matheson’s 1956 novel The Shrinking Man, once the scale of the microscopic world was given a form my human body could empathise with, I wanted nothing more than to descend into that space, becoming a pollutant myself caught on Deptford winds. Moving from the microscopic to the scale of living systems, Julie Freeman’s 2015/2016 project, A Selfless Society, transforms the patterns of a naked mole-rat colony into an abstract minimalist animation projected into the gallery. Naked mole-rats are one of only two species of ‘eusocial’ mammals, living in shared underground burrows that distantly echo the patterns of other ‘superorganism’ colonies such as ants or bees. To be eusocial is to live and work for a single Queen, whose sole responsibility it is to breed and give birth on behalf of the colony. For A Selfless Society, Freeman attached Radio Frequency ID (RFID) chips to each non-breeding mole-rat, allowing their interactions to be logged as the colony went about its slippery subterranean business. The result is a meditation on the ‘missing’ data point: the Queen, whose entire existence is bolstered and maintained by the altruistic behaviours of her wrinkly, buck-teethed family. The work is accompanied by a series of naked mole-rat profile shots, in which the eyes of each creature have been redacted with a thick black line. Freeman’s playful anonymising gesture gives each mole-rat its due, reminding us that behind every model we impel on our data there exist countless, untold subjects bound to the bodies that compel the larger story to life.

James Coupe, A Machine for Living (2017). The New Observatory at FACT, 2017. Photo by Gareth Jones. Natasha Caruana’s works in the exhibition centre on the human phenomena of love, as understood through social datasets related to marriage and divorce. For her work Divorce Index Caruana translated data on a series of societal ‘pressures’ that are correlated with failed marriages – access to healthcare, gambling, unemployment – into a choreographed dance routine. To watch a video of the dance, enacted by Caruana and her husband, viewers must walk or stare through another work, Curtain of Broken Dreams, an interlinked collection of 1,560 pawned or discarded wedding rings. Both the works come out of a larger project the artist undertook in the lead-up to the 1st year anniversary of her own marriage. Having discovered that divorce rates were highest in the coastal towns of the UK, Caruana toured the country staying in a series of AirBnB house shares with men who had recently gone through a divorce. Her journey was plotted on dry statistical data related to one of the most significant and personal of human experiences, a neat juxtaposition that lends the work a surreal humour, without sentimentalising the experiences of either Caruana or the divorced men she came into contact with. Jeronimo Voss, Inverted Night Sky (2016). The New Observatory at FACT, 2017. Photo by Gareth Jones. The New Observatory features many screens, across which data visualisations bloom, or cameras look upwards, outwards or inwards. As part of the Libre Space Foundation artist Kei Kreutler installed an open networked satellite station on the roof of FACT, allowing visitors to the gallery a live view of the thousands of satellites that career across the heavens. For his Inverted Night Sky project, artist Jeronimo Voss presents a concave domed projection space, within which the workings of the Anton Pannekoek Institute for Astronomy teeter and glide. But perhaps the most striking, and prominent use of screens, is James Coupe’s work A Machine for Living. A four-storey wooden watchtower, dotted on all sides with widescreen displays wired into the topmost tower section, within which a bank of computer servers computes the goings on displayed to visitors. The installation is a monument to members of the public who work for Mechanical Turk, a crowdsourcing system run by corporate giant Amazon that connects an invisible workforce of online, human minions to individuals and businesses who can employ them to carry out their bidding. A Machine for Living is the result of James Coupe’s playful subversion of the system, in which he asked mTurk workers to observe and reflect on elements of their own daily lives. On the screens winding up the structure we watch mTurk workers narrating their dance moves as they jiggle on the sofa, we see workers stretching and labelling their yoga positions, or running through the meticulous steps that make up the algorithm of their dinner routine. The screens switch between users so regularly, and the tasks they carry out as so diverse and often surreal, that the installation acts as a miniature exhibition within an exhibition. A series of digital peepholes into the lives of a previously invisible workforce, their labour drafted into the manufacture of an observatory of observations, an artwork homage to the voyeurism that perpetuates so much of 21st century ‘online’ culture.

The New Observatory at FACT, 2017. Learning Space. Photo by Gareth Jones. The New Observatory is a rich and varied exhibition that calls on its visitors to reflect on, and interact more creatively with, the data that increasingly underpins and permeates our lives. The exhibition opened at FACT, Liverpool on Thursday 22nd of June and runs until October 1st.

]]>
Thu, 13 Jul 2017 07:28:55 -0700 http://www.furtherfield.org/features/reviews/new-observatory-fact
<![CDATA[Tate Series: Digital Thresholds: from Information to Agency (public event)]]> http://www.tate.org.uk/whats-on/tate-modern/courses-and-workshops/digital-thresholds-information-agency

I will deliver this 4-week public series at The Tate Modern throughout July 2016. Sign up! Thanks to Viktoria Ivanova for working with me to achieve this.

Data is the lifeblood of today’s economic and social systems. Drones, satellites and CCTV cameras capture digital images covertly, while smartphones we carry feed data packets into the cloud, fought over by corporations and governments. How are we to make sense of all this information? Who is to police and distribute it? And what kind of new uses can art put it to? This four-week series led by writer/artist Daniel Rourke will explore the politics and potential of big data through the lens of contemporary art and the social sciences. Participants will assess the impact the digital revolution has had on notions of value attached to the invisible, the territorial and the tangible. We will look at artists and art activists who tackle the conditions of resolution, algorithmic governance, digital colonialism and world-making in their work, with a focus on key news events yet to unfold in 2016. Session 1 Hito Steyerl: Poor Image Politics In this first session we will examine the politics of image and data resolution, with special attention to the work of artist Hito Steyerl represented in the Tate Collection. How do poor images influence the significance and value of the events they depict? What can online cultures that fetishise poor quality teach us about the economics and autonomy of information? Is being a low resolution event in a field of high resolutions an empowering proposition? Session 2 Morehshin Allahyari: Decolonising the Digital Archive 3D scanning and printing technologies are becoming common tools for archaeologists, archivists and historians. We will examine the work of art activists who question these technologies, connecting the dots from terroristic networks, through the price of crude oil, to artefacts being digitally colonised by Western institutions. Artist Morehshin Allahyari will join us via skype to talk about Material Speculation: ISIS – a series of artifacts destroyed by ISIS in 2015, which Allahyari then ‘recreated’ using digital tools and techniques. Session 3 Mishka Henner: Big Data and World Making In this session we will explore the work of artists who channel surveillance and big data into the poetic re-making of worlds. We will compare and contrast nefarious ‘deep web’ marketplaces with ‘real world’ auction houses selling artworks to a global elite. Artist Mishka Henner will join us via skype to talk about artistic appropriation, subversion and the importance of provocation. Session 4 Forensic Architecture: Blurring the Borders between Forensics, Law and Art The Forensic Architecture project uses analytical methods for reconstructing scenes of war and violence inscribed within spatial artefacts and environments. In this session we will look at their work to read and mobilise ‘ambient’ information gathered from satellites, mobile phones and CCTV/news footage. How are technical thresholds implicated in acts of war, terrorism and atrocity, and how can they be mobilised for resist and deter systemic violence?

]]>
Tue, 17 May 2016 07:23:50 -0700 http://www.tate.org.uk/whats-on/tate-modern/courses-and-workshops/digital-thresholds-information-agency
<![CDATA[Algorithmic Narratives and Synthetic Subjects (paper)]]> http://machinemachine.net/portfolio/paper-at-theorizing-the-web-synthetic-subjects/

This was the paper I delivered at The Theorizing the Web Conference, New York, 18th April 2015. This video of the paper begins part way in, and misses out some important stuff. I urge you to watch the other, superb, papers on my panel by Natalie Kane, Solon Barocas, and Nick Seaver. A better video is forthcoming. I posted this up partly in response to this post at Wired about the UK election, Facebook’s echo-chamber effect, and other implications well worth reading into.

Data churning algorithms are integral to our social and economic networks. Rather than replace humans these programs are built to work with us, allowing the distinct strengths of human and computational intelligences to coalesce. As we are submerged into the era of ‘big data’, these systems have become more and more common, concentrating every terrabyte of raw data into meaningful arrangements more easily digestible by high-level human reasoning. A company calling themselves ‘Narrative Science’, based in Chicago, have established a profitable business model based on this relationship. Their slogan, ‘Tell the Stories Hidden in Your Data’, [1] is aimed at companies drowning in spreadsheets of cold information: a promise that Narrative Science can ‘humanise’ their databases with very little human input. Kristian Hammond, Chief Technology Officer of the company, claims that within 15 years over 90% of all news stories will also be written by algorithms. [2] But rather than replacing the jobs that human journalists now undertake, Hammond claims the vast majority of their ‘robonews’ output will report on data currently not covered by traditional news outlets. One family-friendly example of this is the coverage of little-league baseball games. Very few news organisations have the resources, or desire, to hire a swathe of human journalists to write-up every little-league game. Instead, Narrative Science offer leagues, parents and their children a miniature summary of each game gleaned from match statistics uploaded by diligent little league attendees, and then written up by Narrative Science in a variety of journalistic styles. In their book ‘Big Data’ from 2013, Oxford University Professor of internet governance Viktor Mayer-Schönberger, and  ‘data editor’ of The Economist, Kenneth Cukier, tell us excitedly about another data aggregation company, Prismatic, who: …rank content from the web on the basis of text analysis, user preferences, social network-popularity, and big-data analysis. [3] According to Mayer- Schönberger and Cukier this makes Prismatic able ‘to tell the world what it ought to pay attention to better than the editors of the New York Times’. [4] A situation, Steven Poole reminds us, we can little argue with so long as we agree that popularity underlies everything that is culturally valuable. Data is now the lifeblood of technocapitalism. A vast endless influx of information flowing in from the growing universe of networked and internet connected devices. As many of the papers at Theorizing the Web attest, our environment is more and more founded by systems whose job it is to mediate our relationship with this data. Technocapitalism still appears to respond to Jean Francois Lyotard’s formulation of Postmodernity: that whether something is true has less relevance, than whether it is useful. In 1973 Jean Francois Lyotard described the Postmodern Condition as a change in “the status of knowledge” brought about by new forms of techno-scienctific and techno-economic organisation. If a student could be taught effectively by a machine, rather than by another human, then the most important thing we could give the next generation was what he called, “elementary training in informatics and telematics.” In other words, as long as our students are computer literate “pedagogy would not necessarily suffer”. [5] The next passage – where Lyotard marks the Postmodern turn from the true to the useful – became one of the book’s most widely quoted, and it is worth repeating here at some length:

It is only in the context of the grand narratives of legitimation – the life of the spirit and/or the emancipation of humanity – that the partial replacement of teachers by machines may seem inadequate or even intolerable. But it is probable that these narratives are already no longer the principal driving force behind interest in acquiring knowledge. [6] Here, I want to pause to set in play at least three elements from Lyotard’s text that colour this paper. Firstly, the historical confluence between technocapitalism and the era now considered ‘postmodern’. Secondly, the association of ‘the grand-narrative’ with modern, and pre-modern conditions of knowledge. And thirdly, the idea that the relationship between the human and the machine – or computer, or software – is generally one-sided: i.e. we may shy away from the idea of leaving the responsibility of our children’s education to a machine, but Lyotard’s position presumes that since the machine was created and programmed by humans, it will therefore necessarily be understandable and thus controllable, by humans. Today, Lyotard’s vision of an informatically literate populous has more or less come true. Of course we do not completely understand the intimate workings of all our devices or the software that runs them, but the majority of the world population has some form of regular relationship with systems simulated on silicon. And as Lyotard himself made clear, the uptake of technocapitalism, and therefore the devices and systems it propagates, is piece-meal and difficult to predict or trace. At the same time Google’s fleet of self-driving motor vehicles are let-loose on Californian state highways, in parts of sub-Saharan Africa models of mobile-phones designed 10 or more years ago are allowing farming communities to aggregate their produce into quantities with greater potential to make profit on a world market. As Brian Massumi remarks, network technology allows us the possibility of “bringing to full expression a prehistory of the human”, a “worlding of the human” that marks the “becoming-planetary” of the body itself. [7] This “worlding of the human” represents what Edmund Berger argues is the death of the Postmodern condition itself: [T]he largest bankruptcy of Postmodernism is that the grand narrative of human mastery over the cosmos was never unmoored and knocked from its pulpit. Instead of making the locus of this mastery large aggregates of individuals and institutions – class formations, the state, religion, etc. – it simply has shifted the discourse towards the individual his or herself, promising them a modular dreamworld for their participation… [8] Algorithmic narratives appear to continue this trend. They are piece-meal, tending to feedback user’s dreams, wants and desires, through carefully aggregated, designed, packaged Narratives for individual ‘use’. A world not of increasing connectivity and understanding between entities, but a network worlded to each individual’s data-shadow. This situation is reminiscent of the problem pointed out by Eli Pariser of the ‘filter bubble’, or the ‘you loop’, a prevalent outcome of social media platforms tweaked and personalised by algorithms to echo at the user exactly the kind of thing they want to hear. As algorithms develop in complexity the stories they tell us about the vast sea of data will tend to become more and more enamoring, more and more palatable. Like some vast synthetic evolutionary experiment, those algorithms that devise narratives users dislike, will tend to be killed off in the feedback loop, in favour of other algorithms whose turn of phrase, or ability to stoke our egos, is more pronounced. For instance, Narrative Science’s early algorithms for creating little league narratives tended to focus on the victors of each game. What Narrative Science found is that parents were more interested in hearing about their own children, the tiny ups and downs that made the game significant to them. So the algorithms were tweaked in response. Again, to quote chief scientist Kris Hammond from Narrative Science: These are narratives generated by systems that understand data, that give us information to support the decisions we need to make about tomorrow. [9] Whilst we can program software to translate the informational nuances of a baseball game, or internet social trends, into human palatable narratives, larger social, economic and environmental events also tend to get pushed through an algorithmic meatgrinder to make them more palatable. The ‘tomorrow’ that Hammond claims his company can help us prepare for is one that, presumably, companies like Narrative Science and Prismatic will play an ever larger part in realising. In her recently published essay on Crisis and the Temporality of Networks, Wendy Chun reminds us of the difference between the user and the agent in the machinic assemblage: Celebrations of an all powerful user/agent – ‘you’ as the network, ‘you’ as the producer- counteract concerns over code as law as police by positing ‘you’ as the sovereign subject, ‘you’ as the decider. An agent however, is one who does the  actual labor, hence agent is one who acts on behalf of another. On networks, the agent would seem to be technology, rather than the users or programmers who authorize actions through their commands and clicks. [10] In order to unpack Wendy Chun’s proposition here we need only look at two of the most powerful, and impactful algorithms from the last ten years of the web. Firstly, Amazon’s recommendation system, which I assume you have all interacted with at some point. And secondly, Facebook’s news feed algorithm, that ranks and sorts posts on your personalised stream. Both these algorithms rely on a community of user interactions to establish a hierarchy of products, or posts, based on popularity. Both these algorithms also function in response to user’s past activity, and both, of course, have been tweaked and altered over time by the design and programming teams of the respective companies. As we are all no doubt aware, one of the most significant driving principles behind these extraordinarily successful pieces of code is capitalism itself. The drive for profit, and the relationship that has on distinguishing between a successful or failing company, service or product. Wendy Chun’s reminder that those that carry out an action, that program and click, are not the agents here should give use solace. We are positioned as sovereign subjects over our data, because that idea is beneficial to the propagation of the ‘product’. Whether we are told how well our child has done at baseball, or what particular kinds of news stories we might like, personally, to read right now, it is to the benefit of technocapitalism that those narratives are positive, palatable and uncompromising. However the aggregation and dissemination of big data effects our lives over the coming years, the likelihood is that at the surface – on our screens, and ubiquitous handheld devices – everything will seem rosey, comfortable, and suited to the ‘needs’ and ‘use’ of each sovereign subject.

TtW15 #A7 @npseaver @nd_kane @s010n @smwat pic.twitter.com/BjJndzaLz1

— Daniel Rourke (@therourke) April 17, 2015

So to finish I just want to gesture towards a much much bigger debate that I think we need to have about big data, technocapitalism and its algorithmic agents. To do this I just want to read a short paragraph which, as far as I know, was not written by an algorithm: Surface temperature is projected to rise over the 21st century under all assessed emission scenarios. It is very likely that heat waves will occur more often and last longer, and that extreme precipitation events will become more intense and frequent in many regions. The ocean will continue to warm and acidify, and global mean sea level to rise. [11] This is from a document entitled ‘Synthesis Report for Policy Makers’ drafted by The Intergovernmental Panel on Climate Change – another organisation who rely on a transnational network of computers, sensors, and programs capable of modeling atmospheric, chemical and wider environmental processes to collate data on human environmental impact. Ironically then, perhaps the most significant tool we have to understand the world, at present, is big data. Never before has humankind had so much information to help us make decisions, and help us enact changes on our world, our society, and our selves. But the problem is that some of the stories big data has to tell us are too big to be narrated, they are just too big to be palatable. To quote Edmund Berger again: For these reasons we can say that the proper end of postmodernism comes in the gradual realization of the Anthropocene: it promises the death of the narrative of human mastery, while erecting an even grander narrative. If modernism was about victory of human history, and postmodernism was the end of history, the Anthropocene means that we are no longer in a “historical age but also a geological one. Or better: we are no longer to think history as exclusively human…” [12] I would argue that the ‘grand narratives of legitimation’ Lyotard claimed we left behind in the move to Postmodernity will need to return in some way if we are to manage big data in a meaningful way. Crises such as catastrophic climate change will never be made palatable in the feedback between users, programmers and  technocapitalism. Instead, we need to revisit Lyotard’s distinction between the true and the useful. Rather than ask how we can make big data useful for us, we need to ask what grand story we want that data to tell us.   References [1] Source: www.narrativescience.com, accessed 15/10/14 [2] Steven Levy, “Can an Algorithm Write a Better News Story Than a Human Reporter?,” WIRED, April 24, 2012, http://www.wired.com/2012/04/can-an-algorithm-write-a-better-news-story-than-a-human-reporter/. [3] “Steven Poole – On Algorithms,” Aeon Magazine, accessed May 8, 2015, http://aeon.co/magazine/technology/steven-poole-can-algorithms-ever-take-over-from-humans/. [4] Ibid. [5] Jean-François Lyotard, The Postmodern Condition: A Report on Knowledge, Repr, Theory and History of Literature 10 (Manchester: Univ. Pr, 1992), 50. [6] Ibid., 51. [7] Brian Massumi, Parables for the Virtual: Movement, Affect, Sensation (Duke University Press, 2002), 128. [8] Edmund Berger, “The Anthropocene and the End of Postmodernism,” Synthetic Zero, n.d., http://syntheticzero.net/2015/04/01/the-anthropocene-and-the-end-of-postmodernism/. [9] Source: www.narrativescience.com, accessed 15/10/14 [10] Wendy Chun, “Crisis and the Temporality of Networks,” in The Nonhuman Turn, ed. Richard Grusin (Minneapolis: University of Minnesota Press, 2015), 154. [11] Rajendra K. Pachauri et al., “Climate Change 2014: Synthesis Report. Contribution of Working Groups I, II and III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change,” 2014, http://epic.awi.de/37530/. [12] Berger, “The Anthropocene and the End of Postmodernism.”

]]>
Fri, 08 May 2015 04:02:51 -0700 http://machinemachine.net/portfolio/paper-at-theorizing-the-web-synthetic-subjects/
<![CDATA[The Data Sublime]]> http://thenewinquiry.com/essays/the-data-sublime/

The sublime unknowability of Big Data lets us fall in love with our own domination. I have a memory from childhood, a happy memory — one of complete trust and comfort. It’s dark, and I’m kneeling in the tiny floor area of the back seat of a car, resting my head on the seat.

]]>
Mon, 02 Feb 2015 11:13:53 -0800 http://thenewinquiry.com/essays/the-data-sublime/
<![CDATA[Data as Culture]]> https://furtherfield.org/features/reviews/data-culture#new_tab

For my latest Furtherfield review I wallowed in curator Shiri Shalmy’s ongoing project Data as Culture, examining works by Paolo Cirio and James Bridle that deal explicitly with the concatenation of data. What happens when society is governed by a regime of data about data, increasingly divorced from the symbolic?

In a work commissioned by curator Shiri Shalmy for her ongoing project Data as Culture, artist Paolo Cirio confronts the prerequisites of art in the era of the user. Your Fingerprints on the Artwork are the Artwork Itself [YFOTAATAI] hijacks loopholes, glitches and security flaws in the infrastructure of the world wide web in order to render every passive website user as pure material. In an essay published on a backdrop of recombined RAW tracking data, Cirio states: Data is the raw material of a new industrial, cultural and artistic revolution. It is a powerful substance, yet when displayed as a raw stream of digital material, represented and organised for computational interpretation only, it is mostly inaccessible and incomprehensible. In fact, there isn’t any meaning or value in data per se. It is human activity that gives sense to it. It can be useful, aesthetic or informative, yet it will always be subject to our perception, interpretation and use. It is the duty of the contemporary artist to explore what it really looks like and how it can be altered beyond the common conception. Even the nondescript use patterns of the Data as Culture website can be figured as an artwork, Cirio seems to be saying, but the art of the work requires an engagement that contradicts the passivity of a mere ‘user’. YFOTAATAI is a perfect accompaniment to Shiri Shalmy’s curatorial project, generating questions around security, value and production before any link has been clicked or artwork entertained. Feeling particularly receptive I click on James Bridle’s artwork/website  A Quiet Disposition and ponder on the first hyperlink that surfaces: the link reads “Keanu Reeves“: “Keanu Reeves” is the name of a person known to the system.  Keanu Reeves has been encountered once by the system and is closely associated with Toronto, Enter The Dragon, The Matrix, Surfer and Spacey Dentist.  In 1999 viewers were offered a visual metaphor of ‘The Matrix’: a stream of flickering green signifiers ebbing, like some half-living fungus of binary digits, beneath our apparently solid, Technicolor world. James Bridle‘s expansive work A Quiet Disposition [AQD] could be considered as an antidote to this millennial cliché, founded on the principle that we are in fact ruled by a third, much more slippery, realm of information superior to both the Technicolor and the digital fungus. Our socio-political, geo-economic, rubber bullet, blood and guts world, as Bridle envisages it, relies on data about data. Read the rest of this review at Furtherfield.org

]]>
Wed, 01 Oct 2014 07:37:48 -0700 https://furtherfield.org/features/reviews/data-culture#new_tab
<![CDATA[The Philosophy of Data - NYTimes.com]]> http://www.nytimes.com/2013/02/05/opinion/brooks-the-philosophy-of-data.html?_r=1& ]]> Sat, 23 Feb 2013 03:24:39 -0800 http://www.nytimes.com/2013/02/05/opinion/brooks-the-philosophy-of-data.html?_r=1& <![CDATA[Sloppy MicroChips: Oh, that’s near enough]]> http://www.economist.com/node/21556087

Letting microchips make a few mistakes here and there could make them much faster and more energy-efficient.

Managing the probability of errors and limiting where they occur can ensure that the errors do not cause any problems. The result of a mathematical calculation, for example, need not always be calculated precisely—an accuracy of two or three decimal places is often enough. Dr Palem offers the analogy of a person about to cross a big room. Rather than wasting time and energy calculating the shortest path, it’s better just to start walking in roughly the right direction.

]]>
Tue, 05 Jun 2012 09:18:58 -0700 http://www.economist.com/node/21556087