MachineMachine /stream - search for linguistics https://machinemachine.net/stream/feed en-us http://blogs.law.harvard.edu/tech/rss LifePress therourke@gmail.com <![CDATA[Today, cyber means war.]]> http://io9.com/today-cyber-means-war-but-back-in-the-1990s-it-mean-1325671487/1474902195/

Today, cyber means war. But back in the 1990s, it meant sex — at least, the kind of sex you can have in a chat room. Why did the word change, and where did it originally come from?

It all started with "cybernetics," an obscure term popularized by a mathematician named Norbert Weiner in the 1940s. For his groundbreaking book Cybernetics, Weiner borrowed the ancient Greek word "cyber," which is related to the idea of government or governing. Indeed, the only time the word cybernetics had appeared before was in a few works of political theory about the science of governance.

In his writing, Weiner described what was at the time a pretty futuristic idea — that one day there would be a computer system that ran on feedback. Essentially, it would be a self-governing system. And for a long time, cybernetics remained the purview of information theorists like Weiner, and early computer programmers.

Science fiction author Pat Cadigan, whose novel Mindplayers is a cyberpunk classic, recalled that her first encounter with "cyber" was of a decidedly Weinerish variety. She told io9 that the first time she heard the term was when she was in high school in 1967, and somebody mentioned cybernetics. "I asked what cybernetics was. 'It has to do with computers,' was the answer. My eyes glazed over. For years, that was the only word I knew with the prefix 'cyber' in it."

Mindplayers Amazon.com: $3.50 Buy now M34 readers bought this

But all that changed a little over a decade later. Cadigan recalled:

One morning in 1979, I was getting ready for work and Gary Numan's "Cars" came on the radio. Afterwards, the DJ said, "There's some cyberpunk for you." He was making a joke; in 1979, the punk movement was in full flower but the chaotic noise of punk music was starting to evolve into electronic noise. The Bizarre Evolution of the Word "Cyber" 4 SEXPAND Still, that joke quickly became a reality. In the early 1980s, the cyberpunk movement took over science fiction, spurred by the popularity of the film Bladerunner and William Gibson's novel Neuromancer. Authors like Cadigan, Bruce Sterling and Rudy Rucker were writing mind-blowing stories about the merging of humans and computers. Cyber became a catch-all prefix that could be added to any word to make it sound cutting-edge. Cadigan noted that cyber "sort of supplanted the term 'digital' in some ways as an indicator of something that was high tech."

The 1990s: Decade of Cyber

RELATED

Are you a cyberpunk? This early 1990s poster explains it all to you. R.U. Sirius was a founder of Mondo 2000, the definitive futurist magazine of the early 1990s. And now he's posted a ton of snippets from it over … Read… Cyberpunk was a mostly-underground artistic style in the 1980s, but suddenly in the 1990s everything was cyber. As more and more people got internet access, the alien world of cyberspace from William Gibson's work became a household consumer item.

Richard Holden, a lexicographer with the Oxford English Dictionary, recently researched the history of cyber for the dictionary. He told io9 that the 1990s were a time when use of the word underwent rapid diversification:

The Oxford English Dictionary entry for the prefix cyber­- has evidence of its use going back to 1961 (in Cybertron, as it happens), but . . . it seems to have become particularly popular in the 1990s — we don’t have all that much evidence for its use before then. This seems likely to be a result of the invention of the World Wide Web, and the earliest evidence we’ve found for words like cyber-bully, cybercommunity, cybergeek, cyberlaw, cyberstalker, and, indeed, cybersex and cyberwar all comes from the early 90s. At that time you . . . seem to get a mix of positive and negative terms involving the prefix, which possibly reflects the mixed feelings people often have about the opportunities and threats a new technology can bring. Ben Zimmer, who writes about linguistics for the Wall Street Journal, agreed with Holden, noting that the seemingly-incongruous ideas of cybersex and cyberwar "grew up side by side." The earliest recorded use of the term "cybersecurity" came in 1989, the exact same year when the word "cyberporn" was coined. But neither term was dominant. In the heady days of the 1990s "information superhighway," before people got used to the idea that shopping, dating, and work could exist online, adding the prefix cyber to something made it seem like it was taking place in the gleaming, pixelated world inhabited by futuristic youth.

Had the iPhone come along in the 1990s, it's likely that we'd be calling our devices something very different. Cadigan said, "Terminology-wise, I find it interesting that we never had cyber-phones. The mobile/celluar phone became the cell and then evolved into the smart phone, not the cyber-phone." Just as today everything from buildings to phones can be "smart," in the 1990s anything could be cyber.

Including sex.

The Cybersex Moment

The Bizarre Evolution of the Word "Cyber" 56 SEXPAND Back in the days of AOL chat rooms, IRC channels, and text-only multi-user games, lots of people started having cybersex. Most of this furtive online activity involved no more than people talking dirty via text.

But cyber-pundits suggested that teledildonics and virtual reality sex were just around the corner. Soon, we would be having sex with chrome-plated dragon beasts in landscapes made of diamond flowers. And we would be stimulating our lovers 3,000 miles away with sex toys that plugged into both partners, sending the orgasmic shivers of one to the other via the internet.

Zimmer pointed out that Douglas Adams may have invented the idea of cybersex back in 1982, when he remarked in Life, the Universe and Everything that "Zaphod had spent most of his early history lessons plotting how he was going to have sex with the girl in the cybercubicle next to him." As more college age people began piling on to the internet in the mid-1990s, cybersex became trendy slang for what you did with your long-distance boyfriend using the university dial-up connection. And, like most slang, it quickly got shortened to cyber.

]]>
Wed, 11 Dec 2013 15:42:39 -0800 http://io9.com/today-cyber-means-war-but-back-in-the-1990s-it-mean-1325671487/1474902195/
<![CDATA[The Doctrine of the Similar (GIF GIF GIF)]]> http://machinemachine.net/portfolio/the-doctrine-of-the-similar-gif-gif-gif

In two short essays – written in 1933 – Walter Benjamin argues that primitive language emerged in magical correspondence with the world. The faculty we all exhibit in childhood play, to impersonate and imitate people and things loses its determining power as language gradually takes over from our “non-sensuous” connection with reality. In a break from Saussurian linguistics, Benjamin decries the loss of this “mimetic faculty”, as it becomes further replaced by the “archive of non-sensuous correspondences” we know as writing. To put it in simpler terms… Where once we read the world, the stars or the entrails of a sacrificed animal, now we read the signs enabled and captured by written language. From Benjamin’s The Doctrine of the Similar: “So speed, the swiftness in reading or writing which can scarcely be separated from this process, would then become… the effort or gift of letting the mind participate in that measure of time in which similarities flash up fleetingly out of the stream of things only in order to become immediately engulfed again.” The GIF – standing for Graphical Interchange Format – has been around since 1987. Their early popularity was based, in part, on their ability to load in time with a web-page. In the days of poor bandwidth and dial-up connections this meant that at least part of a GIF image would appear before the user’s connection broke, or – more significantly – the user could see enough of the image for it to make sense. In the mid 90s avid web hackers managed to crack the code of GIFs and use this ‘partial loading’ mechanism to encode animations within a single GIF file. Thus the era of personal web pages saturated with looping animations of spinning hamsters was born. Brought on – ironically – by their obsolescence the GIF has become the medium of choice for web artists, propagating their particular net-aesthetic through this free, open and kitschy medium. GIFs inhabit the space between convenience and abundance, where an apparent breakdown in communication can stimulate new modes of expressing non-sensuous similarities in the internet world. Sites like dump.fm, 4chan and ytmnd revel in the GIF’s ability to quickly correspond to the world. GIFs can be broken into their constituent frames, compressed and corrupted on purpose and made to act as archives for viral events travelling the web. A playground of correspondences that at first reflected language and the wider world, in time has looked increasingly inward. As language and writing find themselves pulled through and energised by the semiotic sludge of the broken, corrupted and iconic animated GIF Benjamin’s sensitivity to similitude continues to echo its magical significance. GIFs take a variety of forms, some of which I will try to classify for you: GIF Type I: Classic

Small in size and made up of few frames, this is where animated GIFs began. Corresponding to single words or concepts such as ‘smile’, ‘alien’ or ‘flying pink unicorn’ GIF Type II: Frame Capture

Frame grab or video capture GIFs pay homage to well known scenes in pop culture. But as the ‘art’ of animated GIFs grew the frame capture began to stand for something isolated from context. This leap is, for me, the first point at which GIFs begin to co-ordinate their own realm of correspondence. An ocean of viral videos turned into a self-serving visual language, looping back on itself ad infinitum. GIF Type III: Art

Leaking then directly into the third category, we have the Art GIF. Much larger in resolution and aware of their heritage in cinema, these GIFs are acutely refined in their choice of framing. GIF Type IV: Glitch

A badly encoded or compressed GIF can result in odd, strangely beautiful phenomena, and with a little skill and coding ability these glitches can be enhanced to enormous proportions. Glitch GIFs break the boundaries of another non-sensuous realm: that of computer code. A significant magical order Benjamin was little capable of predicting. GIF Type V: Mash-Up

Lastly, and perhaps most prolific, is the mash-up GIF. These GIFs are comprised of a combination of all the previous forms. The mash-up is THE most inner-looking species of GIF. It is possible to track the cultural development of some of these. Often though, the source of any original correspondence becomes completely lost in the play of images. Here again, I think Benjamin’s essay can help us: “Language is the highest application of the mimetic faculty: a medium into which the earlier perceptive capabilities for recognising the similar had entered without residue, so that it is now language which represents the medium in which objects meet and enter into relationship with each other…” In other words, what these images MEAN I can’t tell you in words. But perhaps by showing you other GIFs I might go some way to helping you understand them.

]]>
Wed, 25 May 2011 05:21:34 -0700 http://machinemachine.net/portfolio/the-doctrine-of-the-similar-gif-gif-gif
<![CDATA[My bright idea: Guy Deutscher]]> http://www.guardian.co.uk/technology/2010/jun/13/my-bright-idea-guy-deutscher

Guy Deutscher is that rare beast, an academic who talks good sense about linguistics, his chosen field. In his new book, Through the Language Glass (Heinemann), he fearlessly contradicts the fashionable consensus, espoused by the likes of Steven Pinker, that language is wholly a product of nature, that it does not take colour and value from culture and society. Deutscher argues, in a playful and provocative way, that our mother tongue does indeed affect how we think and, just as important, how we perceive the world.

An honorary research fellow at the University of Manchester, the 40-year-old linguist draws on a range of sources in the book to show language reflecting the society in which it is spoken. In the process, he explains why Russian water (a "she") becomes a "he" once you have dipped a teabag into her, and why, in German, a young lady has no sex, though a turnip has.

]]>
Sun, 13 Jun 2010 05:33:00 -0700 http://www.guardian.co.uk/technology/2010/jun/13/my-bright-idea-guy-deutscher
<![CDATA[A Thousand Years of Nonlinear History]]> http://readernaut.com/machinemachine/books/0942299329/a-thousand-years-of-nonlinear-history/

A Thousand Years of Nonlinear History by Manuel De Landa

Cover

Recently added as "reading".

Description: "Forcefully challenges habituated understandings of `history., `urban' and `economics'." -- Christopher Hight, AA Files

Following in the wake of his groundbreaking War in the Age of Intelligent Machines, Manuel De Landa presents a radical synthesis of historical development over the last one thousand years. More than a simple expository history, A Thousand Years of Nonlinear History sketches the outlines of a renewed materialist philosophy of history in the tradition of Fernand Braudel, Gilles Deleuze, and Félix Guattari, while also engaging the critical new understanding of material processes derived from the sciences of dynamics. Working against prevailing attitudes that see history as an arena of texts, discourses, ideologies, and metaphors, De Landa traces the concrete movements and interplays of matter and energy through human populations in the last millennium.

De Landa attacks three domains that have given shape to human societies: economics, biology, and linguistics. In every case, what one sees is the self-directed processes of matter and energy interacting with the whim and will of human history itself to form a panoramic vision of the West free of rigid teleology and naive notions of progress, and even more important, free of any deterministic source of its urban, institutional, and technological forms. Rather, the source of all concrete forms in the West's history are shown to derive from internal morphogenetic capabilities that lie within the flow of matter-energy itself.

  • Reader: Daniel Rourke
]]>
Wed, 24 Mar 2010 10:07:00 -0700 http://readernaut.com/machinemachine/books/0942299329/a-thousand-years-of-nonlinear-history/
<![CDATA[De-constructing 'code' (picking apart its assumptions)]]> http://ask.metafilter.com/mefi/144810

De-constructing 'code': I am looking for philosophical (from W. Benjamin through to post-structuralism and beyond) examinations of 'code'. That both includes the assumptions contained in the word 'code' and any actual objects or subjects that code is connected to - including, but not limited to: computer programming, cyphers, linguistics, genetics etc. I am looking to question the assumptions of 'code'. Perhaps a specific example of a theorist de-constructing the term.

I am currently knee deep in an examination of certain practices and assumptions that have arisen from digital media/medium and digital practice (art and making in the era of data packets and compression-artefacts for example). Through my analysis I wish to investigate the paradigms of text and writing practice (the making of textual arts).

A simple analogy to this process would be looking at dialectic cultures (speech based) from the perspective/hindsight of a grapholectic culture (writing/print based). In a similar way, I want to examine writing, film and their making with the hindsight of digital paradigms.

I am aware of the works of Deleuze, Derrida, Barthes, Genette, Ong, Serres, Agamben etc. but any of their works that deal specifically with 'code' would be very very useful.

I look forward to any pointers you can give me

]]>
Tue, 02 Feb 2010 06:35:00 -0800 http://ask.metafilter.com/mefi/144810
<![CDATA[Is the Internet melting our brains?]]> http://www.salon.com/books/int/2009/09/19/better_pencil/index.html

By now the arguments are familiar: Facebook is ruining our social relationships; Google is making us dumber; texting is destroying the English language as we know it. We're facing a crisis, one that could very well corrode the way humans have communicated since we first evolved from apes. What we need, so say these proud Luddites, is to turn our backs on technology and embrace not the keyboard, but the pencil.

Such sentiments, in the opinion of Dennis Baron, are nostalgic, uninformed hogwash. A professor of English and linguistics at the University of Illinois at Urbana-Champaign, Baron seeks to provide the historical context that is often missing from debates about the way technology is transforming our lives in his new book, "A Better Pencil." His thesis is clear: Every communication advancement throughout human history, from the pencil to the typewriter to writing itself, has been met with fear, skepticism and a longing for the medium that's been displaced. Far from heralding in a

]]>
Mon, 28 Sep 2009 09:01:00 -0700 http://www.salon.com/books/int/2009/09/19/better_pencil/index.html
<![CDATA[Thinking literally]]> http://www.boston.com/bostonglobe/ideas/articles/2009/09/27/thinking_literally/?page=full

Drawing on philosophy and linguistics, cognitive scientists have begun to see the basic metaphors that we use all the time not just as turns of phrase, but as keys to the structure of thought. By taking these everyday metaphors as literally as possible, psychologists are upending traditional ideas of how we learn, reason, and make sense of the world around us. The result has been a torrent of research testing the links between metaphors and their physical roots, with many of the papers reading as if they were commissioned by Amelia Bedelia, the implacably literal-minded children’s book hero. Researchers have sought to determine whether the temperature of an object in someone’s hands determines how "warm” or "cold” he considers a person he meets, whether the heft of a held object affects how "weighty” people consider topics they are presented with, or whether people think of the powerful as physically more elevated than the less powerful.

]]>
Mon, 28 Sep 2009 08:59:00 -0700 http://www.boston.com/bostonglobe/ideas/articles/2009/09/27/thinking_literally/?page=full
<![CDATA[The Next Great Discontinuity: The Data Deluge]]> http://www.3quarksdaily.com/3quarksdaily/2009/04/the-next-great-discontinuity-part-two.html

Speed is the elegance of thought, which mocks stupidity, heavy and slow. Intelligence thinks and says the unexpected; it moves with the fly, with its flight. A fool is defined by predictability… But if life is brief, luckily, thought travels as fast as the speed of light. In earlier times philosophers used the metaphor of light to express the clarity of thought; I would like to use it to express not only brilliance and purity but also speed. In this sense we are inventing right now a new Age of Enlightenment… A lot of… incomprehension… comes simply from this speed. I am fairly glad to be living in the information age, since in it speed becomes once again a fundamental category of intelligence. Michel Serres, Conversations on Science, Culture and Time

(Originally published at 3quarksdaily · Link to Part One) Human beings are often described as the great imitators: We perceive the ant and the termite as part of nature. Their nests and mounds grow out of the Earth. Their actions are indicative of a hidden pattern being woven by natural forces from which we are separated. The termite mound is natural, and we, the eternal outsiders, sitting in our cottages, our apartments and our skyscrapers, are somehow not. Through religion, poetry, or the swift skill of the craftsman smearing pigment onto canvas, humans aim to encapsulate that quality of existence that defies simple description. The best art, or so it is said, brings us closer to attaining a higher truth about the world that remains elusive from language, that perhaps the termite itself embodies as part of its nature. Termite mounds are beautiful, but were built without a concept of beauty. Termite mounds are mathematically precise, yet crawling through their intricate catacombs cannot be found one termite in comprehension of even the simplest mathematical constituent. In short, humans imitate and termites merely are. This extraordinary idea is partly responsible for what I referred to in Part One of this article as The Fallacy of Misplaced Concreteness. It leads us to consider not only the human organism as distinct from its surroundings, but it also forces us to separate human nature from its material artefacts. We understand the termite mound as integral to termite nature, but are quick to distinguish the axe, the wheel, the book, the skyscraper and the computer network from the human nature that bore them. When we act, through art, religion or with the rational structures of science, to interface with the world our imitative (mimetic) capacity has both subjective and objective consequence. Our revelations, our ideas, stories and models have life only insofar as they have a material to become invested through. The religion of the dance, the stone circle and the summer solstice is mimetically different to the religion of the sermon and the scripture because the way it interfaces with the world is different. Likewise, it is only with the consistency of written and printed language that the technical arts could become science, and through which our ‘modern’ era could be built. Dances and stone circles relayed mythic thinking structures, singular, imminent and ethereal in their explanatory capacities. The truth revealed by the stone circle was present at the interface between participant, ceremony and summer solstice: a synchronic truth of absolute presence in the moment. Anyone reading this will find truth and meaning through grapholectic interface. Our thinking is linear, reductive and bound to the page. It is reliant on a diachronic temporality that the pen, the page and the book hold in stasis for us. Imitation alters the material world, which in turn affects the texture of further imitation. If we remove the process from its material interface we lose our objectivity. In doing so we isolate the single termite from its mound and, after much careful study, announce that we have reduced termite nature to its simplest constituent. The reason for the tantalizing involutions here is obviously that intelligence is relentlessly reflexive, so that even the external tools that it uses to implement its workings become ‘internalized’, that is, part of its own reflexive process… To say writing is artificial is not to condemn it but to praise it. Like other artificial creations and indeed more than any other, it is utterly invaluable and indeed essential for the realisation of fuller, interior, human potentials. Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word. Walter J. Ong, Orality and Literacy

Anyone reading this article cannot fail but be aware of the changing interface between eye and text that has taken place over the past two decades or so. New Media – everything from the internet database to the Blackberry – has fundamentally changed the way we connect with each other, but it has also altered the way we connect with information itself. The linear, diachronic substance of the page and the book have given way to a dynamic textuality blurring the divide between authorship and readership, expert testament and the simple accumulation of experience. The main difference between traditional text-based systems and newer, data-driven ones is quite simple: it is the interface. Eyes and fingers manipulate the book, turning over pages in a linear sequence in order to access the information stored in its printed figures. For New Media, for the digital archive and the computer storage network, the same information is stored sequentially in databases which are themselves hidden to the eye. To access them one must commit a search or otherwise run an algorithm that mediates the stored data for us. The most important distinction should be made at the level of the interface, because, although the database as a form has changed little over the past 50 years of computing, the Human Control Interfaces (HCI) we access and manipulate that data through are always passing from one iteration to another. Stone circles interfacing the seasons stayed the same, perhaps being used in similar rituals over the course of a thousand years of human cultural accumulation. Books, interfacing text, language and thought, stay the same in themselves from one print edition to the next, but as a format, books have changed very little in the few hundred years since the printing press. The computer HCI is most different from the book in that change is integral to it structure. To touch a database through a computer terminal, through a Blackberry or iPhone, is to play with data at incredible speed: Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition… Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies. At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics… This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves. Wired Magazine, The End of Theory, June 2008

And as the amount of data has expanded exponentially, so have the interfaces we use to access that data and the models we build to understand that data. On the day that Senator John McCain announced his Vice Presidential Candidate the best place to go for an accurate profile of Sarah Palin was not the traditional media: it was Wikipedia. In an age of instant, global news, no newspaper could keep up with the knowledge of the cloud. The Wikipedia interface allowed knowledge about Sarah Palin from all levels of society to be filtered quickly and efficiently in real-time. Wikipedia acted as if it was encyclopaedia, as newspaper as discussion group and expert all at the same time and it did so completely democratically and at the absence of a traditional management pyramid. The interface itself became the thinking mechanism of the day, as if the notes every reader scribbled in the margins had been instantly cross-checked and added to the content. In only a handful of years the human has gone from merely dipping into the database to becoming an active component in a human-cloud of data. The interface has begun to reflect back upon us, turning each of us into a node in a vast database bigger than any previous material object. Gone are the days when clusters of galaxies had to a catalogued by an expert and entered into a linear taxonomy. Now, the same job is done by the crowd and the interface, allowing a million galaxies to be catalogued by amateurs in the same time it would have taken a team of experts to classify a tiny percentage of the same amount. This method of data mining is called ‘crowdsourcing’ and it represents one of the dominant ways in which raw data will be turned into information (and then knowledge) over the coming decades. Here the cloud serves as more than a metaphor for the group-driven interface, becoming a telling analogy for the trans-grapholectic culture we now find ourselves in. To grasp the topological shift in our thought patterns it pays to move beyond the interface and look at a few of the linear, grapholectic models that have undergone change as a consequence of the information age. One of these models is evolution, a biological theory the significance of which we are still in the process of discerning:

If anyone now thinks that biology is sorted, they are going to be proved wrong too. The more that genomics, bioinformatics and many other newer disciplines reveal about life, the more obvious it becomes that our present understanding is not up to the job. We now gaze on a biological world of mind-boggling complexity that exposes the shortcomings of familiar, tidy concepts such as species, gene and organism. A particularly pertinent example [was recently provided in New Scientist] - the uprooting of the tree of life which Darwin used as an organising principle and which has been a central tenet of biology ever since. Most biologists now accept that the tree is not a fact of nature - it is something we impose on nature in an attempt to make the task of understanding it more tractable. Other important bits of biology - notably development, ageing and sex - are similarly turning out to be much more involved than we ever imagined. As evolutionary biologist Michael Rose at the University of California, Irvine, told us: “The complexity of biology is comparable to quantum mechanics.” New Scientist, Editorial, January 2009

As our technologies became capable of gathering more data than we were capable of comprehending, a new topology of thought, reminiscent of the computer network, began to emerge. For the mindset of the page and the book science could afford to be linear and diachronic. In the era of The Data Deluge science has become more cloud-like, as theories for everything from genetics to neuroscience, particle physics to cosmology have shed their linear constraints. Instead of seeing life as a branching tree, biologists are now speaking of webs of life, where lineages can intersect and interact, where entire species are ecological systems in themselves. As well as seeing the mind as an emergent property of the material brain, neuroscience and philosophy have started to consider the mind as manifest in our extended, material environment. Science has exploded, and picking up the pieces will do no good. Through the topology of the network we have begun to perceive what Michel Serres calls ‘The World Object’, an ecology of interconnections and interactions that transcends and subsumes the causal links propounded by grapholectic culture. At the limits of science a new methodology is emerging at the level of the interface, where masses of data are mined and modelled by systems and/or crowds which themselves require no individual understanding to function efficiently. Where once we studied events and ideas in isolation we now devise ever more complex, multi-dimensional ways for those events and ideas to interconnect; for data sources to swap inputs and output; for outsiders to become insiders. Our interfaces are in constant motion, on trajectories that curve around to meet themselves, diverge and cross-pollinate. Thought has finally been freed from temporal constraint, allowing us to see the physical world, life, language and culture as multi-dimensional, fractal patterns, winding the great yarn of (human) reality: The advantage that results from it is a new organisation of knowledge; the whole landscape is changed. In philosophy, in which elements are even more distanced from one another, this method at first appears strange, for it brings together the most disparate things. People quickly crit[cize] me for this… But these critics and I no longer have the same landscape in view, the same overview of proximities and distances. With each profound transformation of knowledge come these upheavals in perception. Michel Serres, Conversations on Science, Culture and Time

]]>
Tue, 05 May 2009 07:35:00 -0700 http://www.3quarksdaily.com/3quarksdaily/2009/04/the-next-great-discontinuity-part-two.html
<![CDATA[Collocation - Wikipedia]]> http://en.wikipedia.org/wiki/Collocation

Within the area of corpus linguistics, collocation is defined as a sequence of words or terms which co-occur more often than would be expected by chance.

Collocation refers to the restrictions on how words can be used together, for example which prepos

]]>
Tue, 25 Mar 2008 16:08:12 -0700 http://en.wikipedia.org/wiki/Collocation
<![CDATA[Collocation - Wikipedia]]> http://en.wikipedia.org/wiki/Collocation

Within the area of corpus linguistics, collocation is defined as a sequence of words or terms which co-occur more often than would be expected by chance.

Collocation refers to the restrictions on how words can be used together, for example which prepos

]]>
Tue, 25 Mar 2008 16:08:00 -0700 http://en.wikipedia.org/wiki/Collocation