MachineMachine /stream - tagged with paradigm en-us LifePress <![CDATA[A vested interest in palimpsest]]>

The English language contains certain meaning-rich words that command attention and stir controversy. “Paradigm,” for instance: When Thomas Kuhn used it in 1966 to describe accepted scientific theories, and gave us the phrase “paradigm shift,” he launched a thousand articles, several hundred books and quite a few careers, some just distantly related to science.

That kind of word raises curiosity and pries open the imagination, encouraging us to think about what we might otherwise ignore. My favourite is “palimpsest.” When I first noticed it in print, four decades ago, it struck me as odd, beautiful and full of promise. It’s a term that engages many writers and continues to attract new meanings but to some readers it still seems slightly far-fetched, maybe outrageous.

Thu, 20 Dec 2012 03:45:00 -0800
<![CDATA[Freeman Dyson on Tool-Creation, Technology, and What Makes a Scientific Revolution]]>

Dyson refutes the idea that scientific revolutions are concept-driven, a stance pioneered by Thomas Kuhn in his controversial 1962 book The Structure of Scientific Revolutions, and later endorsed by other theory-driven scientists. Instead, Dyson argues, the art of tool-creation is its relationship to science.

The human heritage that gave us toolmaking hands and inquisitive brains did not die. In every human culture, the hand and the brain work together to create the style that makes a civilization….

Science will continue to generate unpredictable new ideas and opportunities. And human beings will continue to respond to new ideas and opportunities with new skills and inventions. We remain toolmaking animals, and science will continue to exercise the creativity programmed into our genes.

Tue, 10 Jul 2012 02:42:00 -0700
<![CDATA[Paradigms Regained]]>

The following is an excerpt from Ian Hacking's introduction to the new edition of Thomas S. Kuhn's The Structure of Scientific Revolutions, which commemorates the book's 50th anniversary. To be published by University of Chicago Press at the end of this month, Kuhn's book is often cited as one of the most-often-cited books of all time.

ONE THING IS NOT SAID often enough: Thomas S. Kuhn's The Structure of Scientific Revolutions, like all great books, is a work of passion, and a passionate desire to get things right. This is plain even from its modest first sentence: "History, if viewed as a repository for more than anecdote or chronology, could produce a decisive transformation in the image of science by which we are now possessed." Thomas Kuhn was out to change our understanding of the sciences — that is, of the activities that have enabled our species, for better or worse, to dominate the planet. He succeeded.

Wed, 27 Jun 2012 15:22:00 -0700
<![CDATA[Prometheus Unbound: What The Movie Was Actually About]]>

Prometheus contains such a huge amount of mythic resonance that it effectively obscures a more conventional plot. I'd like to draw your attention to the use of motifs and callbacks in the film that not only enrich it, but offer possible hints as to what was going on in otherwise confusing scenes.

Let's begin with the eponymous titan himself, Prometheus. He was a wise and benevolent entity who created mankind in the first place, forming the first humans from clay. The Gods were more or less okay with that, until Prometheus gave them fire. This was a big no-no, as fire was supposed to be the exclusive property of the Gods. As punishment, Prometheus was chained to a rock and condemned to have his liver ripped out and eaten every day by an eagle. (His liver magically grew back, in case you were wondering.)

Fix that image in your mind, please: the giver of life, with his abdomen torn open. We'll be coming back to it many times in the course of this article.

The ethos of the titan Prometh

Fri, 15 Jun 2012 05:29:00 -0700
<![CDATA[Sloppy MicroChips: Can a fair comparison be made between biological and silicon entropy?]]>

Was reading about microchips that are designed to allow a few mistakes (known as 'Sloppy Chips'), and pondering equivalent kinds of 'coding' errors and entropy in biological systems. Can a fair comparison be made between the two? OK, to setup my question I probably need to run through my (basic) understanding of biological vs silicon entropy...

In the transistor, error is a bad thing (in getting the required job done as efficiently and cheaply as possible), metered by parity bits that come as standard in every packet of data transmitted. But, in biological systems error is not necessarily bad. Most copying errors are filtered out, but some propogate and some of those might become beneficial to the organism (in thermodynamics sometimes known as "autonomy producing equivocations").

Relating to the article about 'sloppy chips', how does entropy and energy efficiency factor into this? For the silicon chip efficiency leads to heat (a problem), for the string of DNA efficiency leads to fewer mutations, and thus less change within populations, and thus, inevitably, less capacity for organisms to diversify and react to their environments - leading to no evolution, no change, no good. Slightly less efficiency is good for biology, and, it seems, good for some kinds of calculations and computer processes.

What work has been done on these connections I draw between the biological and the silicon?

I'm worried that my analogy is limited, based as it is on a paradigm for living systems that too closely mirrors the digital systems we have built. Can DNA and binary parity bit transistors be understood on their own terms, without resorting to using the other as a metaphor to understanding?

Where do the boundaries lie in comparing the two?

Tue, 05 Jun 2012 10:05:10 -0700
<![CDATA[What Thomas Kuhn Really Thought about Scientific “Truth"]]>

“Look,” Thomas Kuhn said. The word was weighted with weariness, as if Kuhn was resigned to the fact that I would misinterpret him, but he was still going to try—no doubt in vain—to make his point. Kuhn uttered the word often. “Look,” he said again. He leaned his gangly frame and long face forward, and his big lower lip, which ordinarily curled up amiably at the corners, sagged. “For Christ’s sake, if I had my choice of having written the book or not having written it, I would choose to have written it. But there have certainly been aspects involving considerable upset about the response to it.”

Wed, 30 May 2012 02:01:53 -0700
<![CDATA[Neanderthals Getting a Colourful Upgrade]]>

A chorus of smart, modern minds is rising over the hills of anthropology that the ancient Neanderthals of Europe weren't anywhere nearly as dumb, insufferable and unrecognizable as everyone thought all these years. At long last, these creatures who roamed the Continent for hundreds of thousands of years only to become extinct 30,000 years ago under the onslaught of modern humans from Africa are getting a major upgrade by the scientific community.

No more can we say that old Neanderthal -- prototype of shaggy man with absolutely zero smarts -- didn't know what he was doing. And no more can we deny it: They were not a little bit like us but a lot. As Professor David Frayer, Neanderthal expert at the University of Kansas, puts it, with not a little hint of told-you-so scientific glee, "Seemingly with every new journal issue, the gap between Neanderthal and modern human behavior closes."

Wed, 23 May 2012 09:44:15 -0700
<![CDATA[The Trouble with Scientism]]>

The conflict between the Naturwissenschaften and the Geisteswissenschaften goes back at least two centuries, and became intensified as ambitious, sometimes impatient researchers proposed to introduce natural scientific concepts and methods into the study of human psychology and human social behavior. Their efforts, and the attitudes of unconcealed disdain that often inspired them, prompted a reaction, from Vico to Dilthey and into our own time: the insistence that some questions are beyond the scope of natural scientific inquiry, too large, too complex, too imprecise, and too important to be addressed by blundering over-simplifications. From the nineteenth-century ventures in mechanistic psychology to contemporary attempts to introduce evolutionary concepts into the social sciences, “scientism” has been criticized for its “mutilation” (Verstümmelung, in Dilthey’s memorable term) of the phenomena to be explained.

Thu, 17 May 2012 03:42:13 -0700
<![CDATA[A challenge to God-guided mutations]]>

The renowned philosopher of science Elliott Sober has, in recent weeks, given a talk and written a paper that both make the same points: Evolution is totally silent on the idea and actions of God and, further, that evolutionists have neglected the logical possibility that God could have been involved in creating some of the mutations involved in evolution. (These mutations are presumably adaptive—God wouldn’t make all those nasty mutations that cause muscular dystrophy and cancer!)

I see this exercise—of demonstrating the logical compatibility of a rarely-acting God with evolution, and, by extension, with all of science—as a trivial exercise and a waste of time. No evolutionary biologist argues that evolution logically entails the non-existence of a God who can tweak the process. Or, if there are a few misguided individuals who do, they’re not important enough to contest in this way.

Thu, 17 May 2012 03:35:18 -0700
<![CDATA[What is the biological equivalent of discovering the Higgs Boson?]]>

We put the question to experts in various fields. Biology is no stranger to large, international collaborations with lofty goals, they pointed out — the race to sequence the human genome around the turn of the century had scientists riveted. But most biological quests lack the mathematical precision, focus and binary satisfaction of a yes-or-no answer that characterize the pursuit of the Higgs. “Most of what is important is messy, and not given to a moment when you plant a flag and crack the champagne,” says Steven Hyman, a neuroscientist at the Broad Institute in Cambridge, Massachusetts.

Nevertheless, our informal survey shows that the field has no shortage of fundamental questions that could fill an anticipatory auditorium. These questions concern where and how life started — and why it ends.

Thu, 29 Mar 2012 08:44:00 -0700
<![CDATA[The Enlightenment, Naturalism, And The Secularization Of Values]]>

The most influential contribution of the Enlightenment to modern thought, after its transformation of religious toleration from a negative to a positive value, was the secularization of ethical debate. Historically, however, it would be one-dimensional—indeed wrong—to understand this phenomenon as the product of a virgin birth of ideas in the Enlightenment. Both deistic and atheistic Enlightenment authors were part of the same world of thought. Similarly, both eighteenth-century Christian and Enlightenment thinkers were heirs to the same conceptual revolution of seventeenth-century natural philosophy (which included what we now term science), and both moved on the same deeper tidal currents of early-modern intellectual change.

Tue, 20 Mar 2012 11:20:24 -0700
<![CDATA[Error Undoes Faster-Than-Light Neutrino Results]]>

It appears that the faster-than-light neutrino results, announced last September by the OPERA collaboration in Italy, was due to a mistake after all. A bad connection between a GPS unit and a computer may be to blame.

Physicists had detected neutrinos travelling from the CERN laboratory in Geneva to the Gran Sasso laboratory near L'Aquila that appeared to make the trip in about 60 nanoseconds less than light speed. Many other physicists suspected that the result was due to some kind of error, given that it seems at odds with Einstein's special theory of relativity, which says nothing can travel faster than the speed of light. That theory has been vindicated by many experiments over the decades.

Wed, 22 Feb 2012 15:20:26 -0800
<![CDATA[“The Shannon and Weaver Model”]]>

The genius of Shanon’s original paper from 1948 and its subsequent popularization by Weaver lies in many things, among them, their having formulated a model of communication located on the threshold of these two understandings of theory. As a scientist Shannon surely felt accountable to the empirical world, and his work reflects that. Yet, it also seems clear that Shannon and Weaver’s work has, over the last 60 years or so, taken on a life of its own, feeding back into the reality they first set about describing. Shannon and Weaver didn’t merely model the world; they ended up enlarging it, changing it, and making it over in the image of their research.

Mon, 20 Feb 2012 06:51:16 -0800
<![CDATA[It's time for science to move on from materialism]]>

Today we live in the 21st century, and it seems that we are still stuck with this narrow and rigid view of the things. As Rupert Sheldrake puts it in his new book, published this week, The Science Delusion: "The belief system that governs conventional scientific thinking is an act of faith, grounded in a 19th-century ideology."

That's provocative rhetoric. Science an act of faith? Science a belief system? But then how else to explain the grip of the mechanistic, physicalist, purposeless cosmology? As Heisenberg explained, physicists among themselves have long stopped thinking of atoms as things. They exist as potentialities or possibilities, not objects or facts. And yet, materialism persists.

Sat, 28 Jan 2012 10:35:54 -0800
<![CDATA[Rereading Darwin]]>

The Dangers of Extrapolation (“Much light will be thrown on the origin of man.”

Two centuries before Charles Darwin wrote On the Origin of Species, Bishop James Ussher calculated the age of the Earth. To do so, the Primate of All Ireland (time has given his title a certain irony) carefully mined the Old and New Testaments for genealogical information that might lead him back to the date of Creation. In so doing, he concluded that the Earth was only about 5,600 years old. It’s easy to ridicule the bishop and his date, but to do so misses a larger point. Ussher’s approach was rigorous—even elegant—and he seemed to understand that the Earth’s age could only be inferred through careful observation of evidence. What doomed his result—and, indeed, many a conclusion in science—were faulty assumptions. The bishop, who took as an axiom that the biblical account of Creation was literal, painstakingly did the math attendant on the notorious biblical begats. He then dutifully added to his tally the five days that separated the creation of the Earth from the creation of human beings, and arrived at the exact date of Creation: October 23, 4004 B.C. Of course, he was wrong.

Tue, 17 Jan 2012 17:06:23 -0800
<![CDATA[Giorgio Agamben. What is a Paradigm. 2002 1/10]]> ]]> Fri, 12 Aug 2011 07:50:19 -0700 <![CDATA[The truth is in there]]>

“I had all these ideas,” he said, speaking slowly and searchingly, like someone looking back on life and trying to figure out where it all went wrong. “I don’t know what happened to me.”

What happened, strictly speaking, was that Morris fought with the head of his program, Thomas Kuhn, a decorated philosopher specializing in the history of science at Princeton. Kuhn believed it was fundamentally impossible for someone in the present to understand the past — that what was considered “true” in one era might be thought false in another, and therefore “objective reality” as such could not be said to exist.

Young Errol Morris was horrified by this view, and was not particularly shy about making Kuhn aware of it. Things came to a breaking point in 1972 when, during a particularly heated conversation, Kuhn threw a heavy glass ashtray at Morris’s head. He missed, but drove his point home by having Morris ejected from the program.

Thu, 16 Jun 2011 16:11:12 -0700
<![CDATA[Is Twitter writing, or is it speech? Why we need a new paradigm for our social media platforms]]>

Which begs the question: What is Twitter, actually? (No, seriously!) And what type of communication is it, finally? If we’re wondering why heated debates about Twitter’s effect on information/politics/us tend to be at once so ubiquitous and so generally unsatisfying…the answer may be that, collectively, we have yet to come to consensus on a much more basic question: Is Twitter writing, or is it speech?

Twitter versus “Twitter” The broader answer, sure, is that it shouldn’t matter. Twitter is…Twitter. It is what it is, and that should be enough. As a culture, though, we tend to insist on categorizing our communication, drawing thick lines between words that are spoken and words that are written. So libel is, legally, a different offense than slander; the written word, we assume, carries the heft of both deliberation and proliferation and therefore a moral weight that the spoken word does not. Text, we figure, is

Thu, 02 Jun 2011 13:19:30 -0700
<![CDATA[The Limits of Science]]>

Good sense is the most fairly distributed commodity in the world, Descartes once quipped, because nobody thinks he needs any more of it than he already has. A neat illustration of the fact that gullibility seems to be a disease of other people was provided by Martin Gardner, a great American debunker of pseudoscience, who died this year. In the second edition of his “Fads and Fallacies in the Name of Science” (1957), Gardner reported that most of the irate letters he received in response to the first edition criticised only one of its 26 chapters and found the rest to be fine. Needless to say, readers disagreed about which chapter was the faulty one. Homeopaths objected to the treatment meted out to themselves, but thought that the exposé of chiropractors was spot on, and vice versa.

No group of believers has more reason to be sure of its own good sense than today’s professional scientists. There is, or should be, no mystery about why it is always more rational to believe in science t

Thu, 16 Sep 2010 07:12:00 -0700
<![CDATA[Adam Curtis on 'Mad Men']]>

The widespread fascination with the Mad Men series is far more than just simple nostalgia. It is about how we feel about ourselves and our society today. In Mad Men we watch a group of people who live in a prosperous society that offers happiness and order like never before in history and yet are full of anxiety and unease. They feel there is something more, something beyond. And they feel stuck. I think we are fascinated because we have a lurking feeling that we are living in a very similar time. A time that, despite all the great forces of history whirling around in the world outside, somehow feels stuck. And above all has no real vision of the future. And as we watch the group of characters from 50 years ago, we get reassurance because we know that they are on the edge of a vast change that will transform their world and lead them out of their stifling technocratic order and back into the giant onrush of history.

Tue, 31 Aug 2010 15:55:00 -0700