Science/Tech

You are currently browsing the archive for the Science/Tech category.

In his Aeon think-piece about environmentalism, Liam Heneghan suggests that in order to save nature we need to free ourselves of some accepted notions of preservation in favor of a more integrative approach:

“The environmental historian Donald Worster writes about the fall of the ‘balance of nature’ as an idea, and points out that this disruptive world-view makes nature seem awfully like the human sphere. ‘All history,’ he notes, ‘has become a record of disturbance, and that disturbance comes from both cultural and natural agents.’ Thus he places droughts and pests alongside corporate takeovers and the invasion of the academy by French literary theory. If the idea of a balance resurrects Plato and Aristotle, the non-equilibrium, disturbance-inclined view may have its own Greek hero: Heraclitus, pagan saint of flux. ‘Thunderbolt,’ Heraclitus wrote in Fragment 64, ‘steers all things.’

In its brief history, the science of ecology appears to have smuggled in enough ancient metaphysics to make any Greek philosopher nod with approval. However, the question remains. If the handsaw and hurricane are equivalents in their ability to lay a forest low, it is hard to see how we can scientifically criticise the human destruction of ecosystems. Why should we, for instance, concern ourselves with the fate of the Western Ghats if alien introductions are just another disturbance, no different from the more natural-seeming migration of species? The point of conservation in the popular imagination and in many policy directives is that it resists human depredations to preserve important species in ancient, intact, fully functional natural ecosystems. If we have no ‘balance of nature’, this is much harder to defend.

If we lose the ideal of balance, then, we lose a powerful motive for environmental conservation. However, there might be some unintended benefits. A dynamic, ‘disturbance’ approach has fostered some of the most promising new approaches to environmental problems such as urban ecology and restoration ecology. That’s because it is much less concerned with keeping humans and nature separate from one another.

Tags:

From Bibliokept, a prescient if utopian passage about the intersection of sex and technology from a 1972 Penthouse interview with William S. Burroughs:

Penthouse:

How could electrodes improve sex?

William S. Burroughs:

Well, socially, first of all. Here’s one person over here and another person over here, and they want something sexually but they can’t get together and society will see that they don’t. That’s why the law persecuted magazines carrying advertisements for sex partners. But advertisements are a crude method; the whole process could be done on a computer. Perhaps people could be brought together on terms of having reciprocal brainwaves. Everyone could be provided through the computer with someone else who was completely sexually compatible. But it’s more important than this. If the human species is going to mutate in any way, then the mutations must come through sex–how else could they? And sooner or later the species must mutate or it dies out.”

Tags:

Sometimes when we know something different, even scary, is soon to begin, we nervously misread its arrival. In a simple genetic mutation, for instance, we can see the future of genetic engineering, a science that can make us great but right now just makes us uneasy. From Peter Murray at Singularity Hub, a story about a Chinese boy born with blue eyes who’s viewed as a real-life X-Man in his homeland:

“Although the notion is revolting to many, at some point in the future we’ll have the know-how and the tools to genetically modify our bodies to make us stronger, better looking, more intelligent. In Dahua, in south China, the strange properties of one boy’s eyes has made him an Internet sensation. Headlines abound label him a one-of-a-kind, real life X-Man, miraculously given the gift of cat-like vision through genetic mutation.

In all likelihood, however, his miracle probably only extends as far as being able to see at night a little bit better than average, and even this has not been properly documented. In all likelihood, this is more a case of wishful thinking, overactive imagination, and the desire for attention.

Nong Yousui’s blue eyes are an anomalous, but not entirely unseen, occurrence among Chinese children. They are rare enough, however, to trigger worry among Yousui’s parents. Doctors promptly allayed their worries, saying that the boy’s vision was fine.

Years later, Yousui finds himself the center of an online frenzy.”

Tags:

The full 40-minute 1989 Eno documentary, “Imaginary Landscapes.”

Tags:

“De humani corporis fabrica libri septem,” Andreas Vesalius, 1543.

From a Discovery list of obscure facts about postmortems, a passage about the autopsy as live performance:

“Paduan judge Marcantonio Contarini, obsessed with the anatomical drawings of Andreas Vesalius, endorsed autopsies on executed criminals; they soon became all the rage in the region. Starting in 1539, hangings were scheduled around planned autopsies, which were performed to packed houses in special theaters.”

Tags: ,

Devin Coldewey has a smart essay at Techcrunch pointing out that the tech items we use constantly and depend on for function, the ones that look so beautiful, provide little emotional connection for us because of their uniform and disposable nature. I agree with his assessment of digital culture, where disposal is built into the agreement. Analog technology, like those scratchy LP records that were purchased to not only be played but also to be collected, can hold us in their sway and attach themselves to our hearts and minds. Not so with an iPod. An excerpt:

“It’s a puzzling and complicated relationship we have with technology, as it is personified (for lack of a better term) in our iPhones, laptops, and other gadgets. We hold them and touch them every day, look at them for hours on end, sleep next to them. But how little we care for them!

I know that much of this is because what interests us in our devices is not the device itself, but that to which it is a conduit. Our friends, a map of the world, the whole of human knowledge (if not wisdom) at our fingertips. I don’t value my laptop the way I value my jacket because if I lose the laptop, my friends and Google and Wikipedia will still be there, waiting for me to find another way to get at them. It’s not so surprising, then, that we don’t value this middle-man object much.

And although we share so much of our lives with these devices, they don’t last very long. We’re like serial monogamists, committed until something better comes along, usually after a year or two. Can you really be fond of something you know you plan to replace?

Yet however reasonable it appears, still it disturbs me. It strikes me as wrong that our most powerful and expensive and familiar objects should be the ones we love the least.”

Tags:

I’m with Neil Armstrong: You never should take risks just for the sake of taking risks. But I would assume Felix Baumgartner’s derring-do in his spacesuit will aid science in some manner. High winds got the best of Baumgartner over the past 48 hours, so his historic fall from the heavens has been delayed. From Paul Harris in the Guardian:

“Austrian daredevil Felix Baumgartner’s attempt to parachute to earth from the edge of space has been postponed for the second day running after gusty winds in New Mexico hampered the launch of the balloon that would take him skywards.

Baumgartner, a 43-year-old former soldier, was aiming to jump from 23 miles above the Earth in a specially pressurised suit, plummetting to the ground at speeds that would break the sound barrier before he triggers his parachute.

The stunt, if successful, would break five world records. Baumgartner would become the first human to ever break the sound barrier in free-fall; make the highest free-fall altitude jump, ride the highest manned balloon flight and longest free-fall and his jump platform is believed to be the largest manned balloon in history.”

Tags: ,

From “Patient, Heal Thyself,” Randy Rieland’s new post at Smithsonian, a passage about the hopes for duplicating regenerative medicine in humans which is exhibited in a particular type of mouse:

“Mammals scar after they tear their skin. But not the spiny mouse. It can lose more than 50 percent of its skin and then grow a near perfect replacement, including new hair. Its ears are even more magical. When scientists drilled holes in them, the mice were able to not only grow more skin, but also new glands, hair follicles and cartilage.

And that’s what really excites researchers in human regenerative medicine, a fast-emerging field built around finding ways to boost the body’s ability to heal itself. As amazingly sophisticated as medicine has become, treatment of most diseases still focuses largely on managing symptoms–insulin shots to keep diabetes in check, medications to ease the strain on a damaged heart.

But regenerative medicine could dramatically change health care by shifting the emphasis to helping damaged tissue or organs repair themselves.”

Tags:

From the BBC, a report about thousands of smart cars in Ann Arbor that communicate with one another even if the drivers don’t:

“If you want to find the smartest drivers in the world, you need to head for the home of the US car industry. Just outside Detroit, lies the town of Ann Arbor, Michigan. The drivers there are not any more intelligent than other parts of the world, despite it being a famed college town. However, their cars are.

That’s because the roads of Ann Arbor are now home to a fleet of several thousand cars that constantly ‘talk’ to one another. The scheme, known as the Safety Pilot Model Deployment project, offers a potential blueprint for the future of road transport. Like many projects it aims to cut congestion and make the road network more efficient. But this vision of the future is missing one thing: crashes.”

Declassified documents reveal that the Air Force worked stealthily on a flying saucer craft in the 1950s. From Sebastian Anthony at Extreme Tech:

“The aircraft, which had the code name Project 1794, was developed by the USAF and Avro Canada in the 1950s. One declassified memo, which seems to be the conclusion of initial research and prototyping, says that Project 1794 is a flying saucer capable of ;between Mach 3 and Mach 4,’ (2,300-3,000 mph) a service ceiling of over 100,000 feet (30,500m), and a range of around 1,000 nautical miles (1,150mi, 1850km).

As far as we can tell, the supersonic flying saucer would propel itself by rotating an outer disk at very high speed, taking advantage of the Coandă effect. Maneuvering would be accomplished by using small shutters on the edge of the disc (similar to ailerons on a winged aircraft). Power would be provided by jet turbines. According to the cutaway diagrams, the entire thing would even be capable of vertical takeoff and landing (VTOL).”

Tags:

Following up on yesterday’s post about biogerontologist Aubrey de Grey, an excerpt from “The Invincible Man,” a 2007 Washington Post profile by Joel Garreau of the scientist who believes we can conquer aging:

“At midday in George Washington University’s Kogan Plaza off H Street NW, you are surrounded by firm, young flesh. Muscular young men saunter by in sandals, T-shirts and cargo shorts. Young blond women sport clingy, sleeveless tops, oversize sunglasses and the astounding array of subtle variations available in flip-flops and painted toenails.

Is this the future? you ask de Grey.

‘Yes, it is precisely the future,’ he says. ‘Except without people who look as old as you and me.’

‘Of course the world will be completely different in all manner of ways,’ de Grey says of the next few decades. His speech is thick, fast and mellifluous, with a quality British accent.

‘If we want to hit the high points, number one is, there will not be any frail elderly people. Which means we won’t be spending all this unbelievable amount of money keeping all those frail elderly people alive for like one extra year the way we do at the moment. That money will be available to spend on important things like, well, obviously, providing the health care to keep us that way, but that won’t be anything like so expensive. Secondly, just doing the things we can’t afford now, giving people proper education and not just when they’re kids, but also proper adult education and retraining and so on.

‘Another thing that’s going to have to change completely is retirement. For the moment, when you retire, you retire forever. We’re sorry for old people because they’re going downhill. There will be no real moral or sociological requirement to do that. Sure, there is going to be a need for Social Security as a safety net just as there is now. But retirement will be a periodic thing. You’ll be a journalist for 40 years or whatever and then you’ll be sick of it and you’ll retire on your savings or on a state pension, depending on what the system is. So after 20 years, golf will have lost its novelty value, and you’ll want to do something else with your life. You’ll get more retraining and education, and go and be a rock star for 40 years, and then retire again and so on.’

The mind reels. Will we want to be married to the same person for a thousand years? Will we need religion anymore? Will the planet fill to overflowing?”

Tags: ,

My earliest childhood memory is of lying on the living room floor of my family’s home and trying to pick up Crayolas with my toes. I doubt there was a day in my life until recent years when I didn’t spend several hours holding a pen or pencil or marker (with my fingers, not toes). That’s what writers did. And there were unintended benefits: There seems to be a strong connection between penmanship and memory. Write a fact on a piece of paper and it’s much more likely you’ll recall that fact.

I can’t tell you the last time I held any writing utensil in my hand. Whether it’s doing a crossword puzzle or paying a bill or jotting down a note, a screen and keypad do the job. But I don’t fret over the change. Yes, memories and individuality are diminished in some ways in a paperless world, but I’ll accept the trade-off any day. Having crayons as a child was wonderful, but you know what else would have been great? Having access to the mountain of information that is accessible 24/7 to us all now. It’s a net win.

Not everyone agrees, however. In a Guardian essayPhilip Hensher urges the reclamation of penmanship. An excerpt:

“We have surrendered our handwriting for something more mechanical, less distinctively human, less telling about ourselves and less present in our moments of the highest happiness and the deepest emotion. Ink runs in our veins, and shows the world what we are like. The shaping of thought and written language by a pen, moved by a hand to register marks of ink on paper, has for centuries, millennia, been regarded as key to our existence as human beings. In the past, handwriting has been regarded as almost the most powerful sign of our individuality. In 1847, in an American case, a witness testified without hesitation that a signature was genuine, though he had not seen an example of the handwriting for 63 years: the court accepted his testimony.

Handwriting is what registers our individuality, and the mark which our culture has made on us. It has been seen as the unknowing key to our souls and our innermost nature. It has been regarded as a sign of our health as a society, of our intelligence, and as an object of simplicity, grace, fantasy and beauty in its own right. Yet at some point, the ordinary pleasures and dignity of handwriting are going to be replaced permanently.” (Thanks Browser.)

Tags:

From Marek Kohn’s excellent Aeon essay about long-range planning, a passage about how the present might fall into ruin if we weren’t convinced of a future:

“How can you care about something you can’t imagine? For all but the most rigorous moral philosophers, caring requires more than a logical reckoning of duty. People need visions of things they feel attached to, or find beautiful, or moving. They have to be able to imagine a future the failure of which to materialise would feel like a loss. Points on the horizon that help people to see something in the far future may help them feel connected to it. They may also encourage people to believe that there actually will be a future.

After you have systematically cleared the horizon of time and it has faded to white, imagine what is likely to happen if you let someone else get their hands on your vacant landscape. Like as not, they will strew apocalypse all over it: ruins, mutants, scattered bands armed against each other. People seem irresistibly drawn to the end of the world — but if they catch glimpses of a future in which spiritual edifices or ancient documents endure, they might be more inclined to help secure it, and less inclined towards nihilistic fantasy.

They don’t have to have a view of the far horizon in order to factor the distant future’s interests into their actions. The interests of their children and grandchildren will be more alive in their minds: serving them may well serve those of more distant generations, too. But at this possibly critical moment, when our imaginative sympathies need all the help they can get, it’s worth trying to focus a 1,000-year stare.”

Tags:

In a 1966 issue of Ramparts magazine, writer Howard Gossage tried to explain the teachings of Marshall McLuhan, whose book from two years earlier, Understanding Media: The Extensions of Man, had announced him as a media star with a message. An excerpt:

“McLuhan’s theory is that this is the first generation of the electronic age. He says they are different because the medium that controls their environment is not print — one thing at a time, one thing after another — as it has been for 500 years. It is television, which is everything happening at once, instantaneously, and enveloping.

A child who gets his environmental training on television— and very few nowadays do not — learns the same way any member of a pre-literate society learns: from the direct experience of his eyes and ears, without Gutenberg for a middle man. Of course they do learn how to read too, but it is a secondary discipline, not primary as it is with their elders. When it comes to shaping sensory perceptions, I’m afraid that Master Gutenberg just isn’t in the same class with General Sarnoff or Doctor Stanton.

Despite the uproar over inferior or inept television fare, McLuhan does not think that the program content of television has anything to do with the real changes TV has produced; no more than whether a book is trashy or a classic has anything to do with the process of reading it. The basic message of television is television itself, the process, just as the basic message of a book is print. As McLuhan says, ‘The medium is the message.’

This new view of our environment is much more realistic in the light of what has happened since the advent of McLuhan’s ‘Electric Age.’ The Gutenberg Age, which preceded it, was one thing after another in orderly sequence from cause to effect. It reached its finest flower with the development of mechanical linkages: A acts on B which acts on C which acts on D on down to the end of the line and the finished product. The whole process was thus fragmented into a series of functions, and for each function there was a specialist. This methodology was not confined to making things; it pervaded our entire economic and social system. It still does, though we are in an age when cause and effect are becoming so nearly simultaneous as to make obsolete all our accustomed notions of chronological sequence and mechanical linkage. With the dawn of the Electric Age, time and speed themselves have become of negligible importance; just flip the switch. Instant speed.

However, our methodology and thought patterns are still, for the most part, based on the old fragmentation and specialism, which may account for some of our society’s confusion, or perhaps a great deal of it.”

Tags: ,

I’ve posted before about British gerontologist Aubrey de Grey, who brings a technologist’s approach to Ponce de León’s quest. (Thanks Next Big Future.)

SEALAB 1 was the U.S. Navy vessel created to conduct underwater exploration. In addition to doing deep-sea experiments, the psychological strain of isolation on humans was also investigated. Shaped like a ginormous dildo, the craft and its four crewmen were lowered into the waters off the coast of the Bahamas in 1964. Embedded is a short Navy doc about a voyage that would have delighted Jules Verne. Jackie Cooper hosts.

Tags:

“We are entranced with our emotions, which are so easily observed in others and ourselves.” (Image by Kantele.)

Times of great ignorance are petri dishes for all manner of ridiculous myths, but, as we’ve learned, so are times of great information. The more things can be explained, the more we want things beyond explanation. And maybe for some people, it’s a need rather than a want. The opening of “Music, Mind and Meaning,” Marvin Minsky’s 1981 Computer Music Journal essay:

“Why do we like music? Our culture immerses us in it for hours each day, and everyone knows how it touches our emotions, but few think of how music touches other kinds of thought. It is astonishing how little curiosity we have about so pervasive an ‘environmental’ influence. What might we discover if we were to study musical thinking?

Have we the tools for such work? Years ago, when science still feared meaning, the new field of research called ‘Artificial Intelligence’ started to supply new ideas about ‘representation of knowledge’ that I’ll use here. Are such ideas too alien for anything so subjective and irrational, aesthetic, and emotional as music? Not at all. I think the problems are the same and those distinctions wrongly drawn: only the surface of reason is rational. I don’t mean that understanding emotion is easy, only that understanding reason is probably harder. Our culture has a universal myth in which we see emotion as more complex and obscure than intellect. Indeed, emotion might be ‘deeper’ in some sense of prior evolution, but this need not make it harder to understand; in fact, I think today we actually know much more about emotion than about reason.

Certainly we know a bit about the obvious processes of reason–the ways we organize and represent ideas we get. But whence come those ideas that so conveniently fill these envelopes of order? A poverty of language shows how little this concerns us: we ‘get’ ideas; they ‘come’ to us; we are ‘re-minded of’ them. I think this shows that ideas come from processes obscured from us and with which our surface thoughts are almost uninvolved. Instead, we are entranced with our emotions, which are so easily observed in others and ourselves. Perhaps the myth persists because emotions, by their nature, draw attention, while the processes of reason (much more intricate and delicate) must be private and work best alone.

The old distinctions among emotion, reason, and aesthetics are like the earth, air, and fire of an ancient alchemy. We will need much better concepts than these for a working psychic chemistry.

Much of what we now know of the mind emerged in this century from other subjects once considered just as personal and inaccessible but which were explored, for example, by Freud in his work on adults’ dreams and jokes, and by Piaget in his work on children’s thought and play. Why did such work have to wait for modern times? Before that, children seemed too childish and humor much too humorous for science to take them seriously.

Why do we like music? We all are reluctant, with regard to music and art, to examine our sources of pleasure or strength. In part we fear success itself– we fear that understanding might spoil enjoyment. Rightly so: art often loses power when its psychological roots are exposed. No matter; when this happens we will go on, as always, to seek more robust illusions!”

Tags:

Technologists knew for the longest time that the world was going to be much more connected, that we would become a global hive. But what form would it take? Before Apple perfected ideas hatched at Xerox and brought them to the marketplace, a lot of people believed that television would be the medium that would unite us (with help from a phone connection, of course). TVs were already in every home, so even though it never came to pass, it made some sense.

From Bell Labs, a 1979 look at TVs and PCs connecting us:

The opening of David Barash’s New York Times piece about how parasites can manipulate the behavior of their hosts, whether that host is a bee or a human:

“ZOMBIE bees?

That’s right: zombie bees. First reported in California in 2008, these stranger-than-fiction creatures have spread to North Dakota and, just recently, to my home in Washington State.

Of course, they’re not really zombies, although they act disquietingly like them, showing abnormal behavior like flying at night (almost unheard-of in healthy bees), moving erratically and then dying. These ‘zombees’ are victims of a parasitic fly, Apocephalus borealis. The fly lays eggs within honeybees, inducing their hosts to make a nocturnal ‘flight of the living dead,’ after which the larval flies emerge, having consumed the bee from the inside out.

These events, although bizarre, aren’t all that unusual in the animal world. Many fly and wasp species lay their eggs inside hosts. What is especially interesting, and a bit more unusual, is the way an internal parasite not only feeds on its host, but also frequently alters its behavior, in a way that favors the continued survival and reproduction of the parasite.

Not all internal parasites kill their hosts, of course: pretty much every multicellular animal is home to numerous fellow travelers, each of which has its own agenda, which in some cases involves influencing, or taking control of, part or all of the body in which they temporarily reside.”

Tags:

“We are going to be living in a world with tablets or flat screen computers on the walls in our bedrooms and kitchens.” (Image by Vergel Bradford.)

Let’s hope there’s an OFF switch when we are surrounded by screens and sensors that want to assist us without any prompting. It will be wonderful and it will be terrible. From Ben Popper at the Verge:

It’s rare to meet a startup that is focused on building a business for a world which does not yet exist. But Expect Labs, which today announced a $2.4 million round of funding from Google Ventures and Greylock Partners, is doing just that. The company is creating a system that listens and understands human conversation, then suggests relevant information without being prompted. ‘As the price of hardware falls, we are going to be living in a world with tablets or flat screen computers on the walls in our bedrooms and kitchens,’ says Expect Labs founder Timothy Tuttle. ‘These machines are going to listen to everything you say and be able to assist you with the right song, map or recipe, without you even having to ask.'”

Tags: ,

Fun short about computer operations at NASA during the 1960s.

I’m all for rethinking education, making learning more engaging, but I have a problem with the underlying notion in Salman Khan’s new Time think-piece which suggests hour-long lectures should be a thing of the past because students tune them out at the quarter-hour point.

I don’t think a post-literate world means one in which reading books will disappear. It’s just that they’ll be additional types of literacy. There’s room for more than one. Likewise, I don’t think a post-analog society means that we no longer have to press our brains beyond short-form critical thought that uses class time with the rigid efficiency of a computer. Sometimes the length of the lesson is as important as the lesson itself.

Life doesn’t come to us in the small bits we may desire. Some challenges require that we’ve learned patience, that we’ve forced ourselves to focus beyond what’s comfortable or convenient. If we place ourselves inside of a construct that provides us with only what we want, we won’t get everything we need. From Kahn’s essay:

“In 1996, in a journal called the National Teaching & Learning Forum, two professors from Indiana University — Joan Middendorf and Alan Kalish — described how research on human attention and retention speaks against the value of long lectures. They cited a 1976 study that detailed the ebbs and flows of students’ focus during a typical class period. Breaking the session down minute-by-minute, the study’s authors determined that students needed a three- to five-minute period of settling down, which would be followed by 10 to 18 minutes of optimal focus. Then — no matter how good the teacher or how compelling the subject matter — there would come a lapse. In the vernacular, the students would ‘lose it.’ Attention would eventually return, but in ever briefer packets, falling ‘to three- or four-minute [spurts] towards the end of a standard lecture,’ according to the report. This study focused on college students, and of course it was done before the age of texting and tweeting; presumably, the attention spans of younger people today have become even shorter, or certainly more challenged by distractions.” (Thanks Browser.)

Tags:

I think the Singularity won’t be the moment machines surpass human knowledge but when carbon and silicon are integrated to achieve a reality greater than would be possible by either alone. In an excellent Aeon essay, David Deutsch considers the sources of the continuing inability of technologists to create truly conscious machines, which he sees as a crisis of philosophical thought as much as anything. The opening:

“It is uncontroversial that the human brain has capabilities that are, in some respects, far superior to those of all other known objects in the cosmos. It is the only kind of object capable of understanding that the cosmos is even there, or why there are infinitely many prime numbers, or that apples fall because of the curvature of space-time, or that obeying its own inborn instincts can be morally wrong, or that it itself exists. Nor are its unique abilities confined to such cerebral matters. The cold, physical fact is that it is the only kind of object that can propel itself into space and back without harm, or predict and prevent a meteor strike on itself, or cool objects to a billionth of a degree above absolute zero, or detect others of its kind across galactic distances.

But no brain on Earth is yet close to knowing what brains do in order to achieve any of that functionality. The enterprise of achieving it artificially — the field of ‘artificial general intelligence’ or AGI — has made no progress whatever during the entire six decades of its existence.” (Thanks Kurzweil.)

Tags:

Technology wrecks some business models and builds some others. One category that has experienced a surprising boom: chess. From Kenneth Rogoff at Project Syndicate:

“A peculiar but perhaps instructive example comes from the world of professional chess. Back in the 1970’s and 1980’s, many feared that players would become obsolete if and when computers could play chess better than humans. Finally, in 1997, the IBM computer Deep Blue defeated world chess champion Gary Kasparov in a short match. Soon, potential chess sponsors began to balk at paying millions of dollars to host championship matches between humans. Isn’t the computer world champion, they asked?

Today, the top few players still earn a very good living, but less than at the peak. Meanwhile, in real (inflation-adjusted) terms, second-tier players earn much less money from tournaments and exhibitions than they did in the 1970’s.

Nevertheless, a curious thing has happened: far more people make a living as professional chess players today than ever before. Thanks partly to the availability of computer programs and online matches, there has been a mini-boom in chess interest among young people in many countries.

Many parents see chess as an attractive alternative to mindless video games. A few countries, such as Armenia and Moldova, have actually legislated the teaching of chess in schools. As a result, thousands of players nowadays earn surprisingly good incomes teaching chess to children, whereas in the days before Deep Blue, only a few hundred players could truly make a living as professionals.”

Tags:

The opening of a Venkat Rao post, in which he updates the country mouse-city mouse divide for our Digital Age:

“The fable of the town mouse and the country mouse is probably the oldest exploration of the tensions involved in urbanization, but it seems curiously dated today.  The tensions explored in the fable — the simple, rustic pleasures and securities of country life versus the varied, refined pleasures and fears of town life  – seem irrelevant today. In America at least, the ‘country’ such as it is, has turned into a geography occupied by industrial forces.  The countryside is a sparsely populated, mechanized food-and-resource cloud. A system of national parks, and a scattering of ‘charming’ small towns and villages pickled in nostalgia, are all that liven up a landscape otherwise swallowed up by automated modernity.

In America, larger provincial towns and cities that are just a little too large and unwieldy to be nostalgically pickled, but not large enough to be grown into metropolitan regions, appear to be mostly degenerating into meth-lab economies or ossifying into enclaves of a retreating rich.

So the entire canvas of the town mouse/country mouse fable is being gradually emptied out. If there is a divide today, it is between two new species of mice: metro mice and cloud mice.”

••••••••••

Jerry Mouse leaves the sticks to visit Manhattan, 1945:

Tags:

« Older entries § Newer entries »