Excerpts

You are currently browsing the archive for the Excerpts category.

Curious that in the Information Age there’s still so much misinformation about potential pandemics. Epidemiology is vastly improved, but the public is often off-base in understanding medicine in our more quantified world, fearing life-saving vaccines while indulging in unhealthy behaviors. Schlocky journalism, a failure to develop critical thinking and our deep fear of horrible deaths conspire to make it so. From David Quammen in the New York Times:

“Humans die in large numbers every day, every hour, from heart failure and automobile crashes and the dreary effects of poverty; but strange new infectious diseases, even when the death tolls are low, call up a more urgent sort of attention. Why?

There’s a tangle of reasons, no doubt, but one is obvious: whenever an outbreak occurs, we all ask ourselves whether it might herald the Next Big One.

What I mean by the Next Big One is a pandemic of some newly emerging or re-emerging infectious disease, a global health catastrophe in which millions die. The influenza epidemic of 1918-19 was a big one, killing about 50 million people worldwide. The Hong Kong flu of 1968-69 was biggish, causing at least a million deaths. AIDS has killed some 30 million and counting. Scientists who study this subject — virologists, molecular geneticists, epidemiologists, disease ecologists — stress its complexity but tend to agree on a few points.

Yes, there probably will be a Next Big One, they say. It will most likely be caused by a virus, not by a bacterium or some other kind of bug. “

Tags:

FromThe Gray Tsunami,” Jeff Wheelwright’s new Discover article about the challenges attending the increasing longevity of world population, a section about Sun City retirement community in Arizona, an example of how some white Americans used to retire:

“Del Webb was no demographer, but in the late 1950s he saw an opportunity in America’s budding crop of elderly. Promoting the then-novel idea of ‘active retirement,’ Webb was a very active 60-year-old himself. Tall and lean, a vigorous golfer and baseball fan, he was a millionaire contractor with a common touch. The people who flocked to see his Sun City demonstration homes—100,000 showed up over New Year’s weekend in 1960—had had their fill of hard times. These were people who had lived through an economic depression and a world war. The advertisements for Sun City depicted a golden way of life in a place where they could retire and relax, where they would not be frail or sick.

Some of those ads now hang in the Sun City Historical Museum, which occupies one of the first homes to be built here, next to the first golf course. Two vintage golf carts, labeled Him and Her, stand side by side in the carport. Inside, the modest fixtures and furniture of a typical 1960s retired couple are on display. The original cinder-block structure consisted of five rooms totaling just 858 square feet; an addition was put on the back later. The small eat-in kitchen features a boxy electric range and fridge. The sink in the pink-tiled bathroom is very low and the toilet is minuscule, hardly suitable for today’s amplified Americans. The three academics smile as they look into the bathroom. ‘There are no handrails, nothing to grab onto,’ Glick says.

Sun City’s radical idea—to restrict home ownership to people 55 and older—effectively excluded families and children from the development. But recently the policy was updated. Now only one owner has to be over 55, this to accommodate residents with younger spouses. Getting back in the van and touring the quiet, curving streets, with their neat plantings and pink-tinted gravel, the ASU group sees no pregnant women or kids, no young people whatsoever. Sun City has a fertility rate of zero.

The fertility rate is the number of children an average female will produce in her lifetime. The panelists note that the rate is currently plunging in almost all countries around the world. True, it has not occurred in sub-Saharan Africa, not yet. But for those who specialize in the long view, fertility collapse and accelerated aging have supplanted overpopulation as the most salient demographic trend.” (Thanks Browser.)

•••••••••••

Sun City promotional film from the 1960s:

Tags: ,

Microbes supercharged to devour particular types of waste–even the non-organic kind–makes too much sense for it to not happen. Of course, it’s easy for me to say since I don’t have to come up with the science to enable that process. Until we perfect the method, we must employ workarounds. The city of Dallas, for instance, is trying to effect a zero-waste recycling plan by 2040. From Nick Swartsell in the New York Times:

“If J. R. Ewing can quit smoking and promote solar energy, anything is possible in Dallas, environmental advocates say, even an ambitious plan to have the city recycling nearly all of its garbage by 2040.

‘If Dallas can have a zero-waste plan, any city can,’ said Zac Trahan, the Dallas program manager at Texas Campaign for the Environment, a group challenging the city’s reputation for big oil, big cars and big sprawl. ‘It can really be a huge opportunity to move toward a more sustainable Texas.’

Before the last of the plastic bags, crumpled papers and other urban tumbleweeds head to the recycling plant, the city will have to determine when to put into place the various steps of its plan, which the Dallas City Council formally adopted on Aug. 22. It will also have to address the lingering concerns of advocacy groups and business interests, like unintended environmental consequences and unfinanced mandates.”

Tags: ,

“Bada-bing.”

From “Cyber-Neologoliferation,” James Gleick’s fun 2006 New York Times Magazine article about his visit to the offices of the Oxford English Dictionary, an explanation of how the word “bada-bing” came to be listed in the OED:

“Still, a new word as of September is bada-bing: American slang ‘suggesting something happening suddenly, emphatically, or easily and predictably.’ The Sopranos gets no credit. The historical citations begin with a 1965 audio recording of a comedy routine by Pat Cooper and continue with newspaper clippings, a television news transcript and a line of dialogue from the first Godfather movie: ‘You’ve gotta get up close like this and bada-bing! you blow their brains all over your nice Ivy League suit.’ The lexicographers also provide an etymology, a characteristically exquisite piece of guesswork: ‘Origin uncertain. Perh. imitative of the sound of a drum roll and cymbal clash…. Perh. cf. Italian bada bene mark well.’ But is bada-bing really an official part of the English language? What makes it a word? I can’t help wondering, when it comes down to it, isn’t bada-bing (also badda-bing, badda badda bing, badabing, badaboom) just a noise? ‘I dare say the thought occurs to editors from time to time,’ Simpson says. ‘But from a lexicographical point of view, we’re interested in the conventionalized representation of strings that carry meaning. Why, for example, do we say Wow! rather than some other string of letters? Or Zap! Researching these takes us into interesting areas of comic-magazine and radio-TV-film history and other related historical fields. And it often turns out that they became institutionalized far earlier than people nowadays may think.'”

Tags: ,

That Bloomberg took away our barrels of soda for (perhaps) no reason. A growing number of studies show that overweight, even obese, people fare better when becoming ill than their thinner counterparts with the same diseases. From Harriet Brown in the New York Times:

“A few years ago, Mercedes Carnethon, a diabetes researcher at the Feinberg School of Medicine at Northwestern University, found herself pondering a conundrum. Obesity is the primary risk factor for Type 2 diabetes, yet sizable numbers of normal-weight people also develop the disease. Why?

In research conducted to answer that question, Dr. Carnethon discovered something even more puzzling: Diabetes patients of normal weight are twice as likely to die as those who are overweight or obese. That finding makes diabetes the latest example of a medical phenomenon that mystifies scientists. They call it the obesity paradox.

In study after study, overweight and moderately obese patients with certain chronic diseases often live longer and fare better than normal-weight patients with the same ailments. The accumulation of evidence is inspiring some experts to re-examine long-held assumptions about the association between body fat and disease.”

Tags:

When I first became conscious of sports as a child, I was obsessed with boxing. But I was still a kid when Muhammad Ali lost his amazing speaking ability, and I never could watch it again. Ali was very important to me not only as an athlete but for his politics. It isn’t giving him enough credit in and of himself to say that he was for me a gateway drug to Malcolm X, but there’s a lot of truth to that statement. In fact, studying boxing matches that took place long before my birth taught me so much about history and race and politics and sociology. The sport had the same effect on millions of others. Boxing was king until it wasn’t. The shadiness of the promoters had something to do with its decline, but mostly it was watching these beloved figures grow shaky in their hands and voices.

Rich Cohen has an article in the New Republic about football’s future being threatened by the growing awareness of the sport’s unavoidable head injuries. It seems inconceivable that football could severely decline because of the cash cow that the NFL is, but, then again, no one is building insta-stadiums to handle overflowing boxing crowds anymore. An excerpt:

“The worry is not just that people will stop watching the game—it’s that parents will stop letting their kids play, starving the league of talent. Speaking on The Tonight Show, Terry Bradshaw, the great Steelers quarterback, predicted the demise of football, saying if he had a son, he would not let him sign up. ‘The fear of them getting these head injuries,’ he explained, ‘it’s just too great for me.’ Something similar happened to boxing, which was once the biggest sport in the United States. But the country evolved away from the ring, until boxing became a mirror of its own saddest character, the nobody, the palooka, the bum.”

Tags:

I don’t know that our history is disappearing more quickly because so much of it is now reported and recorded online, but maybe we had unrealistic expectations about new technologies defeating the wasting away of information. I would assume, on average, we collect and retain more info now than ever before. But the fraying of facts can only be kept at bay for so long–in our minds and in our machines. No matter how advanced the system, the system will eventually fail. From a post about the Arab Spring vanishing into the Twitterplex at MIT’s Technology Review:

“On 25 January 2011, a popular uprising began in Egypt that  led to the overthrow of the country’s brutal president and to the first truly free elections. One of the defining features of this uprising and of others in the Arab Spring was the way people used social media to organise protests and to spread news.

Several websites have since begun the task of curating this content, which is an important record of events and how they unfolded. That led Hany SalahEldeen and Michael Nelson at Old Dominion University in Norfolk, Virginia, to take a deeper look at the material to see how much the shared  were still live. 

What they found has serious implications. SalahEldeen and Nelson say a significant proportion of the websites that this social media points to has disappeared. And the same pattern occurs for other culturally significant events, such as the the H1N1 virus outbreak, Michael Jackson’s death and the Syrian uprising. 

In other words, our history, as recorded by social media, is slowly leaking away.”

I don’t think any of us will live to see a real understanding of human consciousness. The brain is too confusing, too confounding. We’ll get there eventually, but it’s going to be a long slog. Paul Allen is currently trying to reverse engineer the brain, fully aware of the mammoth challenge. From Matthew Herper in Forbes:

“Understanding the brain, Allen argues, is much like a being a medieval blacksmith trying to reverse engineer a jet plane. It’s not just that you don’t understand how the wing attaches to the fuselage or what makes the engine go. You don’t even know the basic theory of how air going over a wing creates lift. ‘Moore’s Law-based technology is so much easier than neuroscience,’ Allen says. ‘The brain works in such a different way from the way a computer does. The computer is a very regular structure. It’s very uniform. It’s got a bunch of memory, and it’s got a little element that computes bits of memory and combines them with each other and puts them back somewhere. It’s a very simple thing.

‘So for someone to learn how to program a computer, in most cases, a human being can do it. You can start programming. I did it in high school. Me and Bill Gates and our friends did that. Probably in a few months we were programming and probably understood what there was to understand about computing within a few years of diving into it.’

In the human brain, designed by evolution, every tiny part is very different from every other tiny part. ‘It’s hideously complex,’ Allen says. And it’s going to take ‘decades and decades’ of more research to understand.”

Tags: ,

From Kevin Kelly’s 1994 book, Out of Control: The New Biology of Machines, Social Systems, and the Economic World, which examined, among other things, how hive behavior in insects might be replicated in humans connected by technology:

“Ants, too, have hive mind. A colony of ants on the move from one nest site to another exhibits the Kafkaesque underside of emergent control. As hordes of ants break camp and head west, hauling eggs, larva, pupae — the crown jewels — in their beaks, other ants of the same colony, patriotic workers, are hauling the trove east again just as fast, while still other workers, perhaps acknowledging conflicting messages, are running one direction and back again completely empty-handed. A typical day at the office. Yet, the ant colony moves. Without any visible decision making at a higher level, it chooses a new nest site, signals workers to begin building, and governs itself.

The marvel of ‘hive mind’ is that no one is in control, and yet an invisible hand governs, a hand that emerges from very dumb members. The marvel is that more is different. To generate a colony organism from a bug organism requires only that the bugs be multiplied so that there are many, many more of them, and that they communicate with each other. At some stage the level of complexity reaches a point where new categories like ‘colony’ can emerge from simple categories of ‘bug.’ Colony is inherent in bugness, implies this marvel. Thus, there is nothing to be found in a beehive that is not submerged in a bee. And yet you can search a bee forever with cyclotron and fluoroscope, and you will never find the hive.”

Tags:

The early promise of PCs in the 1970s, in the heyday of the Homebrew Computer Club, was that the individual would be master of the technology, not that we would queue up for “improved” iPhones handed down to us by a gigantic corporation every six months. Chris Anderson thinks the spirit of the Homebrew is regaining prominence and will be the future of American manufacturing. From Farhad Manjoo in Slate:

“As Anderson describes it, the new movement is built on three technological and social advances. First, there’s ‘rapid prototyping.’ Today you can design your world-changing widget on a computer, instantly make it real on a 3-D printer, and then go back to the drawing board to refine it. Second, because your designs are all standard CAD files, you can share them with others and borrow other people’s designs, allowing for everyone to improve their widgets through remixing. Finally, when you’ve perfected your widget, you can take advantage of firms like Kickstarter to raise money, then send your designs to commercial manufacturers that will produce your widget in bulk—even if bulk, for you, means you’re making only a few thousand of them.

When I chatted with Anderson recently, I asked him about the timeline of his vision. He thinks the maker movement is around where the PC industry was in the mid-1980s—somewhere between the release of the Apple II and the Mac, between a computer that was popular with hobbyists and one that was meant for everyone. Soon, we’ll have 3-D printers that cost about the same as paper printers, we’ll have 3-D design software that’s as easy to use as iMovie, and making physical things will take on the kind of cultural significance that making digital things did in the first dot-com boom. At that point, we’ll notice the products around us begin to change, Anderson says. A lot of what you’ll buy will still come from large companies that make mass-manufactured goods, but an increasing number of your products will be produced by ‘industrial artisans.’ These artisans will produce goods aimed for niche audiences—perhaps you’re a gardener who needs a specific kind of sprinkler head, or maybe you want computer speakers shaped like Mount Rushmore. Because they’ll be able to sell anywhere, and because their goods will command higher prices that mass-manufactured stuff, artisans will be able to build thriving small businesses from their inventions.”

•••••••••••

Homebrew at the Byte Shop in 1978:

Tags:

In a recent Guernica interview conducted by Emily Brennan, Katherine Boo, that excellent New Yorker writer, addressed the moral complexity of reporting about poverty for a magazine aimed at those with considerable disposable income. An excerpt:

“Guernica: 

At a lecture at American Academy, you recounted that during your reporting on that evacuation shelter for The New Yorker a woman told you, ‘Wait, so you take our stories and put them in a magazine that rich people read, and you get paid and we don’t? That’s some backward-ass bluffiness, if you ask me.’ She seemed to sum up the moral dilemma that reporting on poverty raises. Can you speak to some of these ethical questions?

Katherine Boo: 

She said it better than I did. We take stories and purvey them to people with money. And in the conventions of my profession, which I try to adhere to, we can’t pay people for stories. Anyone with a conscience who does this work grapples with that reality, and if they don’t, I’d worry. I lie awake at night, and I think, ‘Am I exploiting them? Am I a vulture?’ All of the terrible names anyone could call me, I’ve called myself worse.

But if writing about people who are not yourself is illegitimate, then the only legitimate work is autobiography; and as a reader and a citizen, I don’t want to live in that world. Because if you take a kid like Sunil, who’s been denied the possibility of an education that allows him to write his own story, and all of the people who lack the means and access to do so, they go down the memory hole. They’re lost. What it comes down to is, the only thing worse than being a poverty reporter is if no one ever wrote about it at all. My work, I hope, helps people understand how much gets lost between the intellection of how to get people out of poverty and how it’s actually experienced.

One of the reasons I pore over official documents and reportage is because I’m fascinated by the chasm between the lives that people have and the way they’re officially recorded. In Annawadi, when people were killed, they were categorized as sickness deaths because the officials were corrupt, were extorting money from other people, didn’t care to investigate the deaths of no-account people, and so on. The tragedy is that the other children in Annawadi knew that these people were murdered, that their lives had no meaning, that they’d be classified and filed away. The corrosive effect of that knowledge is staggering. When you know that anything can happen to you, that there is no possibility of redress because of who you are, because you’re an embarrassment in this prosperous city, that’s tragic. Sunil knows people who’ve been killed and filed away, and he can’t bring that to life. But he can tell me and I can get the documents and do the work and bring it to life. And that’s a trade-off to make.”

Tags: ,

The opening of Noah Smith’s hopeful new Atlantic article about solar erasing our carbon footprint:

“You may not believe me, but I have news about global warming: Good news, and better news.

Here is the good news. US carbon emissions are decreasing rapidly. We’re down over 10% from our emissions peak in 2007. Furthermore, the drop isn’t just a function of the Great Recession. Since 2010 our economy has been growing, but emissions have kept on falling. The reason? Natural gas. With the advent of ‘fracking’ technology, the price of gas has plummeted far below that of coal, and as a result, essentially no new coal plants are being built. Although gas does release carbon, it only releases about half as much as coal for the same amount of electricity. This is why — despite our failure to join the Kyoto Protocol or impose legal restrictions on CO2 — the United States is now outpacing the rest of the developed world in reducing our contribution to global warming.

Now for the better news. A technology is in the pipeline that has the potential to eliminate CO2 emissions entirely. Solar power, long believed to be unworkably expensive, has actually been falling in cost at a steady exponential rate of 7 percent per year for the last three decades straight. Because of this ‘Moore’s Law for solar,’ electricity from solar panels now costs less than twice as much as electricity from coal, and only about three times as much as electricity from gas. Furthermore, technologies now in the pipeline seem to ensure that the cost drop will continue. 

Within the decade, solar could be cheaper than coal. Within two decades, cheaper than gas. When that happens, assuming we also have electric cars, it is game over for carbon emissions.”

Tags:

I never had time to read this article before. It’s a 2003 Outside piece of participatory journalism about performance-enhancing drugs written by Stuart Stevens, Mitt Romney’s very embattled senior strategist. It’s actually quite good. An excerpt:

“He handed me a bottle of pills. It was Stanozolol, an anabolic steroid that lifters use to add muscle mass. This is one of the drugs that sprinter Ben Johnson was caught using at the 1988 Summer Olympics in Seoul, where he was subsequently stripped of his 100-meter gold medal.

‘Where do you get this?’ I said.

‘A vet I know,’ he answered casually. It took me a second to realize he meant veterinarian, not military veteran. ‘Vets and Mexican farmacias, that’s where you get the best stuff.’ I looked at the label on the bottle—these were literally animal pills. They’re used to bulk up livestock, and they’re banned from greyhound racing, where they’re given to dogs to make them stronger.

‘Start with this,’ he went on, spilling out several doses. ‘Good base, can’t go wrong.’ I must have looked shocked, because he gave me a friendly punch in the arm and said, ‘You want to get big, don’t you?’

That night at home I sat staring at the pills. Veterinarians? Mexican pharmacies? I shuddered and threw them out. I knew the only way I could play this game was under a doctor’s supervision.

THAT’S WHAT LED ME, a few weeks later, to Dr. Jones. He was an internist by training and a specialist in the hot new field of anti-aging medicine, which involves helping people—who are always affluent, since these treatments are expensive—try to stave off the effects of growing old with a combination of nutrition and drugs, including HGH, steroids, and testosterone. A doctor I knew had tipped me off, with a wink, that Dr. Jones also used these drugs to ‘work with a lot of athletes.’

Inside his waiting room, I’d squeezed in next to the World’s Largest Man and a woman who I thought might be an actress—though I couldn’t be certain, since she was wearing a hat and sunglasses indoors. The jumbo guy was somebody I was pretty sure spent Sunday afternoons chasing quarterbacks on television. Such people were, I would come to realize, the core of Dr. Jones’s business: athletes and attractive women of all ages. Plus rich guys over 50. And the odd Playmate or two. Oh, and me.”

Tags:

“Merchants of doom emphasize fears of molecular Frank­enbots instead of benefit.” (Image by Monsterteeth.)

From Delthia Ricks’ Discovery argument in favor of unloosing synthetic biology experimentation, which has been held back by legitimate concerns but also by some outlandish sci-fi scenarios:

“Off-the-shelf molecular parts could allow synthetic biologists to create new medications and biofuels or to make microbes with the capacity to destroy pollutants and other nui­sances. Researchers have built a potential malaria medication, and students have developed a prototype of a new vaccine to stop ulcers.

Shamefully, accolades that resounded a generation ago for biotechnology advances—for instance, recombining DNA to develop human-derived insulin, which is much safer than the animal-derived products that came before—have been drowned out by a misinformed coalition of 114 organizations, including ETC Group and Friends of the Earth. They argue the research must stop until enforceable regulations specific to synthetic biology are in place, and they insist that all alternatives to synthetic biology be considered before an experiment can advance. These demands could halt projects like those of J. Craig Venter, the biotechnologist who built the first self-replicating synthetic bacterium. He is now working on microbes that eat pollution, excrete biofuels, and more. If the coalition has its way, the world will never find out whether these organisms can help us generate energy or clean the air.

There is no documented danger from synthetic biology, yet merchants of doom emphasize fears of molecular Frank­enbots instead of benefits like new drugs and energy sources. Worries about monster species are particularly absurd. It is extraordinarily difficult to construct novel organisms, and countless attempts to do so have failed.”

Tags:

At his blog, futurist Ray Kurzweil asks questions about “the hypothesis that chemical brain preservation may inexpensively preserve the organism’s memories and mental states after death.” An excerpt:

“Would you choose chemical brain preservation at death if it was widely available, validated, and inexpensive? If not, why not? Would you do it to donate your brain to science? Your memories to your children or others who might want them? Would you be willing to come back in person, if that turns out to be possible? If it is sufficiently inexpensive, would it be best to preserve your brain at death, and let future society decide if either your memories or your identity are ‘worth’ reanimating?”

Tags:

Rust never sleeps and organic matter is apt to eventually decay. But sometimes that’s not a bad thing. From Jeff Gordinier’s New York Times piece about the growing popularity of fermented foods in fine dining:

“SAY this about Sandor Ellix Katz: the man knows how to get you revved up to eat bacteria.

‘Oh, this is nice kimchi,’ he said on a summer afternoon at Momofuku Noodle Bar, using chopsticks to pull crimson-coated knuckles of Napa cabbage from a jar. ‘I like the texture of the sauce. It’s kind of thick.’

Kimchi, like sauerkraut, is one of the world’s great fermented foods, andMr. Katz, a resident of Tennessee, was curious to see what David Chang’s team of cooks in the East Village would do with it. Lately Mr. Katz has become for fermentation what Timothy Leary was for psychedelic drugs: a charismatic, consciousness-raising thinker and advocate who wants people to see the world in a new way.

A fermented food is one whose taste and texture have been transformed by the introduction of beneficial bacteria or fungi.”

Tags: ,

I read this passage from Susan Daitch’s “Dispatches” section at Guernica and my head nearly exploded. How did I not know about this? I’ve read that Ota Benga was displayed briefly at the Bronx Zoo in 1906 until an outcry thankfully shut that exhibit down, but I never heard of Carl Hagenback, his insane childhood or his human-centric dioramas. Nor did I know about the preponderance of private zoos in Europe which were often poorly maintained. An excerpt:

“Hagenbeck’s father, a fishmonger with a side business in exotic animals, gave Carl, when still a child, a seal and a polar bear cub as presents. Hagenback displayed them in a tub and charged a few pfennigs to spectators interested in watching arctic mammals splash around. Eventually his collection grew so extensive he needed larger buildings to house them. These early entrepreneurial endeavors led to a career capturing, buying, and selling animals from all over the world, destined for European and even distant American zoos. Hagenback, known as ‘the father of the modern zoo,’ was a pioneer in the concept that animals should be displayed in some approximation of their natural habitat. Acknowledging little difference between humans (at least some humans) and animals in terms of questions of captivity and display, he also exhibited human beings: Eskimos, Laps, Samoans, African, Arabs, Native Americans, all stationed in zoos across Europe in reproductions of their native environments. Creating panoramic fictional spaces for his creatures, Hagenbeck is often credited was being the originator of the amusement park. How these captive people felt about the peculiar dress, language and eating habits of the spectators who came to see them has not, as far as I know. European emissaries, whether propelled by diplomatic missions or for purposes of trade, went into the world and brought back artifacts, instigated the concept of collecting for those who could afford it. German museums would come to display the Gate of Ishtar brought brick by brick from Baghdad, vast Chinese temples, Assyrian fortresses, and other treasures. Hagenbeck, a hybrid figure, ethnographer, zoologist, showman, anthropologist, capitalist, but also the son of a fishmonger, was not of this class of adventurer. A populist, okay, but also the question hangs in the margins: When did the Berlin Zoo stop displaying humans? 1931, I think, but I’m not sure.”

Tags: ,

I recently posted a classic article about telepresence by MIT’s Marvin Minsky. Here’s the opening of a 1982 AI Magazine piece by the cognitive scientist, which considers the possibility of computers being able to think:

“Most people think computers will never be able to think. That is, really think. Not now or ever. To be sure, most people also agree that computers can do many things that a person would have to be thinking to do. Then how could a machine seem to think but not actually think? Well, setting  aside the question of what thinking actually is, I think that most of us would answer that by saying that in these cases, what the computer is doing is merely a superficial imitation of human intelligence. It has been designed to obey certain simple commands, and then it has been provided with programs composed of those commands. Because of this, the computer has to obey those commands, but without any idea of what’s happening.

Indeed, when computers first appeared, most of their designers intended them for nothing only to do huge, mindless computations. That’s why the things were called “computers”. Yet even then, a few pioneers — especially Alan Turing — envisioned what’s now called ‘Artificial Intelligence’ – or ‘AI.’ They saw that computers might possibly go beyond arithmetic, and maybe imitate the processes that go on inside human brains.

Today, with robots everywhere in industry and movie films, most people think Al has gone much further than it has. Yet still, ‘computer experts’ say machines will never really think. If so, how could they be so smart, and yet so dumb?

Indeed, when computers first appeared, most of their designers intended them for nothing only to do huge, mindless computations. That’s why the things were called ‘computers.’ Yet even then, a few pioneers –especially Alan Turing — envisioned what’s now called ‘Artificial Intelligence’ – or ‘AI.’ They saw that computers might possibly go beyond arithmetic, and maybe imitate the processes that go on inside human brains.

Today, with robots everywhere in industry and movie films, most people think Al has gone much further than it has. Yet still, ‘computer experts’ say machines will never really think. If so, how could they be so smart, and yet so dumb?”

Tags:

Prototype of gloves that can say aloud what deaf people are signing. From Singularity Hub“With the motto ‘We’re giving a voice to movements,’ Team QuadSquad came in first place for their glove prototype in the Software Design Competition of the 2012 Microsoft Imagine Cup, winning $25,000 and garnering interest across the world, including developers anxious to bring their expertise to the project. Now the Ukranians have launched Enable Talk, a website that openly shares their ambitious vision, design documentation, and a business plan for how to bring the device to market. Furthermore, the team is looking into the possibility of enabling the same technology to allow cell phone conversations using the system. That could mean a new way for about 70 million people with hearing and speech impairment to verbally communicate and connect to people around them.”

“Your mother’s a filthy whore”:

A proposed workaround solution for global warming from the Philosopher’s Beard, which stresses pragmatism over moralizing:

“The science of climate change does set the parameters of the problem, even though it doesn’t dictate the correct solution. The greenhouse gas build-up cannot be wished away by the kind of pragmatic, social choice guided exercise I have been recommending. It must be dealt with in the medium term, but through the structural transformation of our carbon economy rather than global austerity. That will include both developing scalable technologies for removing CO2 from the atmosphere (such as genetically modified algae and trees) and reducing the carbon intensity of our high energy life-styles (for which we already have some existing technologies, such as nuclear power). But note that such innovations require no prior global agreement to set in train, but can be developed and pioneered by a handful of big industrial economies acting on the moral concerns of their own citizens.

A high price on carbon in a few large rich countries (preferably via a non-regressive carbon tax) supplemented with regulations where market forces have less bite (e.g. to force the construction industry to develop more energy efficient methods and materials) and research subsidies would provide the necessary incentives. Nor would these innovations require global agreement for take-up since they will be attractive on their own merits (clean, efficient, cheap). Developing countries burn dirty coal because it is cheap and their people need electricity. They don’t need a UN treaty to tell them to use cleaner technology if it is cheaper; but neither would they sign up to such a treaty if it were more expensive.

The pragmatic approach does not depend on reaching an impossible global agreement on a perfect solution requiring moral or political coercion. Instead it offers feasible paths through the moral storm while respecting the existing interests and values of the human beings concerned.” (Thanks Browser.)

The opening of Don Troop’s new Chronicle article about the ethics of roboticized warfare:

“The dawn of the 21st century has been called the decade of the drone. Unmanned aerial vehicles, remotely operated by pilots in the United States, rain Hellfire missiles on suspected insurgents in South Asia and the Middle East.

Now a small group of scholars is grappling with what some believe could be the next generation of weaponry: lethal autonomous robots. At the center of the debate is Ronald C. Arkin, a Georgia Tech professor who has hypothesized lethal weapons systems that are ethically superior to human soldiers on the battlefield. A professor of robotics and ethics, he has devised algorithms for an ‘ethical governor’ that he says could one day guide an aerial drone or ground robot to either shoot or hold its fire in accordance with internationally agreed-upon rules of war.”

Tags: ,

“They could create a laser from silk.” (Image by Gerd A.T. Müller.)

From a report about progress in creating biodegradable (even edible) electronics from silk, by Philip Ball at the BBC:

“Electronic waste from obsolete phones, cameras, computers and other mobile devices is one of the scourges of this information age. The circuitry and packaging is not only non-biodegradable but is laced with toxic substances such as heavy metals. Imagine, then, a computer that can be disposed of by simply letting soil bacteria eat it – or even, should the fancy take you, by eating it yourself.

Biodegradable information technology is now closer to appearing on the menu following the announcement by Fiorenzo Omenetto of Tufts University in Massachusetts, United States, and co-workers that they could create a laser from silk.”

Tags: ,

The opening of “Telepresence,” Marvin Minsky’s 1980 Omni think-piece which suggested we should bet our future on a remote-controlled economy:

“You don a comfortable jacket lined with sensors and muscle-like motors. Each motion of your arm, hand, and fingers is reproduced at another place by mobile, mechanical hands. Light, dexterous, and strong, these hands have their own sensors through which you see and feel what is happening. Using this instrument, you can ‘work’ in another room, in another city, in another country, or on another planet. Your remote presence possesses the strength of a giant or the delicacy of a surgeon. Heat or pain is translated into informative but tolerable sensation. Your dangerous job becomes safe and pleasant.

The crude ‘robotic machines of today can do little of this. By building new kinds of versatile, remote‑controlled mechanical hands, however, we might solve critical problems of energy, health, productivity, and environmental quality, and we would create new industries. It might take 10 to 20 years and might cost $1 billion—less than the cost of a single urban tunnel or nuclear power reactor or the development of a new model of automobile.”

Tags:

If you read this blog with any regularity you can understand that a story about the most famous pedestrian of the 1870s might have a special place in my heart. Still, this Grantland article by Brian Phillips about a walking wonder named Edward Payson Weston is wonderful on its own merits. The opening:

“In the summer of 1856, Edward Payson Weston was struck by lightning and fired from his job at the circus. He was 17 years old and had been traveling with the big top for no more than a few weeks — ‘under an assumed name,’ as he reassured the readers of his 1862 memoir, The Pedestrian. One day, as the troupe’s wagons passed near Tyngsborough, Massachusetts, he was ‘affected by a stroke of lightning’ and nearly killed. Nineteenth-century circus managers were about as tenderhearted as you would expect when it came to physical infirmity. When Weston was too sick to perform in Boston a few days later, he was unceremoniously sacked.

For most of us, being hit by lightning and kicked out of the circus would be an extraordinary turn of events. For Weston, it was a pretty typical week. Weston, whose story is recounted in the spectacularly entertaining book A Man in a Hurry, by the British trio of Nick Harris, Helen Harris, and Paul Marshall, lived one of those fevered American lives that seem to hurtle from one beautiful strangeness to the next. By his mid-teens, he had already: worked on a steamship; sold newspapers on the Boston, Providence, and Stonington Railroad; spent a year crisscrossing the country with the most famous traveling musicians in America, the Hutchinson Family Singers, selling candy and songbooks at their concerts; and gone into business for himself as a journalist and publisher. In his 20s and 30s, he somehow became one of the most celebrated athletes in the English-speaking world despite the fact that he was physically unprepossessing — 5-foot-7, 130 pounds, with a body resembling ‘a baked potato stuck with two toothpicks,’ as one journalist wrote — and that his one athletic talent was walking. Just straight-up walking made Weston, for a while, probably the biggest sports star on earth.”

Tags: ,

From Ashlee Vance’s well-rounded Businessweek portrait of Elon Musk, a brilliant and difficult man who is currently the most ambitious industrialist in the world:

“On the assumption that people will be living on earth for some time, Musk is cooking up plans for something he calls the Hyperloop. He won’t share specifics but says it’s some sort of tube capable of taking someone from downtown San Francisco to Los Angeles in 30 minutes. He calls it a ‘fifth mode of transportation’—the previous four being train, plane, automobile, and boat. ‘What you want is something that never crashes, that’s at least twice as fast as a plane, that’s solar powered and that leaves right when you arrive, so there is no waiting for a specific departure time,’ Musk says. His friends claim he’s had a Hyperloop technological breakthrough over the summer. ‘I’d like to talk to the governor and president about it,’ Musk continues. ‘Because the $60 billion bullet train they’re proposing in California would be the slowest bullet train in the world at the highest cost per mile. They’re going for records in all the wrong ways.’ The cost of the SF-LA Hyperloop would be in the $6 billion range, he says.

Musk is also planning to develop a new kind of airplane: ‘Boeing just took $20 billion and 10 years to improve the efficiency of their planes by 10 percent. That’s pretty lame. I have a design in mind for a vertical liftoff supersonic jet that would be a really big improvement.’

After a few hours with Musk, hypersonic tubes and jets that take off like rockets start to seem imminent. But interplanetary travel? Really? Musk says he’s on target to get a spacecraft to the red planet in 10 to 15 years, perhaps with him on board. ‘I would like to die on Mars,” he says. ‘Just not on impact.'”

Tags: ,

« Older entries § Newer entries »