Excerpts

You are currently browsing the archive for the Excerpts category.

Jonah Lehrer has an intriguing, counterintuitive argument in the Wall Street Journal, which posits that creativity isn’t just something for the chosen few, but a quality everyone possesses. Embedded in the article is the origin story of the Post-It Note. An excerpt:

“Consider the case of Arthur Fry, an engineer at 3M in the paper products division. In the winter of 1974, Mr. Fry attended a presentation by Sheldon Silver, an engineer working on adhesives. Mr. Silver had developed an extremely weak glue, a paste so feeble it could barely hold two pieces of paper together. Like everyone else in the room, Mr. Fry patiently listened to the presentation and then failed to come up with any practical applications for the compound. What good, after all, is a glue that doesn’t stick?

On a frigid Sunday morning, however, the paste would re-enter Mr. Fry’s thoughts, albeit in a rather unlikely context. He sang in the church choir and liked to put little pieces of paper in the hymnal to mark the songs he was supposed to sing. Unfortunately, the little pieces of paper often fell out, forcing Mr. Fry to spend the service frantically thumbing through the book, looking for the right page. It seemed like an unfixable problem, one of those ordinary hassles that we’re forced to live with.

But then, during a particularly tedious sermon, Mr. Fry had an epiphany. He suddenly realized how he might make use of that weak glue: It could be applied to paper to create a reusable bookmark! Because the adhesive was barely sticky, it would adhere to the page but wouldn’t tear it when removed. That revelation in the church would eventually result in one of the most widely used office products in the world: the Post-it Note.”

Tags: , ,

Printed knowledge made it possible for us to be intellectuals, to shift attention from land to library, even encouraged us to do so. But what does having a smartphone, with its icons and abbreviations, as our chief means of information gathering portend for us? And what does it mean for the next few generations born to its efficacy?

If we’re mainly typing with our thumbs, mostly communicating in short bursts of letters and symbols, relying largely on algorithms, who are we then? Will an ever-increasing reliance on technology dumb us down? Or will it ultimately free us of ideologies, those abstractions that often conflict violently with reality? And if so, is that a good thing?

In the excellent 2011 Wired article, “The World of the Intellectual vs. the World of the Engineer,” Timothy Ferris makes an argument for rigorous science over intellectual theory. I don’t completely agree with him. Democracy was a pretty potent and wonderful abstraction before it became a reality. And saying that Freud “discovered nothing and cured nobody” isn’t exactly fair since realizing that there are unconscious forces helping to drive our behavior has been an invaluable tool. But Ferris still makes a fascinating argument. An excerpt:

“Being an intellectual had more to do with fashioning fresh ideas than with finding fresh facts. Facts used to be scarce on the ground anyway, so it was easy to skirt or ignore them while constructing an argument. The wildly popular 18th-century thinker Jean-Jacques Rousseau, whose disciples range from Robespierre and Hitler to the anti-vaccination crusaders currently bringing San Francisco to the brink of a public health crisis, built an entire philosophy (nature good, civilization bad) on almost no facts at all. Karl Marx studiously ignored the improving living standards of working-class Londoners — he visited no factories and interviewed not a single worker — while writing Das Kapital, which declared it an ‘iron law’ that the lot of the proletariat must be getting worse. The 20th-century philosopher of science Paul Feyerabend boasted of having lectured on cosmology ‘without mentioning a single fact.’

Eventually it became fashionable in intellectual circles to assert that there was no such thing as a fact, or at least not an objective fact. Instead, many intellectuals maintained, facts depend on the perspective from which are adduced. Millions were taught as much in schools; many still believe it today.

Reform-minded intellectuals found the low-on-facts, high-on-ideas diet well suited to formulating the socially prescriptive systems that came to be called ideologies. The beauty of being an ideologue was (and is) that the real world with all its imperfections could be criticized by comparing it, not to what had actually happened or is happening, but to one’s utopian visions of future perfection. As perfection exists neither in human society nor anywhere else in the material universe, the ideologues were obliged to settle into postures of sustained indignation. ‘Blind resentment of things as they were was thereby given principle, reason, and eschatological force, and directed to definite political goals,’ as the sociologist Daniel Bell observed.

While the intellectuals were busy with all that, the world’s scientists and engineers took a very different path. They judged ideas (‘hypotheses’) not by their brilliance but by whether they survived experimental tests. Hypotheses that failed such tests were eventually discarded, no matter how wonderful they might have seemed to be. In this, the careers of scientists and engineers resemble those of batters in major-league baseball: Everybody fails most of the time; the great ones fail a little less often.”

••••••••••

Intellectualism allows for uncertainty:

Tags:

Jane Jacobs saw patterns in the disorder, benevolence in so-called blight, sublimity in street life, angels in the anarchic, and she was (thankfully) a huge pain in the ass until others could see the same. The opening of her landmark 1958 Fortune story, “Downtown Is for People“:

“This year is going to be a critical one for the future of the city. All over the country civic leaders and planners are preparing a series of redevelopment projects that will set the character of the center of our cities for generations to come. Great tracts, many blocks wide, are being razed; only a few cities have their new downtown projects already under construction; but almost every big city is getting ready to build, and the plans will soon be set.

What will the projects look like? They will be spacious, parklike, and uncrowded. They will feature long green vistas. They will be stable and symmetrical and orderly. They will be clean, impressive, and monumental. They will have all the attributes of a well-kept, dignified cemetery. And each project will look very much like the next one: the Golden Gateway office and apartment center planned for San Francisco; the Civic Center for New Orleans; the Lower Hill auditorium and apartment project for Pittsburgh; the Convention Center for Cleveland; the Quality Hill offices and apartments for Kansas City; the downtown scheme for Little Rock; the Capitol Hill project for Nashville. From city to city the architects’ sketches conjure up the same dreary scene; here is no hint of individuality or whim or surprise, no hint that here is a city with a tradition and flavor all its own.

These projects will not revitalize downtown; they will deaden it.”

•••••••••••

“For the answer, we went to the lady over there,” 1969:

Tags:

There was a theory during Web 1.0, when possibilities seemed infinite, that in the near term the majority of workers would either telecommute or plop down in offices that had communal desks. Why would anyone need to be anchored when the Internet had connected us virtually and set us free? More people certainly telecommute today and many offices have far more democratic architecture, though anonymous desks still aren’t the norm.

But what if the thing that liberated us made us more anchored in other ways? What if being so connected means that our impetus to shift our material lives for new stimuli and opportunities has diminished? In an otherwise interesting Opinion article by Victoria and Todd G. Bucholz in the New York Times about the rising number of young Americans remaining moored in their home states–a trend that’s been increasing since before the recessesion–too little attention is paid to the role new media has played in reducing relocation in the U.S. I would assume a significant percent of that demographic change has to be assigned to technology: More money is spent on tablets, less on motorcycyles. At any rate, here’s an excerpt from the essay:

“The likelihood of 20-somethings moving to another state has dropped well over 40 percent since the 1980s, according to calculations based on Census Bureau data. The stuck-at-home mentality hits college-educated Americans as well as those without high school degrees. According to the Pew Research Center, the proportion of young adults living at home nearly doubled between 1980 and 2008, before the Great Recession hit. Even bicycle sales are lower now than they were in 2000. Today’s generation is literally going nowhere. This is the Occupy movement we should really be worried about.”

Tags:

The opening of “Who Would God Vote For?” a blog post about the intersection of religion and politics in America and Iran that began in earnest in the 1970s, by the BBC’s always provocative Adam Curtis:

“When you bring God into politics very strange things happen. You can see this now in both America and Iran –  in their elections and also in the growing confrontation between them. But it wasn’t always like this – in fact for most of the 20th century fundamentalist religion in both America and Iran had turned its back on the world of politics and power.

But in the 1970s everything changed. For that was the moment when religion was deliberately brought into politics in both countries with the aim of using it as a revolutionary force. And those who did this – Khomeini in Iran, and right-wing activists in America – were inspired by the revolutionary theories and organisations of the left and their ambition to transform society in a radical way.

I want to tell the forgotten story of how this happened – and how in the 1980s both the Americans and the Iranian idealists came together in a very odd way – with disastrous consequences.”

Tags: ,

From “The Last Famine,” Paul Salopek’s sweeping Foreign Policy piece on hunger, a contemporary portrait of legendary paleoanthropologist Richard Leakey:

“The Turkana Basin is a freakishly beautiful place. A gargantuan wilderness of hot wind and thorn stubble, it covers all of northwestern Kenya and spills into neighboring Uganda, Ethiopia, and South Sudan. Black volcanoes knuckle up from its pale-ocher horizons. Lake Turkana — the largest alkaline lake in the world, 150 miles long — pools improbably in its arid heart. The lake is sometimes referred to, romantically, as the Jade Sea; from the air, its brackish waters appear a bad shade of green, like tarnished brass. Turkana, Pokot, Gabra, Daasanach, and other cattle nomads eke out a marginal existence around its shores. The basin’s dry sediments, which form part of the Great Rift Valley, hold a dazzling array of hominid remains. Because of this, the Turkana badlands are considered one of the cradles of our species.

Richard and Meave Leakey, the scions of the eminent Kenyan fossil-hunting family, have been probing deep history here for 45 years. Their oldest discovery, a pre-human skull, about 4 million years old, was found on a 1994 expedition led by Meave. An earlier dig headed by Richard uncovered a fabulous, nearly intact skeleton of Homo ergaster, dating back 1.6 million years, dubbed the Turkana Boy. Both Leakeys told me the modern landscape had changed nearly beyond recognition since their excavations began in the 1960s. The influx of food aid and better medical services had more than tripled the human population and stripped the region of most of its wild meat, wiping out the local buffalo, giraffe, and zebra. Domestic livestock — exploding and then crashing with successive droughts — had scalped the savannas’ fragile grasses. While driving one day near his headquarters, the Turkana Basin Institute, Richard pointed at a dusty cargo truck, its bed piled high with illegally cut wood. ‘Charcoal for the Somali refugee camps,” he said with a puckish smile. ‘The U.N. pays for it.’

Leakey is not only a celebrity thinker. He is also an incorrigible provocateur and a man of big and restless ambitions. Bored with the squabbles of academic research (‘I could never go back to measuring one tooth against another’), he abandoned the summit of paleoanthropology in the late 1980s to assume the directorship of Kenya’s enfeebled wildlife service, where he became a hero to conservationists by ordering elephant poachers shot on sight. A few years later, he helped organize Kenya’s first serious opposition party, and those activities invited years of police harassment. (A 1993 plane crash, which Leakey blames on sabotage, resulted in the loss of both his legs below the knee; he now gets around — driving Land Rovers, piloting planes — on artificial feet.) At one point, I asked him about heavy bandages on his head and hands at a recent lecture at New York’s Museum of Natural History. He had been suffering from skin cancers, he explained, that metastasized from old police-baton injuries. Leakey tends to view humankind through a very long lens, and pessimistically.”

••••••••••

Two Richards, Leakey and Dawkins:


Tags: , ,

From a list of urban innovations at the Next Web, a section about using pedestrian energy to illuminate streetlights:

“Think about all the energy expended by pedestrians walking down the street. What if it could be harnessed to power street lighting? The Viha concept involves an electro-active slab with a lower portion that is embedded in the floor, and a mobile upper part that can produce energy.

While it won’t be able to replace power stations, this solution is designed to produce enough energy for use in the immediate vicinity, thus taking strain off the power grid and reducing the power bill of the local authority. In particularly crowded areas, this would work well.”

••••••••••

The Viha concept in practice:

I was thinking of Doris Lessing’s short novel about a ghastly offspring, The Fifth Child, for no real reason yesterday, so “Was Frankenstein Really About Childbirth?,” Ruth Franklin’s excellent New Republic article, seems particularly resonant to me today. The opening:

“I have no doubt of seeing the animal today,” Mary Wollstonecraft wrote hastily to her husband, William Godwin, on August 30, 1797, as she waited for the midwife who would help her deliver the couple’s first child. The “animal” was Mary Wollstonecraft Godwin, who would grow up to be Mary Shelley, wife of the Romantic poet Percy Bysshe Shelley and author of Frankenstein, one of the most enduring and influential novels of the nineteenth century. But Wollstonecraft would not live to see her daughter’s fame: She died of an infection days after giving birth.

The last notes that Wollstonecraft wrote to Godwin are included in the exhibition “Shelley’s Ghost: The Afterlife of a Poet,” which began last year at the Bodleian Library in Oxford and has now come to the New York Public Library. On display are numerous artifacts both personal and literary from the lives of the Shelleys, including manuscript pages from the notebook in which Mary wrote Frankenstein (with editing in the margins by her husband), which have never before been shown publicly in the United States. But it was Wollstonecraft’s scribbled note, in which she referred to her baby as “the animal”— the same word that the scientist in Frankenstein would use to describe his own notorious creation—that gave me pause. Could the novel—commonly understood as a fable of masculine reproduction, in which a man creates life asexually—also be a story about pregnancy?•

Tags: , ,

On International Women’s Day, here’s a 1973 John Chancellor report about the “Battle of the Sexes” tennis match between Billie Jean King and Bobby Riggs, which was hokey as a sporting event but used pitch-perfect hoopla on an Ali scale to become a huge national attraction. King triumphed in the Astrodome in straight sets: 6-4, 6-3, 6-3.

Tags: , ,

In his new Atlantic piece, “We’re Underestimating the Risk of Human Extinction,” Ross Andersen conducts a smart interview with Oxford’s Professor Nick Bostrom about the possibility of a global Easter Island, in which all of humanity vanishes from Earth. The conversation focuses on the threats we face not from our stars but from ourselves. I think Bostrom’s attitude is too dire, but he only has to be right once, of course. An excerpt:

What technology, or potential technology, worries you the most?

Bostrom: Well, I can mention a few. In the nearer term I think various developments in biotechnology and synthetic biology are quite disconcerting. We are gaining the ability to create designer pathogens and there are these blueprints of various disease organisms that are in the public domain—you can download the gene sequence for smallpox or the 1918 flu virus from the Internet. So far the ordinary person will only have a digital representation of it on their computer screen, but we’re also developing better and better DNA synthesis machines, which are machines that can take one of these digital blueprints as an input, and then print out the actual RNA string or DNA string. Soon they will become powerful enough that they can actually print out these kinds of viruses. So already there you have a kind of predictable risk, and then once you can start modifying these organisms in certain kinds of ways, there is a whole additional frontier of danger that you can foresee.

In the longer run, I think artificial intelligence—once it gains human and then superhuman capabilities—will present us with a major risk area. There are also different kinds of population control that worry me, things like surveillance and psychological manipulation pharmaceuticals.”

Read also:

Tags: ,

In his two years coaching the erstwhile laughingstock New York Knicks in the late ’80s, the forward thinking Rick Pitino had his young and athletic players operate a full-court press on every play. It didn’t make them an NBA champion, but it maximized the talent at hand and earned the team its first division title in two decades. Years later, Malcolm Gladwell published “How David Beats Goliath” in the New Yorker, making a convincing case that this strategy was sound and should be employed more. But conventional wisdom is hard to shake so no one has ever tried it again, and current Knicks coach Mike D’Antoni has taken a lot of flak for his similarly aggressive “Seven Seconds or Less” offensive scheme

I wonder if a young and athletic NFL team should use similar aggression. You couldn’t blitz on every play because offenses are too proficient and rules are stacked in favor of scoring, But how about this for a team that went 2-14 last year: Never punt from anywhere outside your own 35-yard line. Never kick field goals inside the other team’s 35 except during the last two minutes of a game when it would give you the lead. 

It’s unlikely we’d ever see such an experiment in a league were most coaches get flustered by basic time management, but a gradual shift to using four downs much better is possible. From Brian Burke’s smart Slate piece on the topic:

“There are many doubters when it comes to four-down football. If you’re in that camp, indulge me in a quick thought experiment. Let’s imagine a football world where the punt and field goal had never been invented. (Sorry, Ray Guy and Jan Stenerud.) In this universe, there would be no second-guessing: Teams would go for it on every fourth down.

Then one day, some smart guy invents the punt and approaches a head coach with his new idea. ‘Hey coach,’ he’d say, ‘instead of trying for a first down every time, let’s voluntarily give the ball to the other team.’ Our coach would be incredulous at this suggestion. ‘You want me to give up 25 percent of our precious downs for just 35 yards of field position? Do you have any idea how difficult it would be for us to score?’ And the coach would be right.

Since that’s not how the game evolved, our thinking about fourth-down strategy is a lot different. But as the sport has changed, with offense securing a firm upper hand over defense, coaches need to rethink their fourth-down orthodoxy. Accounting for interceptions, teams netted around 3.5 yards per pass attempt in 1977, back when many of today’s head coaches were playing and learning the sport. In the modern game, teams are almost twice as productive when they throw the ball, netting close to six yards per passing attempt. As it gets easier for offenses to move down the field, possession (and maintaining possession by going for it on fourth down) becomes more important and field position becomes less important.”

See also:

Tags:

In the Atlantic, celebrity scientist Neil deGrasse Tyson discusses his new book about America’s foundering space program. An excerpt:

You write that space exploration is a ‘necessity.’ Why do you think others don’t agree?

I don’t think they’ve thought it through. Most people who don’t agree say, ‘We have problems here on Earth. Let’s focus on them.’ Well, we are focusing on them. The budget of social programs in the federal tax base is 50 times greater for social programs than it is for NASA. We’re already focused in ways that many people who are NASA naysayers would rather it become. NASA is getting half a penny on a dollar — I’m saying let’s double it. A penny on a dollar would be enough to have a real Mars mission in the near future.

Can the United States catch up in the 21st-century space race?

When everyone agrees to a single solution and a single plan, there’s nothing more efficient in the world than an efficient democracy. But unfortunately the opposite is also true, there’s nothing less efficient in the world than an inefficient democracy. That’s when dictatorships and other sort of autocratic societies can pass you by while you’re bickering over one thing or another.

But, I can tell you that when everything aligns, this is a nation where people are inventing the future every day. And that future is brought to you by scientists, engineers, and technologists. That’s how I’ve always viewed it. Once people understand that, I don’t see why they wouldn’t say, ‘Sure, let’s double NASA’s budget to an entire penny on a dollar! And by the way, here’s my other 25 pennies for social programs.’ I think it’s possible and I think it can happen, but people need to stop thinking that NASA is some kind of luxury project that can be done on disposable income that we happen to have left over. That’s like letting your seed corn rot in the storage basin.”

Tags:

Fischer, Yugoslavia, 1961.

Long before anyone knew that the earth beneath his feet would shake ever more violently with time, chess champion Bobby Fischer seemed like an eccentric winner–though definitely a winner. The paranoid accusations and erratic mood swings, however, weren’t merely gamesmanship or arrogance but harbingers of a serious mental illness that would eventually manifest itself in antisemitism and derangement. The opening of Brad Darrach’s 1971 Life profile, “Bobby Is a Ferocious Winner,” at a time when he was still considered combustible by nature rather than condemned by it:

“Angry voices rattles the door to Bobby Fischer’s hotel room as I raised my hand to knock. ‘Goddamnit, I’m sick of it!’ I heard Bobby shouting. ‘I’m sick of seeing people! I got to work, I got to rest! Why didn’t you ask me before you set up all those appointments? To hell with them!’ Then I heard the mild and dignified executive director of the U.S. Chess Federation addressing the man who may well be the greatest chess player in world history in a tone just slightly lower than a yell: ‘Bobby, ever since we came to Buenos Aires I’ve done nothing but take care of you, day and night. You ungrateful—-!’

It was 3 p.m., a bit early for Fischer to be up. Ten minutes later, finding the hall silent. I risked a knock and Fischer cracked the door. ‘Oh yeah, the guy from Life. Come on in.’ His smile was broad and boyish but his eyes were wary. Tall, wide and flat, with a head too small for his big body, he put me in mind of a pale transhuman sculpture by Henry Moore. I had seen him twice before but never so tired.

Just inside the door I stopped short. The room looked like a terminal moraine of bachelorhood. Bedclothes in tortured piles on the floor. Socks, underwear, bags, newspapers, magazines jumbled on the spare bed. Boxes stacked all over the couch, and on the floor between the beds a single graceful banana peel. The only clean place in the room was a small table by the window, where a set of handsome wooden chessmen had been set up for play. Serenely beautiful, an altar in the debris of battle.”

 

Tags: ,

The opening of the new Economist article, “Computing with Soup,” which examines the medical benefits of using DNA-laced liquid instead of silicon chips:

“EVER since the advent of the integrated circuit in the 1960s, computing has been synonymous with chips of solid silicon. But some researchers have been taking an alternative approach: building liquid computers using DNA and its cousin RNA, the naturally occurring nucleic-acid molecules that encode genetic information inside cells. Rather than encoding ones and zeroes into high and low voltages that switch transistors on and off, the idea is to use high and low concentrations of these molecules to propagate signals through a kind of computational soup. 

Computing with nucleic acids is much slower than using transistors. Unlike silicon chips, however, DNA-based computers could be made small enough to operate inside cells and control their activity. ‘If you can programme events at a molecular level in cells, you can cure or kill cells which are sick or in trouble and leave the other ones intact. You cannot do this with electronics,’ says Luca Cardelli of Microsoft’s research centre in Cambridge, England, where the software giant is developing tools for designing molecular circuits.” (Thanks Browser.)

Wilbur Wright, flight around Statue of Liberty, 1909.

Look up in the sky and you will see neither bird nor plane but the future–it is here now. From Farhood Manjoo’s new Slate piece,I Love You, Killer Robots“:

“It was way back in May 2010 that I first spotted the flying drones that will take over the world. They were in a video that Daniel Mellinger, one of the robots’ apparently too-trusting creators, proudly posted on YouTube. The clip, titled ‘Aggressive Maneuvers for Autonomous Quadrotor Flight,’ depicts a scene at a robotics lab at the University of Pennsylvania, though a better term for this den might be “drone training camp.”

In the video, an insectlike, laptop-sized ‘quadrotor’ performs a series of increasingly difficult tricks. First, it flies up and does a single flip in the air. Then a double flip. Then a triple flip. In a voice-over so dry it suggests he has no idea the power he’s dealing with, Mellinger says, ‘We developed a method for flying to any position in space with any reasonable velocity or pitch angle.’ What does this mean? It means the drone can fly through or around pretty much any obstacle. We see it dance through an open window with fewer than 3 inches of clearance on either side. Next, it flies and perches on an inverted surface—lying in wait.”

Tags:

From “Can You Build a Human Body?” an interactive BBC feature that explains how close we are to artificially creating a complete array of functioning organs. From the section on skin:

“One of the greatest challenges in bionics is to replicate skin – its ability to feel pressure, temperature and pain is incredibly difficult to reproduce.

Prof Ali Javey, from the University of California, Berkeley, is trying to develop an ‘e-skin’ a material that ‘mechanically has the same properties as skin.’ He has already woven a web of complex electronics and pressure sensors into a plastic which can bend and stretch.

Getting those sensors to send data to a computer could give a sense of touch to robots.”

••••••••••

Printing electronic skin:

Tags:

Harpo Marx, who was plugging his new book, appeared on I’ve Got a Secret, 1961. Johnny Carson on the panel.

From the 1983 New York Times obituary of Mildred Dilling, who taught Harpo how to play his musical instrument and was profiled in the New Yorker in 1940 (subscription required):

Mildred Dilling, a concert harpist who performed for five Presidents, taught Harpo Marx and owned the world’s largest private collection of harps, died in her Manhattan home last Thursday. She was 88 years old.

Miss Dilling performed throughout North and South America, the Orient and Europe. At the peak of her career, she gave 85 concerts and traveled 30,000 miles a year. In her early 80’s, Miss Dilling was still performing 10 concerts a year. She also conducted harp workshops at colleges and universities, giving master classes at the University of California, Los Angeles.”

Tags: , ,

"These realms of knowledge contain concepts such as data mining, non-linear dynamics and chaos theory." (Image by Steve Jurvetson.)

From “School for Quants,” Sam Knight’s interesting Financial Times look at the training of the next generation of financial-sector math geniuses who will likely be building and destroying and building and destroying the world’s economy in the near future:

“As of this winter, the centre had about 60 PhD students, of whom 80 per cent were men. Virtually all hailed from such forbiddingly numerate subjects as electrical engineering, computational statistics, pure mathematics and artificial intelligence. These realms of knowledge contain concepts such as data mining, non-linear dynamics and chaos theory that make many of us nervous just to see written down. Philip Treleaven, the centre’s director, is delighted by this. ‘Bright buggers,’ he calls his students. ‘They want to do great things.’

In one sense, the centre is the logical culmination of a relationship between the financial industry and the natural sciences that has been deepening for the past 40 years. The first postgraduate scientists began to crop up on trading floors in the early 1970s, when rising interest rates transformed the previously staid calculations of bond trading into a field of complex mathematics. The most successful financial equation of all time – the Black-Scholes model of options pricing – was published in 1973 (the authors were awarded a Nobel prize in 1997).

By the mid-1980s, the figure of the ‘quantitative analyst’ or ‘quant’ or ‘rocket scientist’ (most contemporary quants disdain this nickname, pointing out that rocket science is not all that complicated any more) was a rare but not unheard-of species in most investment houses. Twenty years later, the twin explosions of cheap credit and cheap computing power made quants into the banking equivalent of super-charged particles. Given freedom to roam, the best were able – it seemed – to summon ever more refined, risk-free and sophisticated financial products from the edges of the known universe.

Of course it all looks rather different now. Derivatives so fancy you need a degree in calculus to understand them are hardly flavour of the month these days. Proprietary trading desks in banks, the traditional home of quants, have been decimated by losses and attempts at regulation since the start of the financial crisis. There is nothing like the number of jobs there used to be.” (Thanks Browser.)

Tags:

We can’t unlearn what is learned, although we do have the capacity to change how we use knowledge, to improve or deteriorate ethically. From a story about civilian drones in the Economist:

“Safety is not the only concern associated with the greater use of civilian drones, however. There is also the question of privacy. In America, at least, neither the constitution nor common law prohibits the police, the media or anyone else from operating surveillance drones. As the law stands, citizens do not have a reasonable expectation of privacy in a public place. That includes parts of their own backyards that are visible from a public vantage point, including the sky. The Supreme Court has been very clear on the matter. The American Civil Liberties Union, a campaign group, says drones raise ‘very serious privacy issues’ and are pushing America ‘willy-nilly toward an era of aerial surveillance without any steps to protect the traditional privacy that Americans have always enjoyed and expected.'”

"The oceans will boil away and the atmosphere will dry out as water vapor leaks into space." (Image by Pierre Cardin.)

As the sun ages, its light becomes stronger, not weaker. What to do when the future grows too bright, if we even make it that far? From Andrew Grant’s excellent new Discover article,How to Survive the End of the Universe“:

“For [theoretical physicist Glenn] Starkman and other futurists, the fun begins a billion years from now, a span 5,000 times as long as the era in which Homo sapiens has roamed Earth. Making the generous assumption that humans can survive multiple ice ages and deflect an inevitable asteroid or comet strike (NASA predicts that between now and then, no fewer than 10 the size of the rock that wiped out the dinosaurs will hit), the researchers forecast we will then encounter a much bigger problem: an aging sun.

Stable stars like the sun shine by fusing hydrogen atoms together to produce helium and energy. But as a star grows older, the accumulating helium at the core pushes those energetic hydrogen reactions outward. As a result, the star expands and throws more and more heat into the universe. Today’s sun is already 40 percent brighter than it was when it was born 4.6 billion years ago. According to a 2008 model by astronomers K.P. Schröder and Robert Connon Smith of the University of Sussex, England, in a billion years the sun will unleash 10 percent more energy than it does now, inducing an irrefutable case of global warming here on Earth. The oceans will boil away and the atmosphere will dry out as water vapor leaks into space, and temperatures will soar past 700 degrees Fahrenheit, all of which will transform our planet into a Venusian hell-scape choked with thick clouds of sulfur and carbon dioxide. Bacteria might temporarily persist in tiny pockets of liquid water deep beneath the surface, but humanity’s run in these parts would be over.

[Astronomer Greg] Laughlin was intrigued by the idea of using simulations to traverse enormous gulfs of 
time: ‘It opened my eyes to the fact that things will still be there in timescales that 
dwarf the current age 
of the universe.’

Such a cataclysmic outcome might not matter, though, if proactive Earthlings figure out a way to colonize Mars first. The Red Planet offers a lot of advantages as a safety spot: It is relatively close and appears to contain many of life’s required ingredients. A series of robotic missions, from Viking in the 1970s to the Spirit rover still roaming Mars today, have observed ancient riverbeds and polar ice caps storing enough water to submerge the entire planet in an ocean 40 feet deep. This past August the Mars Reconnaissance Orbiter beamed back time-lapse photos suggesting that salty liquid water still flows on the surface.

The main deterrent to human habitation on Mars is that it is too cold. A brightening sun could solve that—or humans could get the job started without having to wait a billion years. ‘From what we know, Mars did have life and oceans and a thick atmosphere,’ says NASA planetary scientist Christopher McKay. ‘And we could bring that back.'”

Tags: , , ,

In the wake of a devastating earthquake last year, Japanese researchers are experimenting with home levitation. From Popsci: “Instead of building super-strong yet flexible structures to withstand earthquakes, what if you built your house to levitate on a cushion of air? This is already being employed in Japan, a little less than a year after the massive earthquake and tsunami that devastated the country.

The levitation system is the brainchild of a company called Air Danshin Systems Inc., which the Japanese-culture-and-art site Spoon & Tamago says roughly translates to ‘anti-seismic.’ It was founded in 2005 but has caught on after the March 11, 2011 Tohoku earthquake.”

I’ve recently linked to a couple of excellent pieces of Charles Duhigg’s reportage for the New York Times (here and here). He has another impressive article, this one for Slate about the hidden corners of consumerism, called “The Power of Habit.” The opening:

“One day in the early 1900s, a prominent American businessman named Claude C. Hopkins was approached by an old friend with an amazing new creation: a minty, frothy toothpaste named ‘Pepsodent’ that, he promised, was going to be huge.

Hopkins, at the time, was one of the nation’s most famous advertising executives. He was the ad man who had convinced Americans to buy Schlitz beer by boasting that the company cleaned their bottles ‘with live steam’ (while neglecting to mention that every other company used the same method). He had seduced millions of women into purchasing Palmolive soap by proclaiming that Cleopatra had washed with it, despite the sputtering protests of outraged historians.

But Hopkins’ greatest contribution would be helping to create a national toothbrushing habit. Before Pepsodent, almost no Americans brushed their teeth. A decade after Hopkins’ advertising campaigns, pollsters found that toothbrushing had become a daily ritual for more than half the population. Everyone from Shirley Temple to Clark Gable eventually bragged about a ‘Pepsodent smile.”

••••••••

Steve and Eydie don’t have filthy, scummy teeth, 1978:

Tags: ,

Researchers in Japan have developed a handgun that can literally stop people from talking in mid-sentence. It has scary Big Brother applications, obviously, though speech today relies more on opposable thumbs than vocal cords. From Extreme Tech:

“Japanese researchers have created a hand-held gun that can jam the words of speakers who are more than 30 meters (100ft) away. The gun has two purposes, according to the researchers: At its most basic, this gun could be used in libraries and other quiet spaces to stop people from speaking — but its second application is a lot more chilling.

The researchers were looking for a way to stop ‘louder, stronger’ voices from saying more than their fair share in conversation. The paper reads: ‘We have to establish and obey rules for proper turn-taking when speaking. However, some people tend to lengthen their turns or deliberately interrupt other people when it is their turn in order to establish their presence rather than achieve more fruitful discussions. Furthermore, some people tend to jeer at speakers to invalidate their speech.’ In other words, this speech-jamming gun was built to enforce ‘proper’ conversations.'”

Janelle Nanos has a timely article, “So Appy Together,” in Boston Magazine, which asks whether smartphones are fundamentally changing what it means to be human. The short answer is “yes.” Until we figure out how to safely use biotech to make our memories more elastic than they naturally are, we need strategies and processes and tools to manage data overload. If you read this blog very much, you know that I think the net gain from these tools far outweighs any negative, but with utility comes dependency–perhaps even a sort of control. An excerpt from Nanos’ piece:

I LOVE MY SMARTPHONE. It’s become a second brain in my pocket that’s changed how I process information. It’s with me every waking moment — and the sleeping moments, too — tracking my daily habits. And through my constant e-mail and Facebook activity, and the personal documentation of my life via Twitter and Instagram photos, it’s become the lens through which I see the world. All day long, I find myself instinctively reaching for my phone, using it as a tool to validate my existence.

But lately, my smartphone and I have taken our relationship to the next level. I provide it with ever-more-intimate details about my life. Last year, for example, I set a few goals for myself. I wanted to lose some weight, save money, and run a half marathon. With only a few app downloads, my phone became a trainer, life coach, and confidant. It now knows what I eat, how I sleep, how much I spend, how much I weigh, and how many calories I burn (or don’t) at the gym each day. It’s gotten to the point where my phone now somehow knows more about me than anyone else in the world, including my own darling husband. My gadget has become a tiny black mirror, reflecting back how I see myself. Which means things are getting more complicated between us.” (Thanks Browser.)

Tags:

"I’m too excited to drive." (Image by Niki Sublime.)

From “An Uneasy Spy Inside 1970s Suburbia,” one of a series of retrospective 2010 articles in the Los Angeles Times about Philip K. Dick, who speeded to his death with the help of amphetamines, but not before decoding our future:

“During his last few years, when he became financially stable for one of the rare times in his life, his daughters visited him at the Santa Ana apartment he moved to after the implosion of his marriage. Dick’s oldest child, daughter Laura, born in 1960, recalls his place full of Bibles, encyclopedias – Dick was a ferocious autodidact – and recordings of Wagner operas.

Phil’s second daughter Isolde, now 42, visited enough during this period to get to know her father for the first time. She recalls him as working hard to be a good father and struggling to overcome his limitations, both with and without success.

During one visit, he got Isa excited about a trip to Disneyland, then open past midnight. ‘He said, ‘We’re gonna go and stay ‘til it closes!’ But in my mind we were there for only 20 or 30 minutes before he said, ‘Honey, my back’s really hurting.’ I think he was just overwhelmed by all the crowds. I knew him, and knew he was uncomfortable moving outside his comfort zone.’

He spent more of his time walking from the apartment to a nearby Trader Joe’s to get sandwiches, a park where he and Isa tried awkwardly to play kickball, and an Episcopalian church where he had running theological discussions with the clergy.

He’d bought himself a Fiat sports car, but almost never drove, telling Isa, ‘Honey, I’m just so excited to see you, I’m too excited to drive.’ She learned quickly to read her father’s code, which seemed designed to protect her from ugly realities.

Sometimes he’d stay up all night, leaving his visitors laughing for hours as he spun idea after idea, or wrote, in a blaze, until dawn. ‘He could go from that really engaging personality to being withdrawn and closed off,’ Isa remembered, explaining that he would sometimes cancel visits at the last minute. ‘I could tell when we spoke on the phone his voice would go really low and flat. When he had that tone he was depressed. He’d say something like he had the flu. ‘The flu’ was usually his code.'”

Tags:

« Older entries § Newer entries »