You are currently browsing the archive for the Science/Tech category.

Few academics sweep as widely across the past or rankle as much in the present as Jared Diamond, the UCLA professor most famous (and infamous) for Guns, Germs and Steel, a book that elides superiority–and often volition–from the history of some humans conquering others. It’s a tricky premise to prove if you apply it to the present: More-developed countries have better weapons than some other states, but it still requires will to use them. Of course, Diamond’s views are more complex than that black-and-white picture. Two excerpts follow from Oliver Burkeman’s very good new Guardian article about the scholar.


In person, Diamond is a fastidiously courteous 77-year-old with a Quaker-style beard sans moustache, and archaic New England vowels: “often” becomes “orphan,” “area” becomes “eerier.” There’s no computer: despite his children’s best efforts, he admits he’s never learned to use one.

Diamond’s first big hit, The Third Chimpanzee (1992), which won a Royal Society prize, has just been reissued in an adaptation for younger readers. Like the others, it starts with a mystery. By some measures, humans share more than 97% of our DNA with chimpanzees – by any commonsense classification, we are another kind of chimpanzee – and for millions of years our achievements hardly distinguished us from chimps, either. “If some creature had come from outer space 150,000 years ago, humans probably wouldn’t figure on their list of the five most interesting species on Earth,” he says. Then, within the last 1% of our evolutionary history, we became exceptional, developing tools and artwork and literature, dominating the planet, and now perhaps on course to destroy it. What changed, Diamond argues, was a seemingly minor set of mutations in our larynxes, permitting control over spoken sounds, and thus spoken language; spoken language permitted much of the rest.


Geography sometimes plays a huge role; sometimes none at all. Diamond’s most vivid illustration of the latter is the former practice, in two New Guinean tribes, of strangling the widows of deceased men, usually with the widows’ consent. Other nearby tribes that have developed in the same landscape don’t do it, so a geographical argument can’t work. On the other hand: “If you ask why the Inuit, living above the Arctic Circle, wear clothes, while New Guineans often don’t, I would say culture makes a negligible contribution. I would encourage anyone who disagrees to try standing around in Greenland in January without clothes.” And human choices really matter: once the Spanish encountered the Incas, Diamond argues, the Spanish were always going to win the fight, but that doesn’t mean brutal genocide was inevitable. “Colonising peoples had massive superiority, but they had choices in how they were going to treat the people over whom they had massive superiority.”

It is clear that behind these disputes, is a more general distrust among academics of the “big-picture” approach Diamond has made his own.•

Tags: ,

In a recent conversation at MIT, Elon Musk pointed out that humans already possess the building blocks for a Mars settlement, but so too do we have everything necessary for world peace, alternative energy and the end of hunger. The shift in priorities we have to make to call Mars home may be more major than he believes. It’s not just an external one. From Nidhi Subbaraman at BetaBoston:

“‘The basic ingredients are there,’ Musk told a sold-out crowd in MIT’s Kresge auditorium.

Of course, humans are going to need to figure out a few things if we’re going to make it to our neighbor planet. Robust, reusable landing gear (which SpaceX is tussling with already) is at the top of the list. Also energy: Power generation on Mars is going to be an ‘interesting problem,’ Musk said.

In Musk’s view, an investment in becoming a ‘multi-planet’ species is essential for our longevity.

It could come fairly cheap. ‘One percent of our resources, we could be buying life insurance collectively for life,’ Musk said. And it just requires a small reshuffling of our priorities. ‘Lipstick or Mars colonies?’ he asked.

He envisions an Olympics-style competitive future in which countries compete to build the necessary technology.”

Tags: ,

Much of American space exploration is being handed over to private enterprise, which I have some qualms about, but even some of the more prosaic elements of our lives have been offloaded from the public sector to technological “innovators.” Certainly that’s not one-hundred percent the case in the U.S., with healthcare, a huge concern, headed in the opposite direction, and the budget, while having grown slower under Obama than under Dubya or Reagan, still formidable. In a new Guardian piece, Evgeny Morozov, that self-designated mourner, looks at the dark side of capitalism and technocracy’s impact on democracy. The opening:

“For seven years, we’ve been held hostage to two kinds of disruption. One courtesy of Wall Street; the other from Silicon Valley. They make for an excellent good cop/bad cop routine: the former preaches scarcity and austerity while the other celebrates abundance and innovation. They might appear distinct, but each feeds off the other.

On the one hand, the global financial crisis – and the ensuing push to bail out the banks – desiccated whatever was left of the welfare state. This has mutilated – occasionally to the point of liquidation – the public sector, the only remaining buffer against the encroachment of the neoliberal ideology, with its unrelenting efforts to create markets out of everything.

The few public services to survive the cuts have either become prohibitively expensive or have been forced to experiment with new and occasionally populist survival mechanisms. The ascent of crowdfunding whereby, instead of relying on lavish and unconditional government funding, cultural institutions were forced to raise money directly from citizens is a case in point: in the absence of other alternatives, the choice has been between market populism – the crowd knows best! – or extinction.

By contrast, the second kind of disruption has been hailed as a mostly positive development. Everything is simply getting digitised and connected – a most natural phenomenon, if venture capitalists are to be believed – and institutions could either innovate or die. Having wired up the world, Silicon Valley assured us that the magic of technology would naturally pervade every corner of our lives. On this logic, to oppose technological innovation is tantamount to defaulting on the ideals of the Enlightenment: Larry Page and Mark Zuckerberg are simply the new Diderot and Voltaire – reborn as nerdy entrepreneurs.

And then, a rather strange thing happened: somehow we have come to believe that the second kind of disruption had nothing to do with the first.”


Kevin Kelly asked his readers to concisely predict the technological future in “The Coming Hundred Years in One Hundred Words.” One of the most implausible scenarios is one of my favorites:

“You will sleep in a sort of bathtub for taking care of your skin. The bathtub will be enclosed in an atmosphere enriched with substances to take care of your organs. You will never have to take a bath again. Your clothes will be made from a special polymer and you choose from more than 1.000 looks, and the fabrics will be molded to the look you choose. You will eat all the food you like. You will have special lanes for those who prefer to drive, but 80% choose self-driven cars. People will work 4 hours/week. No Police and no Politics. ”


Appropriately trippy 1979 ABC News report about the U.S. government’s attempts across three decades to not just know its citizens’ thoughts but to actually control them. There was a Truth Drug Committee, CIA experimentation with LSD and mushrooms on unwitting Americans and Manchurian Candidate-esque goals. Ultimately it aided the establishment of the 1960s counterculture.


Two passages, one from five years ago and one from today, about how the anarchy of the Internet has released the devil inside us all.


In 2009, Jim Windolf of Vanity Fair wrote a good article about Internet trolls even before the term existed, but even though biting blogs have been supplanted by social media, his prediction about the decline of anonymous online hating did not come true–at least not yet–in fact it’s taken on new and even more hurtful forms. An excerpt from his piece:

“Online rudeness probably won’t last forever. I think it’s just a fashion. Things change. Stuff that seems cool gets stale. It feels like it won’t, but it does. So it seems reasonable to guess that online nastiness will fade—not through any enforcement, but just because it will go out of style. There will always be flame wars. There will always be online lunkheads and goons. But in a few years maybe you won’t really want to be the one calling someone else a douche-tard in a comments section.”


From Alex Hern’s Guardian article, Tim Berners-Lee on what he hath wrought, complete unintentionally:

“Tim Berners-Lee has expressed sadness that the web has mirrored the dark side of humanity, as well as enabling its ‘wonderful side’ to flourish.

The developer, who created the web in 1990 while working for the particle collider project Cern in Switzerland, said that the web is a reflection of human nature elsewhere, but that he had hoped ‘that the web would provide tools and fora and new ways of communicating that would break down national barriers and allow us to just get to a better global understanding.

‘Well, maybe it’ll happen in the future … Maybe we will be able to build web-based tools that help us keep people on the path of collaborating rather than fighting.’

Speaking to BBC News, Berners-Lee said that it was ‘staggering’ that people ‘who clearly must have been brought up like anybody else will suddenly become very polarised in their opinions, will suddenly become very hateful rather than very loving.'”

Tags: , ,

An Economist article responds to Lee Gomes’ Slate piece about the difficulty of making driverless vehicles truly autonomous, suggesting that the real impediment to such machines might be the large stock of cheap labor created by the disruptive qualities of other technologies. An excerpt:

“Writing at Slate, for instance, Lee Gomes frets that driverless vehicles struggle in unfamiliar territory when they lack good maps, can make errors when sun blinds their cameras, and are occasionally caught out by the unexpected appearance of new traffic signals. Human drivers, of course, share these weaknesses, and others: like difficulty operating in adverse weather conditions. The big difference between driverless vehicles and humans, in these cases, is that the computer can be programmed to behave cautiously when stumped, while humans often plough ahead heedlessly. When critiquing driverless cars it is often useful to recall that human drivers kill and maim millions of people each year.

Ironically, the biggest obstacle to widespread use of driverless vehicles, over the next decade or two at any rate, may be the effects of rapid technological progress in other parts of the economy. As a recent special report explains, technological change over the last generation has wiped out many middle-skill jobs, pushing millions of workers into competition for low-wage work. That glut has contributed to stagnant wages for most workers, and low pay has in turn reduced the incentive to firms to deploy labour-saving technology. Why automate, when there is an enormous stock of cheap labour available? At the same time firms like Uber are making the use of hired cars cheaper and more convenient, reducing the attraction to many households of owning and driving their own personal vehicles.

The combination of Uber and cheap labour could pose a formidable threat to the driverless car. The cost of the sensors and processors needed to pilot an autonomous vehicle is falling and is likely to fall much more as production ramps up. Yet the technology is still pricey, especially compared with a human, which, after all, is a rather efficient package of sensory and information-processing equipment. At low wages, a smartphone-enabled human driver is formidable competition for a driverless vehicle.

It would be a remarkable irony if the driverless car—in many ways the symbol of the technological revolution that is now reshaping modern economies—fails to materialise as an economic reality thanks to the disemploying power of other technologies.”


The Ebola “crisis” in America is an example of more than one bias at play. It’s Availability Bias, with so much media focus on an illness that has killed exactly zero American citizens on U.S. soil, when the flu season will likely claim hundreds as it did last year. It’s also Confirmation Bias, with those opposed to President Obama angling to position this domestic “plague” as a lack of leadership on his part. The more important news of the success of the Affordable Care Act, which raises the threshold for plague in this country, is lost in the hollering.

Ebola and ISIS beheadings and other modern challenges deserve attention, to be sure, but there is a more-hopeful parallel narrative we often ignore. From a New Statesmen article by Matthew Barzun, the U.S. Ambassador to Great Britain:

“We live in challenging, complex, even confusing times. Our world is in constant flux. Charles Dickens’s description of the French Revolution seems just as appropriate today: it is the worst of times. Indeed, it may be even more true now, as the changes are global, rather than confined to one or two countries. Newspaper headlines suggest as much. They are littered with demoralising words such as ‘beheadings,’ ‘aggression,’ ‘hatred’ and ‘fever.’ Of course, ISIL is engaged in barbarity in the Middle East that is reminiscent of some of the most grotesque of the 20th century, while the ebola virus poses a global public health threat on a scale as large as anything we’ve seen in recent decades.

At the same time, the number of refugees and internally displaced people presents a great humanitarian challenge. And human rights violations abound in many parts of the world. But here is an equally valid and, I concede, sweeping narrative that suggests this is also the best of times.

It is a time of levelling. The world has reduced extreme poverty by half since 1990. Global primary education for boys and girls is now equal.

It is a time of enduring. The number of deaths among children under five has been cut in half since 1990, meaning about 17,000 fewer children die each day. And mothers are surviving at a nearly equal rate.

It is a time of flourishing. Deaths from malaria dropped by 42 per cent between 2000 and 2012. HIV infections are declining in most regions.

It is a time of strengthening. Africa is above the poverty line for the first time. Tens of millions have been lifted out of poverty in China. The debt burden on developing coun­tries has dropped 75 per cent since 2000.

It is a time of healing. The ozone layer is showing signs of recovery thanks to global action. And all the while, the technological and communications revolution is making more people better informed than at any time in history.

So why are we intent on fixing our lens on the chaotic?”

Tags: ,

Famed attorney Clarence Darrow was an extremist when it came to criminality, believing in circumstance but not culpability. He saw criminals the way the writer of a naturalist novel views characters, as prisoners of nature and nurture, incapable of circumventing either. Based on the remarks he made as reported in an article in the April 4, 1931 Brooklyn Daily Eagle, Darrow would have treated all misdeeds as maladies, the perpetrators receiving treatment in hospitals rather than stretches in prison.


In his TED Talk, “New Thoughts on Capital in the Twenty-First Century,” Thomas Piketty has good and bad news. The good: Wealth inequality, although severe now, is not as deep as a century ago. The bad: The shrunken wealth gap post-World War II was an outlier, not a norm that will reestablish itself for any long period under the present system.


Via Google driverless-car consultant Brad Templeton, a report about Singapore’s limited test run of autonomous public transport:

“In late August, I visited Singapore to give an address at a special conference announcing a government sponsored collaboration involving their Ministry of Transport, the Land Transport Authority and A-STAR, the government funded national R&D centre. I got a chance to meet the minister and sit down with officials and talk about their plans, and 6 months earlier I got the chance to visit A-Star and also the car project at the National University of Singapore. At the conference, there were demos of vehicles, including one from Singapore Technologies, which primarily does military contracting.

Things are moving fast there, and this week, the NUS team announced they will be doing a live public demo of their autonomous golf carts and they have made much progress. They will be running the carts over a course with 10 stops in the Singapore Chinese and Japanese Gardens. The public will be able to book rides online, and then come and summon and direct the vehicles with their phones. The vehicles will have a touch tablet where the steering wheel will go. Rides will be free. Earlier, they demonstrated not just detecting pedestrians but driving around them (if they stay still) but I don’t know if this project includes that.

This is not the first such public demo – the CityMobil2 demonstration in Sardinia ran in August, on a stretch of beachfront road blocked to cars but open to bicycles, service vehicles and pedestrians. This project slowed itself to unacceptably slow speeds and offered a linear route.

The Singapore project will also mix with pedestrians, but the area is closed to cars and bicycles. There will be two safety officers on bicycles riding behind the golf carts, able to shut them down if any problem presents, and speed will also be limited.”


Uber could use all the good publicity it can get right now, its business plan a threat to licensed drivers, its surge pricing unpopular and its own operators prone to screaming headlines for any misdeeds. Even Libertarian Peter Thiel thinks the company may be flouting regulations too much, careering onto a self-destructive Napster path. Perhaps on-demand flu shots, which Uber tried for a day in several U.S. cities, can help. From Dan Diamond at Forbes:

“Uber’s latest one-day promotion kicked off on Thursday: UberHEALTH, the company’s first concerted effort to move into health care delivery.

The company announced that Uber users in Boston, New York City, or Washington, D.C., could order a free flu shot between the hours of 10 a.m. and 3 p.m.

And Flüber’s terms sounded appealing.

‘We’re leveraging the reliability and efficiency of the Uber platform and launching a one-day pilot program — UberHEALTH — in select cities today,’ an Uber blog post reads. ‘Together with our partner Vaccine Finder we will bring flu prevention packs and shots directly to you – at the single touch of a button.’

To be clear, Uber drivers weren’t administering the shots; they’d transport registered nurses to a user’s location, and the nurses could give up to 10 flu shots.”


My definition of a genius is someone who can creatively make connections among disparate things in a way that others can’t, piecing together a new reality, a new “language,” in art or physics or anything. They see it.

In a Nautilus blog post, Claire Cameron asked for a description of such a person from five members of Mensa, an organization that measures IQ, which is a different thing. The responses:


Can you define ‘genius’ for me, or describe what a genius is?

Richard Hunter (retired finance director): An exceptional ability perhaps? That would satisfy if you were a member of Mensa—you know you have an exceptional ability in IQ if you get in to it. It is one type of genius, but genius takes many forms. An example would be Dave Johnson. He was a famous decathlete in the 80s and 90s. He was clearly a genius athlete: He ran, he could throw javelin, he could do all these things, and he won the Olympic gold decathlon. That must be genius in the sporting field. I am nothing like Dave Johnson—it is far more complicated than one thing or another.

Bikram Rana (director at a business consulting firm): It’s something that you see and you know it when you see it. I think a modern-day genius would be someone like Steve Jobs. It’s someone who has captured the imagination, done something groundbreaking.

Jack Williams (journalist): Oh god. I have no idea. I actually couldn’t. It comes in different forms. I don’t think being a Mensan makes you a genius, as I prove on a weekly basis on a Saturday night. I think there is a creative, innovative element there as well. Genius pushes the boundaries.

John Sheehan (clinical hypnotist): I don’t think you can say there is a ‘typical’ genius. There isn’t a typicalness to it, bar that one exception: Great intellectual ability. Genius has gone from ‘having a [kind of] genius’ to ‘being’ one. I think the word genius now comes from the popular press, it’s easy to say, it’s got a cachet to it. It’s easy, but among the people whose careers are invested in giftedness, high intelligence, then the word ‘genius’ is not often used. It is something I was born with, and that I have had all my life. I don’t think about it until someone asks me, because it is all I know. I think about this a lot though.

LaRae Bakerink (business consultant): It is what you do with your life that defines whether you are a genius. A genius is someone who can create something new.”

Tags: , , , , , ,

A multi-planet humanity is a hedge against an Earth catastrophe eradicating our species, sure, but there are financial considerations as well to interstellar development. From Tim Fernholz’s new Atlantic article about Elon Musk’s SpaceX:

“With 33 commercial launches on its manifest in the next four years, a plan to launch manned missions by 2017, and subsidies from Texas to build its own spaceport there after several years of leasing government facilities, SpaceX is now a serious competitor in the launch industry. That’s a validation for NASA’s public-private partnership, which was focused on developing a business, not a product.

But the question for Musk and his investors now is whether he can be more than just a better rocket builder. They want to unlock something far more challenging: A space economy where humans can vastly increase their productivity in the vacuum around our tiny world and beyond, even if nobody is quite sure how yet. Nolan of Founders Fund compares this hopeful uncertainty to the founding of the internet. ‘It wasn’t clear exactly what kind of business can come out of exchanging information really rapidly,’ he says.

For example, if it weren’t so pricey, investors could imagine putting up hundreds of new satellites in lower orbits than existing ones, making their communications and imaging far more powerful. Because of the high launch costs, current satellites aren’t upgraded frequently and are stationed relatively far from earth so that they can last longer—the closer a satellite flies to earth, the faster its orbit decays, leading to its eventual demise. As a result, the electronics in them are relatively old technology.

Cheap enough launches could also enable terrestrial flights that hop up over the atmosphere, turning a day-long flight around the world into a matter of hours. Space tourism is often cited as a possible source of revenue, as is commercial research, even asteroid mining, but making any of those sustainable will mean—you guessed it—far lower costs, as NASA has found in its failure to drum up much commercial research at the ISS.

Can the $6 million launch—or even cheaper—replace the $60 million launch?”

Tags: ,

Understanding today or tomorrow is an almost impossible task, the present being anything but a sitting duck and the future a black swan. Even those who have a better-than-average idea of where things stand and where they’re heading can misread the details, all but neutralizing their knowledge. From “Nothing You Think Matters Today Will Matter the Same Way Tomorrow,” Frank Rich’s New York magazine look at the last 50 years of American history and what it tells us about prognostication:

“It was a time when many in my boomer generation fell in love with the idea that change was something you could believe in—a particularly liberal notion that has taken hold in other generations, too, whether in the age of Roosevelt or Obama. Even as we recognize that the calendar makes for a crude and arbitrary marker, we like to think that history visibly marches on, on a schedule we can codify.

The more I dove back into the weeds of 1964, the more I realized that this is both wishful thinking and an optical illusion. I came away with a new appreciation of how selective our collective memory is, and of just how glacially history moves, despite the can-do optimism of a modern America besotted with the pursuit of instant gratification. Asked at the time of the 1964 World’s Fair to anticipate 2014, Isaac Asimov got some things right (miniaturized computers, online education, flat-screen television, and what we now know as Skype), but many of his utopian predictions were delusional. His wrong calls included not just his interplanetary fantasies but his vision of underground suburbs that would protect mankind from war, rampaging weather, and the tyranny of the automobile. Asimov also thought birth control would find international acceptance. It was no doubt beyond even his imagination that a half-century hence American lawmakers would introduce ‘personhood’ amendments attempting to all but outlaw contraception.

The screenwriter William Goldman famously summed up Hollywood in three words: ‘Nobody knows anything.’ Would that this aphorism were applicable, as he intended, solely to the make-believe of show business. It often seems that nobody knew anything about anything in 1964. Most everyone was certain that the big political developments of the time, epitomized by LBJ’s victories for civil rights and against Goldwater, would be transformational. Many of the same seers saw the year’s cultural upheavals, starting with the Beatles, as ephemera. More often than not, the reverse has turned out to be true. Are we so much smarter in 2014?”


In a Slate piece, Lee Gomes wonders whether the Google driverless car will ever be a reality, one impediment being the need for real-time maps able to read constantly shifting infrastructure on a national level. His comparison of the search-giant’s autonomous-vehicle plans to the Apple Newton seems a self-defeating argument, however, since all the elements of that ill-fated invention were realized soon thereafter in other tools. The opening:

“A good technology demonstration so wows you with what the product can do that you might forget to ask about what it can’t. Case in point: Google’s self-driving car. There is a surprisingly long list of the things the car can’t do, like avoid potholes or operate in heavy rain or snow. Yet a consensus has emerged among many technologists, policymakers, and journalists that Google has essentially solved—or is on the verge of solving—all of the major issues involved with robotic driving. The Economist believes that ‘the technology seems likely to be ready before all the questions of regulation and liability have been sorted out.’ The New York Times declared that ‘autonomous vehicles like the one Google is building will be able to pack roads more efficiently’—up to eight times so. Google co-founder Sergey Brin forecast in 2012 that self-driving cars would be ready in five years, and in May, said he still hoped that his original prediction would come true.

But what Google is working on may instead result in the automotive equivalent of the Apple Newton, what one Web commenter called a ‘timid, skittish robot car whose inferior level of intelligence becomes a daily annoyance.’ To be able to handle the everyday stresses and strains of the real driving world, the Google car will require a computer with a level of intelligence that machines won’t have for many years, if ever.”


By the time he filed his final patent in 1928, an “Apparatus for aerial transportation,” the 72-year-old inventor Nikola Tesla was a punchline at best and a forgotten man at worst, and he would remain so for the final 15 years of his life, until he died alone and without money in the New Yorker Hotel. His swan song was a small plane which purportedly could rise from an open window like a helicopter and transport two people cheaply and efficiently to their destination. It would revolutionize travel. Alas, unlike a swan, it wouldn’t have been able to fly even if it had been built, which it wasn’t. An article in the February 23, 1928 Brooklyn Daily Eagle reported on the dubious machine.


An algorithmic miscue worthy of 1999, this book suggestion was on my Amazon home page yesterday. Fucking Bezos.


Featured Recommendation:

Prepper’s Pantry: The Survival Guide To Emergency Water & Food Storage
by Ron Johnson (October 6, 2014)
Auto-delivered wirelessly
Kindle Price: $2.99

In the event of an emergency having an adequate supply of food could mean the difference between life and death!

Are you prepared for any disaster that is about to happen? Do you already have emergency supplies? Is it enough to sustain you and your family’s life for an extended period, when help from others would be close to impossible? Have you discussed and implemented the emergency plans with your family?

Why recommended?

Because you purchased… 

Roughing It [Kindle Edition]
Mark Twain (Author)

The Wild West as Mark Twain lived it

In 1861, Mark Twain joined his older brother Orion, the newly appointed secretary of the Nevada Territory, on a stagecoach journey from Missouri to Carson City, Nevada. Planning to be gone for three months, Twain spent the next “six or seven years” exploring the great American frontier, from the monumental vistas of the Rocky Mountains to the lush landscapes of Hawaii. Along the way, he made and lost a theoretical fortune, danced like a kangaroo in the finest hotels of San Francisco, and came to terms with freezing to death in a snow bank—only to discover, in the light of morning, that he was fifteen steps from a comfortable inn.

As a record of the “variegated vagabondizing” that characterized his early years—before he became a national treasure—Roughing It is an indispensable chapter in the biography of Mark Twain. It is also, a century and a half after it was first published, both a fascinating history of the American West and a laugh-out-loud good time.

Technology Review has published “On Creativity,” a 1959 essay by Isaac Asimov that has never previously run anywhere. The opening: 

“How do people get new ideas?

Presumably, the process of creativity, whatever it is, is essentially the same in all its branches and varieties, so that the evolution of a new art form, a new gadget, a new scientific principle, all involve common factors. We are most interested in the ‘creation’ of a new scientific principle or a new application of an old one, but we can be general here.

One way of investigating the problem is to consider the great ideas of the past and see just how they were generated. Unfortunately, the method of generation is never clear even to the ‘generators’ themselves.

But what if the same earth-shaking idea occurred to two men, simultaneously and independently? Perhaps, the common factors involved would be illuminating. Consider the theory of evolution by natural selection, independently created by Charles Darwin and Alfred Wallace.

There is a great deal in common there. Both traveled to far places, observing strange species of plants and animals and the manner in which they varied from place to place. Both were keenly interested in finding an explanation for this, and both failed until each happened to read Malthus’s ‘Essay on Population.’

Both then saw how the notion of overpopulation and weeding out (which Malthus had applied to human beings) would fit into the doctrine of evolution by natural selection (if applied to species generally).

Obviously, then, what is needed is not only people with a good background in a particular field, but also people capable of making a connection between item 1 and item 2 which might not ordinarily seem connected.

Undoubtedly in the first half of the 19th century, a great many naturalists had studied the manner in which species were differentiated among themselves. A great many people had read Malthus. Perhaps some both studied species and read Malthus. But what you needed was someone who studied species, read Malthus, and had the ability to make a cross-connection.

That is the crucial point that is the rare characteristic that must be found.”


Kate Greene, who wrote an Aeon essay about living on “Mars” in a Hawaiian simulation, has a brief piece in Wired about the domed habitat designed to keep the participants sane during the next 50th state “space mission.” The opening:

“I’d always wanted to visit Mars. Instead I got Hawaii. There, about 8,200 feet above sea level on Mauna Loa, sits a geodesically domed habitat for testing crew psychology and technologies for boldly going. I did a four-month tour at the NASA-funded HI-SEAS—that’s Hawaii Space Exploration Analog and Simulation—in 2013, and a new 8-month mission is scheduled to start in October. It’s a long time to be cooped up, ‘so the psychological impacts are extremely important,’ habitat designer Vincent Paul Ponthieux says. The key to keeping everybody sane? A sense of airiness. Yep—even on Mars, you’re going to need more space.”

Tags: ,

While it’s not as serious a suck on California’s water supply as swimming pools, illegal marijuana farming is part of the problem. From Pilita Clark in the Financial Times:

“Jerry Brown, California’s governor, declared a state of emergency in January after the driest year on record in 2013, but as the annual wet season beckons, the prospect of a complete drought recovery this winter is highly unlikely, government officials say.

‘Marijuana cultivation is the biggest drought-related crime we’re facing right now,’ says Lt Nores as he pokes at a heap of plastic piping the growers used to divert water from a dried-up creek near the plantation.

But California’s drought is exposing a series of problems in the US’s most populous state that are a reminder of an adage popularised by Michael Kinsley, the columnist: the scandal is often not what is illegal but what is legal.

Growing competition

The theft of 80m gallons of water a day by heavily armed marijuana cartels is undoubtedly a serious concern, not least when the entire state is affected by drought and 58 per cent is categorised as being in ‘exceptional drought,’ as defined by the government-funded US Drought Monitor.

However, this is a tiny fraction of the water used legally every day in a state that, like so many other parts of the world, has a swelling population driving rising competition for more heavily regulated supplies that have long been taken for granted and may face added risks as the climate changes.”

Tags: , ,

While the number of U.S. citizens who’ve died on American soil from Ebola remains constant, Liberia is truly in the grips of a deadly epidemic. It’s a horror of human suffering and loss and must be surreal for a nation that was starting to rise up economically. Marcus DiPaola, a freelance journalist who just returned from reporting in Liberia, did an Ask Me Anything at Reddit about this modern plague. A few exchanges follow.



Are people optimistic that Ebola will be defeated?

Marcus DiPaola:

Optimistic is… not the right word. They’re expecting it to be defeated. You gotta realize how much public knowledge there is about the outbreak here. NO ONE is shaking hands. NO ONE is touching each other. All the radio stations play Ebola songs.



Is it as bad as the media is making it out to be?

Marcus DiPaola:

I am a member of the media, and I think it’s pretty damn bad.

Locals treat every dead body with suspicion now, a crowd gathered in Central Monrovia as an alleged thief jumped into the river in Monrovia and died. Bystanders say he was not a suspected Ebola case, but many people are unwilling to take the risk associated with pulling out the body. As a result, the Liberian Red Cross Ebola burial team was called and arrived in hazmat suits and collect the body.

Cellcom Liberia, one of the country’s largest cell phone providers, has a worker checking temperatures before shoppers are even allowed to enter Cellcom Liberia’s parking lot. The worker writes down the temperature on a name tag, which is then checked by security guards after entering the parking lot, then again as they line up to enter the store, then again checked at the door to the store.

Many restaurants, hotels, banks, and stores have hand washing stations installed, and require you to wash your hands before entering. Some businesses’ attempts at requiring hand-washing fall short of their goals, as many of the buckets contain water with no soap available, or chlorine mixed into the water.



Do you personally think things will continue to escalate? Or does it seem like things are more under control than the American media is making them out to be?

Marcus DiPaola:

No, I think things are escalating right now– the burial teams are working flat out and there are about 10 of them. I think the numbers are being under-reported, especially from the Liberian government.



Should America be worried about Ebola?

Marcus DiPaola:

Yes. But not in the way you might think… it’s going to keep coming here, I have no doubt of it, but I’m willing to bet we’ll never see a full-blown outbreak. The experts I was with made it clear to me that you really really really have to work at it to get it as a non-caretaker. You have to touch bodily fluids. Fomites can theoretically do it, but everyone there kinda didn’t really worry about that, and they’ve been doing this since March.•


Chicago plans to make many of its streetlights “smart,” capable of recording data almost continuously, helping to moderate the flow of auto and pedestrian traffic and other currently fixed processes, something which will be standard on all American streets before long. From Liz Stinson at Wired:

“There you are, standing on a street corner surrounded by a mob of people waiting for the walk signal. In front of you, a single car gets the green light. Again. For all the talk of smart cities, they can be infuriatingly dumb at times. But imagine if your city could monitor the flow of pedestrians and optimize its traffic signals for walkers, not drivers? That’s exactly what Chicago is looking to do.

Later this fall, the Windy City will install a network of 40 sensor nodes on light poles at the University of Chicago, School of the Art Institute of Chicago and Argonne National Laboratory. The goal is to eventually expand the system to 1,000 sensors (enough to cover the Chicago Loop) over the next few years. Spearheaded by the University of Chicago’s Urban Center for Computational Data, it’s called the Array of Thingsinitiative, and the goal is to gather an unprecedented set of ambient data to help government officials and residents understand how their city ticks so they can make it a happier, healthier, and smarter place to live.

Every 15 seconds these sensors will gather information like temperature, humidity, carbon monoxide, vibrations, light, and sound—pretty standard stuff.”


Not having a television makes it easier to live outside the idiot broadcast and cable news culture populated by loudmouths like Sean Hannity, who looks like the equipment manager of a lacrosse team that’s been suspended for hazing, and Chuck Fucking Todd, who announced that the President’s administration was “over” when there were still three full years to go. The important work continues apace despite what that sputtering Van Dyke says. The opening of “Obama’s Moonshot Probes the Space Inside Our Skulls,” Anjana Ahuja’s Financial Times piece:

“Every president needs a blockbuster science project to crown his time in the White House. While John Kennedy reached for the stars, Barack Obama decided to reach for the synapses. A year on from his pledge to unravel the secrets of the human brain, researchers are beginning to spell out how they will tackle this challenge.

The breathtaking implications of the research are also becoming clear; within a generation science may acquire the power to predict a person’s future capacities and possibly determine their life chances. The central idea of Mr Obama’s Brain Research through Advancing Innovative Neurotechnologies Initiative is to understand how the human brain choreographs all the astonishing feats it pulls off. It is the seat of learning and the organ of memory. It generates our character and identity, and influences our behaviour. It is our brains, more than anything else, that make us who we are today – and shape who we will become.

But that neural versatility acts as a veil. How does the squidgy, three-pound lump between your ears – filled with nerve cells (neurons) firing tiny impulses of electricity across junctions called synapses – accomplish such a wide array of functions? Research grants announced this month shed light on how Mr Obama’s vaunted cerebral exploration will proceed. One avenue will be to catalogue the differences between healthy neurons and diseased ones. This will spur research into neurodegenerative diseases, such as Alzheimer’s and Parkinson’s, which cast a shadow over ageing societies.”

Tags: , , ,

Walter Isaacson is thought, with some validity, as a proponent of the Great Man Theory, which is why Steve Jobs, with no shortage of hubris, asked him to be his biographer. Albert Einstein and Ben Franklin and me, he thought. Jobs, who deserves massive recognition for the cleverness of his creations, was also known as a guy who sometimes took credit for the work of others, and he sold his author some lines. Bright guy that he is, though, Isaacson knows the new technologies and their applications have many parents, and he’s cast his net wider in The Innovators. An excerpt from his just-published book, via The Daily Beast, in which he describes the evolution of Wikipedia’s hive mind:

“One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that [Jimmy] Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then [Larry] Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, ‘people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.’ Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. ‘As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,’ he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. ‘A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.’ Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. ‘Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,’ Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.”

Tags: , ,

« Older entries § Newer entries »