Two videos about Francis Ford Coppola’s 1974 masterwork The Conversation, a movie about the consequences, intended and unintended, of the clever devices we create and how the tools of security can make us insecure.

The first clip is an interview with the director conducted at the time of the film, in which he recognizes his influences. In the second, Coppola wordlessly receives the Palme d’Or at the Cannes Film Festival, to some applause and a few catcalls. Tony Curtis walks him off stage.

Tags: ,

Haircut

Looking for a haircut by a professional nude hairdresser. Pls provide picture and your rate. Your location. Thanks. 

Ricky Jay is to a playing cards as Nikola Tesla was to electrical currents–brilliant, thrilling, dangerous, shocking–and having the masterly and stylish critic Tom Carson write of him for Grantland is happiness. Jay, who has holes in his memory but none in his logic, somehow knows things we don’t, even in this age when everything is seemingly known. It’s like magic. An excerpt:

Jay even survived the perils of being in fashion, which happened when Miley Cyrus was a toddler. One of the true oddities of the ’90s was that magic — nobody’s idea of chic entertainment in decades, or maybe ever — got trendy out of the blue. Penn & Teller became hipster heroes, David Copperfield graduated from cultural acne to showbiz Death Star, and you couldn’t piss out of a skyscraper without hitting David Blaine. Since “Who are you going to believe: me or your own lying eyes?” was basically Bill Clinton’s motto, PhD dissertations have probably been written about the culture’s unconscious groping for analogues to the hat-trick expert in the White House.

When schlockmeisters and the culturati end up on the same page, something interesting is usually afoot. Fox got count-’em four ratings bonanzas out of Breaking the Magician’s Code: Magic’s Greatest Secrets Finally Revealed. (They were awesomely cretinous, and I don’t think I missed one.) Literature got in on the act — a bit late, as usual — with Glen David Gold’s Carter Beats the Devil. Then came 9/11, and whaddya know? The whole vogue turned quaint damn near overnight. That’s why 59-year-old Penn and 66-year-old Teller, whose six-nights-a-week Las Vegas residency is now in its 14th year — they settled in at the Rio in 2001, almost like they’d figured out the cool-kids jig was up — are still the youngest and, ahem, “edgiest” professional magicians whose names anyone is likely to recognize.

Jay got cast as the caviar edition. A long and awestruck New Yorker profile by Mark Singer is still the closest thing to an intimate portrait he’s ever sat for, and was followed in 1994 by the first of his one-man Broadway shows, Ricky Jay and His 52 Assistants — directed by David Mamet, who went on to oversee two more. Because Jay’s card wizardry works only in jewel-box-size theaters, scoring tickets conferred instant membership in the hipoisie, and I should know: I saved a discarded eight of clubs from his act for years.

Adding to the nimbus of classiness, he was and is a formidably erudite and genial historian of his whole branch of the popular arts from the 15th century to now, with half a dozen books packed with esoteric wonders to his credit. He’s lectured on magic versus spiritualism at Princeton and on confidence games at police conventions. Then there’s his movie work, not only as an actor — for Anderson and Mamet, most memorably — but also as a consultant on big-screen illusions.

What he hasn’t done, at least in any obvious way, is cash in.•

Tags: ,

No one is more moral for eating pigs and cows rather than dogs and cats, just more acceptable. 

Dining on horses, meanwhile, has traditionally fallen into a gray area in the U.S. Americans have historically had a complicated relationship with equine flesh, often publicly saying nay to munching on the mammal, though the meat has had its moments–lots of them, actually. From a Priceonomics post by Zachary Crockett, a passage about the reasons the animal became a menu staple in the fin de siècle U.S.: 

Suddenly, at the turn of the century, horse meat gained an underground cult following in the United States. Once only eaten in times of economic struggle, its taboo nature now gave it an aura of mystery; wealthy, educated “sirs” indulged in it with reckless abandon.

At Kansas City’s Veterinary College’s swanky graduation ceremony in 1898, “not a morsel of meat other than the flesh of horse” was served. “From soup to roast, it was all horse,” reported the Times. “The students and faculty of the college…made merry, and insisted that the repast was appetizing.”

Not to be left out, Chicagoans began to indulge in horse meat to the tune of 200,000 pounds per month — or about 500 horses. “A great many shops in the city are selling large quantities of horse meat every week,” then-Food Commissioner R.W. Patterson noted, “and the people who are buying it keep coming back for more, showing that they like it.”

In 1905, Harvard University’s Faculty Club integrated “horse steaks” into their menu. “Its very oddity — even repulsiveness to the outside world — reinforced their sense of being members of a unique and special tribe,” wrote the Times. (Indeed, the dish was so revered by the staff, that it continued to be served well into the 1970s, despite social stigmas.)

The mindset toward horse consumption began to shift — partly in thanks to a changing culinary landscape. Between 1900 and 1910, the number of food and dairy cattle in the US decreased by nearly 10%; in the same time period, the US population increased by 27%, creating a shortage of meat. Whereas animal rights groups once opposed horse slaughter, they now began to endorse it as more humane than forcing aging, crippled animals to work. 

With the introduction of the 1908 Model-T and the widespread use of the automobile, horses also began to lose their luster a bit as man’s faithful companions; this eased apprehension about putting them on the table with a side of potatoes (“It is becoming much too expensive a luxury to feed a horse,”argued one critic).

At the same time, the war in Europe was draining the U.S. of food supplies at an alarming rate. By 1915, New York City’s Board of Health, which had once rejected horse meat as “unsanitary,” now touted it is a sustainable wartime alternative for meatless U.S. citizens. “No longer will the worn out horse find his way to the bone-yard,” proclaimed the board’s Commissioner. “Instead, he will be fattened up in order to give the thrifty another source of food supply.”

Prominent voices began to sprout up championing the merits of the meat.•

Tags:

I’m not a geneticist, but I doubt successful, educated parents are necessarily more likely to have preternaturally clever children than their poorer counterparts, as is argued in a new Economist article about the role of education in America’s spiraling wealth inequality. Of course, monetary resources can help provide a child every chance to realize his or her abilities, ensuring opportunities often denied to those from families of lesser material means. That, rather than genes, is the main threat to meritocracy. An excerpt:

Intellectual capital drives the knowledge economy, so those who have lots of it get a fat slice of the pie. And it is increasingly heritable. Far more than in previous generations, clever, successful men marry clever, successful women. Such “assortative mating” increases inequality by 25%, by one estimate, since two-degree households typically enjoy two large incomes. Power couples conceive bright children and bring them up in stable homes—only 9% of college-educated mothers who give birth each year are unmarried, compared with 61% of high-school dropouts. They stimulate them relentlessly: children of professionals hear 32m more words by the age of four than those of parents on welfare. They move to pricey neighbourhoods with good schools, spend a packet on flute lessons and pull strings to get junior into a top-notch college.

The universities that mould the American elite seek out talented recruits from all backgrounds, and clever poor children who make it to the Ivy League may have their fees waived entirely. But middle-class students have to rack up huge debts to attend college, especially if they want a post-graduate degree, which many desirable jobs now require. The link between parental income and a child’s academic success has grown stronger, as clever people become richer and splash out on their daughter’s Mandarin tutor, and education matters more than it used to, because the demand for brainpower has soared. A young college graduate earns 63% more than a high-school graduate if both work full-time—and the high-school graduate is much less likely to work at all. For those at the top of the pile, moving straight from the best universities into the best jobs, the potential rewards are greater than they have ever been.

None of this is peculiar to America, but the trend is most visible there. This is partly because the gap between rich and poor is bigger than anywhere else in the rich world—a problem Barack Obama alluded to repeatedly in his state-of-the-union address on January 20th (see article). It is also because its education system favours the well-off more than anywhere else in the rich world.•

From the February 23, 1919 Brooklyn Daily Eagle:

One little girl in Burford Bridge, England, has made a record of having killed 1,415 butterflies in a butterfly killing contest held in the schools of that district. We wonder what there may be about so beautiful and harmless an insect as a butterfly to warrant engaging school children in such a murderous employment, and we hope there were more than a few children who made a very poor showing in the competition, believing that for the most of them the contest offered little inspiration.•

For those who read Lolita after the Sexual Revolution of ’60s and ’70s had ended, how can the book appear like anything but an amazing piece of writing about a horrifying “romance”? But I suppose for some of the young who came of age during the carnal tumult of that earlier time, the novel seemed like a different thing–or at least the culture told them it was. In the opening question of an interview conducted by Erik Morse of the Los Angeles Review of Books, Emily Prager, the novelist and journalist who briefly appeared on the original iteration of Saturday Night Live, astutely explains the generational differences in interpretations of the controversial work:

Erik Morse:

Do you remember when you first read Lolita? What were your initial impressions, both of Nabokov’s story and the character of Lo?

Emily Praeger:

I don’t remember when I read Lolita but the idea of Lolita was a large part of the ’60s when I matured. Recently I saw the now 50ish-year-old woman whom Roman Polanski allegedly raped. She kept stammering that it was a different time, that you can’t judge Polanski by today’s standards. That’s because the Lolita idea was everywhere — there was a book with almost softcore photos of baby ballerinas that was on every coffee table, tons of very young women with much older men and it was okay. Men ruled after all. Many took Humbert Humbert as their role model. They liked him best of all. A few years ago, I went to dinner with some women who had grown up in the ’60s. It was when the new attitude toward sexual harassment in the workplace was surfacing. We had a great laugh because every single one of us had been importuned in the workplace constantly. When I was 17 and a prop girl off-Broadway, we had to kiss the house manager when we arrived at work. We rolled our eyes and did it. We thought it was ridiculous and those who asked it of us ludicrous. Lolita, the movie, came out in 1962, and it was with Peter Sellers and Stanley Kubrick directing and it was cool. We all wanted the heart-shaped sunglasses. You know, the myth of the ’60s is that it was all about sex. The truth is we knew nothing about sex except what society told us, which was it was bad. We just didn’t want anyone anymore saying anything to us about how to think about sex. So sexual liberation had to include Lolita. It was every girl for herself. You can’t believe how innocent we were. I doubt most of us registered that she might be being taken advantage of. The other thing was that very young boys were going to fight and die in Vietnam, not 12 but 18, which then was about 13. Young girls having sex didn’t seem that wrong. Of course you read Lolita now — I teach it in my fiction-writing course and modern girls are disgusted by it, horrified.•

Tags: ,

In a Backchannel interview largely about strategies for combating global poverty, Steven Levy asks Bill Gates about the existential threat of superintelligent AI. The Microsoft founder sides more with Musk than Page. The exchange:

Steven Levy:

Let me ask an unrelated question about the raging debate over whether artificial intelligence poses a threat to society, or even the survival of humanity. Where do you stand?

Bill Gates:

I think it’s definitely important to worry about. There are two AI threats that are worth distinguishing. One is that AI does enough labor substitution fast enough to change work policies, or [affect] the creation of new jobs that humans are uniquely adapted to — the jobs that give you a sense of purpose and worth. We haven’t run into that yet. I don’t think it’s a dramatic problem in the next ten years but if you take the next 20 to 30 it could be. Then there’s the longer-term problem of so-called strong AI, where it controls resources, so its goals are somehow conflicting with the goals of human systems. Both of those things are very worthy of study and time. I am certainly not in the camp that believes we ought to stop things or slow things down because of that. But you can definitely put me more in the Elon Musk, Bill Joy camp than, let’s say, the Google camp on that one.•

Tags: ,

Al Goldstein, currently masturbating in a casket, interviewing Gilbert Gottfried on Midnight Blue in the early ’90s. Definitely NSFW, unless you work in a blowjob store.

Tags: ,

In a Potemkin Review interview conducted by Antoine Dolcerocca and Gokhan Terzioglu, Thomas Piketty discusses what he believes is the source of long-term, even permanent, productivity, which, of course, can occur without reasonably equitable distribution, the main focus of his book Capital in the 21st Century. An excerpt:

Question:

What do you see as a source of perpetual productivity growth?

Thomas Piketty:

Simply accumulation of new knowledge. People go to school more, knowledge in science increases and that is the primary reason for productivity growth. We know more right now in semiconductors and biology than we did twenty years ago, and that will continue.

Question:

You argue in the book that while Marx made his predictions in the 19th century, we now know that sustained productivity growth is possible with knowledge (Solow residual, etc.). But do you think this can be a sustained process in the long run?

Thomas Piketty:

Yes, I do think that we can make inventions forever. The only thing that can make it non-sustainable is if we destroy the planet in the meantime, but I do not think that is what Marx had in mind. It can be a serious issue because we need to find new ways of producing energy in order to make it sustainable, or else this will come to a halt. However if we are able to use all forms of renewable energy, immaterial growth of knowledge can continue forever, or at least for a couple of centuries. There is no reason why technological progress should stop and population growth could also continue for a little more.•

Tags: , ,

Along with the progress being made with driverless cars and 3D bio-printers, the thing that has amazed me the most–alarmed me also–since I’ve been doing this blog has been the efforts of Boston Dynamics, the robotics company now owned by Google. The creations are so stunning that I hope the creators will remember that the applications of their machines are at least as important as the accomplishment of realizing the designs. At any rate, the Atlas robot is now untethered, liberated from its safety cord, operating freely via batteries.

At 3 Quarks Daily, Thomas Rodham Wells, that bearded philosopher, delivers a spanking to adults who imbue small children with greater value than they would others. At first blush, it seems a pedestrian argument. Little ones, still dependent, need us more, so they are prioritized. Pretty sensible. But as Wells makes clear, the cult of children informs moral decisions and perspectives in ways that may be out of proportion.

I think at the heart of issue is that we hold out hope that babies will turn out better than the rest of us did, and we’d like to enable that opportunity. Once they’ve grown and fallen into the middle of the pack like most do, that hope extinguishes. It certainly can’t just be that we like to make ourselves feel good by protecting those who are more defenseless because a lot adults, impoverished or ill, also fit that category. An excerpt:

Children are special in one particular, their extreme neediness. They have quite specific often urgent needs that only suitably motivated adults can meet, and the younger they are, the greater their neediness. That makes children’s care and protection a moral priority in any civilised society – there are lots of things that aren’t as important and should give rightly way to meeting children’s needs. As a result, children create multiple obligations upon their care-givers, as well second-order obligations on society in general, to ensure those needs are met. 

Yet the fact that you should give way to an ambulance attending an emergency doesn’t mean that the person in the ambulance is more important than you; only that her needs right now are more important than you getting to work on time. Likewise, the immanence of children’s neediness should often determine how we rank the priorities of actions we want to do, such as interrupting a movie to attend to a baby’s cries. But such an action ranking is not a guide to the relative worth of children and adults, or of babies and teenagers. There will surely be times when something even more urgent occurs – such as someone having a heart-attack in front of you – that requires a baby’s cries be neglected for the moment. 

II

The confusion of neediness with worth is only one source of the confusion though. The other major source is rather more blameworthy: the valorisation of psychological immaturity. For a peculiarity of the moral priority we grant to the neediness of children is that we do not apply it to equally needy adults, most obviously those whose mental and physical faculties decline in old age in a somewhat symmetrical way to the development of those faculties in children. If we only cared about neediness we would care more about, and take on more personal responsibility for, meeting the needs of the disabled in general without regard for their age. 

Of course we don’t do that. We seem to place a special value on children because of their blankness, the fact that they have not thought or done anything interesting or important yet and that their identity – their relationship to themselves and to others – is still unformed. (Some abortion activists make a great deal of the innocence of foetuses, the ultimate non-achievers.) As children grow up and become more like people, with a life of their own – friends and favourite things and secrets and dreams and ideas of their own – they seem to become less valuable. 

I can’t explain this bizarre phenomenon.•

Tags:

In a Business Insider piece, tech entrepreneur Hank Williams neatly dissects the problem of the intervening period between the present and that future moment when material plenty arrives, which is hopefully where technology is taking us. How hungry will people get before the banquet is served? I don’t know that I agree with his prediction that more jobs will move to China; outsourcing will likely come to mean out of species more than out of country. An excerpt:

When you read in the press the oft-quoted concept that “those jobs aren’t coming back” this “reduction of need” is what underlies all of it. Technology has reduced the need for labor. And the labor that *is* needed can’t be done in more developed nations because there are people elsewhere who will happily provide that labor less expensively.

In the long term, technology is almost certainly the solution to the problem. When we create devices that individuals will be able to own that will be able to produce everything that we need, the solution will be at hand. This is *not* science fiction. We are starting to see that happen with energy with things like rooftop solar panels and less expensive wind turbines. We are nowhere near where we need to be, but it is obvious that eventually everyone will be able to produce his or her own energy.

The same will be true for clothing, where personal devices will be able to make our clothing in our homes on demand. Food will be commoditized in a similar way, making it possible to have the basic necessities of life with a few low cost source materials.

The problem is that we are in this awful in-between phase of our planet’s productivity curve. Technology has vastly reduced the number of workers and resources that are required to make what the planet needs. This means that a small number of people, the people in control of the creation of goods, get the benefit of the increased productivity. When we get to the end of this curve and everyone can, in essence, be their own manufacturer, things will be good again. But until we can ride this curve to its natural stopping point, there will be much suffering, as the jobs that technology kills are not replaced.

The political implications of this are staggering.•

Tags:

I’m sure the advent of commercial aviation was met with prejudices about the new-fangled machines, but it took quite a while to perfect automated co-pilots and the navigation of wind shears, so horrifying death was probably also a deterrent. In the article below from the September 22, 1929 Brooklyn Daily Eagle (which is sadly chopped off a bit in the beginning), the unnamed author looks at a selected history of technophobia. 

 

I’m not worried about conscious, superintelligent machines doing away with humans anytime soon. As far as I can see into the future, I’m more concerned about the economic and ethical ramifications of Weak AI and the proliferation of automation. That will be enough of a challenge. If there is to be a people-killing “plague,” it will likely come from environmental devastation of our own making. That’s the “machine” we’ve unloosed.

On the topic of the Singularity, the excellent Edge.org asked a raft of thinkers in various disciplines to ponder this question: “What do you think about machines that think?” Excerpts follow from responses by philosopher Daniel C. Dennett, journalist William Poundstone and founding Wired editor Kevin Kelly.

___________________________

From Dennett:

The Singularity—an Urban Legend?

The Singularity—the fateful moment when AI surpasses its creators in intelligence and takes over the world—is a meme worth pondering. It has the earmarks of an urban legend: a certain scientific plausibility (“Well, in principle I guess it’s possible!”) coupled with a deliciously shudder-inducing punch line (“We’d be ruled by robots!”). Did you know that if you sneeze, belch, and fart all at the same time, you die? Wow. Following in the wake of decades of AI hype, you might think the Singularity would be regarded as a parody, a joke, but it has proven to be a remarkably persuasive escalation. Add a few illustrious converts—Elon Musk, Stephen Hawking, and David Chalmers, among others—and how can we not take it seriously? Whether this stupendous event takes place ten or a hundred or a thousand years in the future, isn’t it prudent to start planning now, setting up the necessary barricades and keeping our eyes peeled for harbingers of catastrophe?

I think, on the contrary, that these alarm calls distract us from a more pressing problem, an impending disaster that won’t need any help from Moore’s Law or further breakthroughs in theory to reach its much closer tipping point: after centuries of hard-won understanding of nature that now permits us, for the first time in history, to control many aspects of our destinies, we are on the verge of abdicating this control to artificial agents that can’t think, prematurely putting civilization on auto-pilot. The process is insidious because each step of it makes good local sense, is an offer you can’t refuse. You’d be a fool today to do large arithmetical calculations with pencil and paper when a hand calculator is much faster and almost perfectly reliable (don’t forget about round-off error), and why memorize train timetables when they are instantly available on your smart phone? Leave the map-reading and navigation to your GPS system; it isn’t conscious; it can’t think in any meaningful sense, but it’s much better than you are at keeping track of where you are and where you want to go.•

___________________________

From Poundstone:

Can Submarines Swim?

My favorite Edsger Dijkstra aphorism is this one: “The question of whether machines can think is about as relevant as the question of whether submarines can swim.” Yet we keep playing the imitation game: asking how closely machine intelligence can duplicate our own intelligence, as if that is the real point. Of course, once you imagine machines with human-like feelings and free will, it’s possible to conceive of misbehaving machine intelligence—the AI as Frankenstein idea. This notion is in the midst of a revival, and I started out thinking it was overblown. Lately I have concluded it’s not.

Here’s the case for overblown. Machine intelligence can go in so many directions. It is a failure of imagination to focus on human-like directions. Most of the early futurist conceptions of machine intelligence were wildly off base because computers have been most successful at doing what humans can’t do well. Machines are incredibly good at sorting lists. Maybe that sounds boring, but think of how much efficient sorting has changed the world.

In answer to some of the questions brought up here, it is far from clear that there will ever be a practical reason for future machines to have emotions and inner dialog; to pass for human under extended interrogation; to desire, and be able to make use of, legal and civil rights. They’re machines, and they can be anything we design them to be.

But that’s the point. Some people will want anthropomorphic machine intelligence.•

___________________________

From Kelly:

Call Them Artificial Aliens

The most important thing about making machines that can think is that they will think different.

Because of a quirk in our evolutionary history, we are cruising as the only sentient species on our planet, leaving us with the incorrect idea that human intelligence is singular. It is not. Our intelligence is a society of intelligences, and this suite occupies only a small corner of the many types of intelligences and consciousnesses that are possible in the universe. We like to call our human intelligence “general purpose” because compared to other kinds of minds we have met it can solve more kinds of problems, but as we build more and more synthetic minds we’ll come to realize that human thinking is not general at all. It is only one species of thinking.

The kind of thinking done by the emerging AIs in 2014 is not like human thinking. While they can accomplish tasks—such as playing chess, driving a car, describing the contents of a photograph—that we once believed only humans can do, they don’t do it in a human-like fashion. Facebook has the ability to ramp up an AI that can start with a photo of any person on earth and correctly identifying them out of some 3 billion people online. Human brains cannot scale to this degree, which makes this ability very un-human. We are notoriously bad at statistical thinking, so we are making intelligences with very good statistical skills, in order that they don’t think like us. One of the advantages of having AIs drive our cars is that theywon’t drive like humans, with our easily distracted minds.

In a pervasively connected world, thinking different is the source of innovation and wealth. Just being smart is not enough. Commercial incentives will make industrial strength AI ubiquitous, embedding cheap smartness into all that we make. But a bigger payoff will come when we start inventing new kinds of intelligences, and entirely new ways of thinking. We don’t know what the full taxonomy of intelligence is right now.•

Tags: , ,

“No Adult Wet Nursing.”

Surrogate with lots of milk! – $2

*****NO WET NURSING!!*****

I am a 24 year old gestational surrogate who delivered on Dec 3rd 2014. I pumped and shipped milk for my surro son for 6 weeks and as of 1/12/2015 I am longer pumping for him.

I am STD and drug free, I have rigorously been tested for all STD’s, drugs and diseases through a fertility clinic in California as well as my OB in Florida. If you or anyone is looking for fresh or frozen breast milk weather it’s for a baby, cancer patient, adult, or body builder then send me a message

I use a Medela Symphony hospital grade breast pump and produce 40-50oz per day.

I am looking to be compensated for my time, wear and tear on pump, and pumping supplies at a rate of $2 per ounce. I am also willing to ship.

Please contact me for any questions or inquires.

Please don’t waste my time, No Adult Wet Nursing, No Pictures, No Videos, No Checks accepted, and No Scams.

First the bad news: We’re dying people on a dying planet in a dying universe. The good news: We’re hastening the destruction of the delicate balance of factors which enable our transient-but-amazing existence. Oh wait, that’s also bad.

In a New York Times piece, astrophysicist Adam Frank looks out at all the dead space in our solar system to analyze our own precariousness. An excerpt:

The defining feature of a technological civilization is the capacity to intensively “harvest” energy. But the basic physics of energy, heat and work known as thermodynamics tell us that waste, or what we physicists call entropy, must be generated and dumped back into the environment in the process. Human civilization currently harvests around 100 billion megawatt hours of energy each year and dumps 36 billion tons of carbon dioxide into the planetary system, which is why the atmosphere is holding more heat and the oceans are acidifying. As hard as it is for some to believe, we humans are now steering the planet, however poorly.

Can we generalize this kind of planetary hijacking to other worlds? The long history of Earth provides a clue. The oxygen you are breathing right now was not part of our original atmosphere. It was the so-called Great Oxidation Event, two billion years after the formation of the planet, that drove Earth’s atmospheric content of oxygen up by a factor of 10,000. What cosmic force could so drastically change an entire planet’s atmosphere? Nothing more than the respiratory excretions of anaerobic bacteria then dominating our world. The one gas we most need to survive originated as deadly pollution to our planet’s then-leading species: a simple bacterium.

The Great Oxidation Event alone shows that when life (intelligent or otherwise) becomes highly successful, it can dramatically change its host planet. And what is true here is likely to be true on other planets as well.

But can we predict how an alien industrial civilization might alter its world? From a half-century of exploring our own solar system we’ve learned a lot about planets and how they work. We know that Mars was once a habitable world with water rushing across its surface. And Venus, a planet that might have been much like Earth, was instead transformed by a runaway greenhouse effect into a hellish world of 800-degree days.

By studying these nearby planets, we’ve discovered general rules for both climate and climate change.•

Tags:

There’s never been any material evidence linking Mohamedou Ould Slahi to the 9/11 terrorists, but he’s spent the past dozen years a prisoner in Guantanamo Bay. Slahi’s just-published diary makes claims of sexual abuse, among other forms of torture, punishment which he says forced him to lie and implicate others he did not know for crimes he knew nothing about. An excerpt via Britta Sandberg at Spiegel:

As soon as I stood up, the two _______ took off their blouses, and started to talk all kind of dirty stuff you can imagine, which I minded less. What hurt me most was them forcing me to take part in a sexual threesome in the most degrading manner. What many _______ don’t realize is that men get hurt the same as women if they’re forced to have sex, maybe more due to the traditional position of the man. Both _______ stuck on me, literally one on the front and the other older _______ stuck on my back rubbing ____ whole body on mine.

At the same time they were talking dirty to me, and playing with my sexual parts. I am saving you here from quoting the disgusting and degrading talk I had to listen to from noon or before until 10 p.m. when they turned me over to _______, the new character you’ll soon meet.

To be fair and honest, the _______ didn’t deprive me from my clothes at any time; everything happened with my uniform on. The senior _______________ was watching everything _____________________________________________________. I kept praying all the time.

“Stop the fuck praying! You’re having sex with American _______ and you’re praying? What a hypocrite you are!” said ______________ angrily, entering the room.

I refused to stop speaking my prayers, and after that, I was forbidden to perform my ritual prayers for about one year to come. I also was forbidden to fast during the sacred month of Ramadan October 2003, and fed by force. During this session I also refused to eat or to drink, although they offered me water every once in a while. “We must give you food and water; if you don’t eat it’s fine.”

I was just wishing to pass out so I didn’t have to suffer, and that was really the main reason for my hunger strike; I knew people like these don’t get impressed by hunger strikes. Of course they didn’t want me to die, but they understand there are many steps before one dies. “You’re not gonna die, we’re gonna feed you up your ass,” said ____________.

I have never felt as violated in myself as I had since the DoD team started to torture me to get me admit to things I haven’t done. (…)•

Tags: ,

There’s a line near the end of 1973’s Westworld, after things have gone haywire, that speaks to concerns about Deep Learning. A technician, who’s asked why the AI has run amok and how order can be restored, answers: “They’ve been designed by other computers…we don’t know exactly how they work.”

At Google, search has never been the point. It’s been an AI company from the start, Roomba-ing information to implement in a myriad of automated ways. Deep Learning is clearly a large part of that ultimate search. On that topic, Steven Levy conducted a Backchannel interview with Demis Hassabis, the company’s Vice President of Engineering for AI projects, who is a brilliant computer-game designer. For now, it’s all just games. An excerpt:

Steven Levy:

I imagine that the more we learn about the brain, the better we can create a machine approach to intelligence.

Demis Hassabis:

Yes. The exciting thing about these learning algorithms is they are kind of meta level. We’re imbuing it with the ability to learn for itself from experience, just like a human would do, and therefore it can do other stuff that maybe we don’t know how to program. It’s exciting to see that when it comes up with a new strategy in an Atari game that the programmers didn’t know about. Of course you need amazing programmers and researchers, like the ones we have here, to actually build the brain-like architecture that can do the learning.

Steven Levy:

In other words, we need massive human intelligence to build these systems but then we’ll —

Demis Hassabis:

… build the systems to master the more pedestrian or narrow tasks like playing chess. We won’t program a Go program. We’ll have a program that can play chess and Go and Crosses and Drafts and any of these board games, rather than reprogramming every time. That’s going to save an incredible amount of time. Also, we’re interested in algorithms that can use their learning from one domain and apply that knowledge to a new domain. As humans, if I show you some new board game or some new task or new card game, you don’t start from zero. If you know to play bridge and whist and whatever, I could invent a new card game for you, and you wouldn’t be starting from scratch—you would be bringing to bear this idea of suits and the knowledge that a higher card beats a lower card. This is all transferable information no matter what the card game is.•

Tags: ,

Marriage may be the ultimate social safety net, but for many it seems a trap. The heart wants what it wants–and doesn’t want what it doesn’t want–and emotions more than economics drive personal lives. Sometimes politicos believe, however, that divorce and single parenting is driven mostly by public policy.

It’s a bipartisan folly, with even the very social conservatives who most want the government’s finger out of the pie doing a turnabout when a wedding cake is involved. Marriage shouldn’t be de-incentivized by public policy, obviously, but if the institution isn’t taken as seriously in America as it once was, that may be because it was enabled by certain inequalities and prejudices we’re better off without. The people have voted. From the Economist:

When marriage is hitched to politics the result is usually muddled thinking. Social conservatives think that lax attitudes to sex, a decline in manliness, short skirts and a hundred other things have chipped away at a sacred institution. The Heritage Foundation, a think-tank with a “Marshall Plan for Marriage”, recently puffed a study suggesting that online pornography was the cause of the rot. People who reckon culture is to blame often propose economic solutions, from getting rid of marriage penalties to using public policy to promote wedlock. Thus some conservatives, who tend to assume that the government mucks up everything it tries, are nonetheless arguing that it can revive the traditional family. Leftish Democrats, meanwhile, think that marriage has been undermined by rising inequality, and especially the low wages of unskilled men, which make them less attractive as mates. They tend to argue that marriage, unlike practically every other social problem, cannot be fixed by government.

Both these views are confused. There are indeed marriage penalties in the tax code and in the welfare system: a single mother who marries a man with a job can lose all kinds of means-tested benefits. But there are also some marriage bonuses, and the tax code is so complicated that few Americans know whether tying the knot will mean they owe the taxman more or less. The federal government has made $114 billion-worth of pro-marriage fiddles to tax laws in the past decade with nothing much to show for it. And there is no evidence from decades of marriage-promotion programmes that the government can persuade people to get or stay hitched, a finding that will not surprise anyone who has ever actually been married.

As for the notion that inequality is to blame, that is muddled too. Most of the increase in income inequality has been at the very top of the scale: it is hard to see how the vast pay packet of a hedge-funder in New York changes the intentions of someone waiting tables in Utah. Though the wealthy are much more likely to wed than the poor, the relationship between money and vows is not clear-cut. Lots of people who decided to marry a few generations ago were poorer than those who choose not to today. Nor did marriage rates decline in the 1920s, when the surge in stock prices gilded the incomes of rich Americans. This tangle over inequality blinds Democrats to the possibility that causation may run in the opposite direction: that unwed parents raise poorer children. Isabel Sawhill of the Brookings Institution, a think-tank, calculates that returning marriage rates to their 1970 level would lower the child-poverty rate by a fifth. This omission may be deliberate: Democrats are reluctant to offend unmarried women, 60% of whom voted for the party’s candidates in 2014.

A debate about marriage should begin by acknowledging that the high rates of the 1950s and 1960s were a peak rather than the norm. The marriage rate in America has only recently dipped below where it was at the end of the 19th century, according to Andrew Cherlin of Johns Hopkins University. Reviving marriage rates of the 1950s, an era looked on fondly both by conservatives (who remember an America as wholesome as its cereal adverts) and by liberals (who recall an age when well-paid jobs were available for people with few qualifications) would require reviving some of that decade’s less jolly features too.•

 

Companies really want robots to take your job, and pretty much any task that can be performed by either humans or machines will be ceded to our silicon sisters. Your career security may depend on how far engineers can develop these simulacra. Case in point: Toshiba’s Chihira Aico, a female android who can already read the news in only slightly more wooden fashion than your local anchor. What more will she learn? From Susan Kalla at Forbes:

At the CES, the crowd was mesmerized by Toshiba’s talking android in a pink party dress. She stood quietly, looking like a mannequin until she sprang to life exclaiming, “I can be a news reader, consultant or cheerleader!” Throwing her arms up in the air, she squealed,”I can get excited!”

Chihira is a concept model and her creators are exploring applications while working on ways to make her seem more human. The are refining her movements and language skills. She has a limited range of motion, and the abrupt thrusts of her arms can remind you of Frankenstein. She can do a great presentation, but developers are not satisfied, they want her to interact with people.

She’s a complicated machine. Over 40 motors in her joints coordinate her moves, driven by  software developed by Toshiba under the direction of Hitoshi Tokuda. The 15 tiny air pumps in her face control the blinking of her eyes and move her jaws and mouth as she speaks. Osaka University managed the muscle research for Chihira, building on previous work on prosthetic limbs.

Chihira may seem creepy, but businesses are serious about developing androids to cut costs. Hospitals are running trials with the robot, and she’s being retrofitted for assisted living. Of course, life-like robots may eventually take your job. The field of robotics is advancing quickly and many universities are racing stake a claim.•

Tags: ,

From the June 20, 1942 Brooklyn Daily Eagle:

Nottingham, England — A strange man called on Mrs. Mabel Foulkes yesterday, said he had come at the request of a friend of hers to examine her teeth, then pushed her into a chair, extracted one of her teeth and ran out of the house, exclaiming, “What a beauty!”

Police said the man produced a forceps from his pocket and shoved it into her mouth before she could protest. Mrs. Foulkes fainted.•

Tags:

Some people tell a certain story at a particular time and everyone wants to believe it, even though it couldn’t possibly be true. Usually, these tall tales have something to do with unattainable wealth of one kind or another and our deep desire to possess it. Charlie Smith was just such a storyteller and his fortune was longevity. No one will argue that he didn’t have a very good run, but Smith didn’t make it as close to 137 as he wanted people to believe.

Smith became something of a minor celebrity in the 1960s-70s with his “memories” of life on plantations and on the frontier, claiming to have been born in 1842 (though documents uncovered later put lie to these assertions). His renown grew to the point that he was invited to watch the moon launch at the Kennedy Space Center. He doubted aloud (without irony) that the space mission was anything but a hoax.

Life magazine took Smith very seriously in 1972, seven years before he died, providing an interesting story if not a factual one. An excerpt from the article:

A researcher from the Martin Luther King Center in Boston traveled to Barstow, Florida, late last month to stick a microphone into the deeply furrowed face of Charlie Smith. The purpose was to add Smith’s recollections to the center of the black oral history bank.

What could this retired candy store owner from backwoods Florida have to offer? Among other things, memories of slavery, the Civil War and Jesse James.

Charlie Smith has become the object of historical research because he has obtained the incredible age of 130. He is the oldest living American.

For three hours Smith talked into the tape recorder, and even sang a couple of frontier ballads. He described being lured onto a slave ship in Liberia by tales of ‘fritter trees’ in far-off America, then being put on an auction ship in New Orleans. He wound up on a Texas plantation owned by a Charlie Smith, whose name he adopted. Freed during the Civil War, Smith told of years as a cowpuncher, gambler, bootlegger and outlaw.

“Ain’t nobody ever shot Jesse James,” Smith insists, contrary to historical legend. “He’s dead now, but nobody ever killed him.”

The fine line between fact and fiction sometimes seems to blur in Charlie Smith’s ancient mind. The Social Security Administration verified one thing, however: his age. Its confidence is based on an 1855 bill of sale of a 12-year-old Negro in the New Orleans slave market.•

It’s not just Americans who are in favor of doing rash and violent things that are against our interests merely because they feel patriotic. Look at contemporary Russia. Putin’s aggression, which has contributed mightily to the state’s wrecked economy, was met with approval by the majority of Russians. Briefly elated with a bacchanal of pride, they’re now sick with a long hangover, paying for letting their emotions dictate strategy, just as we did with Iraq.

I’m unconvinced that the U.S. has learned any lesson from that invasion, which occurred under false pretenses, cost us the lives 4,500 soldiers and more than a trillion dollars, a good amount of it bilked by war contractors. You hear rumblings now from Americans, and in our culture, that we need to not just hear how great the country is, but to have it demonstrated in some loud, scary way. That’s, of course, what our enemies want.

We defeat Putin and terrorists and any external threat by not behaving like them or letting them drive us into the rashness they deploy. But can we control ourselves? Terrorism is not nearly our biggest challenge. We are. From Edward Luce at the Financial Times:

For the first time since 2009, US voters cite terrorism as America’s top priority (76 per cent), according to a Pew poll last week. Given the tenacity of the gains of Isis in Syria and Iraq — it has held its ground in spite of US air strikes — and al-Qaeda’s strong advances in Yemen and beyond, it is hard to imagine this will change in 2015. Though it never vanished, terrorism is centre stage again.

The immediate US impact is among Republican White House hopefuls. Only one contender, Rand Paul — the artist formerly known as isolationist, also a senator from Kentucky — diverges from his party’s muscular line on national security. The more crowded the Republican field, the more Mr Paul stands out.

In one sense, this is a real selling point. There are plenty of millennial libertarians out there. But it also makes Mr Paul an increasingly juicy target. He is adjusting rapidly. In the past year he has gone from being an isolationist to a “non-interventionist” and is now a foreign policy “realist”. At this rate he will be a neoconservative before Independence Day.•

 

Tags:

If global wealth inequality was merely about envy and not concern over an astounding disproportion unrelated to meritocracy, the issue would have gone away after an election cycle. Even Mitt Romney is now pretending to worry about this systemic failure. From Mona Chalabi at FiveThirtyEight:

Eighty people hold the same amount of wealth as the world’s 3.6 billion poorest people, according to an analysis just released from Oxfam. The report from the global anti-poverty organization finds that since 2009, the wealth of those 80 richest has doubled in nominal terms — while the wealth of the poorest 50 percent of the world’s population has fallen. …

Thirty-five of the 80 richest people in the world are U.S. citizens, with combined wealth of $941 billion in 2014. Together in second place are Germany and Russia, with seven mega-rich individuals apiece. The entire list is dominated by one gender, though — 70 of the 80 richest people are men. And 68 of the people on the list are 50 or older.

If those 80 individuals were to bump into each on Svenborgia, what might they talk about? Retail could be a good conversation starter — 14 of the 80 got their wealth that way. Or they could discuss “extractives” (industries like oil, gas and mining, to which 11 of them owe their fortunes), finance (also 11 of them) or tech (10 of them).•

Tags:

« Older entries § Newer entries »