Excerpts

You are currently browsing the archive for the Excerpts category.

Yes, the good stuff you can do with drones is endless, though you could say the same about the bad stuff as well. From a new Economist report on domestic drones, a prognostication on what will be the initial roles of these robots:

“There could be 10,000 drones buzzing around America’s skies by 2017, reckons the FAA. ‘The good stuff you can do is endless,’ says Lucien Miller of Innov8tive Designs, a UAS firm in San Diego county. Estate agents and architects can use them for aerial photography. Energy firms will be able to monitor pieces of vital infrastructure, such as pipelines. Amazon recently caused a stir by saying it was looking into delivery-by-drone, releasing a video of a test run. However, the prospect of automated aircraft being allowed to carry heavy parcels along crowded city streets still seems distant.

At first drones’ main civilian uses, AUVSI predicts, will be in agriculture, followed distantly by public safety. Farmers will be able to monitor their land in detail, pinpointing outbreaks of disease and infestation, for example, or checking soil humidity. They will also be able to apply nutrients and pesticides more precisely. Besides Mr Loh’s drones for fire-and-rescue workers, other potential public-safety uses include police tracking of suspects. Ben Kimbro of Tactical Electronics, a technology firm, says they will find uses in various other ‘dull, dirty and dangerous’ public-service jobs.”

A few excerpts from computer scientist and TV presenter Dr. Christopher Evans’ 1979 interview of J.G. Ballard in the UK version of Penthouse, which was much classier than its US counterpart because all the beaver shots wore bowler hats and had the quaintest accents. 

__________________________

On the transition from the Space Age to the Personal Computer Age:

J.G. Ballard:

In the summer of ’74 I remember standing out in my garden on a bright, clear night and watching a moving dot of light in the sky which I realised was Skylab. I remember thinking how fantastic it was that there were men up there, and I felt really quite moved as I watched it. Through my mind there even flashed a line from every Hollywood aviation movie of the 40s, ‘it takes guts to fly those machines.’ But I meant it. Then my neighbour came out into his garden to get something and I said, ‘Look, there’s Skylab,’ and he looked up and said, ‘Sky-what?’ And I realised that he didn’t know about it, and he wasn’t interested. No, from that moment there was no doubt in my mind that the space age was over.

Dr. Christopher Evans:

What is the explanation for this. Why are people so indifferent?

J.G. Ballard:

I think it’s because we’re at the climactic end of one huge age of technology which began with the Industrial Revolution and which lasted for about 200 years. We’re also at the beginning of a second, possibly even greater revolution, brought about by advances in computers and by the development of information-processing devices of incredible sophistication. It will be the era of artificial brains as opposed to artificial muscles, and right now we stand at the midpoint between these two huge epochs. Now it’s my belief that people, unconsciously perhaps, recognise this and also recognise that the space programme and the conflict between NASA and the Soviet space effort belonged to the first of these systems of technological exploration, and was therefore tied to the past instead of the future. Don’t misunderstand me – it was a magnificent achievement to put a man on the moon, but it was essentially nuts and bolts technology and therefore not qualitatively different from the kind of engineering that built the Queen Mary or wrapped railroads round the world in the 19th century. It was a technology that changed peoples lives in all kinds of ways, and to a most dramatic extent, but the space programme represented its fast guttering flicker.

__________________________

On the PC bringing the world into the home, from social to pornography:

Dr. Christopher Evans:

How do you see the future developing?

J.G. Ballard:

I see the future developing in just one way – towards the home. In fact I would say that if one had to categorise the future in one word, it would be that word ‘home.’ Just as the 20th century has been the age of mobility, largely through the motor car, so the next era will be one in which instead of having to seek out one’s adventures through travel, one creates them, in whatever form one chooses, in one’s home. The average individual won’t just have a tape recorder, a stereo HiFi, or a TV set. He’ll have all the resources of a modern TV studio at his fingertips, coupled with data processing devices of incredible sophistication and power. No longer will he have to accept the relatively small number of permutations of fantasy that the movie and TV companies serve up to him, but he will be able to generate whatever he pleases to suit his whim. In this way people will soon realise that they can maximise the future of their lives with new realms of social, sexual and personal relationships, all waiting to be experienced in terms of these electronic systems, and all this exploration will take place in their living rooms.

But there’s more to it than that. For the first time it will become truly possible to explore extensively and in depth the psychopathology of one’s own life without any fear of moral condemnation. Although we’ve seen a collapse of many taboos within the last decade or so, there are still aspects of existence which are not counted as being legitimate to explore or experience mainly because of their deleterious or irritating effects on other people. Now I’m not talking about criminally psychopathic acts, but what I would consider as the more traditional psychopathic deviancies. Many, perhaps most of these, need to be expressed in concrete forms, and their expression at present gets people into trouble. One can think of a million examples, but if your deviant impulses push you in the direction of molesting old ladies, or cutting girl’s pig tails off in bus queues, then, quite rightly, you find yourself in the local magistrates court if you succumb to them. And the reason for this is that you’re intruding on other people’s life space. But with the new multi-media potential of your own computerised TV studio, where limitless simulations can be played out in totally convincing style, one will be able to explore, in a wholly benign and harmless way, every type of impulse – impulses so deviant that they might have seemed, say to our parents, to be completely corrupt and degenerate.

__________________________

On media decentralization, the camera-saturated society, Reality TV, Slow TV:

Dr. Christopher Evans:

Will people really respond to these creative possibilities themselves? Won’t the creation of these scenarios always be handed over to the expert or professional?

J.G. Ballard:

I doubt it. The experts or professionals only handle these tools when they are too expensive or too complex for the average person to manage them. As soon as the technology becomes cheap and simple, ordinary people get to work with it. One’s only got to think of people’s human responses to a new device like the camera. If you go back 30 or 40 years the Baby Brownie gave our parents a completely new window on the world. They could actually go into the garden and take a photograph of you tottering around on the lawn, take it down to the chemists, and then actually see their small child falling into the garden pool whenever and as often as they wanted to. I well remember my own parents’ excitement and satisfaction when looking at these blurry pictures, which represented only the simplest replay of the most totally commonplace. And indeed there’s an interesting point here. Far from being applied to mammoth productions in the form of personal space adventures, or one’s own participation in a death-defying race at Brands Hatch it’s my view that the incredibly sophisticated hook-ups of TV cameras and computers which we will all have at our fingertips tomorrow will most frequently be applied to the supremely ordinary, the absolutely commonplace. I can visualise for example a world ten years from now where every activity of one’s life will be constantly recorded by multiple computer-controlled TV cameras throughout the day so that when the evening comes instead of having to watch the news as transmitted by BBC or ITV – that irrelevant mixture of information about a largely fictional external world – one will be able to sit down, relax and watch the real news. And the real news of course will be a computer-selected and computer-edited version of the days rushes. ‘My God, there’s Jenny having her first ice cream!’or ‘There’s Candy coming home from school with her new friend.’ Now all that may seem madly mundane, but, as I said, it will be the real news of the day, as and how it affects every individual. Anyone in doubt about the compulsion of this kind of thing just has to think for a moment of how much is conveyed in a simple family snapshot, and of how rivetingly interesting – to oneself and family only of course – are even the simplest of holiday home movies today. Now extend your mind to the fantastic visual experience which tomorrow’s camera and editing facilities will allow. And I am not just thinking about sex, although once the colour 3-D cameras move into the bedroom the possibilities are limitless and open to anyone’s imagination. But let’s take another level, as yet more or less totally unexplored by cameras, still or movie, such as a parent’s love for one’s very young children. That wonderful intimacy that comes on every conceivable level – the warmth and rapport you have with a two-year-old infant, the close physical contact, his pleasure in fiddling with your tie, your curious satisfaction when he dribbles all over you, all these things which make up the indefinable joys of parenthood. Now imagine these being viewed and recorded by a very discriminating TV camera, programmed at the end of the day, or at the end of the year, or at the end of the decade, to make the optimum selection of images designed to give you a sense of the absolute and enduring reality of your own experience. With such technology interfaced with immensely intelligent computers I think we may genuinely be able to transcend time. One will be able to indulge oneself in a kind of continuing imagery which, for the first time will allow us to dominate the awful finiteness of life. Great portions of our waking state will be spent in a constant mood of self-awareness and excitement, endlessly replaying the simplest basic life experiences.•

Tags: ,

From “Endless Fun,” neuroscientist Michael Graziano’s excellent Aeon article about the implications, many worrisome, of immortality through computer uploading, which sidesteps cryogenics and its frozen heads and gets to the essence beneath–the brain’s data:

Imagine a future in which your mind never dies. When your body begins to fail, a machine scans your brain in enough detail to capture its unique wiring. A computer system uses that data to simulate your brain. It won’t need to replicate every last detail. Like the phonograph, it will strip away the irrelevant physical structures, leaving only the essence of the patterns. And then there is a second you, with your memories, your emotions, your way of thinking and making decisions, translated onto computer hardware as easily as we copy a text file these days.

That second version of you could live in a simulated world and hardly know the difference. You could walk around a simulated city street, feel a cool breeze, eat at a café, talk to other simulated people, play games, watch movies, enjoy yourself. Pain and disease would be programmed out of existence. If you’re still interested in the world outside your simulated playground, you could Skype yourself into board meetings or family Christmas dinners.

This vision of a virtual-reality afterlife, sometimes called ‘uploading’, entered the popular imagination via the short story ‘The Tunnel Under the World’ (1955) by the American science-fiction writer Frederik Pohl, though it also got a big boost from the movie Tron (1982). Then The Matrix (1999) introduced the mainstream public to the idea of a simulated reality, albeit one into which real brains were jacked. More recently, these ideas have caught on outside fiction. The Russian multimillionaire Dmitry Itskov made the news by proposing to transfer his mind into a robot, thereby achieving immortality. Only a few months ago, the British physicist Stephen Hawking speculated that a computer-simulated afterlife might become technologically feasible.”

Tags:

In a post at Practical Ethics, Dominic Wilkinson asks a thorny question that seems like a simple one at first blush: Should some people, who are considered exceptional, receive health care that others don’t? Of course not, we all would say. Human lives are equal in importance, and our loved ones are just as valuable as the most famous or successful among us. But Wilkinson quickly points out that Nelson Mandela, probably the most beloved among us during his life, received expensive and specialized care that would have been denied almost anyone else in South Africa. But how could we deny Mandela anything, after he sacrificed everything and ultimately led a nation 180 degrees from a civil war that could have cost countless lives? You can’t, really, though I would wager that Peter Singer disagrees with me. The opening of Wilkinson’s post:

There are approximately 150,000 human deaths each day around the world. Most of those deaths pass without much notice, yet in the last ten days one death has received enormous, perhaps unprecedented, attention. The death and funeral of Nelson Mandela have been accompanied by countless pages of newsprint and hours of radio and television coverage. Much has been made of what was, by any account, an extraordinary life. There has been less attention, though, on Mandela’s last months and days. One uncomfortable question has not been asked. Was it ethical for this exceptional individual to receive treatment that would be denied to almost everyone else?

At the age of almost 95, and physically frail, Mandela was admitted to a South African hospital intensive care unit with pneumonia. He remained there for three months before being transferred for ongoing intensive care in a converted room in his own home. Although there are limited details available from media coverage it appears that Mandela received in his last six months a very large amount of highly expensive and invasive medical treatment. It was reported that he was receiving ventilation (breathing machine support) and renal dialysis (kidney machine). This level of treatment would be unthinkable for the vast majority of South Africans, and, indeed, the overwhelming majority of the people with similar illnesses even in developed countries. Frail elderly patients with pneumonia are not usually admitted to intensive care units. They do not have the option of prolonged support with breathing machines and dialysis at home.”

Tags: ,

Ima Hogg, 1900.

Scientific studies (which I mostly don’t believe) have long shown that those with more common names fare better in life than those with unique ones. Barack Obama is a small sample size, but he’s done fairly well personally and professionally. And then there’s Ima Hogg, who was the celebrated belle of Texas as well as un-porcine. Well, she did have family connections, so I could be talking about another exception. I suppose the one area in which a name can have an impact is when it allows an employer with biased hiring practices to know the race or ethnicity of the applicant. That does have a bearing on happiness.

The opening of an interesting New Yorker blog post on the topic by Maria Konnikova:

“In 1948, two professors at Harvard University published a study of thirty-three hundred men who had recently graduated, looking at whether their names had any bearing on their academic performance. The men with unusual names, the study found, were more likely to have flunked out or to have exhibited symptoms of psychological neurosis than those with more common names. The Mikes were doing just fine, but the Berriens were having trouble. A rare name, the professors surmised, had a negative psychological effect on its bearer.

Since then, researchers have continued to study the effects of names, and, in the decades after the 1948 study, these findings have been widely reproduced. Some recent research suggests that names can influence choice of profession, where we live, whom we marry, the grades we earn, the stocks we invest in, whether we’re accepted to a school or are hired for a particular job, and the quality of our work in a group setting. Our names can even determine whether we give money to disaster victims: if we share an initial with the name of a hurricane, according to one study, we are far more likely to donate to relief funds after it hits.

Much of the apparent influence of names on behavior has been attributed to what’s known as the implicit-egotism effect: we are generally drawn to the things and people that most resemble us. Because we value and identify with our own names, and initials, the logic goes, we prefer things that have something in common with them. For instance, if I’m choosing between two brands of cars, all things being equal, I’d prefer a Mazda or a Kia.

That view, however, may not withstand closer scrutiny.”

Tags: , ,

In promoting his new book about Twitter, Nick Bilton sat for an excellent interview with Shaun Randol of the Los Angeles Review of Books. A passage in which the Times reporter describes the changes in journalistic portraits in the Information Age:

Shaun Randol:

Can you speak about writing a narrative using at least four competing memories?

Nick Bilton:

There were over 100 competing memories. Everyone has a different viewpoint of what happened. I interviewed not only the founders and the board members; I interviewed also the people who worked there in the early days, their spouses, their ex-spouses, ex-girlfriends, and ex-boyfriends. I found people who worked at nearby coffee shops. I spoke with anyone who I could have a conversation with.

What I found the most fascinating was that I could go back to social media and use that in my reporting. For example: There’s a moment in the book when Twitter launches at the Love Parade, a rave in San Francisco. Everyone I spoke to believed it happened in June or July, or the beginning of summer. I looked up the Love Parade online and discovered it was in September. So then I went through and searched Jack’s tweets and those of other people from that time, and I ended up finding references of them at the Love Parade in September. Their memories believed it was the beginning of the summer, but they had actually documented it was the end of the summer.

That was the moment I realized that I could use these tweets and social media as a reporting tool for this book. There was a treasure trove of stuff that existed online, whether it was tweets, Flickr photos, videos on YouTube, Facebook updates, or Foursquare updates. These things existed everywhere and allowed me to pinpoint almost with exact accuracy where people were at certain points in time. I was able to untangle all of the somewhat different memories.

Shaun Randol:

There are significant implications of leaving that digital breadcrumb.

Nick Bilton:

Yes. If you want to write a book about me and I won’t let you interview me, you could potentially say what I was doing at certain points in time just by looking at my social media feeds: Foursquare, Facebook, Twitter. For this book, I had access to thousands of emails and other documents, but there were certain events that I could find via social media. The places people had gone. Videos of boat trips they took. Writing and reporting this story was a real eye-opening experience.

Shaun Randol:

I’m reminded of Gay Talese’s famous portrait of Frank Sinatra, ‘Frank Sinatra Has a Cold,’ in which a vivid portrait of the singer was drawn without ever speaking to him.

Nick Bilton:

That’s the piece that everyone attains to when they write a story like this. Imagine how Talese’s piece would have looked if Frank Sinatra was on Twitter and there were photographs on Instagram of him. As you see in my book, there are incredible details about what people were wearing, the temperature that day, and even the gusty wind. I used the internet to find these things. I could look at almanacs to find what the weather was that day and the photos on Flickr of what people were wearing that day.”

Tags: ,

From Robert Walker’s well-considered Science 2.0 article explaining why terraforming Mars is a far more fraught operation than merely building working Biospheres, which themselves aren’t easy assignments:

“Our only attempt at making a closed Earth-like ecosystem so far on Earth, in Biosphere 2, failed. There, it was because of an interaction of a chemical reaction with the concrete in the building, which indirectly removed oxygen from the habitat. Nobody predicted this and it was only detected after the experiment was over. The idea itself doesn’t seem to be fundamentally flawed, it was just a mistake of detail.

In the future perhaps we will try a Biosphere 3 or 4, and eventually get it right. When we build self-enclosed settlements in space such as the Stanford Torus, they will surely go wrong too from time to time in the early stages. But again, you can purge poisonous gases from the atmosphere, and replenish its oxygen. In the worst case, you can evacuate the colonists from the space settlement, vent all the atmosphere, sterilize the soil, and start again.

It is a similar situation with Mars, there are many interactions that could go wrong, and we are sure to make a few mistakes to start with. The difference is, if you make a mistake when you terraform a planet, it is likely that you can’t ‘turn back the clock’ and undo your mistakes.

With Mars, we can’t try again with a Mars 2, Mars 3, Mars 4 etc. until we get it right.”

Tags:

From a Wired piece by Liz Stinson about a printable paper speaker by a French product designer: 

“If you’re the tinkering type, you’ve probably deconstructed a fair number of electronics. It doesn’t take a genius to tear apart a radio, but once you get past the bulk of plastic packaging and down to the guts, you begin to realize that reading the mess of circuits, chips and components is like trying to navigate your way through a foreign country with a map from the 18th century.

But it doesn’t have to be so complicated, says Coralie Gourguechon. ‘Nowadays, we own devices that are too complicated considering the way we really use them,’ she says. Gourguechon, maker of the Craft Camera, believes that in order to understand our electronics, they need to be vastly simpler and more transparent than they currently are.”

Tags: ,

Libertarian billionaire Peter Thiel is an interesting guy, though I don’t agree with most of what he says. I’d love, for instance, to see him apply some of his know-how to coming up with solutions for poverty. Like a lot of people in Silicon Valley, he seems to exist on an island where such messy problems don’t register.

From a new Financial Times profile of Thiel by Richard Waters, in which the subject rails against government regulation, some of which might have come in handy on Wall Street during the aughts:

“He sounds equally uncomfortable discussing himself. The ‘ums’ multiply as he tries to explain why he threw in law and banking and came to Silicon Valley to pursue something far more world-changing. ‘There was this decision to move back to California and try something new and different,’ he says as though it were something that happened to someone else.

He is similarly vague when talking about the origins of his personal philosophy. ‘I’ve always been very interested in ideas and trying to figure things out.’ His undergraduate degree, from Stanford University, was in philosophy but his stance against the dominant political philosophy on many issues seems more visceral than intellectual. ‘I think that one of the most contrarian things one can do in our society is try to think for oneself,’ he says.

He only really regains his stride when talking about how technological ambition has gone from the world, leaving what he calls an ‘age of diminished expectations that has slowly seeped into the culture.’ Predictably, given his libertarian bent, much of this is traced back to regulation.

This is his explanation for why the computer industry (which inhabits ‘the world of bits’) has thrived while so many others (‘the world of atoms’) have not. ‘The world of bits has not been regulated and that’s where we’ve seen a lot of progress in the past 40 years, and the world of atoms has been regulated, and that’s why it’s been hard to get progress in areas like biotechnology and aviation and all sorts of material science areas.'”

Tags: ,

Moral philosopher Peter Singer is a bothersome man, a stickler, a provider of inconvenient truths. He’s the humanist who’s tough to take. But if you were sick or impoverished, you’d want him on your side. 

Two excerpts: The opening of his new Washington Post editorial about the Make-a-Wish Batkid celebration in San Francisco and the beginning of “The Dangerous Philosopher,” Michael Specter’s 1999 New Yorker profile of the ethicist.

From WaPo:

You’d have to be a real spoilsport not to feel good about Batkid. If the sight of 20,000 people joining in last month to help the Make-A-Wish Foundation and the city of San Francisco fulfill the superhero fantasies of a 5-year-old — and not just any 5-year-old, but one who has been battling a life-threatening disease — doesn’t warm your heart, you must be numb to basic human emotions.

Yet we can still ask if these basic emotions are the best guide to what we ought to do. According to Make-A-Wish, the average cost of realizing the wish of a child with a life-threatening illness is $7,500. That sum, if donated to the Against Malaria Foundation and used to provide bed nets to families in malaria-prone regions, could save the lives of at least two or three children (and that’s a conservative estimate). If donated to the Fistula Foundation, it could pay for surgeries for approximately 17 young mothers who, without that assistance, will be unable to prevent their bodily wastes from leaking through their vaginas and hence are likely to be outcasts for the rest of their lives. If donated to the Seva Foundation to treat trachoma and other common causes of blindness in developing countries, it could protect 100 children from losing their sight as they grow older.

It’s obvious, isn’t it, that saving a child’s life is better than fulfilling a child’s wish to be Batkid? If Miles’s parents had been offered that choice — Batkid for a day or a cure for their son’s leukemia — they surely would have chosen the cure.”•

From Specter’s profile:

Peter Singer may be the most controversial philosopher alive; he is certainly among the most influential. And this month, as he begins a new job as Princeton University’s first professor of bioethics, his unorthodox views will be debated in America more passionately than ever before. For nearly thirty years, Singer has written with great severity on subjects ranging from what people should put on their dinner plates each night to how they should spend their money or assess the value of human life. He is always relevant, but what he has to say often seems outrageous: Singer believes, for example, that a human’s life is not necessarily more sacred than a dog’s, and that it might be more compassionate to carry out medical experiments on hopelessly disabled, unconscious orphans than on perfectly healthy rats. Yet his books are far more popular than those of any other modern philosopher. Animal Liberation, which was first published in 1975, has sold half a million copies and is widely regarded as the touchstone of the animal-rights movement. In 1979, he brought out Practical Ethics, which has sold more than a hundred and twenty thousand copies, making it the most successful philosophy text ever published by Cambridge University Press.

Singer laid out his brutally frank approach to ethics in his first major paper, ‘Famine, Affluence, and Morality,’ which has become required reading for thousands of university students. ‘As I write this in November 1971, people are dying in East Bengal from lack of food, shelter, and medical care,” Singer’s essay began. ‘The suffering and death that are occurring there now are not inevitable, not unavoidable.’ The problem, he explained, is a result of the moral blindness of rich human beings who are far too selfish to come to the aid of the poor.

Following in the tradition of the eighteenth-century moral philosopher William Godwin–who asked, famously, ‘What magic is there in the pronoun `my’ to overturn the decisions of everlasting truth?’–Singer argues that proximity means nothing when it comes to moral decisions, and that personal relationships don’t mean much, either. Saving your daughter’s life is a fine thing to do, for example, but it can never measure up to saving the lives of ten strangers. If you were faced with the choice, Singer’s ethics would require you to save the strangers. ‘It makes no moral difference whether the person I can help is a neighbor’s child ten yards from me or a Bengali whose name I shall never know, ten thousand miles away,’ he wrote in his essay. Singer believes we are obliged to give money away until our sacrifice is of ‘comparable moral importance’ to the agony of people starving to death. ‘This would mean, of course,’ he continued, approvingly, ‘that one would reduce oneself to very near the material circumstances of a Bengali refugee.’

Singer’s views on animal rights are even bolder: he calls man’s dominion over other animals a ‘speciesist’ outrage that can properly be compared only to the pain and suffering ‘which resulted from the centuries of tyranny by white humans over black humans.’ For Singer, that ‘tyranny’ is one of the central social issues of our age. Yet what has brought him infamy is his radical position on an even more compelling set of moral questions: how to cope with the borders between birth, life, and death in an era when we are becoming technologically capable of controlling them all.”•

Tags: ,

Karaoke, the smallpox of singing, the pained sounds people make when success is no longer an option, was not a naturally occurring, viral phenomenon, but an invention, originally called a Juke 8, that was marketed more than 40 years ago. Via Alexis C. Madrigal’s Atlantic article, Daisuke Inoue, the enabler of drunken-salarymen warbling, recalls his devilry:

“Inoue recounted his adventures in 2005 to Topic Magazine, which allowed the Atlantic-favorite history site The Appendix to reprint his first person account of creating a modern sensation.

One day, the president of a small company came to the club where I was playing to ask a favor. He was meeting business clients in another town and knew they would all end up at a drinking establishment and that he would be called on to sing. ‘Daisuke, your keyboard playing is the only music that I can sing to! You know how my voice is and what it needs to sound good.’

So at his request I taped a number of his favorite songs onto an open-reel tape recorder in the keys that would best suit his voice. A few days later he came back full of smiles and asked if I could record some more songs. At that moment the idea for the Juke 8 dawned on me: You would put money into a machine with a microphone, speaker and amplifier, and it would play the music people wanted to sing.

As I had attended a Denko (or Electric Industry) High School, you’d think I could have built the machine myself. But I was always scared of electricity and so graduated without much of an ability to put things together. A member of my band introduced me to a friend of his who had an electronics shop. I took my idea to him, and he understood exactly what I’d envisioned. With my instruction, he built eleven Juke 8s. Each machine consisted of an amplifier, a microphone, a coin box and an eight-track car stereo. Putting the machines together took about two months and cost around $425 per unit.

That was in 1969, but the machines did not actually hit the market until 1971. At first, people weren’t all that interested, but once they figured out how they worked, they started to take off with an Atari-like speed.”

Tags: ,

Google isn’t striving for driverless cars just so its employees can get to work more easily; it wants to sell the software to every automaker. The same likely goes for the robots it plans to build. From Illah Nourbakhsh’s New Yorker blog post about the company’s recent robotics-buying binge:

“While some analysts initially suggested that Google’s goal was to more thoroughly automate factories—highly controlled environments that are well-suited for a fleet of semi-independent robots—it’s now clear that the company’s team of engineers and scientists has a vision of truly dexterous, autonomous robots that can walk on sidewalks, carry packages, and push strollers. (A disclosure: Carnegie Mellon’s Community Robotics, Education, and Technology Empowerment Lab, which I direct, has partnered with Google on a number of mapping projects that utilize our GigaPan technology.) Before its acquisition of Boston Dyanmics, Google ingested seven start-ups in just six months—companies that have created some of the best-engineered arms, hands, motion systems, and vision processors in the robotics industry. The companies Meka and Schaft, two of Google’s recent acquisitions, designed robot torsos that can interact with humans at work and at home. Another, Redwood Robotics, created a lightweight, strong arm that uses a nimble-control system. Boston Dynamics, Google’s eighth acquisition, is one of the most accomplished robotics companies on the planet, having pioneered machines that can run, jump, and lift, often better than humans. Its latest projects have focussed on developing full-body robot androids that can wear hazmat clothing and manipulate any tool designed for humans, like front loaders or jackhammers. The potential impact of Google’s robot arsenal, already hinted at by its self-driving car effort, is stunning: Google could deploy human-scale robots throughout society. And, while Amazon is busy optimizing delivery logistics, Google bots could roboticize every Amazon competitor, from Target to Safeway.

If robots pervade society, how will our daily experiences change?”

Tags:

One more thing from the recent email exchange between Bill Simmons and Malcolm Gladwell. (I posted here about my agreement with Gladwell’s remarks about PEDs.) The host and guest discuss celebrity early in the conversation and make some good points. There is one thing, however, I disagree with. In discussing an anecdote from Johnny Carson, the new book by Henry “Bombastic” Bushkin, the late talk show host’s longtime lawyer, Gladwell asserts that celebrity behavior must have been far worse 50 years ago because there wasn’t so much press attention. This is conventional, but I think incorrect, wisdom. 

Celebrity behavior was horrible decades ago, and it was covered up. That’s true. Every now and then something would explode into public view, like the cases with Errol Flynn and Fatty Arbuckle. But I believe the same thing happens now, even with the tabloid culture. Paparazzi aren’t muckrakers duty-bound to serve the public good but entrepreneurs who sell salacious details, even uncovered information about criminality, to the highest bidder. Plenty gets covered up. It’s a marketplace in which silence is bought with money or favors. Let’s assume that tween performers don’r grow up to be so dysfunctional without cause and that action stars don’t always behave well while filming abroad. Every now and then something will explode into the public view, like the cases of O.J. Simpson and Michael Jackson. But despite the surfeit of information everywhere, most misbehavior is still kept quiet.

From Simmons and Gladwell:

Bill Simmons:

It’s weird to think of Johnny Carson involved in a conspiracy, though.

Malcolm Gladwell:

And it just not plausible today, is it? There are 4 million Americans with top secret security clearances. How can you make a legitimate cultural argument for the presence of some shadowy secret government when 4 million people are in on the shadowy secret government? But in 1970, the Mafia throws the biggest star on television down the stairs and then puts a contract on him, causing him to lock himself in his apartment for three days and for millions of Americans to be forced to watch live coverage of the Italian American unity rally, and none of that became public. This is tabloid malpractice.

Bill Simmons:

Wait, it seems like you were inordinately mesmerized by this Carson book. Was it because you didn’t realize that he was such a flawed human being? Or were you blown away by how different celebrity culture was in the 1960s and 1970s compared to now?

Malcolm Gladwell:

Well, it made me think that the average level of celebrity behavior must have been much worse 50 years ago than today.”

Tags: , ,

Al Goldstein was a horrible man, so it’s a shame he was right. The Screw publisher, who just passed away at 77, had a dream–a wet one–and lived long enough to see it become reality. Goldstein envisioned a world in which porn was ubiquitous and acceptable, and now it’s available on every screen in our homes and shirt pockets. He was the McLuhan of smut, not envisioning a Global Village so much as a universal circle jerk. He won.

One of my favorite video clips of all time: Smartmouth Stanley Siegel interviews Goldstein and comedian Jerry Lewis in 1976. When not busy composing the world’s finest beaver shots, Goldstein apparently had a newsletter about tech tools. He shows off a $3900 calculator watch and a $2200 portable phone. Lewis, easily the biggest tool on the stage, flaunts his wealth the way only a truly insecure man can.

The opening of a post at the Lefsetz Letter which offers a perfectly reasonable takedown of those who see Beyoncé’s record sales last week as anything but an extreme outlier, just a brief flash when an old paradigm still worked, a singular moment of calm before the sharks again turn the water red:

It’s a stunt. No different from Radiohead’s In Rainbows. Unrepeatable by mere mortals, never mind wannabes and also-rans.

That’s how desperate Apple is. It lets Beyonce circumvent its rules and release a ‘video album,’ so the record industry can have its bundle and the Cupertino company can delude itself into believing that it’s got a solution to Spotify, when the Swedish streaming company is chasing YouTube, not iTunes.

And the media is so impressed by numbers that it trumpets the story, believing its role is to amplify rather than analyze.

Yes, it was a story. The same way a bomb or SpaceX or anything new gets people’s attention. Only in this case, there was something to buy. Whoo-hoo! We got lemmings and fans to lay down their credit cards to spend money for the work of a superstar, as if this is a new paradigm.

And we’ve got Rob Stringer and the rest of the inane music business slapping its back, declaring victory.

What a bunch of hogwash.

The story of 2013 is cacophony.”

Tags:

Walter Isaacson, who’s writing a book about Silicon Valley creators, knows firsthand that sometimes such people take credit that may not be coming to them. So he’s done a wise thing and put a draft of part of his book online, so that crowdsourcing can do its magic. As he puts it: “I am sketching a draft of my next book on the innovators of the digital age. Here’s a rough draft of a section that sets the scene in Silicon Valley in the 1970s. I would appreciate notes, comments, corrections.” The opening paragraphs of his draft at Medium:

“The idea of a personal computer, one that ordinary individuals could own and operate and keep in their homes, was envisioned in 1945 by Vannevar Bush. After building his Differential Analyzer at MIT and helping to create the military-industrial-academic triangle, he wrote an essay for the July 1945 issue of the Atlantic titled ‘As We May Think.’ In it he conjured up the possibility of a personal machine, which he dubbed a memex, that would not only do mathematical tasks but also store and retrieve a person’s words, pictures and other information. ‘Consider a future device for individual use, which is a sort of mechanized private file and library,’ he wrote. ‘A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.’

Bush imagined that the device would have a ‘direct entry’ mechanism so you could put information and all your records into its memory. He even predicted hypertext links, file sharing, and collaborative knowledge accumulation. ‘Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified,’ he wrote, anticipating Wikipedia by a half century.

As it turned out, computers did not evolve the way that Bush envisioned, at least not initially. Instead of becoming personal tools and memory banks for individuals to use, they became hulking industrial and military colossi that researchers could time share but the average person could not touch. In the early 1970s, companies such as DEC began to make minicomputers, the size of a small refrigerator, but they dismissed the idea that there would be a market for even smaller ones that could be owned and operated by ordinary folks. ‘I can’t see any reason that anyone would want a computer of his own,’ DEC president Ken Olsen declared at a May 1974 meeting where his operations committee was debating whether to create a smaller version of its PDP-8 for personal consumers. As a result, the personal computer revolution, when it erupted in the mid-1970s, was led by scruffy entrepreneurs who started companies in strip malls and garages with names like Altair and Apple.

Once again, innovation was spurred by the right combination of technological advances, new ideas, and social desires. The development of the microprocessor, which made it technologically possible to invent a personal computer, occurred at a time of rich cultural ferment in Silicon Valley in the late 1960s, one that created a cauldron suitable for homebrewed machines. There was the engineering culture that arose during World War II with the growth of defense contractors, such as Westinghouse and Lockheed, followed by electronics companies such as Fairchild and its fairchildren. There was the startup culture, exemplified by Intel and Atari, where creativity was encouraged and stultifying bureaucracies disdained. Stanford and its industrial park had lured west a great silicon rush of pioneers, many of them hackers and hobbyists who, with their hands-on imperative, had a craving for computers that they could touch and play with. In addition there was a subculture populated by wireheads, phreakers, and cyberpunks, who got their kicks hacking into the Bell System’s phone lines or the timeshared computers of big corporations.

Added to this mix were two countercultural strands: the hippies, born out of the Bay Area’s beat generation, and the antiwar activists, born out of the Free Speech Movement at Berkeley. The antiauthoritarian and power-to-the-people mindset of the late 1960s youth culture, along with its celebration of rebellion and free expression, helped lay the ground for the next wave of computing. As John Markoff wrote in What the Dormouse Said, ‘Personal computers that were designed for and belonged to single individuals would emerge initially in concert with a counterculture that rejected authority and believed the human spirit would triumph over corporate technology.'”

Tags:

Ronnie Biggs of Great Train Robbery infamy–which morphed in time into pure fame–was good at robbing trains, escaping from prison and eluding authorities, but he was a genius at the ways of cultivating celebrity before such things were common knowledge. From Margalit’s Fox’s New York Times obituary of Biggs:

“Mr. Big­gs’s en­dur­ing rep­u­ta­tion stemmed not so much from the heist it­self as from what hap­pened af­ter­ward. Tried and con­vict­ed, he es­caped from prison and be­came the sub­ject of an in­ter­na­tion­al man­hunt; spent the next 36 years as a fugi­tive, much of that time liv­ing open­ly in Rio de Ja­neiro in de­fi­ance of the British au­thori­ties; and en­joyed al­most preter­nat­ur­al luck in thwart­ing re­peat­ed at­tempts to bring him to jus­tice, in­clud­ing be­ing kid­napped and spir­ited out of Brazil by yacht.

The fact that the rob­bery hap­pened to take place on Mr. Big­gs’s birth­day al­so did not hurt.

Dur­ing his years at large, Mr. Big­gs, aid­ed by the British tab­loid press, cul­ti­vated his im­age as a work­ing-class Cock­ney hero. He sold mem­o­ra­bilia to tourists, en­dorsed prod­ucts on tele­vi­sion and re­corded a song (‘No One Is In­no­cent’) with the Sex Pis­tols, the British punk band.

As much as any­thing, Mr. Big­gs’s story is about the con­struc­tion of ce­leb­rity, and the ways in which ce­leb­rity can be sus­tained as a kind of cot­tage in­dus­try long af­ter the world might rea­son­ably be ex­pect­ed to have lost in­ter­est.”

________________

“No One Is Innocent”:

Tags: ,

Apollo astronauts knew they’d always have a job in government or aviation or academia or corporate America if they made it back to Earth alive from their missions, but the actual job didn’t pay very well, even by the standards of the 1960s. From Norman Mailer’s Of a Fire on the Moon“Of course, most of the astronauts worked for only thirteen thousand dollars a year in base pay. Not much for an honored profession. There are, of course, increments and insurance policies and collective benefits from the Life Magazine contract, but few earn more than twenty thousand dollars a year.”

Tags:

I frequently post videos from Boston Dynamics, the best and scariest robotics company on the planet. I’ve been surprised that Google or Amazon, with such deep pockets, didn’t acquire it, instantly becoming  leader in a sector that could help it with order processing and things far beyond that. But recently Google took the plunge and is now the company’s owner. What does it want from its newest division? From Samuel Gibbs at Guardian:

“Boston Dynamics is not the only robotics company Google has bought in recent years. Put under the leadership of Andy Rubin, previously Google’s head of Android, the search company has quietly acquired seven different technology companies to foster a self-described ‘moonshot’ robotics vision.

The acquired companies included Schaft, a small Japanese humanoid robotics company; Meka and Redwood Robotics, San Francisco-based creators of humanoid robots and robot arms; Bot & Dolly who created the robotic camera systems recently used in the movie Gravity; Autofuss an advertising and design company; Holomni, high-tech wheel designer, and Industrial Perception, a startup developing computer vision systems for manufacturing and delivery processes.

Sources told the New York Times that Google’s initial plans are not consumer-focused, instead aimed at manufacturing and industry automation, although products are expected within the next three to five years.”

___________________________________

From Boston Dynamics.

Petman:

Petman’s best friend:

Tags: ,

The opening of a Quartz article by Christopher Mims detailing what needs to be established before the Internet of Things can take off:

“As Quartz has already reported, the Internet of Things is already here, and in the not too distant future it will replace the web. Many enabling technologies have arrived which will make the internet of things ubiquitous, and thanks to smartphones, the public is finally ready to accept that it will become impossible to escape from the internet’s all-seeing eye.

But a critical piece of the internet of things puzzle remains to be solved. What engineers lack is a universal glue to bind all the of the ‘things’ in the internet of things to each other and to the cloud.

To understand how important these standards will be, it helps to know a bit about the history of the web. When the internet was born, it was a mishmash of now mostly-forgotten protocols designed to accomplish different tasks—gopher for retrieving documents, FTP for sending and receiving files, and no standard for social networking other than email. Then the web came along and unified those protocols, and made them accessible to non-geeks. All of this magic was possible because the internet is built on open standards: transparent, agreed-upon ways that devices should communicate with one another and share data.”

Tags:

Google certainly aspires to be the Bell Labs of our age, but is it doing that level of work? Two contrasting opinions: David Litwak (who is pro) and Zak Kukoff (who is con).

From Litwak:

“Bell Labs was the research division of AT&T and Western Electric Research Laboratories, originally formed in 1925 to work on telephone exchange switches. However, over the next 50 years or so, their research won 7 Nobel Prizes, for things very loosely connected to telephone switches, if at all. Among their inventions are the transistor, the laser, UNIX, radio astronomy and the C and C++ programming languages.

Under various ownership structures and names, Bell Labs spit out truly groundbreaking inventions for 50+ years. They still enjoy a measure of success, but by most opinions their best days are behind them, and many of their ~20 locations have been shuttered.

Google is the only tech company who has devoted significant resources to not just figuring out what the next big thing is, but figuring out what the big thing will be 15 years from now, much like Bell Labs used to.”

From Kukoff:

“I won’t argue with much of the article, because I think David makes some compelling points. Google is doing some compelling and interesting work, especially at Google X. But one big point missed by David (and many who agree with him) is that Bell Labs operated in no small part for the public good, producing IP like UNIX and C that entered the public domain. In fact, despite being a part of a state sanctioned monopoly, Bell Labs produced a staggering amount of freely-available knowledge that moved entire industries forward.”

Tags: ,

The opening of “A Model World,” Jon Turney’s Aeon article about computer models, which he reminds are not all created equal:

“Here’s a simple recipe for doing science. Find a plausible theory for how some bits of the world behave, make predictions, test them experimentally. If the results fit the predictions, then the theory might describe what’s really going on. If not, you need to think again. Scientific work is vastly diverse and full of fascinating complexities. Still, the recipe captures crucial features of how most of it has been done for the past few hundred years.

Now, however, there is a new ingredient. Computer simulation, only a few decades old, is transforming scientific projects as mind-bending as plotting the evolution of the cosmos, and as mundane as predicting traffic snarl-ups. What should we make of this scientific nouvelle cuisine? While it is related to experiment, all the action is in silico — not in the world, or even the lab. It might involve theory, transformed into equations, then computer code. Or it might just incorporate some rough approximations, which are good enough to get by with. Made digestible, the results affect us all.

As computer modelling has become essential to more and more areas of science, it has also become at least a partial guide to headline-grabbing policy issues, from flood control and the conserving of fish stocks, to climate change and — heaven help us — the economy. But do politicians and officials understand the limits of what these models can do? Are they all as good, or as bad, as each other? If not, how can we tell which is which?”

Tags:

I don’t agree with Malcolm Gladwell’s logic in diminishing the importance of satire, but I’m on board with him in this Grantland exchange with Bill Simmons about the hypocrisies in the discussion of performance-enhancing drugs:

Malcolm Gladwell:

As you know, I’ve had mixed feelings for years about doping. It’s not that I’m in favor of it. It’s just that I’ve never found the standard arguments against doping to be particularly compelling. So professional cyclists take EPO because they can rebuild their red blood cell count, in order to step up their training. I’m against ‘cheating’ when it permits people to take shortcuts. But remind me why I would be against something someone takes because they want to train harder?

Bill Simmons:

Or why blood doping is any different from ‘loading your body with tons of Toradol’ or ‘getting an especially strong cortisone shot’? I don’t know.

Malcolm Gladwell:

Exactly! Or take the so-called ‘treatment/enhancement’ distinction. The idea here is that there is a big difference between the drug that ‘treats’ some kind of illness or medical disorder and one, on the other hand, that ‘enhances’ some preexisting trait. There is a huge amount of literature on treatment/enhancement among scholars, and with good reason. Your health insurance company relies on this distinction, for example, when it decides what to cover. Open heart surgery is treatment. A nose job, which you pay for yourself, is enhancement. This principle is also at the heart of most anti-doping policies. Treatment is OK. Enhancement is illegal. That’s why Tommy John surgery is supposed to be OK. It’s treatment: You blow out your ulnar collateral ligament so you get it fixed.

But wait a minute! The tendons we import into a pitcher’s elbow through Tommy John surgery are way stronger than the ligaments that were there originally. There’s no way Tommy John pitches so well into his early forties without his bionic elbow. Isn’t that enhancement?”

Tags: ,

From Mark Pack’s well-rounded take on autonomous vehicles, which are being developed at an ever-accelerating pace and must now be a consideration during the planning of all long-range transportation projects:

“Think just how quickly driverless cars have developed in the last five years alone – and then think how long it takes to get planning permission, let alone build or fit out, a big public transport project. Public transport plans now should already be factoring in the high likelihood of a near future in which cars no longer need humans to drive them.

Some of the benefits like to accrue from this are brilliant – but do not require policy changes. A further improvement in road safety is likely for, as we have seen in other areas where automated machinery replaces humans in repetitive tasks, computers are more reliable, less sleepy and never drunk. Brilliant news for humanity (road deaths killed more people than genocides during the twentieth century after all), a useful saving for the NHS but not something which much knock-on policy impacts.

Other changes are likely to be more troubling.”

Tags:

No one outside of NYC literary circles may care about this, but over the last couple of weeks there’s been a debate in that world about the value of satire and its pesky little sibling, snark. It started with Tom Scocca’s Gawker essay “On Smarm” which argues that those opposed to impolite humor are really just trying to protect an unfair status quo that profits them. A few days later, Malcolm Gladwell’s New Yorker blog post “Being Nice Isn’t Really So Awful,” retorted that satire actually aids the powerful even if it’s aimed at them. Two quick thoughts starting in reverse order with Gladwell’s piece. 

1) There’s a gigantic pothole in Gladwell’s reasoning that satire is ineffectual and that more serious criticism is preferable. He quotes a famous Peter Cook line (via a Jonathan Coe essay) about “those wonderful Berlin cabarets which did so much to stop the rise of Hitler and prevent the outbreak of the Second World War.” Um, no, stage satire didn’t stop the Nazis, but you know what else didn’t prevent that horror? Serious criticism, op-eds and solemn political speeches. German resistance groups were likewise unable to stop Hitler’s ascent. Does that mean that serious criticism and protest are meaningless? Of course not. They just sadly didn’t work in this case. But they are good and useful things that have helped open eyes, hearts and minds in many other moments and so has humor.

Satire isn’t the main action but a call to action. It’s the weather report that tells us it’s pouring outside before most of us have yet taken notice of a cloud in the sky. (Though, no, it won’t unfold your umbrella for you.) It’s the first salvo, not the coup de grâce. It’s written about the present with an eye toward the future. And it doesn’t need to deflate dissent unless it’s written that way, and the best of it is not. There’s no measurement to quantify how much satire has helped accomplish, but it seems a trusty tool in the long march toward progress.

Ultimately, I think Gladwell is trying to knock down what he feels is a false narrative with a false one of his own.

2) That being said, I take an argument that there’s a dangerous effort to upend satire with the same seriousness as I take the so-called “War on Christmas.” Yes, there are some hypersensitive souls who confuse a punchline for an actual punch, but there has never been more satire or snark in the country than now, nor more channels, stages and outlets to practice this “dark” art. It’s under no threat and an argument that worries about it excessively seems hysterical. There is certainly no consensus against biting criticism. It, not smarm, is actually the hallmark of our times. I think that’s a reassuring thing.•

The opening of Scocca’s piece:

Last month, Isaac Fitzgerald, the newly hired editor of BuzzFeed’s newly created books section, made a remarkable but not entirely surprising announcement: He was not interested in publishing negative book reviews. In place of ‘the scathing takedown rip,’ Fitzgerald said, he desired to promote a positive community experience.

A community, even one dedicated to positivity, needs an enemy to define itself against. BuzzFeed’s motto, the attitude that drives its success, is an explicit ‘No haters.’ The site is one of the leading voices of the moment, thriving in the online sharing economy, in which agreeability is popularity, and popularity is value. (Upworthy, the next iteration, has gone ahead and made its name out of the premise.)

There is more at work here than mere good feelings. ‘No haters’ is a sentiment older and more wide-reaching than BuzzFeed. There is a consensus, or something that has assumed the tone of a consensus, that we are living, to our disadvantage, in an age of snark—that the problem of our times is a thing called ‘snark.'”

From Gladwell:

Earlier this year, in the London Review of Books, the English novelist Jonathan Coe published an essay titled ‘Sinking Giggling into the Sea.’ It is a review of a book about the mayor of London, Boris Johnson. And in the course of evaluating Johnson’s career, Coe observes that the tradition of satire in English cultural life has turned out to be profoundly conservative. What began in an anti-establishment spirit, he writes, ends up dissipating into a ‘culture of facetious cynicism.’ Coe quotes the comedian Peter Cook—’Britain is in danger of sinking giggling into the sea’—and continues:

The key word here is ‘giggling’ (or in some versions of the quotation, ‘sniggering’). Of the four Beyond the Fringe members, it’s always Peter Cook who is described as the comic genius, and like any genius he fully (if not always consciously) understood the limitations of his own medium. He understood laughter, in other words – and certainly understood that it is anything but a force for change. Famously, when opening his club, The Establishment, in Soho in 1961, Cook remarked that he was modelling it on ‘those wonderful Berlin cabarets which did so much to stop the rise of Hitler and prevent the outbreak of the Second World War’.

‘Laughter,’ Coe concludes, ‘is not just ineffectual as a form of protest … it actually replaces protest.’

Coe and Scocca are both interested in the same phenomenon: how modern cultural forms turn out to have unanticipated—and paradoxical—consequences. But they reach widely divergent conclusions. Scocca thinks that the conventions of civility and seriousness serve the interests of the privileged. Coe says the opposite. Privilege is supported by those who claim to subvert civility and seriousness. It’s not the respectful voice that props up the status quo; it is the mocking one.”

Tags: ,

« Older entries § Newer entries »