Science/Tech

You are currently browsing the archive for the Science/Tech category.

Garry Kasparov’s defeat at the hands–well, not exactly hands–of Deep Blue was supposed to have delivered a message to humans that we needed to dedicate ourselves to other things–but the coup de grace was ignored. In fact, computers have only enhanced our chess acumen, making it clear that thus far a hybrid is better than either carbon or silicon alone. In the wake of Computer Age child Magnus Carlsen becoming the greatest human player on Earth, Christopher Chabris and David Goodman of the Wall Street Journal look at the surprising resilience of chess in these digital times. The opening:

“In the world chess championship match that ended Friday in India, Norway’s Magnus Carlsen, the cool, charismatic 22-year-old challenger and the highest-rated player in chess history, defeated local hero Viswanathan Anand, the 43-year-old champion. Mr. Carlsen’s winning score of three wins and seven draws will cement his place among the game’s all-time greats. But his success also illustrates a paradoxical development: Chess-playing computers, far from revealing the limits of human ability, have actually pushed it to new heights.

The last chess match to get as much publicity as Mr. Carlsen’s triumph was the 1997 contest between then-champion Garry Kasparov and International Business Machines Corp.’s Deep Blue computer in New York City. Some observers saw that battle as a historic test for human intelligence. The outcome could be seen as an ‘early indication of how well our species might maintain its identity, let alone its superiority, in the years and centuries to come,’ wrote Steven Levy in a Newsweek cover story titled ‘The Brain’s Last Stand.’ 

But after Mr. Kasparov lost to Deep Blue in dramatic fashion, a funny thing happened: nothing.”•

_________________________________________

“In Norway, you’ve got two big sports–chess and sadness”:

Tags: , , ,

Edwin Heathcote of the Financial Times isn’t very high on the new book about Apple designer Jony Ive, noting that seamlessness makes for beautiful products but ineffable biographical subjects. An excerpt from his new review:

“As Kubrick’s filmic anticipation of the iPad makes clear, Ive’s devices have been imagined before. Think of Ettore Sottsass, the Italian who made Olivetti the Apple of its time, designing typewriters and early computers with flair. Or Dieter Rams, the German designer whose products for Braun defined the company and are among the most beautiful products of the 20th century (and whose designs profoundly influenced Ive, even down to the rounded corners). Ive is far from unique as a designer who is synonymous with his company. What is new is the ubiquity of the products and the way they have insinuated themselves into every aspect of our lives.

Apple’s products are so beautifully and mysteriously constructed (where are the joints and bolts?) that they somehow mirror the obsessiveness of this secretive corporation. All of which makes them difficult to write about. Arguably what is most interesting is why they have become such a success, the social, political, aesthetic and cultural context which they have slotted into – or remade. And why have other companies not managed to emulate Apple’s design-led model?”

Tags: , ,

I don’t think there was a conspiracy to kill President John F. Kennedy, and I can’t take anyone seriously who refers to Oliver Stone’s ridiculous JFK movie to argue the contrary. It’s not that a lot of people didn’t want him dead, but I don’t think Lee Harvey Oswald was the trigger man for any group. Oswald probably acted alone. He almost definitely wasn’t in cahoots with Cuba or Russia or any other foreign power. It’s somewhat possible he may have been acting in concert with American mob figures, but it’s doubtful, and there’s no good proof of any such cabal. Jack Ruby likewise probably acted alone in murdering Oswald, envisioning himself as a national hero for his deed. 

There is one interesting theory that can’t be completely dismissed: Perhaps the final bullet that struck and killed the President was an accidental discharge from a Secret Service agent. This idea has survived for three reasons: 1) The last bullet impacted differently than the first, causing an explosion of flesh 2) Some doubt Oswald’s ability for such pinpoint accuracy at such a distance with such a cheap weapon 3) Quite a few witnesses on the ground reported smelling gunpowder.

I don’t believe this theory, either. Ammo can react differently in different situations and a direct hit to the back from one angle will not necessarily create the same result as one to the head from another. Oswald was a highly trained marksman, and I think it’s very possible he could reach a target in a slow-moving vehicle. Bullets hitting more than one person and causing someone’s brain to explode might cause a smell that’s similar to gunpowder. There were also likely tires straining quickly in every direction which can cause a burning smell. And let’s remember that the witness closest to Oswald in the book depository distinctly heard three registers.

During the first 35 minutes of a recent Grantland podcast, Bill Simmons and Chris Connelly interview Bill James, who subscribes to the Secret Service theory. In addition to being one of baseball’s sabermetrics pioneers, James has written about the assassination in his book on true crime. I was disappointed by James’ stance in the wake of the Penn State pedophilia scandal, but he’s very sober-minded in this discussion. The only comment James makes in the podcast that I take issue with is his assertion that Oswald striking Kennedy more than once in a matter of seconds is tantamount to James himself being able to hit a home run off Roger Clemens. It’s a poor analogy. Oswald had a professional level of marksmanship and James does not have that level of athletic ability, especially in middle age. And James didn’t seem to be employing hyperbole. But it’s an interesting conversation overall.• Listen here.

Tags: , , , , ,

I watch Frontline the way most Americans watch slasher films and zombie TV dramas: to frighten the fuck out of myself. The recent episode, “Hunting the Nightmare Bacteria,” pointed out a yawning hole in the free market: Big Pharma companies have very few antibiotics in R&D because they’re expensive to develop and they’re supposed to be used as little as possible. It’s much more feasible to produce a diabetes or heart drug–something for long-term care. 

Of course, we actually haven’t been careful about restricting antibiotics, overprescribing them to humans in the past and currently practically pouring them into livestock. And the more we use these drugs, the less efficacy they possess. So the ones we have are losing effectiveness, and there are no answers in the pipeline. From Maryn McKenna’s Medium essay, “Imagining the Post-Antibiotics Future“:

“Predictions that we might sacrifice the antibiotic miracle have been around almost as long as the drugs themselves. Penicillin was first discovered in 1928 and battlefield casualties got the first non-experimental doses in 1943, quickly saving soldiers who had been close to death. But just two years later, the drug’s discoverer Sir Alexander Fleming warned that its benefit might not last. Accepting the 1945 Nobel Prize in Medicine, he said:

‘It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them… There is the danger that the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant.’

As a biologist, Fleming knew that evolution was inevitable: sooner or later, bacteria would develop defenses against the compounds the nascent pharmaceutical industry was aiming at them. But what worried him was the possibility that misuse would speed the process up. Every inappropriate prescription and insufficient dose given in medicine would kill weak bacteria but let the strong survive. (As would the micro-dose ‘growth promoters’ given in agriculture, which were invented a few years after Fleming spoke.) Bacteria can produce another generation in as little as twenty minutes; with tens of thousands of generations a year working out survival strategies, the organisms would soon overwhelm the potent new drugs.”

Tags: ,

Bruce Schneier, a security expert (online and offline), just did an Ask Me Anything at Reddit. The following is an exchange about post-9/11 airport security:

“Question:

I am of the opinion that our airport security is poorly designed and for the hassle passengers go through, we get minimal benefit. I feel like we react to specific circumstances to create an illusion of security and that perception is more important to the TSA than creating a constructive plan to deal with threats. I know you are a proponent of the fail well philosophy which accepts failure and tries to compartmentalize and minimize the damage. Based on this theory what should be the security steps that airports should be taking?

Bruce Schneier:

I think airport security should be rolled back to pre-9/11 levels, and all the money saved should be spent on things that work: intelligence, investigation, and emergency response.

Only two things have improved airplane security since 9/11: reinforcing the cockpit doors, and teaching passengers that they have to fight back. Everything else has been security theater.”

 

Tags:

You already know that I’m flummoxed that children can’t go into bars or buy cigarettes while they’re allowed to eat at fast-food restaurants. All three will equally set them up for unhealthy lives. Consenting adults should do what they want, but I don’t think McDonald’s and Wendy’s should be open to children.

On a completely different topic about fast-food places: There’s a wiseass article in Vice by Alison Stevenson about a test McDonald’s in California that’s supposed to be “futuristic,” allowing customers to order on iPads. The author has fun with the redundancy of the iPad and the employee currently doing the same tasks, but before long. the human element will likely be reduced. It’s another step in the automation of informal restaurants and cafes. An excerpt:

“This McDonald’s is the McDonald’s of the future. I’m not saying that just because it’s really clean and people are happy. I’m saying that because this McDonald’s has iPads! What do these iPads do? They are the tool with which you customize your burger order. With this magic iPad, you’re able to order such exotic menu items as an ‘artisan roll,’ and ‘guacamole.’ Yeah you heard me, a McDonald’s that serves guacamole. Welcome to the 21st century, fuckers. Obviously, little things like ‘clean dining areas,’ ‘friendly service,’ and ‘freedom of choice’ are not features that can be rolled out to every McDonald’s all at once. No, those things have to be ‘tested,’ and Laguna Niguel is the only place where you can enjoy the aforementioned amenities.”

Tags:

I mentioned Cliodynamics in a post yesterday, and the field’s founder, Peter Turchin, has a dark forecast about America’s future at Bloomberg. He sees economic inequality and other factors possibly renting us apart, even violently. As Jim McKay said when the horror of the 1972 Munich Olympics was complete, “Our greatest hopes and our worst fears are seldom realized.” But seldom doesn’t mean never. The opening:

“Complex human societies, including our own, are fragile. They are held together by an invisible web of mutual trust and social cooperation. This web can fray easily, resulting in a wave of political instability, internal conflict and, sometimes, outright social collapse.

Analysis of past societies shows that these destabilizing historical trends develop slowly, last many decades, and are slow to subside. The Roman Empire, Imperial China and medieval and early-modern England and France suffered such cycles, to cite a few examples. In the U.S., the last long period of instability began in the 1850s and lasted through the Gilded Age and the ‘violent 1910s.’

We now see the same forces in the contemporary U.S. Of about 30 detailed indicators I developed for tracing these historical cycles (reflecting popular well-being, inequality, social cooperation and its inverse, polarization and conflict), almost all have been moving in the wrong direction in the last three decades.

The roots of the current American predicament go back to the 1970s, when wages of workers stopped keeping pace with their productivity. The two curves diverged: Productivity continued to rise, as wages stagnated. The ‘great divergence‘ between the fortunes of the top 1 percent and the other 99 percent is much discussed, yet its implications for long-term political disorder are underappreciated. Battles such as the recent government shutdown are only one manifestation of what is likely to be a decade-long period.”

Tags:

In a new Economist essay, Adrian Wooldridge predicts that the tech sector will experience a backlash similar to what has been experienced by Wall Street and Big Oil, but I’m not convinced. A certain degree of blowback has already occurred, and it’s a deserved and healthy thing. Excesses of all kinds should be challenged, in Silicon Valley or anywhere else. Pre-philanthropy Bill Gates was a cutthroat jerk and Sean Parker is a narcissistic dunderhead. But the difference between bankers and techies is that the latter usually actually create something of value. And unlike fossil-fuel corporations, which damage the environment, the bigger tech companies (and many small start-ups) are aimed at making us greener. Those two factors will probably neutralize criticisms somewhat. From Wooldridge:

“Geeks have turned out to be some of the most ruthless capitalists around. A few years ago the new economy was a wide-open frontier. Today it is dominated by a handful of tightly held oligopolies. Google and Apple provide over 90% of the operating systems for smartphones. Facebook counts more than half of North Americans and Europeans as its customers. The lords of cyberspace have done everything possible to reduce their earthly costs. They employ remarkably few people: with a market cap of $290 billion Google is about six times bigger than GM but employs only around a fifth as many workers. At the same time the tech tycoons have displayed a banker-like enthusiasm for hoovering up public subsidies and then avoiding taxes. The American government laid the foundations of the tech revolution by investing heavily in the creation of everything from the internet to digital personal assistants. But tech giants have structured their businesses so that they give as little back as possible.”

Tags:

The above quote, not a fact obviously but an educated guess, was made by Princeton economist Angus Deaton during this week’s excellent EconTalk podcast. Host Russ Roberts and his guest talk about the topics covered in Deaton’s recent book, The Great Escape: Health, Wealth, and the Origins of Inequality: longevity, income disparity and the argument over whether investment in developing nations has made a real difference.

Great little facts about the hidden reasons for why we live longer. Example: In the early part of the 20th century, hotels didn’t change sheets between guests, which helped bacteria to thrive. There’s also discussion about how lifespans continue to grow with a Moore’s Law steadiness despite predictions to the contrary.

What’s left unsaid is that damage to environment or some calamity of disease or meteorite could halt progress in the quantity and quality of life. What are the odds of that? Are we prepared to prevent such doom?

Tags: ,

Vint Cerf, co-creator of the Internet, is saying what seems inarguable at this point–that technology has outpaced our capacity to control it, that privacy as we knew it isn’t coming back regardless of legislation, that we’re at the outset, for better or worse, of the new normal. From a BGR post by Brad Reed:

“While having a right to privacy sounds nice, the Internet’s co-creator thinks that it’s also unrealistic to expect your behavior to stay private if you engage in social networking and post through social media. Adweek’s Katy Bachman reports that during a panel at a Federal Trade Commission workshop on privacy in the age of wearable computers, tech industry legend Vint Cerf said that new technology means that ‘it will be increasingly difficult for us to achieve privacy’ and that ‘privacy may be an anomaly.'”

Tags: , ,

Howard Stern is a funny guy with a rare facility for psychology, but he lives within a bubble of wealth and self-absorption so he’s not always particularly attuned to current events. He did, however, recently point out astutely, when discussing the disappointing sales of Lady Gaga’s new record, that the music business clings to the past the same way the film industry did during the 1980s when it unsuccessfully sued VCR makers. You can’t ignore or legislate progress away. Some foresaw the end of record stores more than 30 years ago, so it’s not like the future sneaked up on record execs.

A newer company like Netflix realizes that a portable computing culture demands that you bring entertainment to people wherever they are, while more traditional companies like Blockbuster, weighed down by bricks and mortar, go out of business. When you’re an institution, it’s difficult to avoid oncoming doom even if you can see it racing towards you. 

From Stern:

“Nowadays isn’t every album a flop? The record business…you know I read this guy Bob Lefsetz and he made a point, he was saying that when computers came about, and we all use our computer for typing, the typewriter companies didn’t get out there and go, ‘Fuck the computer business, I’m gonna sue them.’ They either created a computer or a keyboard or something. They didn’t say we’re gonna fight progress and stop the computer from stealing out business. Now the music business unfortunately reacted in way that was like that. They were like, ‘Hey, we love the way we’re doing business. We like record stores, and the Internet is coming about and we’re going to sue it and fight it’ instead of adapting to the business and finding a new way of selling music. And what it’s going to take is some brilliant executive in the music business who figures out how to market things in today’s environment, who figures out how to use the Internet almost like it’s a radio station. You know, record companies thrived even though AM and FM were playing the music for free. But they figured out how to make that a marketing thing. I’m not in the music business, but there are ways to sell music. Maybe the whole concept of an album is completely outdated. Yes, artists have a hard-on for putting out an album like the Beatles would put out Sgt. Pepper’s and have a whole statement to make. Maybe in terms of selling music that has all passed us by, and that opportunity doesn’t exist. Maybe an artist has to figure out that they’ll put out a song once a month as opposed to putting out an entire album and let people put together their collection. I don’t know what the new paradigm should be. You used to put out an album and go out and promote it for a year and there was a way of doing business. Now that’s out the window. They’ve got to figure out a new way of doing business. The album might be a thing of the past. Life is progress. Things are going to change and you must adapt.”

______________________

“Always searching for records?”

Tags: ,

I haven’t owned a TV for awhile so I’m not up on the latest commercials, but this ad for Goldie Blox toys, which encourage girls to develop engineering skills, is amazing. It exhorts girls into tech the way the 1990s Nike “If You Let Me Play” campaign invited them to get in the game. And it joins Rube Goldberg to the Beastie Boys!

Nathaniel Rich has a post on the New Yorker blog about the field of disaster forecasting, which can be approached from many disciplines. (Even Cliodynamics, which focuses on mapping dynamics that are historical, can help us divine the future.). Of course, knowing doom is approaching isn’t the same thing as preventing it. From the post:

“The Philippines could have been better prepared, but the best preparation is no match for two-hundred-mile-per-hour winds.

Nevertheless, our knowledge of how disasters occur, and how they will occur in the future, has never been more sophisticated. We are now able to prophesy impending cataclysms with a specificity that would have been inconceivable just several years ago. Several factors have contributed to this progress: a growing public anxiety about disasters; advances in disciplines as disparate as computer science, fluid mechanics, and neuroscience; and an infusion of funding from governments, universities, and especially corporations, which have figured out that disaster planning saves money in the long run. But the field remains in its infancy. Disaster prediction—like disaster science, disaster economics, disaster-response technology, disaster art, disaster cinema, disaster lit—is a growth industry. All indications suggest a growth curve that will continue to steepen well into the next century. Disaster is big business, and its prophets will profit.

Milestones in the past year include the March publication, by a team of U.C.L.A. scientists, of a new computer model that predicts where the next global pandemic will originate.”

Tags:

In 2010, the last year of Benoit Mandelbrot’s life, Errol Morris pointed his Interrotron at the mathematician who recognized patterns in nature that nobody else did and gave us fractals. Morris himself often deals in fractals, chipping away pieces of his subject’s minds that perfectly represent the greater self. (Thanks Browser.)

Tags: ,

It’s not quite quantum computing, but computing that intuits your needs without your intentional prompts and adjusts accordingly is serious business or will likely become serious business. From “The Coming Age of Magical Computing,” by Om Malik at Fast Company:

“This idea of anticipatory computing is going to be the next big change in our relationship with computers. And it’s coming more quickly than you realize.

Look around the App Store and there are powerful illustrations emerging. The iPad app MindMeld, made by the startup Expect Labs, listens in on your conference call and starts to display relevant information based on what you’re talking about. When I’m speaking, you might see basic facts about me from my Wikipedia page. When the conversation turns to the latest Audi S4, MindMeld displays car photos and even a map showing the location of the closest dealer. By following along and adding context where it can, MindMeld can make a call more fruitful.

Cover–a brand-new app cofounded by Todd Jackson, who worked on such early experiments in anticipatory computing as Gmail’s Priority Inbox and Facebook’s News Feed–is a simple-looking replacement for your Android smartphone lock screen. Its secret is that it adapts based on your location. If you are in the office (which it learns from your Wi-Fi network’s address and location), it shows work-related apps such as Salesforce. If you are at home, ESPN and Netflix populate the launcher. ‘I am a firm believer that we will no longer have to worry about things we currently spend time trying to make work for us,’ Jackson says.

‘We will no longer have to worry about things we currently spend time trying to make work for us,’ says Jackson, CEO of Cover.

With a trend this big, Google and Apple are also spending millions racing to this future.”

Tags:

People are wary of the new as they should be, but sometimes we can be so circumspect about what’s arriving that we forget about the shortcoming of what’s already here, already familiar. Elon Musk, who has more of a vested interest in electric cars than practically anyone, argues against the idea that the new technology, even with several recent Tesla fires, is inordinately dangerous when compared to fossil-fuel counterparts. Am excerpt:

“In order to get to that end goal, big leaps in technology are required, which naturally invites a high level of scrutiny. That is fair, as new technology should be held to a higher standard than what has come before. However, there should also be some reasonable limit to how high such a standard should be, and we believe that this has been vastly exceeded in recent media coverage.

Since the Model S went into production last year, there have been more than a quarter million gasoline car fires in the United States alone, resulting in over 400 deaths and approximately 1,200 serious injuries (extrapolating 2012 NFPA data). However, the three Model S fires, which only occurred after very high-speed collisions and caused no serious injuries or deaths, received more national headlines than all 250,000+ gasoline fires combined. The media coverage of Model S fires vs. gasoline car fires is disproportionate by several orders of magnitude, despite the latter actually being far more deadly.”

Tags:

Mars One plans on sending astronauts to our neighboring planet in 2023, sans return ticket. Even if the mission doesn’t crash, the astronauts might–psychologically. From “Voyage of No Return,” by Peter Guest at the Ascender:

“After all that technical effort, the flaws in the project could not be in the technology or the financing. The biggest risk to Martian colonization could well be the astronauts themselves. If successful, they would face unprecedented levels of isolation and disconnection from the world, which in turn could lead to depression and severe psychological stress. While it might feel to would-be astronauts like [Timothy] Gatenby that seeing the Earth from above might keep them going for the long journey ahead—in fact the scientific literature, such that there is, suggests that gaining a perspective of Earth is one of the principal positive psychological benefits experienced by astronauts—the downsides are dangerous.

Researchers Michel Nicolas, Gro Mjeldheim Sandal, Karine Weiss and Anna Yusupova, from universities in France, Norway and Russia, studied the Mars500, a 105-day Mars simulation, and noted that over the course of the ‘journey’, participants demonstrated ‘significant’ deterioration in their emotional wellbeing.

A 2010 paper by Nick Kanas, professor of psychiatry of the University of California, noted that astronauts in long orbital missions show some signs of psychological distress, including depression, which could stem from a sense of dislocation and isolation. Candidates are intensely screened for psychiatric conditions and for their emotional resilience, as Mars One candidates would be, but even so, problems are manifest. In his study, Kanas notes, for example, the emergence of psychosomatic reactions—physical symptoms that are thought to have psychological roots.

‘For example, an on-orbit cosmonaut wrote in his diary that he experienced tooth pain following some anxious dreams he had of a tooth infection and his concern that nothing could be done about such an infection should it occur in space,’ Kanas writes.'”

Tags:

At Practical Ethics, Luke J. Davies presents some ideas about science historian James Burke’s recent predictions for the year 2100 (which I posted here). An excerpt:

“The future Burke describes is a far cry from where we are now. It is at once highly technologized—his predictions are informed by a belief in, and enthusiasm for, the promise of nano-technology—and strangely bucolic. He imagines that nano-fabricators (machines that, with very little input, will be able to make anything we want) will lead to a self-sufficiency that will make governments unnecessary. Rather than crowding into big cities, people will spread out more evenly and in small communities. Burke’s future is one in which there is no poverty, or illness, or want. It is a place where people take up gardening because it would be ‘essential for the comfort of their soul. [He imagines the] planet as a giant untouched wilderness dotted with gardens.’

It is interesting that Burke seems to envision a world in which a technological solution has been given to a seemingly human problem—that of greed. His future is one in which we haven’t changed. Technology has changed so that our desire for material goods can be sated in a way that is both sustainable and egalitarian. (We might ask, in Hobbesian fashion, whether this abundance of material objects would just make us emphasize necessarily positional goods even more than we do now). Of course, Burke might be wrong about the promise of nano-technology. But, it would surprise me if he were wrong about the role of technology itself.”

Tags: ,

As Google invests heavily in solar and other clean energies, Apple has quietly been the driving force in North Carolina’s solar growth. It makes sense: Why not control your energy and control your costs if you’re a company on that level? States can force better environmental standards and so can tech behemoths. From Katie Fehrenbacher at Gigaom:

“But absent from a lot of the public dialogue has been the one company that arguably has had a greater effect on bringing clean power to the state of North Carolina than any other: Apple. While the state’s utility has just now become more willing to supply clean energy to corporate customers, several years ago Apple took the stance that if clean power wasn’t going to be available from the local utility for its huge data center in Maiden, North Carolina, it would, quite simply, build its own.

In an unprecedented move — and one that hasn’t yet been repeated by other companies — Apple spent millions of dollars building two massive solar panel farms and a large fuel cell farm near its data center. These projects and are now fully operational and similar facilities (owned by utilities) have cost in a range of $150 million to $200 million to build. Apple’s are the largest privately-owned clean energy facilities in the U.S. and more importantly, they represent an entirely new way for an internet company to source and think about power.”

Tags:

You like to believe that India sending rockets into space or South Africa building soccer stadiums for international competitions will bring something meaningful to poor people in those countries: infrastructure, information, medicine, money. That’s questionable, but even if that occurs, it’s painful to crane your neck past the horrors to get a good view of the action. But most of the Western reporting about such events focuses on the safety and comfort of the tourists, not the at-risk locals.

The opening of “A Yellow Card,” a Grantland essay by Brian Phillips, an uncommonly graceful writer, about the grandeur of the World Cup being visited upon the poverty of Brazil:

“Three points make a trend, but in a World Cup year, two points are good enough. So here’s one: Early on the morning of October 29, 31-year-old Geisa Silva, a social worker with the Brazilian military police, found her husband’s backpack on their front porch in Rio de Janeiro. Joao Rodrigo Silva Santos was a retired professional soccer player, a journeyman who’d spent most of his career knocking around the Brazilian lower leagues; post-retirement, he ran a food shop in the city’s Realengo neighborhood. He hadn’t come home the night before, and Silva had been worried, jumping up at the sound of every car. Before dawn, she got ready to leave for her job with a police unit responsible for conducting an anti-gang crackdown. When she opened the front door, she saw the backpack. It contained her husband’s severed head.

And here’s point two: Four months earlier, on the afternoon of June 30, during a pickup soccer game in the northeastern Brazilian municipality of Pio XII, a 19-year-old amateur referee named Otavio Jordao da Silva Cantanhede showed a yellow card to his friend, a player named Josemir Santos Abreu. Abreu protested. A fight broke out. Cantanhede pulled a knife and stabbed Abreu twice. Abreu died on the way to the hospital. In retaliation, a group of Abreu’s friends attacked Cantanhede. Cantanhede was — I’m quoting the New York Times — ‘tied up, smashed in the face with a bottle of cheap sugarcane liquor, pummeled with a wooden stake, run over by a motorcycle and stabbed in the throat.’ Then his legs were sawed off. Then his head was cut off and mounted on a wooden post near the field.

And here’s a quick question, just an aside. How do you feel, hearing these stories? I don’t mean how do you think you’re supposed to feel; I mean how do you feel, in fact? Are you intrigued? Disturbed? Sad? Curious? Titillated, in the way that horrifying real-life stories can sometimes leave you titillated? You don’t have to answer. Just think about it.”

Tags:

World fairs regularly introduced us to greatness–the telephone, the Ferris wheel, the elevator. But they grew impractical as people became more connected, as the shock of the new came directly to you and I wherever we were. In an Aeon essay, Venkatesh Rao makes a convincing case that Silicon Valley is the new world’s fair, one that never closes. An excerpt about the technological significance of the fairs:

“The history of technology is the story of transitions that worked, like the Industrial Revolution. It entered adolescence and began breaking free of the pre-modern social order at the 1851 Great Exhibition in London. Even as the worldwide mercantilist social order led by Britain began to unravel, the modern industrial social order began to take shape in America.

By the time of the 1967 Montreal Expo, the scenes were safely sequestered again within Cold War institutions, after the world had been violently transformed through great wars, thousands of inventions, and a massive reordering of society along urban lines.

These fairs were equal parts technological debutante balls, theaters of wild futurist speculation, and pure circus entertainment. Cities vied to host them to signal their arrival into industrial modernity. Nations used them as public throwdowns. Corporations used them to spar over emerging markets. Artists, urban planners and architects used them to hawk entire imagined lifestyles.

It was through world fairs that a rapidly developing US announced its arrival on the global stage. From London in 1851, when it stole Britain’s thunder, to Chicago in 1893, when it formally claimed Great Power status, the young nation had taught the world about everything from bloody mechanised killing and newspaper circulation wars to electric lighting and manufacturing with interchangeable parts.

But beneath the pageantry and posturing, these fairs were more than technological Olympics. They spawned both the enduring mainstream folkways of modernity, such as suburban living and business-class air travel, as well as its dead-end subcultures, such as the world of flying-car loyalists. The fairs could do this because, fundamentally, they were large-scale exercises in what futurists call design fiction: indirect explorations of possible futures mediated by speculative, but tangible artifacts.

Tags:

While Apollo 11 traveled to the moon and back in 1969, the astronauts were treated each day to a six-minute newscast from Mission Control about the happenings on Earth. Here’s one that was transcribed in Norman Mailer’s Of a Fire on the Moon, which made space travel seem quaint by comparison:

Washington UPI: Vice President Spiro T. Agnew has called for putting a man on Mars by the year 2000, but Democratic leaders replied that priority must go to needs on earth…Immigration officials in Nuevo Laredo announced Wednesday that hippies will be refused tourist cards to enter Mexico unless they take a bath and get haircuts…’The greatest adventure in the history of humanity has started,’ declared the French newspaper Le Figaro, which devoted four pages to reports from Cape Kennedy and diagrams of the mission…Hempstead, New York: Joe Namath officially reported to the New York Jets training camp at Hofstra University Wednesday following a closed-door meeting with his teammates over his differences with Pro Football Commissioner Pete Rozelle…London UPI: The House of Lords was assured Wednesday that a major American submarine would not ‘damage or assault’ the Loch Ness monster.”

Tags: ,

The transportation revolution could have meant smart roads or smart cars, and it ended up being the latter. Smart roads were easy to devise but were prohibitively expensive to build on a grand scale. Smart cars (driverless ones, that is) were difficult to devise but if perfected could be retrofitted to any driving surface. From “Auto Learning,” Burkhard Bilger’s New Yorker article about the history of driverless cars and the role of machine learning in the sector’s development, a passage about the original car-road question:

“Almost from the beginning, the field divided into two rival camps: smart roads and smart cars. General Motors pioneered the first approach in the late nineteen-fifties. Its Firebird III concept car—shaped like a jet fighter, with titanium tail fins and a glass-bubble cockpit—was designed to run on a test track embedded with an electrical cable, like the slot on a toy speedway. As the car passed over the cable, a receiver in its front end picked up a radio signal and followed it around the curve. Engineers at Berkeley later went a step further: they spiked the track with magnets, alternating their polarity in binary patterns to send messages to the car—’Slow down, sharp curve ahead.’ Systems like these were fairly simple and reliable, but they had a chicken-and-egg problem. To be useful, they had to be built on a large scale; to be built on a large scale, they had to be useful. ‘We don’t have the money to fix potholes,’ Levandowski says. ‘Why would we invest in putting wires in the road?’

Smart cars were more flexible but also more complex. They needed sensors to guide them, computers to steer them, digital maps to follow. In the nineteen-eighties, a German engineer named Ernst Dickmanns, at the Bundeswehr University in Munich, equipped a Mercedes van with video cameras and processors, then programmed it to follow lane lines. Soon it was steering itself around a track. By 1995, Dickmanns’s car was able to drive on the Autobahn from Munich to Odense, Denmark, going up to a hundred miles at a stretch without assistance. Surely the driverless age was at hand! Not yet. Smart cars were just clever enough to get drivers into trouble. The highways and test tracks they navigated were strictly controlled environments. The instant more variables were added—a pedestrian, say, or a traffic cop—their programming faltered. Ninety-eight per cent of driving is just following the dotted line. It’s the other two per cent that matters.”

_____________________________________

“Cars without steering wheels,” from the 1950s:

Tags:

Google Glass, at least in its current form, is unlikely to gain traction–too geeky, too creepy–but small and powerful cameras on drones and autonomous machines will only grow more ubiquitous. That seems inevitable. The opening ofEvery Step You Take,from the Economist:

“‘THIS season there is something at the seaside worse than sharks,’ declared a newspaper in 1890. ‘It is the amateur photographer.’ The invention of the handheld camera appalled 19th-century society, as did the ‘Kodak fiends’ who patrolled beaches snapping sunbathers.

More than a century later, amateur photography is once more a troubling issue. Citizens of rich countries have got used to being watched by closed-circuit cameras that guard roads and cities. But as cameras shrink and the cost of storing data plummets, it is individuals who are taking the pictures.

Through a Glass, darkly

Some 10,000 people are already testing a prototype of Google Glass, a miniature computer worn like spectacles. It aims to replicate all the functions of a smartphone in a device perched on a person’s nose. Its flexible frame holds both a camera and a tiny screen, and makes it easy for users to take photos, send messages and search for things online.

Glass may fail, but a wider revolution is under way. In Russia, where insurance fraud is rife, at least 1m cars already have cameras on their dashboards that film the road ahead. Police forces in America are starting to issue officers with video cameras, pinned to their uniforms, which record their interactions with the public. Collar-cams help anxious cat-lovers keep tabs on their wandering pets. Paparazzi have started to use drones to photograph celebrities in their gardens or on yachts. Hobbyists are even devising clever ways to get cameras into space.”

The decentralization of media has given us the Kardashians, holy fuck, but it’s also opened up an infinite number of channels for new voices, many of them comic. In the Guardian, Anna Holmes points out that the Internet has provided women a platform for feminist-tinged wit, which is great because comedy stems from dissatisfaction and the must put-upon people are often the funniest. An excerpt:

“This outpouring, which can be found in print, pop culture and all over social media, has been fuelled by any number of things. Among them are the democratising nature of the internet, the inclusion of new and previously marginalised voices and the fact that many women are not only very tired of being treated like second-class citizens but are very funny about it.

This may come as a surprise to some, because feminism and discussions of gender politics have rarely, if ever, been celebrated for their embrace of the farcical or the witty. In fact, an accusation of humourlessness has remained one of the most pervasive accusations levelled against those involved in agitating against sexism and misogyny.

You might recall Christopher Hitchens’s infamous essay ‘Why women aren’t funny,’ published in Vanity Fair. The late polemicist ended up undermining his own argument for male superiority by explaining that ‘humour, if we are to be serious about it, arises from the ineluctable fact that we are all born into a losing struggle.’ And last year, in a disappointing interview with The Daily Show host Jon Stewart, the normally perceptive comedian Louis CK alleged that comedians and feminists are ‘natural enemies.'”

Tags: , , ,

« Older entries § Newer entries »