Urban Studies

You are currently browsing the archive for the Urban Studies category.

Those who still believe privacy can be preserved by legislation either haven’t thought through the realities or are deceiving themselves. Get ready for your close-up because it’s not the pictures that are getting smaller, but the cameras. Tinier and Tinier. Soon you won’t even notice them. And they can fly.

I have no doubt the makers of the Nixie, the wristwatch-drone camera, have nothing but good intentions, but not everyone using it will. From Joseph Flaherty at Wired:

“Being able to wear the drone is a cute gimmick, but it’s powerful software packed into a tiny shell could set Nixie apart from bargain Brookstone quadcopters. Expertise in motion-prediction algorithms and sensor fusion will give the wrist-worn whirlybirds an impressive range of functionality. A ‘Boomerang mode’ allows Nixie to travel a fixed distance from its owner, take a photo, then return. ‘Panorama mode’ takes aerial photos in a 360° arc. ‘Follow me’ mode makes Nixie trail its owner and would capture amateur athletes in a perspective typically reserved for Madden all-stars. ‘Hover mode’ gives any filmmaker easy access to impromptu jib shots. Other drones promise similar functionality, but none promise the same level of portability or user friendliness.

‘We’re not trying to build a quadcopter, we’re trying to build a personal photographer,’ says Jovanovic.

A Changing Perspective on Photography

[Jelena] Jovanovic and her partner Christoph Kohstall, a Stanford postdoc who holds a Ph.D. in quantum physics and a first-author credit in the journal Nature, believe photography is at a tipping point.

Early cameras were bulky, expensive, and difficult to operate. The last hundred years have produced consistently smaller, cheaper, and easier-to-use cameras, but future developments are forking. Google Glass provides the ultimate in portability, but leaves wearers with a fixed perspective. Surveillance drones offer unique vantage points, but are difficult to operate. Nixie attempts to offer the best of both worlds.”•

Tags: , ,

I previously posted some stuff about driverless-car testing in a mock cityscape in Ann Arbor, Michigan, which might seem unnecessary given Google’s regular runs on actual streets and highways. But here’s an update on the progress from “Town Built for Driverless Cars,” by Will Knight at Technology Review:

“A mocked-up set of busy streets in Ann Arbor, Michigan, will provide the sternest test yet for self-driving cars. Complex intersections, confusing lane markings, and busy construction crews will be used to gauge the aptitude of the latest automotive sensors and driving algorithms; mechanical pedestrians will even leap into the road from between parked cars so researchers can see if they trip up onboard safety systems.

The urban setting will be used to create situations that automated driving systems have struggled with, such as subtle driver-pedestrian interactions, unusual road surfaces, tunnels, and tree canopies, which can confuse sensors and obscure GPS signals.

‘If you go out on the public streets you come up against rare events that are very challenging for sensors,’ says Peter Sweatman, director of the University of Michigan’s Mobility Transformation Center, which is overseeing the project. ‘Having identified challenging scenarios, we need to re-create them in a highly repeatable way. We don’t want to be just driving around the public roads.'”

Tags:

The Mudd Club was a cabaret institution in New York for a few years in the late-’70s and early ’80s, the edgier little cousin to Studio 54, which wasn’t exactly Disneyland. An excerpt from a 1979 People article which includes a holy shit! quote from Andy Warhol:

“Ever on the prowl for outrageous novelty, New York’s fly-by-night crowd of punks, posers and the ultra hip has discovered new turf on which to flaunt its manic chic. It is the Mudd Club, a dingy disco lost among the warehouses of lower Manhattan. By day the winos skid by without a second glance. But come midnight (the opening time), the decked-out decadents amass 13 deep. For sheer kinkiness, there has been nothing like it since the cabaret scene in 1920s Berlin.

In just six months the Mudd has made its uptown precursor, Studio 54, seem almost passé and has had to post a sentry on the sidewalk. The difference is that the Mudd doesn’t have a velvet rope but a steel chain. Such recognizable fun-lovers as David Bowie, Mariel Hemingway, Diane von Furstenburg and Dan Aykroyd are automatically waved inside. For the rest, the club picks its own like some sort of perverse trash compactor. The kind of simple solution employed by U.S. gas stations is out of the question: At the Mudd, every night is odd. Proprietor Steve Mass, 35, admits that ‘making a fashion statement’ is the criterion. That means a depraved version of the audience of Let’s Make a Deal. One man gained entrance simply by flashing the stump of his amputated arm.

The action inside varies from irreverent to raunch. Andy Warhol is happy to have found a place, he says, ‘where people will go to bed with anyone—man, woman or child.’ Some patrons couldn’t wait for bedtime, and the management has tried to curtail sex in the bathrooms.”

Tags: ,

Rust never sleeps, and Walt Disney, even with all his great success and grand imagination, wasn’t immune to the quiet terrors of life any more than the rest of us. Almost two decades before he built his first safe and secure family theme park in California, the Hollywood house he’d purchased for his parents was invaded by a silent killer. Two articles follow from the Brooklyn Daily Eagle.

_____________________________

From the November 27, 1938 edition:

 _____________________________

From the May 16, 1954 edition:

 

Tags: , ,

For productivity to increase, labor costs must shrink. That’s fine provided new industries emerge to accommodate workers, but that really isn’t what we’ve seen so far in the Technological Revolution, the next great bend in the curve, as production and wages haven’t boomed. It’s been the trade of a taxi medallion for a large pink mustache. More convenient, if not cheaper (yet), for the consumer, but bad for the drivers.

Perhaps that’s because we’re at the outset of a slow-starting boom, the Second Machine Age, or perhaps what we’re going through refuses to follow the form of the Industrial Revolution. Maybe it’s the New Abnormal. The opening of the Economist feature, “Technology Isn’t Working“: 

“IF THERE IS a technological revolution in progress, rich economies could be forgiven for wishing it would go away. Workers in America, Europe and Japan have been through a difficult few decades. In the 1970s the blistering growth after the second world war vanished in both Europe and America. In the early 1990s Japan joined the slump, entering a prolonged period of economic stagnation. Brief spells of faster growth in intervening years quickly petered out. The rich world is still trying to shake off the effects of the 2008 financial crisis. And now the digital economy, far from pushing up wages across the board in response to higher productivity, is keeping them flat for the mass of workers while extravagantly rewarding the most talented ones.

Between 1991 and 2012 the average annual increase in real wages in Britain was 1.5% and in America 1%, according to the Organisation for Economic Co-operation and Development, a club of mostly rich countries. That was less than the rate of economic growth over the period and far less than in earlier decades. Other countries fared even worse. Real wage growth in Germany from 1992 to 2012 was just 0.6%; Italy and Japan saw hardly any increase at all. And, critically, those averages conceal plenty of variation. Real pay for most workers remained flat or even fell, whereas for the highest earners it soared.

It seems difficult to square this unhappy experience with the extraordinary technological progress during that period, but the same thing has happened before. Most economic historians reckon there was very little improvement in living standards in Britain in the century after the first Industrial Revolution. And in the early 20th century, as Victorian inventions such as electric lighting came into their own, productivity growth was every bit as slow as it has been in recent decades.

In July 1987 Robert Solow, an economist who went on to win the Nobel prize for economics just a few months later, wrote a book review for the New York Times. The book in question, The Myth of the Post-Industrial Economy by Stephen Cohen and John Zysman, lamented the shift of the American workforce into the service sector and explored the reasons why American manufacturing seemed to be losing out to competition from abroad. One problem, the authors reckoned, was that America was failing to take full advantage of the magnificent new technologies of the computing age, such as increasingly sophisticated automation and much-improved robots. Mr Solow commented that the authors, ‘like everyone else, are somewhat embarrassed by the fact that what everyone feels to have been a technological revolution…has been accompanied everywhere…by a slowdown in productivity growth.'”

Anne Frank famously wrote, “Despite everything, I believe that people are really good at heart,” and if she could be hopeful, how can we feel grounded by despondency? A sneakily cheerful side to the growing cadre of scientists and philosophers focused on existential risks that could drive human extinction is that if we make it through the next century, we may have unbridled abundance. From Aaron Labaree at Salon:

“When we think about general intelligence,’ says Luke Muelhauser, Executive Director at MIRI, ‘that’s a meta-technology that gives you everything else that you want — including really radical things that are even weird to talk about, like having our consciousness survive for thousands of years. Physics doesn’t outlaw those things, it’s just that we don’t have enough intelligence and haven’t put enough work into the problem … If we can get artificial intelligence right, I think it would be the best thing that ever happened in the universe, basically.’

A surprising number of conversations with experts in human extinction end like this: with great hope. You’d think that contemplating robot extermination would make you gloomy, but it’s just the opposite. As [Martin] Rees explains, ‘What science does is makes one aware of the marvelous potential of life ahead. And being aware of that, one is more concerned that it should not be foreclosed by screwing up during this century.’ Concern over humanity’s extermination at the hands of nanobots or computers, it turns out, often conceals optimism of the kind you just don’t find in liberal arts majors. It implies a belief in a common human destiny and the transformative power of technology.

‘The stakes are very large,’ [Nick] Bostrom told me. ‘There is this long-term future that could be so enormous. If our descendants colonized the universe, we could have these intergalactic civilizations with planetary-sized minds thinking and feeling things that are beyond our ken, living for billions of years. There’s an enormous amount of value that’s on the line.’

It’s all pretty hypothetical for now.•

Tags: ,

I don’t know if it will happen within ten years–though that’s not an outrageous time frame–but 3-D printing will automate much of the restaurant process, making jobs vanish, and will also be common in homes as prices fall. The opening of “Is 3-D Printing the Next Microwave?” by Jane Dornbusch at the Boston Globe:

“Picture the dinner hour in a decade: As you leave work, you pull up an app (assuming we still use apps) on your phone (or your watch!) that will direct a printer in your kitchen to whip up a batch of freshly made ravioli, some homemade chicken nuggets for the kids, and maybe a batch of cookies, each biscuit customized to meet the nutritional needs of different family members. 

It sounds like science fiction, but scientists and engineers are working on 3-D printing, and the food version of the 3-D printer is taking shape. Don’t expect it to spin out fully cooked meals anytime soon. For now, the most popular application in 3-D food printing seems to be in the decidedly low-tech area of cake decoration.

Well, not just cake decoration, but sugary creations of all kinds. The Sugar Lab is the confectionary arm of 3-D printing pioneer 3D Systems, and it expects to have the ChefJet, a 3-D food printer, available commercially later this year. Though tinkerers have been exploring the possibilities of 3-D food printing for a few years, and another food printer, Natural Machines’ Foodini, is slated to appear by year’s end, 3D Systems says the ChefJet is the world’s first 3-D food printer.

Like so many great inventions, the ChefJet came about as something of an accident, this one performed by husband-and-wife architecture grad students Liz and Kyle von Hasseln a couple of years ago. At the Southern California Institute of Architecture, the von Hasselns used a 3-D printer to create models. Intrigued by the process, Liz von Hasseln says, ‘We bought a used printer and played around with different materials to see how to push the technology. One thing we tried was sugar. We thought if we altered the machine a bit and made it all food safe and edible, we could push into a new space.’ More tweaking ensued, and the ChefJet was born.”

___________________________

Walter Cronkite presents the kitchen of 2001 in 1967:

Tags: , ,

I haven’t yet read Walter Isaacson’s new Silicon Valley history, The Innovators, but I would be indebted if it answers the question of how much Gary Kildall’s software was instrumental to Microsoft’s rise. Was Bill Gates and Paul Allen’s immense success built on intellectual thievery? Has the story been mythologized beyond realistic proportion? An excerpt from Brendan Koerner’s New York Times review of the book:

“The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities ‘shared a resistance to power elites and a desire to control their own access to information.’ Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he’s accustomed to writing biographies of men who operated inside the corridors of power — Benjamin Franklin, Henry ­Kissinger, Jobs — Isaacson seems a bit baffled by committed outsiders like ­Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of ’60s ­counterculture.

Yet this minor shortcoming is quickly forgiven when The Innovators segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk — a spoiled brat and compulsive gambler who ‘was rebellious just for the hell of it.’ Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm — in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine’s MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian.”

Tags: , ,

Selfies, the derided yet immensely popular modern portraiture, draw ire because of narcissism and exhibitionism, of course, but also because anyone can take them and do so ad nauseum. It’s too easy and available, with no expertise or gatekeeper necessary. The act is magalomania, sure, but it’s also democracy, that scary, wonderful rabble. From, ultimately, a defense of the self-directed shot by Douglas Coupland in the Financial Times:

“Selfies are the second cousin of the air guitar.

Selfies are the proud parents of the dick pic.

Selfies are, in some complex way, responsible for the word ‘frenemy.’

I sometimes wonder what selfies would look like in North Korea.

Selfies are theoretically about control – or, if you’re theoretically minded, they’re about the illusion of self-control. With a selfie some people believe you’re buying into a collective unspoken notion that everybody needs to look fresh and flirty and young for ever. You’re turning yourself into a product. You’re abdicating power of your sexuality. Or maybe you’re overthinking it – maybe you’re just in love with yourself.

I believe that it’s the unanticipated side effects of technology that directly or indirectly define the textures and flavours of our eras. Look at what Google has already done to the 21st century. So when smartphones entered the world in 2002, I think that if you gathered a group of smart media-savvy people in a room with coffee and good sandwiches, before the end of the day, the selfie could easily have been forecast as an inevitable smartphone side effect. There’s actually nothing about selfies that feels like a surprise in any way. The only thing that is surprising is the number of years it took us to isolate and name the phenomenon. I do note, however, that once the selfie phenomenon was named and shamed, selfies exploded even further, possibly occupying all of those optical-fibre lanes of the internet that were once occupied by Nigerian princes and ads for penis enlargement procedures.”

 

Tags:

From the August 18, 1889 Brooklyn Daily Eagle:

“‘Gloves which are sold as kid are often made of human skin,’ said Dr. Mark L. Nardyz, the Greek physician, of 716 Pine Street, yesterday. ‘The skin on the breast,’ continued the physician, ‘is soft and pliable and may be used for the making of gloves. When people buy gloves they never stop to question about the material of which they are made. The shopkeeper himself may be in ignorance, and the purchaser has no means of ascertaining whether the material is human skin or not. The fact is the tanning of human skin is extensively carried on in France and Switzerland. The product is manufactured into gloves, and these are imported into this country. Thus you see a person may be wearing part of a distant relative’s body and not know it.'”

Tags:

The end is near, roughly speaking. A number of scientists and philosophers, most notably Martin Rees and Nick Bostrom, agonize over the existential risks to humanity that might obliterate us long before the sun dies. A school of thought has arisen over the End of Days. From Sophie McBain in the New Statesman (via 3 Quarks Daily):

“Predictions of the end of history are as old as history itself, but the 21st century poses new threats. The development of nuclear weapons marked the first time that we had the technology to end all human life. Since then, advances in synthetic biology and nanotechnology have increased the potential for human beings to do catastrophic harm by accident or through deliberate, criminal intent.

In July this year, long-forgotten vials of smallpox – a virus believed to be ‘dead’ – were discovered at a research centre near Washington, DC. Now imagine some similar incident in the future, but involving an artificially generated killer virus or nanoweapons. Some of these dangers are closer than we might care to imagine. When Syrian hackers sent a message from the Associated Press Twitter account that there had been an attack on the White House, the Standard & Poor’s 500 stock market briefly fell by $136bn. What unthinkable chaos would be unleashed if someone found a way to empty people’s bank accounts?

While previous doomsayers have relied on religion or superstition, the researchers at the Future of Humanity Institute want to apply scientific rigour to understanding apocalyptic outcomes. How likely are they? Can the risks be mitigated? And how should we weigh up the needs of future generations against our own?

The FHI was founded nine years ago by Nick Bostrom, a Swedish philosopher, when he was 32. Bostrom is one of the leading figures in this small but expanding field of study.”

Tags: , ,

Speaking of human laborers being squeezed: Open Source with Christopher Lydon has an episode called “The End of Work,” with two guests, futurist Ray Kurzweil and MIT economist Andrew McAfee. A few notes.

  • McAfee sees the Technological Revolution as doing for gray matter what the Industrial Revolution did for muscle fiber, but on the way to a world of wealth without toil–a Digital Athens–the bad news is the strong chance of greater income inequality and decreased opportunities for many. Kodak employed 150,000; Instagram a small fraction of that. With the new technologies, destruction (of jobs) outpaces creation. Consumers win, but Labor loses.
  • Kurzweil is more hopeful in the shorter term than McAfee. He says we have more jobs and more gratifying ones today than 100 years ago and they pay better. We accomplish more. Technology will improve us, make us smarter, to meet the demands of a world without drudgery. It won’t be us versus the machines, but the two working together. The majority of jobs always go away, most of the jobs today didn’t exist so long ago. New industries will be invented to provide work. He doesn’t acknowledge a painful period of adjustment in distribution before abundance can reach all.

Tags: , ,

Sure, we have phones that are way nicer now, but the Technological Revolution has largely been injurious to anyone in the Labor market, and things are going to get worse, at least in the near and mid term. A free-market society that is highly automated isn’t really very free. Drive for Uber until autonomous cars can take over the wheel, you’re told, or rent a spare room on Airbnb–make space for yourself on the margins through the Sharing Economy. But there’s less to share for most people. From an Economist report:

“Before the horseless carriage, drivers presided over horse-drawn vehicles. When cars became cheap enough, the horses and carriages had to go, which eliminated jobs such as breeding and tending horses and making carriages. But cars raised the productivity of the drivers, for whom the shift in technology was what economists call ‘labour-augmenting.’ They were able to serve more customers, faster and over greater distances. The economic gains from the car were broadly shared by workers, consumers and owners of capital. Yet the economy no longer seems to work that way. The big losers have been workers without highly specialised skills.

The squeeze on workers has come from several directions, as the car industry clearly shows. Its territory is increasingly encroached upon by machines, including computers, which are getting cheaper and more versatile all the time. If cars and lorries do not need drivers, then both personal transport and shipping are likely to become more efficient. Motor vehicles can spend more time in use, with less human error, but there will be no human operator to share in the gains.

At the same time labour markets are hollowing out, polarising into high- and low-skill occupations, with very little employment in the middle. The engineers who design and test new vehicles are benefiting from technological change, but they are highly skilled and it takes remarkably few of them to do the job. At Volvo much of the development work is done virtually, from the design of the cars to the layout of the production line. Other workers, like the large numbers of modestly skilled labourers that might once have worked on the factory floor, are being squeezed out of such work and are now having to compete for low-skill and low-wage jobs.

Labour has been on the losing end of technological change for several decades.”

"My cat urinated on it."

“My cat urinated on it.”

Free couch (Linden)

I have a full size couch I’m giving away because my cat urinated on it. I know there are cleaning solutions available to eliminate the odor at the pet stores but I’m moving and don’t want to take it. Free if interested. A truck will be needed to take it out. It’s solid and comfortable.

"My cat urinated on it."

Space-travel enthusiast and labor organizer David Lasser was one of the first Americans to champion a mission to the moon, and one of the most influential. His 1931 book, The Conquest of Space, suggested such a rocket voyage was possible, not fanciful. In Lasser’s 1996 New York Times obituary, Arthur C. Clarke said of the then-65-year-old volume: “[It was] the first book in the English language to explain that space travel wasn’t just fiction…[it was] one of the turning points in my life — and I suspect not only of mine.”

While an article about the book’s publication in the October 6, 1931 Brooklyn Daily Eagle took seriously Lasser’s vision of rocket-powered airline travel–from New York to Paris in one hour!–it gave less credence to his moonshot scenario.

 

Tags: ,

From the Overcoming Bias post in which economist Robin Hansen comments on Debora MacKenzie’s New Scientist article “The End of Nations,” a piece which wonders about, among other things, whether states in the modern sense predated the Industrial Revolution:

“An interesting claim: the nation-state didn’t exist before, and was caused by, the industrial revolution. Oh there were empires before, but most people didn’t identify much with empires, or see empires as much influencing their lives. In contrast people identify with nation-states, which they see as greatly influencing their lives. More:

Before the late 18th century there were no real nation states. … If you travelled across Europe, no one asked for your passport at borders; neither passports nor borders as we know them existed. People had ethnic and cultural identities, but these didn’t really define the political entity they lived in. …

Agrarian societies required little actual governing. Nine people in 10 were peasants who had to farm or starve, so were largely self-organising. Government intervened to take its cut, enforce basic criminal law and keep the peace within its undisputed territories. Otherwise its main role was to fight to keep those territories, or acquire more. … Many eastern European immigrants arriving in the US in the 19th century could say what village they came from, but not what country: it didn’t matter to them. … Ancient empires are coloured on modern maps as if they had firm borders, but they didn’t. Moreover, people and territories often came under different jurisdictions for different purposes.

Such loose control, says Bar-Yam, meant pre-modern political units were only capable of scaling up a few simple actions such as growing food, fighting battles, collecting tribute and keeping order. …

The industrial revolution … demanded a different kind of government. … ‘In 1800 almost nobody in France thought of themselves as French. By 1900 they all did.’ … Unlike farming, industry needs steel, coal and other resources which are not uniformly distributed, so many micro-states were no longer viable. Meanwhile, empires became unwieldy as they industrialised and needed more actual governing. So in 19th-century Europe, micro-states fused and empires split.

These new nation states were justified not merely as economically efficient, but as the fulfilment of their inhabitants’ national destiny. A succession of historians has nonetheless concluded that it was the states that defined their respective nations, and not the other way around.”

Tags:

It’s a heartbreaker watching what’s happening to the New York Times these days, and the latest layoffs are just the most recent horrible headline. The Magazine is currently a green shoot, with its bright new editor, Jake Silverstein, and a boost to staffing, but that section is the outlier. The business can’t continue to suffer without being joined by the journalism. You just hope the company is ultimately sold to someone great.

At the Washington Post, not much has changed dramatically since Jeff Bezos bought the Graham family jewel, despite some executive shuffles and new hires. Does Bezos have a long-term plan? Does he have any plan? Does it really matter in the intervening period, since he can afford to wait for everyone else to fall and position his publication as the inheritor? From Dylan Byers at Politico:

“Despite expectations, Bezos himself had never promised a reinvention. ‘There is no map, and charting a path ahead will not be easy,’ he wrote in his first memo to Post staff in August of last year. Still, his reputation preceded him: With Amazon, he had revolutionized not just the book-selling business but the very means and standards of online shopping. He was planning ambitious new initiatives like drone delivery. Surely, this man had the silver bullet to save the Washington Post, and perhaps the newspaper industry.

Bezos, who declined to be interviewed for this story, is holding his cards close to his chest. He has no influence on the editorial side, according to [Exceutive Editorm Martin] Baron, but is focused on ‘broader strategic efforts.’

If Bezos has any innovative digital initiatives in the works, they’re being formed not in Washington but in New York. In March, the Post launched a Manhattan-based design and development office called WPNYC, which is focused on improving the digital experience and developing new advertising products.

‘Jeff’s preoccupation isn’t editorial, it’s delivery,’ one Post staffer said of WPNYC. ‘He wants to change the way people receive, read and experience the news. The only problem is we still don’t know what that looks like.'”

Tags: , , ,

Aiming to make surgical invasion even more minimal, robots are being devised that can slide into small openings and perform currently messy operations. From Sarah Laskow at the Atlantic:

“In the past few years, surgeons have been pushing to make these less invasive surgeries almost entirely invisible. Instead of cutting a tiny window in the outside of the body, they thought, why not cut one inside? Surgeons would first enter a person’s body through a ‘natural orifice’ and make one small incision, through which to access internal organs. The end result of this idea was that, in 2009, a surgeon removed a woman’s kidney through her vagina.

Few surgeons were convinced this was actually an improvement though. Instead, they have focused on minimizing the number of tiny incisions needed to perform surgery. Single-site surgery requires just one ‘port’ into a body.

A team of surgeons at Columbia, for instance, is working on a small robotic arm—minuscule, when compared to the da Vinci system—that can sneak into one 15 millimeter incision. And NASA is working on a robot that can enter the abdominal cavity through a person’s belly button, Matrix-like, to perform simple surgeries. It’s meant to be used in emergencies, but we know how this story goes: Soon enough, it’ll be routine for a robot to slide into a person’s body and pull her appendix back out.”

Tags:

Elon Musk has said that Tesla will produce fully autonomous vehicles within six years, which doesn’t make complete sense to me because I think infrastructure would have to be modified before that’s possible, but he is now promising that 2015 models will reach the 90% autopilot threshold. From Chris Ziegler at The Verge:

In an excerpt from a CNNMoney interview, Tesla boss Elon Musk says that the self-driving car — or “autopilot,” the term he prefers — is basically just months away from retail. Here’s the language:

‘Autonomous cars will definitely be a reality. A Tesla car next year will probably be 90 percent capable of autopilot. Like, so 90 percent of your miles can be on auto. For sure highway travel.

How’s that going to happen?

With a combination of various sensors. You combine cameras with image recognition with radar and long-range ultrasonics, that’ll do it. Other car companies will follow.

But you guys are going to be the leader?

Of course. I mean, Tesla’s a Silicon Valley company. If we’re not the leader, shame on us.'”

Tags: ,

Sand seems limitless, something we can almost disregard. But like water, its supply is currently under stress, owing in part to a growing world population requiring basic resources. A global building boom is stripping beaches bare, the sand used to make cement, disappearing them. From Laura Höflinger at Spiegel:

“The phenomenon of disappearing beaches is not unique to Cape Verde. With demand for sand greater than ever, it can be seen in most parts of the world, including Kenya, New Zealand, Jamaica and Morocco. In short, our beaches are disappearing. ‘It’s the craziest thing I’ve seen in the past 25 years,’ says Robert Young, a coastal researcher at Western Carolina University. ‘We’re talking about ugly, miles-long moonscapes where nothing can live anymore.’

The sand on our ocean shores, once a symbol of inexhaustibility, has suddenly become scarce. So scarce that stealing it has become attractive.

Never before has Earth been graced with the prosperity we are seeing today, with countries like China, India and Brazil booming. But that also means that demand for sand has never been so great. It is used in the production of computer chips, plates and mobile phones. More than anything, though, it is used to make cement. You can find it in the skyscrapers in Shanghai, the artificial islands of Dubai and in Germany’s autobahns.”

Tags: ,

From the October 29, 1910 Brooklyn Daily Eagle:

Kewanee, Ill. — Believing she would die unless snakes were in her home, Mrs. Ada Packard has received from New York a boa constrictor eleven feet long to be used as a pet.

Mrs. Packard, who claims to have a gift over reptiles, always has had a fondness for serpents, and when she failed to rally from a recent surgical operation satisfactorily she became convinced she would improve if she could get a snake to fondle. A poisonous copperhead was obtained three weeks ago, and it was believed her health immediately was benefited. The snake became chilled, however, and died, after which Mrs. Packard became ill again.

Her husband objected to the coming of the serpent, but believing it was a case of life or death with his wife, he consented to the order for another. Mrs. Packard declares she noticed great improvement in her health already. She had the snake placed in a tub of warm water, as the cold weather was feared.

She says her gift has been noticeable since she was a child, when snakes crawled to her from all directions whenever she was in their vicinity at picnics in the woods.”

Tags:

Thanks to the excellent 3 Quarks Daily for pointing me to Mala Szalavitz’s Substance.com essay, “Most People With Addiction Simply Grow Out Of It.” Hopelessly addicted is a phrase we’re all familiar with, but it’s an extreme outlier, not the rule, as most people kick after a few years. We likely believe addiction is terminal because we conjure the most extreme and dramatic examples to represent it; call it the Availability Heuristic of heroin and the like. Unfortunately, that misunderstanding influences laws and treatment. An excerpt:

“Why do so many people still see addiction as hopeless? One reason is a phenomenon known as ‘the clinician’s error,’ which could also be known as the ‘journalist’s error’ because it is so frequently replicated in reporting on drugs. That is, journalists and rehabs tend to see the extremes: Given the expensive and often harsh nature of treatment, if you can quit on your own you probably will. And it will be hard for journalists or treatment providers to find you.

Similarly, if your only knowledge of alcohol came from working in an ER on Saturday nights, you might start thinking that prohibition is a good idea. All you would see are overdoses, DTs, or car crash, rape or assault victims. You wouldn’t be aware of the patients whose alcohol use wasn’t causing problems. And so, although the overwhelming majority of alcohol users drink responsibly, your ‘clinical’ picture of what the drug does would be distorted by the source of your sample of drinkers.

Treatment providers get a similarly skewed view of addicts: The people who keep coming back aren’t typical—they’re simply the ones who need the most help. Basing your concept of addiction only on people who chronically relapse creates an overly pessimistic picture.

This is one of many reasons why I prefer to see addiction as a learning or developmental disorder, rather than taking the classical disease view.”

Tags:

Take Control of Your Home – 47 year-old man for hire

My specialty is reigning in unruly roommates and adult children who refuse to obey the rules or grow up. Effective, discreet. Initial consultation free.

You know that famous 1967 clip of a woman shopping online? Here’s a 21-minute segment of the film it’s from, “Year: 1999 A.D.” The Philco-Ford featurette follows the fictional Shaw family, led by the astrophysicist/botanist dad (played by game-show host Wink Martindale), who is employed on a Mars colonization project. Life tomorrow was to be computerized, monitored, networked, automated, centralized and quantified. It was supposed to be a bountiful technotopia “full of leisure.” If the Internet isn’t lying to me, McCabe & Mrs. Miller cinematographer Vilmos Zsigmond shot it.

Wink recalls the film:

I just want an apology from the geniuses who mocked me for predicting at the start of the aughts (in a published article that’s no longer online) that films would eventually be released on all screens simultaneously, large theater ones as well as on TVs and computers. It hasn’t happened yet, but it may. Actually, it will, almost definitely. The opening of “Is Netflix Trying to Kill the Theater For Once and All?” by Grantland’s John Lopez:

“Next time you’re in Los Angeles, check out the historicBroadway theater district downtown: At the turn of the century, before the studios and theater chains were split apart, the stretch of Broadway between Third and Olympic boasted the highest concentration of cinemas in the word, the jeweled crown of L.A.’s burgeoning film industry. On any given night, studios premiered their latest films at sumptuous movie palaces like the Orpheum and the Million Dollar Theater. More recently, these temples of cinema, which wouldn’t look out of place at Versailles, have hosted Sunday revival churches and Spanish-language swap meets. Now they’re mostly ghosts of a bygone era when the theatrical experience was the undisputed king of American mass culture. It’s that ghost that streaked through modern-day multiplex owners’ nightmares Monday when Netflix (aided by the prince of darkness himself, Harvey Weinstein) announced that for the first time it would stream a major feature film,Crouching Tiger, Hidden Dragon: The Green Legend, simultaneous to its IMAX theater release next summer.

Predictably, by Tuesday morning a film business already battered by the worst box office summer since 1997 went apeshit. In fact, that nightmare freaked out theater owners so bad that Regal, Cinemark, and AMC — the nation’s three biggest theater chains — dropped the popcorn gauntlet Tuesday and announced they would refuse to carry the Crouching Tiger sequel at their theaters.1 In other words, as Netflix was announcing a historic new era when on-demand truly means on-demand, the nation’s theaters collectively said, “Over our swap-meet-hosting dead bodies.” Obviously, your local cineplex isn’t going to shut down after next summer, but let’s answer some questions about what’s going on here before the revolution arrives.

Could this truly be the beginning of the much-foretold end of the moviegoing experience? And should you even care?

Yes. And yes.”

Tags:

« Older entries § Newer entries »