Science/Tech

You are currently browsing the archive for the Science/Tech category.

Those who still believe privacy can be preserved by legislation either haven’t thought through the realities or are deceiving themselves. Get ready for your close-up because it’s not the pictures that are getting smaller, but the cameras. Tinier and Tinier. Soon you won’t even notice them. And they can fly.

I have no doubt the makers of the Nixie, the wristwatch-drone camera, have nothing but good intentions, but not everyone using it will. From Joseph Flaherty at Wired:

“Being able to wear the drone is a cute gimmick, but it’s powerful software packed into a tiny shell could set Nixie apart from bargain Brookstone quadcopters. Expertise in motion-prediction algorithms and sensor fusion will give the wrist-worn whirlybirds an impressive range of functionality. A ‘Boomerang mode’ allows Nixie to travel a fixed distance from its owner, take a photo, then return. ‘Panorama mode’ takes aerial photos in a 360° arc. ‘Follow me’ mode makes Nixie trail its owner and would capture amateur athletes in a perspective typically reserved for Madden all-stars. ‘Hover mode’ gives any filmmaker easy access to impromptu jib shots. Other drones promise similar functionality, but none promise the same level of portability or user friendliness.

‘We’re not trying to build a quadcopter, we’re trying to build a personal photographer,’ says Jovanovic.

A Changing Perspective on Photography

[Jelena] Jovanovic and her partner Christoph Kohstall, a Stanford postdoc who holds a Ph.D. in quantum physics and a first-author credit in the journal Nature, believe photography is at a tipping point.

Early cameras were bulky, expensive, and difficult to operate. The last hundred years have produced consistently smaller, cheaper, and easier-to-use cameras, but future developments are forking. Google Glass provides the ultimate in portability, but leaves wearers with a fixed perspective. Surveillance drones offer unique vantage points, but are difficult to operate. Nixie attempts to offer the best of both worlds.”•

Tags: , ,

I previously posted some stuff about driverless-car testing in a mock cityscape in Ann Arbor, Michigan, which might seem unnecessary given Google’s regular runs on actual streets and highways. But here’s an update on the progress from “Town Built for Driverless Cars,” by Will Knight at Technology Review:

“A mocked-up set of busy streets in Ann Arbor, Michigan, will provide the sternest test yet for self-driving cars. Complex intersections, confusing lane markings, and busy construction crews will be used to gauge the aptitude of the latest automotive sensors and driving algorithms; mechanical pedestrians will even leap into the road from between parked cars so researchers can see if they trip up onboard safety systems.

The urban setting will be used to create situations that automated driving systems have struggled with, such as subtle driver-pedestrian interactions, unusual road surfaces, tunnels, and tree canopies, which can confuse sensors and obscure GPS signals.

‘If you go out on the public streets you come up against rare events that are very challenging for sensors,’ says Peter Sweatman, director of the University of Michigan’s Mobility Transformation Center, which is overseeing the project. ‘Having identified challenging scenarios, we need to re-create them in a highly repeatable way. We don’t want to be just driving around the public roads.'”

Tags:

For productivity to increase, labor costs must shrink. That’s fine provided new industries emerge to accommodate workers, but that really isn’t what we’ve seen so far in the Technological Revolution, the next great bend in the curve, as production and wages haven’t boomed. It’s been the trade of a taxi medallion for a large pink mustache. More convenient, if not cheaper (yet), for the consumer, but bad for the drivers.

Perhaps that’s because we’re at the outset of a slow-starting boom, the Second Machine Age, or perhaps what we’re going through refuses to follow the form of the Industrial Revolution. Maybe it’s the New Abnormal. The opening of the Economist feature, “Technology Isn’t Working“: 

“IF THERE IS a technological revolution in progress, rich economies could be forgiven for wishing it would go away. Workers in America, Europe and Japan have been through a difficult few decades. In the 1970s the blistering growth after the second world war vanished in both Europe and America. In the early 1990s Japan joined the slump, entering a prolonged period of economic stagnation. Brief spells of faster growth in intervening years quickly petered out. The rich world is still trying to shake off the effects of the 2008 financial crisis. And now the digital economy, far from pushing up wages across the board in response to higher productivity, is keeping them flat for the mass of workers while extravagantly rewarding the most talented ones.

Between 1991 and 2012 the average annual increase in real wages in Britain was 1.5% and in America 1%, according to the Organisation for Economic Co-operation and Development, a club of mostly rich countries. That was less than the rate of economic growth over the period and far less than in earlier decades. Other countries fared even worse. Real wage growth in Germany from 1992 to 2012 was just 0.6%; Italy and Japan saw hardly any increase at all. And, critically, those averages conceal plenty of variation. Real pay for most workers remained flat or even fell, whereas for the highest earners it soared.

It seems difficult to square this unhappy experience with the extraordinary technological progress during that period, but the same thing has happened before. Most economic historians reckon there was very little improvement in living standards in Britain in the century after the first Industrial Revolution. And in the early 20th century, as Victorian inventions such as electric lighting came into their own, productivity growth was every bit as slow as it has been in recent decades.

In July 1987 Robert Solow, an economist who went on to win the Nobel prize for economics just a few months later, wrote a book review for the New York Times. The book in question, The Myth of the Post-Industrial Economy by Stephen Cohen and John Zysman, lamented the shift of the American workforce into the service sector and explored the reasons why American manufacturing seemed to be losing out to competition from abroad. One problem, the authors reckoned, was that America was failing to take full advantage of the magnificent new technologies of the computing age, such as increasingly sophisticated automation and much-improved robots. Mr Solow commented that the authors, ‘like everyone else, are somewhat embarrassed by the fact that what everyone feels to have been a technological revolution…has been accompanied everywhere…by a slowdown in productivity growth.'”

Anne Frank famously wrote, “Despite everything, I believe that people are really good at heart,” and if she could be hopeful, how can we feel grounded by despondency? A sneakily cheerful side to the growing cadre of scientists and philosophers focused on existential risks that could drive human extinction is that if we make it through the next century, we may have unbridled abundance. From Aaron Labaree at Salon:

“When we think about general intelligence,’ says Luke Muelhauser, Executive Director at MIRI, ‘that’s a meta-technology that gives you everything else that you want — including really radical things that are even weird to talk about, like having our consciousness survive for thousands of years. Physics doesn’t outlaw those things, it’s just that we don’t have enough intelligence and haven’t put enough work into the problem … If we can get artificial intelligence right, I think it would be the best thing that ever happened in the universe, basically.’

A surprising number of conversations with experts in human extinction end like this: with great hope. You’d think that contemplating robot extermination would make you gloomy, but it’s just the opposite. As [Martin] Rees explains, ‘What science does is makes one aware of the marvelous potential of life ahead. And being aware of that, one is more concerned that it should not be foreclosed by screwing up during this century.’ Concern over humanity’s extermination at the hands of nanobots or computers, it turns out, often conceals optimism of the kind you just don’t find in liberal arts majors. It implies a belief in a common human destiny and the transformative power of technology.

‘The stakes are very large,’ [Nick] Bostrom told me. ‘There is this long-term future that could be so enormous. If our descendants colonized the universe, we could have these intergalactic civilizations with planetary-sized minds thinking and feeling things that are beyond our ken, living for billions of years. There’s an enormous amount of value that’s on the line.’

It’s all pretty hypothetical for now.•

Tags: ,

I don’t know if it will happen within ten years–though that’s not an outrageous time frame–but 3-D printing will automate much of the restaurant process, making jobs vanish, and will also be common in homes as prices fall. The opening of “Is 3-D Printing the Next Microwave?” by Jane Dornbusch at the Boston Globe:

“Picture the dinner hour in a decade: As you leave work, you pull up an app (assuming we still use apps) on your phone (or your watch!) that will direct a printer in your kitchen to whip up a batch of freshly made ravioli, some homemade chicken nuggets for the kids, and maybe a batch of cookies, each biscuit customized to meet the nutritional needs of different family members. 

It sounds like science fiction, but scientists and engineers are working on 3-D printing, and the food version of the 3-D printer is taking shape. Don’t expect it to spin out fully cooked meals anytime soon. For now, the most popular application in 3-D food printing seems to be in the decidedly low-tech area of cake decoration.

Well, not just cake decoration, but sugary creations of all kinds. The Sugar Lab is the confectionary arm of 3-D printing pioneer 3D Systems, and it expects to have the ChefJet, a 3-D food printer, available commercially later this year. Though tinkerers have been exploring the possibilities of 3-D food printing for a few years, and another food printer, Natural Machines’ Foodini, is slated to appear by year’s end, 3D Systems says the ChefJet is the world’s first 3-D food printer.

Like so many great inventions, the ChefJet came about as something of an accident, this one performed by husband-and-wife architecture grad students Liz and Kyle von Hasseln a couple of years ago. At the Southern California Institute of Architecture, the von Hasselns used a 3-D printer to create models. Intrigued by the process, Liz von Hasseln says, ‘We bought a used printer and played around with different materials to see how to push the technology. One thing we tried was sugar. We thought if we altered the machine a bit and made it all food safe and edible, we could push into a new space.’ More tweaking ensued, and the ChefJet was born.”

___________________________

Walter Cronkite presents the kitchen of 2001 in 1967:

Tags: , ,

I haven’t yet read Walter Isaacson’s new Silicon Valley history, The Innovators, but I would be indebted if it answers the question of how much Gary Kildall’s software was instrumental to Microsoft’s rise. Was Bill Gates and Paul Allen’s immense success built on intellectual thievery? Has the story been mythologized beyond realistic proportion? An excerpt from Brendan Koerner’s New York Times review of the book:

“The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities ‘shared a resistance to power elites and a desire to control their own access to information.’ Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he’s accustomed to writing biographies of men who operated inside the corridors of power — Benjamin Franklin, Henry ­Kissinger, Jobs — Isaacson seems a bit baffled by committed outsiders like ­Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of ’60s ­counterculture.

Yet this minor shortcoming is quickly forgiven when The Innovators segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk — a spoiled brat and compulsive gambler who ‘was rebellious just for the hell of it.’ Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm — in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine’s MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian.”

Tags: , ,

Selfies, the derided yet immensely popular modern portraiture, draw ire because of narcissism and exhibitionism, of course, but also because anyone can take them and do so ad nauseum. It’s too easy and available, with no expertise or gatekeeper necessary. The act is magalomania, sure, but it’s also democracy, that scary, wonderful rabble. From, ultimately, a defense of the self-directed shot by Douglas Coupland in the Financial Times:

“Selfies are the second cousin of the air guitar.

Selfies are the proud parents of the dick pic.

Selfies are, in some complex way, responsible for the word ‘frenemy.’

I sometimes wonder what selfies would look like in North Korea.

Selfies are theoretically about control – or, if you’re theoretically minded, they’re about the illusion of self-control. With a selfie some people believe you’re buying into a collective unspoken notion that everybody needs to look fresh and flirty and young for ever. You’re turning yourself into a product. You’re abdicating power of your sexuality. Or maybe you’re overthinking it – maybe you’re just in love with yourself.

I believe that it’s the unanticipated side effects of technology that directly or indirectly define the textures and flavours of our eras. Look at what Google has already done to the 21st century. So when smartphones entered the world in 2002, I think that if you gathered a group of smart media-savvy people in a room with coffee and good sandwiches, before the end of the day, the selfie could easily have been forecast as an inevitable smartphone side effect. There’s actually nothing about selfies that feels like a surprise in any way. The only thing that is surprising is the number of years it took us to isolate and name the phenomenon. I do note, however, that once the selfie phenomenon was named and shamed, selfies exploded even further, possibly occupying all of those optical-fibre lanes of the internet that were once occupied by Nigerian princes and ads for penis enlargement procedures.”

 

Tags:

The end is near, roughly speaking. A number of scientists and philosophers, most notably Martin Rees and Nick Bostrom, agonize over the existential risks to humanity that might obliterate us long before the sun dies. A school of thought has arisen over the End of Days. From Sophie McBain in the New Statesman (via 3 Quarks Daily):

“Predictions of the end of history are as old as history itself, but the 21st century poses new threats. The development of nuclear weapons marked the first time that we had the technology to end all human life. Since then, advances in synthetic biology and nanotechnology have increased the potential for human beings to do catastrophic harm by accident or through deliberate, criminal intent.

In July this year, long-forgotten vials of smallpox – a virus believed to be ‘dead’ – were discovered at a research centre near Washington, DC. Now imagine some similar incident in the future, but involving an artificially generated killer virus or nanoweapons. Some of these dangers are closer than we might care to imagine. When Syrian hackers sent a message from the Associated Press Twitter account that there had been an attack on the White House, the Standard & Poor’s 500 stock market briefly fell by $136bn. What unthinkable chaos would be unleashed if someone found a way to empty people’s bank accounts?

While previous doomsayers have relied on religion or superstition, the researchers at the Future of Humanity Institute want to apply scientific rigour to understanding apocalyptic outcomes. How likely are they? Can the risks be mitigated? And how should we weigh up the needs of future generations against our own?

The FHI was founded nine years ago by Nick Bostrom, a Swedish philosopher, when he was 32. Bostrom is one of the leading figures in this small but expanding field of study.”

Tags: , ,

Speaking of human laborers being squeezed: Open Source with Christopher Lydon has an episode called “The End of Work,” with two guests, futurist Ray Kurzweil and MIT economist Andrew McAfee. A few notes.

  • McAfee sees the Technological Revolution as doing for gray matter what the Industrial Revolution did for muscle fiber, but on the way to a world of wealth without toil–a Digital Athens–the bad news is the strong chance of greater income inequality and decreased opportunities for many. Kodak employed 150,000; Instagram a small fraction of that. With the new technologies, destruction (of jobs) outpaces creation. Consumers win, but Labor loses.
  • Kurzweil is more hopeful in the shorter term than McAfee. He says we have more jobs and more gratifying ones today than 100 years ago and they pay better. We accomplish more. Technology will improve us, make us smarter, to meet the demands of a world without drudgery. It won’t be us versus the machines, but the two working together. The majority of jobs always go away, most of the jobs today didn’t exist so long ago. New industries will be invented to provide work. He doesn’t acknowledge a painful period of adjustment in distribution before abundance can reach all.

Tags: , ,

Sure, we have phones that are way nicer now, but the Technological Revolution has largely been injurious to anyone in the Labor market, and things are going to get worse, at least in the near and mid term. A free-market society that is highly automated isn’t really very free. Drive for Uber until autonomous cars can take over the wheel, you’re told, or rent a spare room on Airbnb–make space for yourself on the margins through the Sharing Economy. But there’s less to share for most people. From an Economist report:

“Before the horseless carriage, drivers presided over horse-drawn vehicles. When cars became cheap enough, the horses and carriages had to go, which eliminated jobs such as breeding and tending horses and making carriages. But cars raised the productivity of the drivers, for whom the shift in technology was what economists call ‘labour-augmenting.’ They were able to serve more customers, faster and over greater distances. The economic gains from the car were broadly shared by workers, consumers and owners of capital. Yet the economy no longer seems to work that way. The big losers have been workers without highly specialised skills.

The squeeze on workers has come from several directions, as the car industry clearly shows. Its territory is increasingly encroached upon by machines, including computers, which are getting cheaper and more versatile all the time. If cars and lorries do not need drivers, then both personal transport and shipping are likely to become more efficient. Motor vehicles can spend more time in use, with less human error, but there will be no human operator to share in the gains.

At the same time labour markets are hollowing out, polarising into high- and low-skill occupations, with very little employment in the middle. The engineers who design and test new vehicles are benefiting from technological change, but they are highly skilled and it takes remarkably few of them to do the job. At Volvo much of the development work is done virtually, from the design of the cars to the layout of the production line. Other workers, like the large numbers of modestly skilled labourers that might once have worked on the factory floor, are being squeezed out of such work and are now having to compete for low-skill and low-wage jobs.

Labour has been on the losing end of technological change for several decades.”

It’s not a shocker that the late psychologist and computer scientist Dr. Christopher Evans, who presented the great 1979 TV series The Mighty Micro, combined the two disciplines he was devoted to when trying to explain why he believed people dream. From Daniel Goleman’s 1984 New York Times article about the possible causes of eyes moving rapidly:

Dr. Evans, a psychologist and computer scientist, proposes that dreams are the brain’s equivalent of a computer’s inspection of its programs, allowing a chance to integrate the experiences of the day with the memories already stored in the brain. His theory is based in part on evidence that dreaming consolidates learning and memory.

The contents of a dream, according to Dr. Evans, are fragments of events and experiences during the day which are being patched into related previous memories. “Dreaming,” he writes, “might be our biological equivalent to the computer’s process of program inspection.”•

 

Tags: ,

Space-travel enthusiast and labor organizer David Lasser was one of the first Americans to champion a mission to the moon, and one of the most influential. His 1931 book, The Conquest of Space, suggested such a rocket voyage was possible, not fanciful. In Lasser’s 1996 New York Times obituary, Arthur C. Clarke said of the then-65-year-old volume: “[It was] the first book in the English language to explain that space travel wasn’t just fiction…[it was] one of the turning points in my life — and I suspect not only of mine.”

While an article about the book’s publication in the October 6, 1931 Brooklyn Daily Eagle took seriously Lasser’s vision of rocket-powered airline travel–from New York to Paris in one hour!–it gave less credence to his moonshot scenario.

 

Tags: ,

From the Overcoming Bias post in which economist Robin Hansen comments on Debora MacKenzie’s New Scientist article “The End of Nations,” a piece which wonders about, among other things, whether states in the modern sense predated the Industrial Revolution:

“An interesting claim: the nation-state didn’t exist before, and was caused by, the industrial revolution. Oh there were empires before, but most people didn’t identify much with empires, or see empires as much influencing their lives. In contrast people identify with nation-states, which they see as greatly influencing their lives. More:

Before the late 18th century there were no real nation states. … If you travelled across Europe, no one asked for your passport at borders; neither passports nor borders as we know them existed. People had ethnic and cultural identities, but these didn’t really define the political entity they lived in. …

Agrarian societies required little actual governing. Nine people in 10 were peasants who had to farm or starve, so were largely self-organising. Government intervened to take its cut, enforce basic criminal law and keep the peace within its undisputed territories. Otherwise its main role was to fight to keep those territories, or acquire more. … Many eastern European immigrants arriving in the US in the 19th century could say what village they came from, but not what country: it didn’t matter to them. … Ancient empires are coloured on modern maps as if they had firm borders, but they didn’t. Moreover, people and territories often came under different jurisdictions for different purposes.

Such loose control, says Bar-Yam, meant pre-modern political units were only capable of scaling up a few simple actions such as growing food, fighting battles, collecting tribute and keeping order. …

The industrial revolution … demanded a different kind of government. … ‘In 1800 almost nobody in France thought of themselves as French. By 1900 they all did.’ … Unlike farming, industry needs steel, coal and other resources which are not uniformly distributed, so many micro-states were no longer viable. Meanwhile, empires became unwieldy as they industrialised and needed more actual governing. So in 19th-century Europe, micro-states fused and empires split.

These new nation states were justified not merely as economically efficient, but as the fulfilment of their inhabitants’ national destiny. A succession of historians has nonetheless concluded that it was the states that defined their respective nations, and not the other way around.”

Tags:

It’s a heartbreaker watching what’s happening to the New York Times these days, and the latest layoffs are just the most recent horrible headline. The Magazine is currently a green shoot, with its bright new editor, Jake Silverstein, and a boost to staffing, but that section is the outlier. The business can’t continue to suffer without being joined by the journalism. You just hope the company is ultimately sold to someone great.

At the Washington Post, not much has changed dramatically since Jeff Bezos bought the Graham family jewel, despite some executive shuffles and new hires. Does Bezos have a long-term plan? Does he have any plan? Does it really matter in the intervening period, since he can afford to wait for everyone else to fall and position his publication as the inheritor? From Dylan Byers at Politico:

“Despite expectations, Bezos himself had never promised a reinvention. ‘There is no map, and charting a path ahead will not be easy,’ he wrote in his first memo to Post staff in August of last year. Still, his reputation preceded him: With Amazon, he had revolutionized not just the book-selling business but the very means and standards of online shopping. He was planning ambitious new initiatives like drone delivery. Surely, this man had the silver bullet to save the Washington Post, and perhaps the newspaper industry.

Bezos, who declined to be interviewed for this story, is holding his cards close to his chest. He has no influence on the editorial side, according to [Exceutive Editorm Martin] Baron, but is focused on ‘broader strategic efforts.’

If Bezos has any innovative digital initiatives in the works, they’re being formed not in Washington but in New York. In March, the Post launched a Manhattan-based design and development office called WPNYC, which is focused on improving the digital experience and developing new advertising products.

‘Jeff’s preoccupation isn’t editorial, it’s delivery,’ one Post staffer said of WPNYC. ‘He wants to change the way people receive, read and experience the news. The only problem is we still don’t know what that looks like.'”

Tags: , , ,

Aiming to make surgical invasion even more minimal, robots are being devised that can slide into small openings and perform currently messy operations. From Sarah Laskow at the Atlantic:

“In the past few years, surgeons have been pushing to make these less invasive surgeries almost entirely invisible. Instead of cutting a tiny window in the outside of the body, they thought, why not cut one inside? Surgeons would first enter a person’s body through a ‘natural orifice’ and make one small incision, through which to access internal organs. The end result of this idea was that, in 2009, a surgeon removed a woman’s kidney through her vagina.

Few surgeons were convinced this was actually an improvement though. Instead, they have focused on minimizing the number of tiny incisions needed to perform surgery. Single-site surgery requires just one ‘port’ into a body.

A team of surgeons at Columbia, for instance, is working on a small robotic arm—minuscule, when compared to the da Vinci system—that can sneak into one 15 millimeter incision. And NASA is working on a robot that can enter the abdominal cavity through a person’s belly button, Matrix-like, to perform simple surgeries. It’s meant to be used in emergencies, but we know how this story goes: Soon enough, it’ll be routine for a robot to slide into a person’s body and pull her appendix back out.”

Tags:

Elon Musk has said that Tesla will produce fully autonomous vehicles within six years, which doesn’t make complete sense to me because I think infrastructure would have to be modified before that’s possible, but he is now promising that 2015 models will reach the 90% autopilot threshold. From Chris Ziegler at The Verge:

In an excerpt from a CNNMoney interview, Tesla boss Elon Musk says that the self-driving car — or “autopilot,” the term he prefers — is basically just months away from retail. Here’s the language:

‘Autonomous cars will definitely be a reality. A Tesla car next year will probably be 90 percent capable of autopilot. Like, so 90 percent of your miles can be on auto. For sure highway travel.

How’s that going to happen?

With a combination of various sensors. You combine cameras with image recognition with radar and long-range ultrasonics, that’ll do it. Other car companies will follow.

But you guys are going to be the leader?

Of course. I mean, Tesla’s a Silicon Valley company. If we’re not the leader, shame on us.'”

Tags: ,

Sand seems limitless, something we can almost disregard. But like water, its supply is currently under stress, owing in part to a growing world population requiring basic resources. A global building boom is stripping beaches bare, the sand used to make cement, disappearing them. From Laura Höflinger at Spiegel:

“The phenomenon of disappearing beaches is not unique to Cape Verde. With demand for sand greater than ever, it can be seen in most parts of the world, including Kenya, New Zealand, Jamaica and Morocco. In short, our beaches are disappearing. ‘It’s the craziest thing I’ve seen in the past 25 years,’ says Robert Young, a coastal researcher at Western Carolina University. ‘We’re talking about ugly, miles-long moonscapes where nothing can live anymore.’

The sand on our ocean shores, once a symbol of inexhaustibility, has suddenly become scarce. So scarce that stealing it has become attractive.

Never before has Earth been graced with the prosperity we are seeing today, with countries like China, India and Brazil booming. But that also means that demand for sand has never been so great. It is used in the production of computer chips, plates and mobile phones. More than anything, though, it is used to make cement. You can find it in the skyscrapers in Shanghai, the artificial islands of Dubai and in Germany’s autobahns.”

Tags: ,

A follow-up on yesterday’s post about films being released on all screens, not just theater ones: Netflix is continuing to transform itself into a studio that streams, inking a four-picture deal with the inexplicably popular Adam Sandler. (The first three are rumored to be a trilogy about a golfer who has violent diarrhea competing against another golfer who has violent diarrhea.) From Pamela McClintock at the Hollywood Reporter:

“Netflix has signed a deal to make four feature films with Adam Sandler as the streaming service continues its empire-building and moves into producing original movies that bypass the usual theatrical release.

Sandler’s Happy Madison Productions will work alongside Netflix in developing the yet-to-be announced titles, which will premiere exclusively in the nearly 50 countries where Netflix operates. It’s a significant move for Sandler, a longtime denizen of the Hollywood studio system — a system wedded to playing films first in theaters, not in the home. He’ll both star in and produce the Netflix projects.

‘When these fine people came to me with an offer to make four movies for them, I immediately said yes for one reason and one reason only … Netflix rhymes with Wet Chicks,’ Sandler said in a statement. ‘Let the streaming begin!!!!'”

Tags: ,

Thanks to the excellent 3 Quarks Daily for pointing me to Mala Szalavitz’s Substance.com essay, “Most People With Addiction Simply Grow Out Of It.” Hopelessly addicted is a phrase we’re all familiar with, but it’s an extreme outlier, not the rule, as most people kick after a few years. We likely believe addiction is terminal because we conjure the most extreme and dramatic examples to represent it; call it the Availability Heuristic of heroin and the like. Unfortunately, that misunderstanding influences laws and treatment. An excerpt:

“Why do so many people still see addiction as hopeless? One reason is a phenomenon known as ‘the clinician’s error,’ which could also be known as the ‘journalist’s error’ because it is so frequently replicated in reporting on drugs. That is, journalists and rehabs tend to see the extremes: Given the expensive and often harsh nature of treatment, if you can quit on your own you probably will. And it will be hard for journalists or treatment providers to find you.

Similarly, if your only knowledge of alcohol came from working in an ER on Saturday nights, you might start thinking that prohibition is a good idea. All you would see are overdoses, DTs, or car crash, rape or assault victims. You wouldn’t be aware of the patients whose alcohol use wasn’t causing problems. And so, although the overwhelming majority of alcohol users drink responsibly, your ‘clinical’ picture of what the drug does would be distorted by the source of your sample of drinkers.

Treatment providers get a similarly skewed view of addicts: The people who keep coming back aren’t typical—they’re simply the ones who need the most help. Basing your concept of addiction only on people who chronically relapse creates an overly pessimistic picture.

This is one of many reasons why I prefer to see addiction as a learning or developmental disorder, rather than taking the classical disease view.”

Tags:

You know that famous 1967 clip of a woman shopping online? Here’s a 21-minute segment of the film it’s from, “Year: 1999 A.D.” The Philco-Ford featurette follows the fictional Shaw family, led by the astrophysicist/botanist dad (played by game-show host Wink Martindale), who is employed on a Mars colonization project. Life tomorrow was to be computerized, monitored, networked, automated, centralized and quantified. It was supposed to be a bountiful technotopia “full of leisure.” If the Internet isn’t lying to me, McCabe & Mrs. Miller cinematographer Vilmos Zsigmond shot it.

Wink recalls the film:

I just want an apology from the geniuses who mocked me for predicting at the start of the aughts (in a published article that’s no longer online) that films would eventually be released on all screens simultaneously, large theater ones as well as on TVs and computers. It hasn’t happened yet, but it may. Actually, it will, almost definitely. The opening of “Is Netflix Trying to Kill the Theater For Once and All?” by Grantland’s John Lopez:

“Next time you’re in Los Angeles, check out the historicBroadway theater district downtown: At the turn of the century, before the studios and theater chains were split apart, the stretch of Broadway between Third and Olympic boasted the highest concentration of cinemas in the word, the jeweled crown of L.A.’s burgeoning film industry. On any given night, studios premiered their latest films at sumptuous movie palaces like the Orpheum and the Million Dollar Theater. More recently, these temples of cinema, which wouldn’t look out of place at Versailles, have hosted Sunday revival churches and Spanish-language swap meets. Now they’re mostly ghosts of a bygone era when the theatrical experience was the undisputed king of American mass culture. It’s that ghost that streaked through modern-day multiplex owners’ nightmares Monday when Netflix (aided by the prince of darkness himself, Harvey Weinstein) announced that for the first time it would stream a major feature film,Crouching Tiger, Hidden Dragon: The Green Legend, simultaneous to its IMAX theater release next summer.

Predictably, by Tuesday morning a film business already battered by the worst box office summer since 1997 went apeshit. In fact, that nightmare freaked out theater owners so bad that Regal, Cinemark, and AMC — the nation’s three biggest theater chains — dropped the popcorn gauntlet Tuesday and announced they would refuse to carry the Crouching Tiger sequel at their theaters.1 In other words, as Netflix was announcing a historic new era when on-demand truly means on-demand, the nation’s theaters collectively said, “Over our swap-meet-hosting dead bodies.” Obviously, your local cineplex isn’t going to shut down after next summer, but let’s answer some questions about what’s going on here before the revolution arrives.

Could this truly be the beginning of the much-foretold end of the moviegoing experience? And should you even care?

Yes. And yes.”

Tags:

An excerpt from “Future of Rail 2050,” an ARUP study which predicts the demands of sprawling megacities will completely overhaul the nature of railway stations and that the typical person will be named “Nuno”:

“Hugo Dupont, 31 • Smart City Engineer

Hugo is rushing to catch the Metro train to work. Earlier, as he reached Rue Daval, he remembered that he had left a parcel on his kitchen counter and had to turn back to get it. Now, running a little late but parcel in hand, he pauses as a fleet of driverless pods pass by and then crosses the road at the signal, disappearing into the Metro station. 

He needs the package to be delivered that evening, as today is his friend Nuno’s birthday. At the entrance to the Metro, he drops the parcel into the International Express box next to the interactive tourist information wall. As he selects to receive freight alerts to track the progress of his  package and pays for the shipping with a tap of a button, a message notifies him that his meeting with colleagues in Hong Kong via holographic software will start in 15 minutes. He hurries to the platform to catch his train.

The platform screen doors slide shut just before Hugo can board the Metro. However, he isn’t too worried as he knows the next train will arrive in under a minute. The driverless metro trains can travel in close succession as they constantly communicate with each other and with rail infrastructure and automatically respond to the movements of the other trains on the track, making the metro extremely safe and efficient. 

As he waits, Hugo notices other commuters buying groceries from the virtual shopping wall. As his fridge hasn’t sent him any alerts, he thinks he is stocked up well enough at home for the time being. He also glances at some of the artwork on platform screen doors — he enjoys seeing the changing digital exhibitions every day.

Meanwhile, at 08:46, Hugo’s parcel drops onto a conveyor belt and is transported to a pod on the underground freight pipeline. The routing code is scanned as it is loaded onto the pod, and the package is whisked away to Gare Centrale. The electric pod travels uninhibited at a steady pace, independent of traffic and weather conditions, and at 09:16 the package is loaded onto the mail carriage at the back of the waiting high-speed EuroTrain that carries both passengers and small express freight. At 10:35, the train leaves the station and runs directly to Berlin.

In his office, Hugo is testing a new  system for analysing how much electricity from braking trains is fed back into the grid, when he receives a notification informing him that his package is on the train and is running on schedule. Hugo lives alone in an apartment in a large European megacity. Having studied abroad, he has returned to his home city and works as a Smart City Engineer for the City Authority, maintaining a network of sensors tracking electricity, traffic and people flows to create efficiencies across city systems. He likes gadgets and his wearable computers perform a variety of functions from wayfinding, to holographic communication, to the real-time monitoring of his health.”

When ten teams out of thirty reach the playoffs as they do in baseball, not every club that makes the grade will be tremendous. So while neither the Orioles nor Royals boast any of the sport’s best players, it’s not a shocker that they’re still playing. It’s almost likely it would have happened to them or a couple of star-less teams like them.

But it’s still fun to figure out how they did it. The Economist chalks up their postseason presence to great defense, which is a little behind the curve, but I’m just happy that publication is covering baseball at all. It certainly is a part of the story, especially with the Royals. (I would say Buck Showalter is the is the O’s greatest “secret” weapon.) With the Pirates making it to October the last two seasons thanks to dramatic defensive shifting and the Royals doing so this year with great gloves, catching the ball has never been more in vogue. Of course, it’s still not easy to rank an individual player’s defense even with all the advanced stats and endless amount of video, so the teams with the best analytics and coaching can earn a few extra wins. From the Economist:

“It should come as little surprise that fielding has proved to be baseball’s next analytical frontier. Offence is almost mindlessly easy to measure: since every hitter gets roughly the same number of opportunities, faces a similar quality of opposition, and is entirely responsible for the outcome of his at-bat, all one has to do is count the results. Pitching has historically been somewhat harder to assess, since its effect on run prevention has to be disentangled from that of defence. But the advent of statistics like FIP has made it fairly straightforward to analyse pitchers based only on the factors under their control. Fielding, in contrast, is devilishly tough to evaluate quantitatively. The number and difficulty of the balls hit to each position vary wildly from team to team over the course of a season. Moreover, the visually spectacular plays likely to be voted Web Gems—leaping, diving or throwing off-balance—are often the product of poor initial positioning or circuitous routes. “Making it look easy” is in fact the highest praise for a fielder. It is also the best way to ensure that one’s skills never get noticed.

For most of baseball history, these obstacles have led teams to misjudge defensive contributions and to underweight the importance of glovework, particularly at ‘bat-first’ positions like first base and left and right field. But the advent of video scouting firms like STATS and Baseball Info Solutions (BIS), followed by the development of the SportvisionFieldf/x system that digitally tracks batted balls in real time, has at last enabled clubs to give defence its due.”

Bayesian statistics–“Monty Hall Math,” you might call it–is a nontraditional method employed to interpret probability and improve the odds of being right. It’s not a sure thing but a surer one. From F.D. Flam in the New York Times:

“Take, for instance, a study concluding that single women who were ovulating were 20 percent more likely to vote for President Obama in 2012 than those who were not. (In married women, the effect was reversed.)

Dr. [Andrew] Gelman re-evaluated the study using Bayesian statistics. That allowed him look at probability not simply as a matter of results and sample sizes, but in the light of other information that could affect those results.

He factored in data showing that people rarely change their voting preference over an election cycle, let alone a menstrual cycle. When he did, the study’s statistical significance evaporated. (The paper’s lead author, Kristina M. Durante of the University of Texas, San Antonio, said she stood by the finding.)

Dr. Gelman said the results would not have been considered statistically significant had the researchers used the frequentist method properly. He suggests using Bayesian calculations not necessarily to replace classical statistics but to flag spurious results.

A famously counterintuitive puzzle that lends itself to a Bayesian approach is the Monty Hall problem, in which Mr. Hall, longtime host of the game show Let’s Make a Deal, hides a car behind one of three doors and a goat behind each of the other two. The contestant picks Door No. 1, but before opening it, Mr. Hall opens Door No. 2 to reveal a goat. Should the contestant stick with No. 1 or switch to No. 3, or does it matter?

A Bayesian calculation would start with one-third odds that any given door hides the car, then update that knowledge with the new data: Door No. 2 had a goat. The odds that the contestant guessed right — that the car is behind No. 1 — remain one in three. Thus, the odds that she guessed wrong are two in three. And if she guessed wrong, the car must be behind Door No. 3. So she should indeed switch.”

Tags: , , ,

Transforming Downtown Las Vegas into a technotopia always seemed a quixotic quest at best, but that was Zappos founder Tony Hsieh’s top-down attempt, a huge wager in the American capital of gambling. Unsurprisingly, the “house” has been unsparing. From Nellie Bowles at Recode:

“In a surprise all-hands meeting at the Inspire Theater a few weeks ago, Hsieh, whose $350 million in funding and vision turned 60 acres of Downtown Las Vegas into an growing tech city, told his staff he was stepping down and handing the reins over to his lawyer, Millie Chou. On Tuesday, the project laid off 30 percent of the staff.

News of the layoffs was first reported by KNPR News.

‘(Hsieh) said I see myself as advisor and investor, but I’m going to appoint someone as our strategy implementation lead,’ one source who attended the meeting said.

Another person close to Downtown Project said the new businesses — like an artisanal doughnut shop and a high-end flower vendor — were ‘bleeding money.’

‘It seems like it’s being run by kids, that’s because it’s being run by kids,’ one source said about the Downtown Project.

This person cited Hsieh’s hiring decisions, which included several family members, as a problem.

‘There are a lot of people in leadership at Downtown Project who have absolutely no business being there,’ the source said. ‘Tony is not always altogether the most wise judge of character. There’s a lot of family. There’s a lot of drinking buddies. And some poor choices were made.'”

Tags: ,

« Older entries § Newer entries »