Sue Halpern

You are currently browsing articles tagged Sue Halpern.

An unqualified sociopath was elected President of the United States with the aid of the FBI, fake news, Russian spies, white supremacists and an accused rapist who’s holed up inside the Ecuadorian embassy in London to avoid arrest. Writing that sentence a million times can’t make it any less chilling.

WikiLeaks’ modus operandi over the last couple of years probably wouldn’t be markedly different if it were in the hands of Steve Bannon rather than Julian Assange, so it’s not surprising the organization leaked a trove of (apparently overhyped) documents about CIA surveillance just as Trump was being lambasted from both sides of the aisle for baselessly accusing his predecessor for “wiretapping.” The timing is familiar if you recall that WikiLeaks began releasing Clinton campaign emails directly after the surfacing of a video that recorded Trump’s boasts of sexual assault. With all this recent history, is it any surprise Assange mockingly described himself as a “deplorable” when chiding Twitter for refusing verify his account?

The decentralization of media, with powerful tools in potentially every hand, has changed the game, no doubt. We’re now in a permanent Spy vs. Spy cartoon, though one that isn’t funny, with feds and hackers permanently at loggerheads. Which side can do the most damage? Voters have some recourse in regards to government snooping but not so with private-sector enterprises. In the rush to privatize and outsource long-established areas of critical services, from prisons to the military to intelligence work, we’ve also dispersed dangers.

From Sue Halpern’s New York Review of Books pieceThe Assange Distraction“:

In his press conference, Assange observed that no cyber weapons are safe from hacking because they live on the Internet, and once deployed are themselves at risk of being stolen. When that happens, he said, “there’s a very easy cover for any gray market operator, contractor, rogue intelligence agent to take that material and start a company with it. Start a consulting company, a hacker for hire company.” Indeed, the conversation we almost never have when we’re talking about cyber-security and hacking is the one where we acknowledge just how privatized intelligence gathering has become, and what the consequences of this have been. According to the reporters Dana Priest, Marjorie Censer and Robert O’Harrow, Jr., at least 70 percent of the intelligence community’s “secret” budget now goes to private contractors. And, they write, “Never before have so many US intelligence workers been hired so quickly, or been given access to secret government information through networked computers. …But in the rush to fill jobs, the government has relied on faulty procedures to vet intelligence workers, documents and interviews show.” Much of this expansion occurred in the aftermath of the September 11 attacks, when the American government sought to dramatically expand its intelligence-gathering apparatus.

Edward Snowden was a government contractor; he had a high security clearance while working for both Dell and for Booz, Allen, Hamilton. Vault 7’s source, from what one can discern from Assange’s remarks, was most likely a contractor, too. The real connection between Snowden’s NSA revelations and an anonymous leaker handing off CIA malware to WikiLeaks, however, is this: both remind us, in different ways, that the expansion of the surveillance state has made us fundamentally less secure, not more.

Julian Assange, if he is to be believed, now possesses the entire cyber-weaponry of the CIA. He claims that they are safe with him while explaining that nothing is safe on the Internet. He says that the malware he’s published so far is only part of the CIA arsenal, and that he’ll reveal more at a later date. If that is not a veiled threat, then this is: Assange has not destroyed the source codes that came to him with Vault 7, the algorithms that run these programs, and he hasn’t categorically ruled out releasing them into the wild, where they would be available to any cyber-criminal, state actor, or random hacker. This means that Julian Assange is not just a fugitive, he is a fugitive who is armed and dangerous.•

Tags: ,

  • Is it worse if algorithms fishing through the polluted waters of the Internet are great at perceiving our personalities or if they’re just so much bilge? Maybe the efficacy we grant them is the problem regardless of their level of accuracy.
  • After I purchased a copy of Mark Twain’s Roughing It from Amazon in 2014, the ghosts in Jeff Bezos’ machine recommended to me Prepper’s Pantry: The Survival Guide To Emergency Water & Food Storage. Missed by that much.
  • In 2012, Target had eerie success in using data analysis to divine personal details, famously suggesting maternity-type purchases to a teen whose parents did not even know yet she was pregnant. But surprising insight in some cases doesn’t mean universal proficiency has been achieved.
  • Online video advertising is a bubble–get people to watch three seconds and get paid!–but perhaps something similarly suspect is happening when Facebook sells your supposed personality profile and preferences. Are they really just pushing false assumptions?
  • Whether we’re talking about predicting criminality based on facial features, hiring an employee or picking the next book to read, machines may seem smarter than humans, but that may not be so, at least not yet.

In one passage from “They Have, Right Now, Another You,” one of her regular excellent commentaries about our new tools and what they’ve wrought, Sue Halpern of the New York Review Books looks into the black mirror and sees something strange–a stranger. An excerpt:

While Facebook appears to be making seriously wrong and misdirected assumptions about me, and then cashing in on those mistakes, it is hardly alone in using its raw data to come to strange and wildly erroneous assumptions. Researchers at the Psychometrics Centre at Cambridge University in England have developed what they call a “predictor engine,” fueled by algorithms using a subset of a person’s Facebook “likes” that “can forecast a range of variables that includes happiness, intelligence, political orientation and more, as well as generate a big five personality profile.” (The big five are extroversion, agreeableness, openness, conscientiousness, and neuroticism, and are used by, among others, employers to assess job applicants. The acronym for these is OCEAN.) According to the Cambridge researchers, “we always think beyond the mere clicks or Likes of an individual to consider the subtle attributes that really drive their behavior.” The researchers sell their services to businesses with the promise of enabling “instant psychological assessment of your users based on their online behavior, so you can offer real-time feedback and recommendations that set your brand apart.”

So here’s what their prediction engine came up with for me: that I am probably male, though “liking” The New York Review of Books page makes me more “feminine”; that I am slightly more conservative than liberal—and this despite my stated affection for Bernie Sanders on Facebook; that I am much more contemplative than engaged with the outside world—and this though I have “liked” a number of political and activist groups; and that, apparently, I am more relaxed and laid back than 62 percent of the population. (Questionable.)

Here’s what else I found out about myself. Not only am I male, but “six out of ten men with [my] likes are gay,” which gives me “around an average probability” of being not just male, but a gay male. The likes that make me appear “less gay” are the product testing magazine Consumer Reports, the tech blog Gizmodo, and another website called Lifehacker. The ones that make me appear “more gay” are The New York Timesand the environmental group 350.org. Meanwhile, the likes that make me “appear less interested in politics” are The New York Times and 350.org.

And there’s more. According to the algorithm of the Psychometrics Centre, “Your likes suggest you are single and not in a relationship.” Why? Because I’ve liked the page for 350.org, an organization founded by the man with whom I’ve been in a relationship for thirty years!

Amusing as this is, it’s also an object lesson, yet again, about how easy it is to misconstrue and misinterpret data.•

Tags:

740x-1

The amazing, Zeitgeist-capturing photograph above, taken by Brett Gundlock of Bloomberg, shows drivers in Mexico City gridlock being peppered with advertisements floated by Uber drones. While you might think it dangerous that even slow-moving vehicles are besieged by hovering appeals sent from the heavens or thereabouts, Travis Kalanick, the leading ridesharer’s CEO, wants to remove that worry, eliminating the burden of drivers so they can instead plug their ears and eyes into other machines. Why stop and smell the roses when you can count the drones?

Autonomous vehicles are likely upon us, whether that means they arrive at high speed or merge more gradually with the Digital Age. While making the roads and highways safer was the early selling point for these cars, their establishment will have a profound effect on surveillance, employment, urban design, ethics, capitalism and even human nature itself. Of course, there will be unintended consequences we can’t yet even appreciate.

It’s also worthwhile to mention that the intervening period between fully human driving and fully automated control will not be without incidence, in much the way that horse-drawn carts and internal combustion engines made for uneasy partners on the road during that earlier transition. One thing I’m sure of is driverless cars will not create a “utopian society,” a promise often assigned to new technological tools at their outset before we remember that the function they provide was never the main problem with us to start with.

In a New York Review of Books piece on Hod Lipson and Melba Kurman’s Driverless: Intelligent Cars and the Road AheadSue Halpern looks at the industry’s dream scenario of fleets of autonomous taxis and the significant roadblocks to its realization. Even if the challenges are met, cheaper rides might not reduce wealth inequality but exacerbate the problem.

An excerpt:

The major car makers, rushing to make alliances with tech companies, understand their days of dominance are numbered. “We are rapidly becoming both an auto company and a mobility company,” Bill Ford, the chairman of Ford Motor Company, told an audience in Kansas City in February. He knows that if the fleet model prevails, Ford and other car manufacturers will be selling many fewer cars. More crucially, the winners in this new system will be the ones with the best software, and the best software will come from the most robust data, and the companies with the most robust data are the tech companies that have been hoovering it up for years: Google most of all.

“The mobility revolution is going to affect all of us personally and many of us professionally,” Ford said that day in Kansas City. He might have been thinking about car salespeople, whose jobs are likely to become obsolete, but before that it will be the taxi drivers and truckers who will be displaced by vehicles that drive themselves. Historically these have been the jobs that have provided incomes to recently arrived immigrants and to people without college degrees. Without them yet another trajectory into the middle class will be eliminated.

What of Uber drivers themselves? These are the poster people for the gig-economy, “entrepreneurs”—which is to say freelancers—who use their own cars to ferry people around. “Obviously the self-driving car thing is freaking people out a little bit,” an Uber driver in Pittsburgh named Ryan told a website called TechRepublic. And, he went on, he learned about Uber’s plans from the media, not from the company. “If it’s a negative thing, they let you find out for yourself.” As media critic Douglas Rushkoff has written, “Uber’s drivers are the R&D for Uber’s driverless future. They are spending their labor and capital investments (cars) on their own future unemployment.”

All economies have winners and losers. It does not take a sophisticated algorithm to figure out that the winners in the decades ahead are going to be those who own the robots, for they will have vanquished labor with their capital. In the case of autonomous vehicles, a few companies are now poised to control a necessary public good, the transportation of people to and from work, school, shopping, recreation, and other vital activities. This salient fact is often lost in the almost unanimously positive reception of the coming “mobility revolution,” as Bill Ford calls it.

Tags: , ,

applecomp1981 (2)

In a series of articles in the New York Review of Books over the last couple of years, Sue Halpern has taken a thought-provoking look at the dubious side of the Digital Era, considering the impact of tech billionaires, technological unemployment and the Internet of Things.

Her latest salvo tries to locate the real legacy of Steve Jobs, who was mourned equally in office parks and Zuccotti Park. In doing so she calls on the two recent films on the Apple architect, Alex Gibney’s and Danny Boyle’s, and the new volume about him by Brent Schlender and Rick Tetzeli. Ultimately, the key truth may be that Jobs used a Barnum-esque “magic” and marketing myths to not only sell his new machines but to plug them into consumers’ souls.

An excerpt:

So why, Gibney wonders as his film opens—with thousands of people all over the world leaving flowers and notes “to Steve” outside Apple Stores the day he died, and fans recording weepy, impassioned webcam eulogies, and mourners holding up images of flickering candles on their iPads as they congregate around makeshift shrines—did Jobs’s death engender such planetary regret?

The simple answer is voiced by one of the bereaved, a young boy who looks to be nine or ten, swiveling back and forth in a desk chair in front of his computer: “The thing I’m using now, an iMac, he made,” the boy says. “He made the iMac. He made the Macbook. He made the Macbook Pro. He made the Macbook Air. He made the iPhone. He made the iPod. He’s made the iPod Touch. He’s made everything.”

Yet if the making of popular consumer goods was driving this outpouring of grief, then why hadn’t it happened before? Why didn’t people sob in the streets when George Eastman or Thomas Edison or Alexander Graham Bell died—especially since these men, unlike Steve Jobs, actually invented the cameras, electric lights, and telephones that became the ubiquitous and essential artifacts of modern life?* The difference, suggests the MIT sociologist Sherry Turkle, is that people’s feelings about Steve Jobs had less to do with the man, and less to do with the products themselves, and everything to do with the relationship between those products and their owners, a relationship so immediate and elemental that it elided the boundaries between them. “Jobs was making the computer an extension of yourself,” Turkle tells Gibney. “It wasn’t just for you, it was you.”•

Tags: , , ,

In the latest excellent Sue Halpern NYRB piece, this one about Ashlee Vance’s Elon Musk bio, the critic characterizes the technologist as equal parts Iron Man and Tin Man, a person of otherworldly accomplishment who lacks a heart, his globe-saving goals having seemingly liberated him from a sense of empathy.

As Halpern notes, even Steve Jobs, given to auto-hagiography of stunning proportion, had ambitions dwarfed by Musk’s, who aims to not just save the planet but to also take us to a new one, engaging in a Space Race to Mars with NASA (while simultaneously doing business with the agency). The founder of Space X, Tesla, etc., may be parasitic on existing technologies, but he’s intent on revitalizing, not damaging, his hosts, doing so by bending giant corporations, entire industries and even governments to meet his will. An excerpt:

Two years after the creation of SpaceX, President George W. Bush announced an ambitious plan for manned space exploration called the Vision for Space Exploration. Three years later, NASA chief Michael Griffin suggested that the space agency could have a Mars mission off the ground in thirty years. (Just a few weeks ago, six NASA scientists emerged from an eight-month stint in a thirty-six-foot isolation dome on the side of Mauna Loa meant to mimic conditions on Mars.) Musk, ever the competitor, says he will get people to Mars by 2026. The race is on.

How are those Mars colonizers going to communicate with friends and family back on earth? Musk is working on that. He has applied to the Federal Communications Commission for permission to test a satellite-beamed Internet service that, he says, “would be like rebuilding the Internet in space.” The system would consist of four thousand small, low-orbiting satellites that would ring the earth, handing off services as they traveled through space. Though satellite Internet has been tried before, Musk thinks that his system, relying as it does on SpaceX’s own rockets and relatively inexpensive and small satellites, might actually work. Google and Fidelity apparently think so too. They recently invested $1 billion in SpaceX, in part, according to The Washington Post, to support Musk’s satellite Internet project.

While SpaceX’s four thousand circling satellites have the potential to create a whole new meaning for the World Wide Web, since they will beam down the Internet to every corner of the earth, the system holds additional interest for Musk. “Mars is going to need a global communications system, too,” he apparently told a group of engineers he was hoping to recruit at an event last January in Redmond, Washington. “A lot of what we do developing Earth-based communications can be leveraged for Mars as well, as crazy as that may sound.”

Tags: , ,

In her NYRB piece on Nicholas Carr’s The Glass Cage, Sue Halpern runs through periods of the twentieth century when fears of technological unemployment were raised before receding, mentioning a 1980 Time cover story about the Labor-destabilizing force of machines. These projections seemed to have been proved false as job creation increased considerably during the Reagan Administration, but as Halpern goes on to note, that feature article may have been prescient in ways we didn’t then understand. Income inequality began to boom during the last two decades of the previous century, a worrying trajectory that’s only been exacerbated as we’ve moved deeper into the Digital Revolution. Certainly there are other causes but automation is likely among them, with the new wealth in the hands of fewer, algorithms and robots managing a good portion of the windfall-creating toil. And if you happen to be working in many of the fields likely to soon be automated (hotels, restaurants, warehouses, etc.), you might want to ask some former travel agents and record-store owners for resume tips. 

Halpern zeroes in on a Carr topic often elided by economists debating whether the next few decades will be boon or bane for the non-wealthy: the hole left in our hearts when we’re “freed” of work. Is that something common to us because we were born on the other side of the transformation, or are humans marked indelibly with the need to produce beyond tweets and likes? Maybe it’s the work, not the play, that’s the thing. From Halpern:

Here is what that future—which is to say now—looks like: banking, logistics, surgery, and medical recordkeeping are just a few of the occupations that have already been given over to machines. Manufacturing, which has long been hospitable to mechanization and automation, is becoming more so as the cost of industrial robots drops, especially in relation to the cost of human labor. According to a new study by the Boston Consulting Group, currently the expectation is that machines, which now account for 10 percent of all manufacturing tasks, are likely to perform about 25 percent of them by 2025. (To understand the economics of this transition, one need only consider the American automotive industry, where a human spot welder costs about $25 an hour and a robotic one costs $8. The robot is faster and more accurate, too.) The Boston group expects most of the growth in automation to be concentrated in transportation equipment, computer and electronic products, electrical equipment, and machinery.

Meanwhile, algorithms are writing most corporate reports, analyzing intelligence data for the NSA andCIA, reading mammograms, grading tests, and sniffing out plagiarism. Computers fly planes—Nicholas Carr points out that the average airline pilot is now at the helm of an airplane for about three minutes per flight—and they compose music and pick which pop songs should be recorded based on which chord progressions and riffs were hits in the past. Computers pursue drug development—a robot in the UK named Eve may have just found a new compound to treat malaria—and fill pharmacy vials.

Xerox uses computers—not people—to select which applicants to hire for its call centers. The retail giant Amazon “employs” 15,000 warehouse robots to pull items off the shelf and pack boxes. The self-driving car is being road-tested. A number of hotels are staffed by robotic desk clerks and cleaned by robotic chambermaids. Airports are instituting robotic valet parking. Cynthia Breazeal, the director of MIT’s personal robots group, raised $1 million in six days on the crowd-funding site Indiegogo, and then $25 million in venture capital funding, to bring Jibo, “the world’s first social robot,” to market. …

There is a certain school of thought, championed primarily by those such as Google’s Larry Page, who stand to make a lot of money from the ongoing digitization and automation of just about everything, that the elimination of jobs concurrent with a rise in productivity will lead to a leisure class freed from work. Leaving aside questions about how these lucky folks will house and feed themselves, the belief that most people would like nothing more than to be able to spend all day in their pajamas watching TV—which turns out to be what many “nonemployed” men do—sorely misconstrues the value of work, even work that might appear to an outsider to be less than fulfilling. Stated simply: work confers identity. When Dublin City University professor Michael Doherty surveyed Irish workers, including those who stocked grocery shelves and drove city buses, to find out if work continues to be “a significant locus of personal identity,” even at a time when employment itself is less secure, he concluded that “the findings of this research can be summed up in the succinct phrase: ‘work matters.’”

How much it matters may not be quantifiable, but in an essay in The New York Times, Dean Baker, the codirector of the Center for Economic and Policy Research, noted that there was

a 50 to 100 percent increase in death rates for older male workers in the years immediately following a job loss, if they previously had been consistently employed.

One reason was suggested in a study by Mihaly Csikszentmihalyi, the author of Flow: The Psychology of Optimal Experience (1990), who found, Carr reports, that “people were happier, felt more fulfilled by what they were doing, while they were at work than during their leisure hours.”

Tags: , , , ,

 

Here are 25 pieces of journalism from this year, alphabetized by author name, which made me consider something new or reconsider old beliefs or just delighted me.

  • Exodus” (Ross Andersen, Aeon) A brilliant longform piece that lifts off with Elon Musk’s mission to Mars and veers in deep and mysterious directions.
  • Barack Obama, Ferguson, and the Evidence of Things Unsaid” (Ta-Nehisi Coates, The Atlantic) Nobody speaks truth to race in America quite like Coates, and the outrage of Ferguson was the impetus for this spot-on piece about the deeply institutionalized prejudice of government, national and local, in the U.S.
  • The Golden Age of Journalism?” (Tom Engelhardt, TomDispatch) The landscape has never been more brutal for news nor more promising. The author luxuriates in the richness destabilization has wrought.
  • Amazon Must Be Stopped” (Franklin Foer, The New Republic) Before things went completely haywire at the company, Foer returned some sanity to the publication in the post-Peretz period. This lucid article argues that Amazon isn’t becoming a monopoly but already qualifies as one.
  • America in Decay” (Francis Fukuyama, Foreign Affairs) Strong argument that the U.S. public sector is so dysfunctional because of a betrayal of meritocracy in favor of special interests and lobbyists. The writer’s idea of what constitutes a merit-based system seems flawed, but he offers many powerful ideas.
  • What’s the Matter With Russia?” (Keith Gessen, Foreign Affairs) An insightful meditation about Putin’s people, who opt to to live in a fairy tale despite knowing such a thing can never have a happy ending.
  • The Dying Russians(Masha Gessen, New York Review of Books) Analysis of Russia’s high mortality rate suggests that the root cause is not alcohol, guns or politics, but simply hopelessness.  
  • Soak the Rich” (David Graeber, Thomas Piketty) Great in-depth exchange between two thinkers who believe capitalism has run amok, but only one of whom thinks it’s run its course.
  • The First Smile(Michael Graziano, Aeon) The Princeton psychology and neuroscience professor attempts to explain why facial expressions appear to be natural and universal.
  • The Creepy New Wave of the Internet” (Sue Halpern, New York Review of Books) The author meditates on the Internet of Things, which may make the world much better and much worse, quantifying us like never before.
  • Super-Intelligent Humans Are Coming” (Stephen Hsu, Nautilus) A brisk walk through the process of genetic modification, which would lead to heretofore unknown brain power.
  • All Dressed Up For Mars and Nowhere to Go” (Elmo Keep, Matter) A sprawling look at the seeming futility of the MarsOne project ultimately gets at a more profound pointlessness–pursuing escape in a dying universe.
  • The Myth of AI” (Jaron Lanier, Edge) Among other things, this entry draws a neat comparison between the religionist’s End of Days and the technologist’s Singularity, the Four Horseman supposedly arriving in driverless cars.
  • The Disruption Machine” (Jill Lepore, The New Yorker) The “D” word, its chief promulgator, Clayton M. Christensen, and its circuitous narratives, receive some disruption of their own.
  • The Longevity Gap(Linda Marsa, Aeon) A severely dystopian thought experiment: Will the parallels of widening income disparity and innovations in medicine lead to two very different lifespans for the haves and have-nots?
  • The Genetics Epidemic” (Jamie F. Metzl, Foreign Affairs) Genetic modification studied from an uncommon angle, that of national-security concerns.
  • My Captivity(Theo Padnos, The New York Times Magazine) A harrowing autobiographical account of an American journalist’s hostage ordeal in the belly of the beast in Syria.
  • We Are a Camera” (Nick Paumgarten, The New Yorker) In a time of cheap, ubiquitous cameras, the image, merely an imitation, is ascendant, and any event unrecorded seemingly has less currency. The writer examines the strangeness of life in the GoPro flow.
  • A Goddamn Death Dedication” (Alex Pappademas, Grantland) A knowing postmortem about Casey Kasem, America’s deejay when the world was hi-fi but before it became sci-fi.
  • In Conversation: Chris Rock” (Frank Rich, New York) The exchange about “black progress” is an example of what comedy does at its best: It points out an obvious truth that so many have missed.
  • The Mammoth Cometh” (Nathaniel Rich, The New York Times Magazine) A piece which points out that de-extinct animals won’t be exactly like their forebears, nor will augmented humans of the future be just like us. It’s progress, probably.
  • Hello, My Name Is Stephen Glass, and I’m Sorry(Hanna Rosin, The New Republic) Before the implosion of the publication, the writer wondered what it would mean to forgive her former coworker, an inveterate fabulist and liar, and what it would mean if she could not forgive.
  • Gilbert Gottfried: New York Punk” (Jay Ruttenberg, The Lowbrow Reader) Written by the only person on the list whom I know personally, but no cronyism is necessary for the inclusion of this excellent analysis of the polarizing comic, who’s likely more comfortable when at his most alienating.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,

The Internet of Things is wonderful–and terrible. How valuable will be the aggregated information when all objects report back to the cloud and the network effect takes hold, and how impossible it will be to opt out, how unfortunately that information will sometimes be used. If we go from the ten million sensors currently connected to the Internet to 100 trillion by 2030 as theorist Jeremy Rifkin predicts, the next digital revolution will have taken place, with all the good and bad that entails. The opening of Sue Halpern’s New York Review of Books analysis of a slew of new titles about how tomorrow may find us all tethered:

“Every day a piece of computer code is sent to me by e-mail from a website to which I subscribe called IFTTT. Those letters stand for the phrase ‘if this then that,’ and the code is in the form of a ‘recipe’ that has the power to animate it. Recently, for instance, I chose to enable an IFTTT recipe that read, ‘if the temperature in my house falls below 45 degrees Fahrenheit, then send me a text message.’ It’s a simple command that heralds a significant change in how we will be living our lives when much of the material world is connected—like my thermostat—to the Internet.

It is already possible to buy Internet-enabled light bulbs that turn on when your car signals your home that you are a certain distance away and coffeemakers that sync to the alarm on your phone, as well as WiFi washer-dryers that know you are away and periodically fluff your clothes until you return, and Internet-connected slow cookers, vacuums, and refrigerators. ‘Check the morning weather, browse the web for recipes, explore your social networks or leave notes for your family—all from the refrigerator door,’ reads the ad for one.

Welcome to the beginning of what is being touted as the Internet’s next wave by technologists, investment bankers, research organizations, and the companies that stand to rake in some of an estimated $14.4 trillion by 2022—what they call the Internet of Things (IoT). Cisco Systems, which is one of those companies, and whose CEO came up with that multitrillion-dollar figure, takes it a step further and calls this wave ‘the Internet of Everything,’ which is both aspirational and telling. The writer and social thinker Jeremy Rifkin, whose consulting firm is working with businesses and governments to hurry this new wave along, describes it like this:

The Internet of Things will connect every thing with everyone in an integrated global network. People, machines, natural resources, production lines, logistics networks, consumption habits, recycling flows, and virtually every other aspect of economic and social life will be linked via sensors and software to the IoT platform, continually feeding Big Data to every node—businesses, homes, vehicles—moment to moment, in real time. Big Data, in turn, will be processed with advanced analytics, transformed into predictive algorithms, and programmed into automated systems to improve thermodynamic efficiencies, dramatically increase productivity, and reduce the marginal cost of producing and delivering a full range of goods and services to near zero across the entire economy.

In Rifkin’s estimation, all this connectivity will bring on the ‘Third Industrial Revolution,’ poised as he believes it is to not merely redefine our relationship to machines and their relationship to one another, but to overtake and overthrow capitalism once the efficiencies of the Internet of Things undermine the market system, dropping the cost of producing goods to, basically, nothing. His recent book, The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism, is a paean to this coming epoch.

It is also deeply wishful, as many prospective arguments are, even when they start from fact.”

Tags: ,

Google is chiefly interested in accurately answering your requests because your questions have monetary potential, with predictive powers labeling you someone who likely is (or likely to become) a vegan or a yoga enthusiast, or, perhaps, a criminal. And so much the better if Big Data can figure this out before your first salad or downward dog or burglary. You aren’t just what you do but what the algorithms say you are likely to do. So, now, questions are treated like answers. From Sue Halpern in the New York Review of Books:

“The social Web celebrated, rewarded, routinized, and normalized this kind of living out loud, all the while anesthetizing many of its participants. Although they likely knew that these disclosures were funding the new information economy, they didn’t especially care. As John Naughton points out in his sleek history From Gutenberg to Zuckerberg: What You Really Need to Know About the Internet:

Everything you do in cyberspace leaves a trail, including the ‘clickstream’ that represents the list of websites you have visited, and anyone who has access to that trail will get to know an awful lot about you. They’ll have a pretty good idea, for example, of who your friends are, what your interests are (including your political views if you express them through online activity), what you like doing online, what you download, read, buy and sell.

In other words, you are not only what you eat, you are what you are thinking about eating, and where you’ve eaten, and what you think about what you ate, and who you ate it with, and what you did after dinner and before dinner and if you’ll go back to that restaurant or use that recipe again and if you are dieting and considering buying a Wi-Fi bathroom scale or getting bariatric surgery—and you are all these things not only to yourself but to any number of other people, including neighbors, colleagues, friends, marketers, and National Security Agency contractors, to name just a few.”

 

Tags: ,