Science/Tech

You are currently browsing the archive for the Science/Tech category.

It would be great if all of us could grow smarter, but smart isn’t everything. Being wise and ethical are important, too.

PayPal co-founders Peter Thiel and Elon Musk have had access to elite educations, started successful businesses and amassed vast fortunes, but in this time of Trump they don’t seem particularly enlightened. Thiel ardently supported the bigoted, unqualified sociopath to the White House, while Musk’s situational ethics in dealing with the new abnormal are particularly amoral.

At SXSW, Ray Kurzweil said he believes technology has already made us much smarter and will improve us exponentially in that manner by 2029 when the Singularity arrives. While his views of the future are too aggressive, Kurzweil’s view of today seems oddly rose-colored. Why if we’re so much brighter do we have unintelligent reality TV host in the White House? Why is there ever-deepening wealth inequality? Why are we ravaged by an opioid epidemic? 

If we’re smarter now–a big if–and it’s divorced from basic morality and decency, are we any better off?

From Dyani Sabin’s Inverse piece about Kurzweil’s appearance in Austin:

The future isn’t going to look like a science fiction story with a few super intelligent A.I.s that attack us.

“That’s not realistic. We don’t have one or two A.I.s in the world. Today we have billions,” he says. And unlike Musk who imagines the rise of the A.I. as something that threatens human existence, Kurzweil says that doesn’t hold with how we interact with A.I.s today.

“What’s actually happening is they are powering all of us. They’re making us smarter. They may not yet be inside our bodies but by the 2030s we will connect our neocortex, the part of our brain where we do our thinking, to the cloud.”

This isn’t just a pipe dream to Kurzweil, who’s had reasonable luck predicting where the future is going to go. “There are people with computers in their brains today — Parkinson’s patients,” he points out. “That’s how these things start.” Following the path of steps from the technology we have now, to what will happen twenty years from now, Kurzweil says, “in the 2030’s there will be something you can take that will go inside your brain and help your memory.” And that’s just the beginning.

Uploading our brains into the cloud will allow humanity to waste less time on lower-level types of mental tasks, Kurzweil says. He’s very interested in the idea of uploading the neocortex because it’s responsible for things like art, music, and humor. By allowing our brains to connect more on that level, by melding with artificial intelligence, we will expand our ability to do these things and be better people. “Ultimately it will affect everything,” he says. “We’re going to be able to meet the physical needs of all humans. We’re going to expand our minds and exemplify these artistic qualities that we value.”•

Tags: ,

  • Don’t blame Tim Berners-Lee, not for cat memes, spam or even the way his gift connected and emboldened the absolute worst among us. A hammer is a weapon or a tool depending on how you swing it, and like almost any invention, the World Wide Web is as good as people utilizing it. A lot of us aren’t very good right now.
  • Marshall McLuhan feared the Global Village even as he was heralding its arrival 50 years ago. He believed all this closeness, these worlds colliding, could explode. He encouraged us to study the new arrangement–“why not devote your powers to discerning patterns?”–lest we’d be overrun by them. Plenty among us know the issues at hand, but they’re not easy to address.
  • There’s no doubt the Internet does a lot of good and some of its worst excesses can be curbed, but the trouble with this tool isn’t that we’re not yet familiar enough with decentralized media and soon enough we’ll have a handle on the situation. The problems seem inherent to the medium, which is a large-scale experiment in anarchy, and just as sure as we correct some of bugs, others will take flight.
  • During the Arab Spring there was much debate over whether the Internet aas actually useful in toppling states. I think it is, regardless of whether the nation or usurpers happen to be good or bad.

In a Guardian essay, Berners-Lee offers biting criticism of his pet project, suggesting fixes. I wonder though, as with Facebook promising to address its shortcomings, if the system isn’t built for mayhem. That may be especially true since most citizens don’t seem very bothered by handing over their personal information in exchange for sating some psychological needs, offering their own Manhattan for some shiny beads.

An excerpt:

1) We’ve lost control of our personal data

The current business model for many websites offers free content in exchange for personal data. Many of us agree to this – albeit often by accepting long and confusing terms and conditions documents – but fundamentally we do not mind some information being collected in exchange for free services. But, we’re missing a trick. As our data is then held in proprietary silos, out of sight to us, we lose out on the benefits we could realise if we had direct control over this data and chose when and with whom to share it. What’s more, we often do not have any way of feeding back to companies what data we’d rather not share – especially with third parties – the T&Cs are all or nothing.

This widespread data collection by companies also has other impacts. Through collaboration with – or coercion of – companies, governments are also increasingly watching our every move online and passing extreme laws that trample on our rights to privacy. In repressive regimes, it’s easy to see the harm that can be caused – bloggers can be arrested or killed, and political opponents can be monitored. But even in countries where we believe governments have citizens’ best interests at heart, watching everyone all the time is simply going too far. It creates a chilling effect on free speech and stops the web from being used as a space to explore important topics, such as sensitive health issues, sexuality or religion.•

Tags:

If Moby-Dick were the only Herman Melville book I’d ever read, I would have assumed that he was a mediocre writer with great ideas. Having gone through all of his shorter works, however, I know he could be a precise and cogent talent. He seemed to have reached for everything with his most famous novel–aiming to fashion a sort of Shakespearean Old Testament story of good and evil–and buckled under the weight of his ambitions.

The far better Moby-Dick is Cormac McCarthy’s 1992 Blood Meridian: Or the Evening Redness in the West, a horse opera of Biblical proportions, a medicine show peddling poison, which takes an unsparing look at our black hearts and leaves the reader with a purple bruise. Twenty-five years on, it remains as profound and disturbing as any American novel.

The British writer David Vann reveals he’s similarly admiring of this McCarthy work in a “Twenty Questions” interview in the Times Literary Supplement. He’s also despairing of what he believes is the bleak future of literature. I believe as long as humans are largely human, we’ll always enamored by narratives. My fear is mainly that sometimes we choose the wrong ones.

An excerpt:

Question:

Is there any book, written by someone else, that you wish you’d written?

David Vann:

There are hundreds, but the foremost from this time is Cormac McCarthy‘s Blood Meridian, which I think is the greatest novel ever written in English.  He’s not a dramatist, and I write Greek tragedy, so I never could have imagined skipping the dramatic plane and going straight to vision.  I do write in the same American landscape tradition, extending literal landscapes into figurative ones, but I’ll never do it as powerfully as he does.

Question:

What will your field look like twenty-five years from now?

David Vann:

Less money for sure. We’ve already lost so much to piracy and shrinking readerships and economic downturns. Publishers will be less brave, editors will edit less, more books will be published online for nothing, we’ll continue to lose experts and have to put up with even more reviews from unqualified idiots, and as entire generations learn to read without subtext about what someone had for lunch, we can expect literature to look more like an account of what someone had for lunch. There is absolutely no way in which the technology or literary theory of the past decades will enrich literature.  We should be honest about what is crap. …

Question:

What author or book do you think is most overrated? And why?

David Vann:

I should never answer this kind of question, because I’m only shooting myself in the foot, but when Jonathan Franzen appeared on the cover of Time as the Great American Novelist, who could not have thought of McCarthy, Proulx, Robinson, Morrison, Oates, Roth, DeLillo and at least a hundred others far better than Franzen?  And to call The Corrections the best book in ten years?  Really?•

Tags: ,

Thomas E. Ricks of Foreign Policy asked one of the most horrifying questions about America you can pose: Will we have another civil war in the next ten to fifteen years? Keith Mines of the United States Institute of Peace and a career foreign service officer provided a sobering reply, estimating the chance for large-scale internecine violence at 60%. 

Things can change unexpectedly, sometimes for the better, but it sure does feel like we’re headed down a road to ruin, with the anti-democratic, incompetent Trump and company provoking us to a tipping point. The Simon Cowell-ish strongman may seem a fluke because of his sizable loss in the popular vote, but in many ways his political ascent is the culmination of the past four decades of dubious U.S. cultural, civic, economic, technological and political decisions. We’re not here by accident. 

· · ·

One of the criteria on which Mines bases his diagnosis: “Press and information flow is more and more deliberately divisive, and its increasingly easy to put out bad info and incitement.” That triggered in me a memory of a 2012 internal Facebook study, which, unsurprisingly, found that Facebook was an enemy of the echo chamber rather than one of its chief enablers. I’m not saying the scholars involved were purposely deceitful, but I don’t think even Mark Zuckerberg would stand by those results five years later. We’re worlds apart in America, and social media, and the widespread decentralization of all media, has hastened and heightened those divisions.

· · ·

An excerpt from Farhad Manjoo’s 2012 Slate piece “The End of the Echo Chamber,” about the supposed salubrious effects of social networks, is followed by Mines’ opening.


From Manjoo:

Today, Facebook is publishing a study that disproves some hoary conventional wisdom about the Web. According to this new research, the online echo chamber doesn’t exist.
 
This is of particular interest to me. In 2008, I wrote True Enough, a book that argued that digital technology is splitting society into discrete, ideologically like-minded tribes that read, watch, or listen only to news that confirms their own beliefs. I’m not the only one who’s worried about this. Eli Pariser, the former executive director of MoveOn.org, argued in his recent book The Filter Bubble that Web personalization algorithms like Facebook’s News Feed force us to consume a dangerously narrow range of news. The echo chamber was also central to Cass Sunstein’s thesis, in his book Republic.com, that the Web may be incompatible with democracy itself. If we’re all just echoing our friends’ ideas about the world, is society doomed to become ever more polarized and solipsistic?

It turns out we’re not doomed. The new Facebook study is one of the largest and most rigorous investigations into how people receive and react to news. It was led by Eytan Bakshy, who began the work in 2010 when he was finishing his Ph.D. in information studies at the University of Michigan. He is now a researcher on Facebook’s data team, which conducts academic-type studies into how users behave on the teeming network.

Bakshy’s study involves a simple experiment. Normally, when one of your friends shares a link on Facebook, the site uses an algorithm known as EdgeRank to determine whether or not the link is displayed in your feed. In Bakshy’s experiment, conducted over seven weeks in the late summer of 2010, a small fraction of such shared links were randomly censored—that is, if a friend shared a link that EdgeRank determined you should see, it was sometimes not displayed in your feed. Randomly blocking links allowed Bakshy to create two different populations on Facebook. In one group, someone would see a link posted by a friend and decide to either share or ignore it. People in the second group would not receive the link—but if they’d seen it somewhere else beyond Facebook, these people might decide to share that same link of their own accord.

By comparing the two groups, Bakshy could answer some important questions about how we navigate news online. Are people more likely to share information because their friends pass it along? And if we are more likely to share stories we see others post, what kinds of friends get us to reshare more often—close friends, or people we don’t interact with very often? Finally, the experiment allowed Bakshy to see how “novel information”—that is, information that you wouldn’t have shared if you hadn’t seen it on Facebook—travels through the network. This is important to our understanding of echo chambers. If an algorithm like EdgeRank favors information that you’d have seen anyway, it would make Facebook an echo chamber of your own beliefs. But if EdgeRank pushes novel information through the network, Facebook becomes a beneficial source of news rather than just a reflection of your own small world.

That’s exactly what Bakshy found. His paper is heavy on math and network theory, but here’s a short summary of his results. First, he found that the closer you are with a friend on Facebook—the more times you comment on one another’s posts, the more times you appear in photos together, etc.—the greater your likelihood of sharing that person’s links. At first blush, that sounds like a confirmation of the echo chamber: We’re more likely to echo our closest friends.

But here’s Bakshy’s most crucial finding: Although we’re more likely to share information from our close friends, we still share stuff from our weak ties—and the links from those weak ties are the most novel links on the network. Those links from our weak ties, that is, are most likely to point to information that you would not have shared if you hadn’t seen it on Facebook.•


From Mines:

What a great but disturbing question (the fact that you can even ask it). Weird question for me as for most of my career I have been traveling the world observing other countries in various states of dysfunction and answering this same question. In this case if the standard is largescale violence that requires the National Guard to deal with in the timeline you lay out, I would say about 60 percent.

I base that on the following factors:

— Entrenched national polarization of our citizenry with no obvious meeting place. (Not true locally, however, which could be our salvation; but the national issues are pretty fierce and will only get worse).

— Press and information flow is more and more deliberately divisive, and its increasingly easy to put out bad info and incitement.

— Violence is “in” as a method to solve disputes and get one’s way. The president modeled violence as a way to advance politically and validated bullying during and after the campaign.  Judging from recent events the left is now fully on board with this, although it has been going on for several years with them as well — consider the university events where professors or speakers are shouted down and harassed, the physically aggressive anti-Israeli events, and the anarchists during globalization events. It is like 1859, everyone is mad about something and everyone has a gun.

— Weak institutions — press and judiciary, that are being further weakened. (Still fairly strong and many of my colleagues believe they will survive, but you can do a lot of damage in four years, and your timeline gives them even more time).

— Total sellout of the Republican leadership, validating and in some cases supporting all of the above.•

Tags: ,

Developing visual recognition in machines is helpful in performing visual tasks, of course, but this ability has the potential to unfold Artificial Intelligence in much broader and significant ways, providing AI with a context from which to more accurately “comprehend” the world. (I’m not even sure if the quotation marks in the previous sentence are necessary.)

In an interview conducted by Tom Simonite of Technology Review, Director of AI Research at Facebook’s AI research director Yann LeCun explains that exposing machines to video will hopefully enable them to learn through observation as small children do. “That’s what would allow them to acquire common sense, in the end,” he says.

An excerpt:

Question:

Babies learn a lot about the world without explicit instruction, though.

Yann LeCun:

One of the things we really want to do is get machines to acquire the very large number of facts that represent the constraints of the real world just by observing it through video or other channels. That’s what would allow them to acquire common sense, in the end. These are things that animals and babies learn in the first few months of life—you learn a ridiculously large amount about the world just by observation. There are a lot of ways that machines are currently fooled easily because they have very narrow knowledge of the world.

Question:

What progress is being made on getting software to learn by observation?

Yann LeCun:

We are very interested in the idea that a learning system should be able to predict the future. You show it a few frames of video and it tries to predict what’s going to happen next. If we can train a system to do this we think we’ll have developed techniques at the root of an unsupervised learning system. That is where, in my opinion, a lot of interesting things are likely to happen. The applications for this are not necessarily in vision—it’s a big part of our effort in making progress in AI.•

Tags: ,

When it comes to technology, promises often sound like threats. 

In a very smart Edge piece, Chris Anderson, the former Wired EIC who’s now CEO of 3DRobotics, holds forth on closed-loop systems, which allow for processes to be monitored, measured and corrected–even self-corrected. As every object becomes “smart,” they can collect information about themselves, their users and their surroundings. In many ways, these feedback loops will be a boon, allowing (potentially) for smoother maintenance, a better use of resources and a cleaner environment. But the new arrangement won’t all be good.

The question Anderson posed which I used as the headline makes it sound like we’ll be able to control where such technology snakes, but I don’t think that’s true. It won’t get out of hand in a sci-fi thriller sense but in very quiet, almost imperceptible ways. There will hardly be a hum. 

At any rate, Anderson’s story of how he built a drone company from scratch, first with the help of his children and then a 19-year-old kid with no college background from Tijuana, is amazing and a great lesson in globalized economics.

From Edge:

If we could measure the world, how would we manage it differently? This is a question we’ve been asking ourselves in the digital realm since the birth of the Internet. Our digital lives—clicks, histories, and cookies—can now be measured beautifully. The feedback loop is complete; it’s called closing the loop. As you know, we can only manage what we can measure. We’re now measuring on-screen activity beautifully, but most of the world is not on screens.                                 

As we get better and better at measuring the world—wearables, Internet of Things, cars, satellites, drones, sensors—we are going to be able to close the loop in industry, agriculture, and the environment. We’re going to start to find out what the consequences of our actions are and, presumably, we’ll take smarter actions as a result. This journey with the Internet that we started more than twenty years ago is now extending to the physical world. Every industry is going to have to ask the same questions: What do we want to measure? What do we do with that data? How can we manage things differently once we have that data? This notion of closing the loop everywhere is perhaps the biggest endeavor of our age.                                 

Closing the loop is a phrase used in robotics. Open-loop systems are when you take an action and you can’t measure the results—there’s no feedback. Closed-loop systems are when you take an action, you measure the results, and you change your action accordingly. Systems with closed loops have feedback loops; they self-adjust and quickly stabilize in optimal conditions. Systems with open loops overshoot; they miss it entirely. …

I use the phrase closing the loop because that’s the phrase we use in robotics. Other people might use the phrase big data. Before they called it big data, they called it data mining. Remember that? That was nuts. Anyway, we’re going to come up with a new word for it.                                 

It goes both ways: The tendrils of the Internet reach out through sensors, and then these sensors feed back to the Internet. The sensors get smarter because they’re connected to the Internet, and the Internet gets smarter because it’s connected to the sensors. This feedback loop extends beyond the industry that’s feeding back to the meta-industry, which is the Internet and the planet.•

Tags:

In the latest round of what may be gamesmanship between the American Intelligence Community and Russia, Wikileaks released information about purported CIA spying techniques, which included, among other tricks of the trade, a way to remotely hack smart televisions so that the watchers would become the watched. Such methods should surprise no one.

What does startle me is how receptive Americans are to being watched, as if in this decentralized media age, we’ve accepted, finally and completely, that all the world actually is a stage. It goes far beyond the way we revel in the modern freak show of Reality TV or allow social networks access to our private lives in return for a cheap platform on which to peddle our personalities. As sensors and microchips proliferate, we’re gradually turning every object into a computer from which we can be monitored and quantified. Big Brother will eventually have several siblings in every room. The shock is that we’re so willing to be members of this non-traditional family.

Chance the Gardner was all of us when he said, “I like to watch.” Apparently, we also like to be watched.

The opening of Sapna Maheshwari’s smart New York Times piece on the topic:

While Ellen Milz and her family were watching the Olympics last summer, their TV was watching them.

Ms. Milz, 48, who lives with her husband and three children in Chicago, had agreed to be a panelist for a company called TVision Insights, which monitored her viewing habits — and whether her eyes flicked down to her phone during the commercials, whether she was smiling or frowning — through a device on top of her TV.

“The marketing company said, ‘We’re going to ask you to put this device in your home, connect it to your TV and they’re going to watch you for the Olympics to see how you like it, what sports, your expression, who’s around,’” she said. “And I said, ‘Whatever, I have nothing to hide.’”

Ms. Milz acknowledged that she had initially found the idea odd, but that those qualms had quickly faded.

“It’s out of sight, out of mind,” she said, comparing it to the Nest security cameras in her home. She said she had initially received $60 for participating and an additional $230 after four to six months.

TVision — which has worked with the Weather Channel, NBC and the Disney ABC Television Group — is one of several companies that have entered living rooms in recent years, emerging with new, granular ways for marketers to understand how people are watching television and, in particular, commercials. The appeal of this information has soared as Americans rapidly change their viewing habits, streaming an increasing number of shows weeks or months after they first air, on devices as varied as smartphones, laptops and Roku boxes, not to mention TVs.

Through the installation of a Microsoft Kinect device, normally used for Xbox video games, on top of participants’ TVs, TVision tracks the movement of people’s eyes in relation to the television. The device’s sensors can record minute shifts for all the people in the room.•

Tags:

Margaret Atwood is in an odd position: As our politics get worse, her stature grows. Right now, sadly (for us), she’s never towered higher.

Appropriate that on International Women’s Day and the A Day Without a Woman protests, the The Handmaid’s Tale novelist conducted a Reddit Ask Me Anything to coincide with the soon-to-premiere Hulu version of her most famous work. Dystopia, feminism and literature are, of course, among the discussion topics. A few exchanges follow.


Question:

Thank you so much for writing The Handmaid’s Tale. It was the book that got me hooked on dystopian novels. What was your inspiration for the story?

Margaret Atwood:

Ooo, three main things: 1) What some people said they would do re: women if they had the power (they have it now and they are); 2) 17th C Puritan New England, plus history through the ages — nothing in the book that didn’t happen, somewhere and 3) the dystopian spec fics of my youth, such as 1984, Ray Bradbury’s Fahrenheit 451, etc. I wanted to see if I could write one of those, too.


Question:

What would you be doing right now if you were an American? Would you run for office? Would you protest? Would you be planning to resist ICE?

Margaret Atwood:

I would make a very bad politician, so no, I wouldn’t run for office. But I would support those who were running. I would certainly turn out for protests, as I did here in Toronto, wearing a rather strange pink hat. I don’t know what else I would do! We are in a time when reality seems to shift every day…


Question:

What is a book you keep going back to read and why?

Margaret Atwood:

This is going to sound corny but Shakespeare is my return read. He knew so much about human nature (+ and minus) and also was an amazing experimenter with language. But there are many other favourites. Wuthering Heights recently. In moments of crisis I go back to (don’t laugh) Lord of the Rings, b/c despite the EVIL EYE OF MORDOR it comes out all right in the end. Whew.


Question:

How, if at all, has your feminism changed over the last decade or so? Can you see these changes taking place throughout your literature? Lastly, can you offer any advice for feminists of the millennial generation? What mistakes are we making/repeating? What are our priorities in this political climate?

Margaret Atwood:

Hello: I am so shrieking old that my formative years (the 40s and 50s) took place before 2nd wave late-60’s feminist/women’s movement. But since I grew up largely in the backwoods and had strong female relatives and parents who read a lot and never told me I couldn’t do such and such because of being a girl, I avoided the agit-prop of the 50s that said women should be in bungalows with washing machines to make room for men coming back from the war. So I was always just very puzzled by some of the stuff said and done by/around women. I was probably a danger to myself and others! (joke) My interest was in women of all kinds — and they are of all kinds. They are interesting in and of themselves, and they do not always behave well. But then I learned more about things like laws and other parts of the world, and history… try Marilyn French’s From Eve to Dawn, pretty massive. We are now in what is being called the 3rd wave — seeing a lot of pushback against women, and also a lot of women pushing back in their turn. I’d say in general: be informed, be aware. The priorities in the US are roughly trying to prevent the roll-back that is taking place especially in the area of women’s health. Who knew that this would ever have to be defended? Childbirth care, pre-natal care, early childhood care — many people will not even be able to afford any of it. Dead bodies on the floor will result. It is frightful. Then there is the whole issue of sexual violence being used as control — it is such an old motif. For a theory of why now, see Eve’s Seed. It’s an unsettled time. If I were a younger woman I’d be taking a self-defense course. I did once take Judo, in the days of the Boston Strangler, but it was very lady-like then and I don’t think it would have availed. There’s something called Wen-Do. It’s good, I am told.


Question:

The Handmaid’s Tale gets thrown out as your current worst-case scenario right now but I read The Heart Goes Last a few months ago and I was surprised how possible it felt. Was there a specific news story or event that compelled you to write that particular story?

Margaret Atwood:

The Heart Goes Last — yes, came from my interest in what happens when a region’s economy collapses and people are really up against it, and the only “business” in which people can have jobs is a prison. It pushes the envelope (will there really be some Elvis robots?) but again, much of what was only speculation then is increasingly possible.


Question:

How did your experience with the 2017 version differ from the 1990 version of The Handmaid’s Tale?

Margaret Atwood:

Different times (that world is closer now!) and a 90 minute film is a different proposition from a 10 part 1st season series, which can build out and deep dive because it has more time. The advent of high-quality streamed or televised series has opened up a whole new set of possibilities for longer novels. We launched the 1990 film in West and then East Berlin just as the Wall was coming down… and I started writing book when the Wall was still there… Framed it in people’s minds in a different way. Also, then, many people were saying “It can’t happen here.” Now, not so much….•

Tags:

Andrew Ng’s predictions about Artificial Intelligence carry more weight with me than the projections of many of his peers because he never seems driven by irrational exuberance. In fact, he often urges caution when talk about the imminent arrival of driverless cars and other landscape-changing tools becomes overheated. 

So, when Baidu’s Chief Scientist asserts AI will soon deliver to us a brave new world, one in which, for instance, speech recognition is all but perfected, it’s probably wise to take notice. Computer conversation that’s wholly convincing should give us pause, however. Any technology that becomes seamless should be met as much by concern as enthusiasm.

An excerpt from a smart Wall Street Journal interview Scott Austin conducted with Ng and Neil Jacobstein of Singularity University:

Andrew Ng:

In addition to strengthening our core business, AI is creating a lot of new opportunities. Just as about 100 years ago electrification changed every single major industry, I think we’re in the phase where AI will change pretty much every major industry.

So part of my work at Baidu is to systematically explore new verticals. We have built up an autonomous driving unit. We have a conversational computer, similar to Amazon’s Alexa and Google Home. And we’re systematically pursuing new industries where we think we can build an AI team to create and capture value.

Question:

Let’s talk about speech recognition. I believe someone in your program has said that the hope is to get to the point where it is 99% accurate. Where are you on that?

Andrew Ng:

A couple of years ago, we started betting heavily on speech recognition because we felt that it was on the cusp of being so accurate that you would use it all the time. And the difference between speech recognition that is 95% accurate, which is where we were several years ago, versus 99% accuracy isn’t just an incremental improvement.

It’s the difference between you barely using it, like a couple of years ago, versus you using it all the time and not even thinking about it. At Baidu we have passed the knee of that adoption curve. Over the past year, we’ve seen about 100% year-to-year growth in the daily active use of speech recognition across our assets, and we project that this will continue to grow.

In a few years everyone will be using speech recognition. It will feel natural. You’ll soon forget what it was like before you could talk to computers.•

Tags: , ,

Economist Tyler Cowen just did a fun Ask Me Anything at Reddit, discussing driverless cars, the Hyperloop, wealth redistribution, Universal Basic Income, the American Dream, etc.

Cowen also discusses Peter Thiel’s role in the Trump Administration, though his opinion seems too coy. We’re not talking about someone who just so happens to work for a “flawed” Administration but a serious supporter of a deeply racist campaign to elect a wholly unqualified President and empower a cadre of Breitbart bigots. Trump owns the mess he’s creating, but Thiel does also. The most hopeful thing you can say about the Silicon Valley billionaire, who was also sure there were WMDs in Iraq, is that outside of his realm he has no idea what he’s doing. The least hopeful is that he’s just not a good person.

A few exchanges follow.


Question:

What is an issue or concept in economics that you wish were easier to explain so that it would be given more attention by the public?

Tyler Cowen:

The idea that a sound polity has to be based on ideas other than just redistribution of wealth.


Question:

What do you think about Peter Thiel’s relationship with President Trump?

Tyler Cowen:

I haven’t seen Peter since his time with Trump. I am not myself a Trump supporter, but wish to reserve judgment until I know more about Peter’s role. I am not in general opposed to the idea of people working with administrations that may have serious flaws.


Question:

In a recent article by you, you spoke about who in the US was experiencing the American Dream, finding evidence that the Dream is still alive and thriving for Hispanics in the U.S. What challenges do you perceive now with the new Administration that might reduce the prospects for this group?

Tyler Cowen:

Breaking up families, general feeling of hostility, possibly damaging the economy of Mexico and relations with them. All bad trends. I am hoping the strong and loving ties across the people themselves will outweigh that. We will see, but on this I am cautiously optimistic.


Question:

Do you think convenience apps like Amazon grocery make us more complacent?

Tyler Cowen:

Anything shipped to your home — worry! Getting out and about is these days underrated. Serendipitous discovery and the like. Confronting the physical spaces we have built, and, eventually, demanding improvements in them.


Question:

Given that universal basic income or similar scheme will become necessity after large scale automation kicks in, will these arguments about fiscal and budgetary crisis still hold true?

And with self driving cars and tech like Hyperloop, wouldn’t the rents in the cities go down?

Tyler Cowen:

Driverless cars are still quite a while away in their most potent form, as that requires redoing the whole infrastructure. But so far I see location only becoming more important, even in light of tech developments, such as the internet, that were supposed to make it less important. It is hard for me to see how a country with so many immigrants will tolerate a UBI. I think that idea is for Denmark and New Zealand, I don’t see it happening in the United States. Plus it can cost a lot too. So the arguments about fiscal crisis I think still hold.


Question:

What is the most underrated city in the US? In the world?

Tyler Cowen:

Los Angeles is my favorite city in the whole world, just love driving around it, seeing the scenery, eating there. I still miss living in the area.


Question:

I am a single guy. Can learning economics help me find a girlfriend?

Tyler Cowen:

No, it will hurt you. Run the other way!•

Tags: ,

In the Financial Times interview with Daniel Dennett I recently blogged about, a passage covers a compelling idea hatched by the philosopher and MIT’s Deb Roy in “Our Transparent Future,” a 2015 Scientific American article. The academics argue that the radical transparency now taking hold because of new technological tools, which will only grow more profound as we are lowered even further into a machine with no OFF switch, is akin to the circumstances that may have catalyzed the Cambrian explosion.

In that epoch, it might have been an abundance of light that shined through in a newly transparent atmosphere which forced organisms to adapt and led to tremendous growth–and also death. Dennett and Roy believe that society’s traditional institutions (government, marriage, education, etc.) are facing the same challenge to reinvent themselves or else, due to the tremendous flow of information we have today at our fingertips. Now that privacy is all but impossible, what is the best way to arrange ourselves? 

The opening:

MORE THAN HALF A BILLION YEARS AGO A SPECTACULARLY CREATIVE burst of biological innovation called the Cambrian explosion occurred. In a geologic “instant” of several million years, organisms developed strikingly new body shapes, new organs, and new predation strategies and defenses against them. Evolutionary biologists disagree about what triggered this prodigious wave of novelty, but a particularly compelling hypothesis, advanced by University of Oxford zoologist Andrew Parker, is that light was the trigger. Parker proposes that around 543 million years ago, the chemistry of the shallow oceans and the atmosphere suddenly changed to become much more transparent. At the time, all animal life was confined to the oceans, and as soon as the daylight flooded in, eyesight became the best trick in the sea. As eyes rapidly evolved, so did the behaviors and equipment that responded to them. 

Whereas before all perception was proximal—by contact or by sensed differences in chemical concentration or pressure waves—now animals could identify and track things at a distance. Predators could home in on their prey; prey could see the predators coming and take evasive action. Locomotion is a slow and stupid business until you have eyes to guide you, and eyes are useless if you cannot engage in locomotion, so perception and action evolved together in an arms race. This arms race drove much of the basic diversification of the tree of life we have today.

Parker’s hypothesis about the Cambrian explosion provides an excellent parallel for understanding a new, seemingly unrelated phenomenon: the spread of digital technology. Although advances in communications technology have transformed our world many times in the past—the invention of writing signaled the end of prehistory; the printing press sent waves of change through all the major institutions of society—digital technology could have a greater impact than anything that has come before. It will enhance the powers of some individuals and organizations while subverting the powers of others, creating both opportunities and risks that could scarcely have been imagined a generation ago. 

Through social media, the Internet has put global-scale communications tools in the hands of individuals. A wild new frontier has burst open. Services such as YouTube, Facebook, Twitter, Tumblr, Instagram, WhatsApp and SnapChat generate new media on a par with the telephone or television—and the speed with which these media are emerging is truly disruptive. It took decades for engineers to develop and deploy telephone and television networks, so organizations had some time to adapt. Today a social-media service can be developed in weeks, and hundreds of millions of people can be using it within months. This intense pace of innovation gives organizations no time to adapt to one medium before the arrival of the next.

The tremendous change in our world triggered by this media inundation can be summed up in a word: transparency. We can now see further, faster, and more cheaply and easily than ever before—and we can be seen. And you and I can see that everyone can see what we see, in a recursive hall of mirrors of mutual knowledge that both enables and hobbles. The age old game of hide-and-seek that has shaped all life on the planet has suddenly shifted its playing field, its equipment and its rules. The players who cannot adjust will not last long.

The impact on our organizations and institutions will be profound. Governments, armies, churches, universities, banks and companies all evolved to thrive in a relatively murky epistemological environment, in which most knowledge was local, secrets were easily kept, and individuals were, if not blind, myopic. When these organizations suddenly find themselves exposed to daylight, they quickly discover that they can no longer rely on old methods; they must respond to the new transparency or go extinct.•

Tags: ,

A little more on Cambridge Analytica, which Carole Caldwalla reported on recently in the Guardian. The audience-targeting company is being given significant credit by some for powering the Trump campaign to Electoral College victory. My main hesitation with believing online fake news was an predominant factor in the recent election is that Trump won overwhelmingly with older Americans, who seem to have been more plugged into Fox News than Facebook. It may have played a role, but did it really have a greater impact than, say, a legacy-media company like the New York Times, running an un-skeptical headline above the fold about the FBI suspiciously reopening the investigation into the Clinton emails? Or Russia hacking the election? It’s hard to untangle what just went on, but looking for a single smoking gun will probably always prove unsatisfactory. 

From Nicholas Confessore and Danny Hakim of the New York Times

Cambridge Analytica’s rise has rattled some of President Trump’s critics and privacy advocates, who warn of a blizzard of high-tech, Facebook-optimized propaganda aimed at the American public, controlled by the people behind the alt-right hub Breitbart News. Cambridge is principally owned by the billionaire Robert Mercer, a Trump backer and investor in Breitbart. Stephen K. Bannon, the former Breitbart chairman who is Mr. Trump’s senior White House counselor, served until last summer as vice president of Cambridge’s board.

But a dozen Republican consultants and former Trump campaign aides, along with current and former Cambridge employees, say the company’s ability to exploit personality profiles — “our secret sauce,” Mr. Nix once called it — is exaggerated.

Cambridge executives now concede that the company never used psychographics in the Trump campaign. The technology — prominently featured in the firm’s sales materials and in media reports that cast Cambridge as a master of the dark campaign arts — remains unproved, according to former employees and Republicans familiar with the firm’s work.

“They’ve got a lot of really smart people,” said Brent Seaborn, managing partner of TargetPoint, a rival business that also provided voter data to the Trump campaign. “But it’s not as easy as it looks to transition from being excellent at one thing and bringing it into politics. I think there’s a big question about whether we think psychographic profiling even works.”•

Tags: , ,

Late to Industrialization, China entered the process knowing what much of the Western world had to learn the hard way in the 1970s: Urbanizing and modernizing an entire nation brings with it tremendous economic growth, but it can’t be sustained by the same methods–or perhaps at all–when the mission is complete. It’s a one-time-only bargain.

A richer nation can’t grow endlessly on the production of cheap exports, so the newly minted superpower is pivoting more to domestic demand, a nuance no doubt lost in the Trump Administration’s ham-handed appreciation of global politics. In “Trump’s Most Chilling Economic Lie,” a Joseph Stiglitz Vanity Fair “Hive” article, the economist highlights the insanity of America engaging in a trade war with China and expecting to emerge the richer. An excerpt:

Trump’s team may be tempted to conclude, naively, that because China exports so much more to the U.S. than the U.S. exports to China, the loss of a huge export market would hurt them more than it would hurt us. This reasoning is too simplistic by half. China’s government has far more control over the country’s economy than our government has over ours; and it is moving from export dependence to a model of growth driven by domestic demand. Any restriction on exports to the U.S. would simply accelerate a process already underway. Moreover, China’s government has the resources (it’s still sitting on some $3 trillion of reserves) and instruments to help any sector that has been shut out—and in this respect, too, China is better placed than the U.S.

China has already shown how it is likely to respond if Trump should launch a trade war. At Davos, President Xi Jinping came out as the great supporter of globalization and the international rule of law—as well China should. China, with its large emerging middle class, is among the big beneficiaries of globalization. Critics have said that China does not always play fair. They complain that as China has grown, it has taken away some of the privileges, some of the tax preferences, that it gave to foreigners in earlier stages of development. They are unhappy, too, that some Chinese firms have learned quickly how to compete—some of them even appropriating ideas from others, just as we appropriated intellectual property from Europe more than a century ago.

It is worth noting that, although large multinationals complain, they are not leaving. And we tend to forget the extensive restrictions we impose on Chinese firms when they seek to invest in the U.S. or buy high-tech products. Indeed, the Chinese frequently point out that if the U.S. lifted those restrictions, America’s trade deficit with China would be smaller.

China’s first response will be to try to find areas of cooperation. They are experts in construction. They know how to build high-speed trains. They might even provide some financing for these projects. Given Trump’s rhetoric, though, I suspect that such cooperation is just a dream.

If Trump insists on an adversarial stance, China is likely to respond within the framework of international law even if Trump puts little weight on such agreements—and thus is not likely to retaliate in a naive, tit-for-tat way. But China has made it clear that it will respond. And if history is any guide, it will respond both forcefully and intelligently, hitting us where it hurts economically and politically—where, for instance, cutbacks in purchases by China will lead to more unemployment in congressional districts that are vulnerable, influential, or both. If Boeing’s order book is thin, it might, for instance, cancel its purchases of Boeing planes.•

Tags: ,

Some things take on a life of their own, even when they’re not actually living. AI will likely slot into this category.

When Bill Gates suggests America tax robots not only to provide opportunities to those squeezed from the middle class but to also slow down innovation, he’s analyzing the situation as it if it existed inside a vacuum. That’s not the way technology proceeds.

Developing new tools is part of a constant competition among and within states and corporations. Innovating is paramount to maintaining a competitive edge in the marketplace or battlefield. Of course, holding your position or gaining an advantage means a nation (or nations) may be careering down a perilous path. Being militarily prepared for danger can, paradoxically, be very dangerous. The same will be true of biotech.

From Morgan Chalfant’s The Hill story about the surprising speed with which robotic soldiers may be reporting for duty:

Peter Singer, a strategist at the New America Foundation, said that artificial intelligence is among potential “disruptions” being developed in the realm of cyber conflict. 

“It’s not just when is it going to happen, but we don’t yet know is it going to privilege the offense or defense, what are going to be the affects of it,” Singer said, recommending that Congress hold a classified hearing on where the U.S. stands in comparison to likely adversaries on this capability. 

“We don’t want to fall behind,” he said. 

Healey expressed concerns about the possibility of artificial intelligence augmenting our adversaries’ offensive capabilities more significantly than the United States’ defense of its critical infrastructure. 

“The part of it that particularly worries me the most is that on the defensive side many people are thinking that artificial intelligence, new heuristics, better analytics and automation are going to help the defense, that if only we can roll these things out faster we will be better and the system will be more stable,” Healey explained.

“I think that these technologies are going to aid the offense much more than it aids the defense because to defend against these kinds of attacks, you need your own super computer,” he continued. 

Healey warned that while the Pentagon can afford computer systems necessary to defend against adversaries using artificial intelligence, small- or mid-sized enterprises that own U.S. critical infrastructure cannot.

“It leaves much of America undefended,” he said.•

Tags:

Francis Ford Coppola’s The Conversation is the “little” 1974 psychological thriller he squeezed in between the first two Godfather films, which fast-forwarded the disquiet of Antonioni’s Blow-Up into the Watergate Era, even if the director has always considered it more a personal than political film. The movie, which hangs on San Francisco surveillance expert Harry Caul’s descent into madness, remains a classic and has actually grown in stature as the Digital Age replaced the analog one. When I wrote briefly about the cerebral movie six years ago, I concluded with this:

In the era that saw the downfall of an American President who listened to the tapes of others and erased his own, The Conversation was amazingly relevant, but in some ways it may be even more meaningful in this exhibitionist age, in which we gleefully hand over our privacy to satisfy our egos. As Caul and Nixon learned, and as we may yet, those who press PLAY don’t always get to choose when to press STOP.•

This weekend, we had a sitting American President (baselessly) accuse his predecessor of tapping his phone lines, all the while the Intelligence Community searches for real tapes of this Administration’s officials conspiring with the Kremlin during the campaign. Such evidence would be treasonous.

It’s not shocking that Trump’s viciously ugly brand of nostalgia has forced us backwards into a Cold Way type of paranoia, in which 20th-century espionage is predominant. The greater insight to take from The Conversation may be more about the near future, however, when nobody has to hit PLAY because there’s no longer a STOP.

In an amazing find, the good people at Cinephilia & Beyond published a 1974 Filmmakers Newsletter interview in which Brian De Palma quizzed Coppola about this masterwork. It’s more a discussion of cinema than of Watergate, and there’s a very interesting exchange in which the subject reveals why he doesn’t regard Hitchcock with awe.

Here’s the opening:


Here’s a wonderful making-of featurette about The Conversation, which asked questions about a world where everyone is a spy and spied upon. The surprise more than 40 years later: Few seemed upset as we crept into the new order of the techno-society. We haven’t been trapped after all; we’ve logged on and signed up for it.

Tags: ,

When the idea of online personalization was first presented to me by an Web 1.0 entrepreneur, I was unimpressed and uninterested, saying I preferred newspapers and magazines introducing me to ideas I hadn’t been seeking. You could have your very own tailor-made newspaper or magazine, I was told, but I insisted that wasn’t what I would subscribe to. I naively believed others would feel the same way.

When the very young version of Mark Zuckerberg (who is still young) infamously said “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa,” he betrayed himself as stunningly callous at that point in his life but also demonstrated the myopia of personalization. Our new tools can enable us to live in a bubble, but that doesn’t mean they ennoble us.

Targeted information helps maximize online advertising revenue, but it’s lousy for democracy, paradoxically offering us greater freedom while simultaneously undermining it. Whether the Internet is inherently at odds with liberty is a valid question. And considering Fox News has been fake news for more than 20 years, the decentralized media in general has to similarly analyzed.


The opening of Tim Harford’s latest Financial Times column:

“Our goal is to build the perfect personalised newspaper for every person in the world,” said Facebook’s Mark Zuckerberg in 2014. This newspaper would “show you the stuff that’s going to be most interesting to you.”

To many, that statement explains perfectly why Facebook is such a terrible source of news.

A “fake news” story proclaiming that Pope Francis had endorsed Donald Trump was, according to an analysis from BuzzFeed, the single most successful item of news on Facebook in the three months before the US election. If that’s what the site’s algorithms decide is interesting, it’s far from being a “perfect newspaper.”

It’s no wonder that Zuckerberg found himself on the back foot after Trump’s election. Shortly after his victory, Zuckerberg declared: “I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way . . . is a pretty crazy idea.” His comment was greeted with a scornful response.

I should confess my own biases here. I despise Facebook for all the reasons people usually despise Facebook (privacy, market power, distraction, fake-smile social interactions and the rest). And, as a loyal FT columnist, I need hardly point out that the perfect newspaper is the one you’re reading right now.

But, despite this, I’m going to stand up for Zuckerberg, who recently posted a 5,700-word essay defending social media. What he says in the essay feels like it must be wrong. But the data suggest that he’s right. Fake news can stoke isolated incidents of hatred and violence. But neither fake news nor the algorithmically driven “filter bubble” is a major force in the overall media landscape. Not yet.•


From Eli Pariser in the New York Times in 2011:

Like the old gatekeepers, the engineers who write the new gatekeeping code have enormous power to determine what we know about the world. But unlike the best of the old gatekeepers, they don’t see themselves as keepers of the public trust. There is no algorithmic equivalent to journalistic ethics.

Mark Zuckerberg, Facebook’s chief executive, once told colleagues that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” At Facebook, “relevance” is virtually the sole criterion that determines what users see. Focusing on the most personally relevant news — the squirrel — is a great business strategy. But it leaves us staring at our front yard instead of reading about suffering, genocide and revolution.

There’s no going back to the old system of gatekeepers, nor should there be. But if algorithms are taking over the editing function and determining what we see, we need to make sure they weigh variables beyond a narrow “relevance.” They need to show us Afghanistan and Libya as well as Apple and Kanye.

Companies that make use of these algorithms must take this curative responsibility far more seriously than they have to date. They need to give us control over what we see — making it clear when they are personalizing, and allowing us to shape and adjust our own filters. We citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.

It is in our collective interest to ensure that the Internet lives up to its potential as a revolutionary connective medium. This won’t happen if we’re all sealed off in our own personalized online worlds.•

Tags: , ,

In his NYRB review of Daniel Dennett’s From Bacteria to Bach and Back: The Evolution of Minds, Thomas Nagel is largely laudatory even though he believes his fellow philosopher ultimately guilty of “maintaining a thesis at all costs,” writing that:

Dennett believes that our conception of conscious creatures with subjective inner lives—which are not describable merely in physical terms—is a useful fiction that allows us to predict how those creatures will behave and to interact with them.

Nagel draws an analogy between Dennett’s ideas and the Behaviorism of B.F. Skinner and other mid-century psychologists, a theory that never was truly satisfactory in explaining the human mind. Dennett’s belief that we’re more machine-like than we want to believe is probably accurate, though his assertion that all consciousness is illusory–if that’s what he’s arguing–seems off.

Dennett’s life work about consciousness and evolution has certainly crested at right moment, as we’re beginning to wonder in earnest about AI and non-human-consciousness, which seems possible at some point if not on the immediate horizon. In a Financial Times interview conducted by John Thornhill, Dennett speaks to the nature and future of robotics.

An excerpt:

AI experts tend to draw a sharp distinction between machine intelligence and human consciousness. Dennett is not so sure. Where many worry that robots are becoming too human, he argues humans have always been largely robotic. Our consciousness is the product of the interactions of billions of neurons that are all, as he puts it, “sorta robots”.

“I’ve been arguing for years that, yes, in principle it’s possible for human consciousness to be realised in a machine. After all, that’s what we are,” he says. “We’re robots made of robots made of robots. We’re incredibly complex, trillions of moving parts. But they’re all non-miraculous robotic parts.” …

Dennett has long been a follower of the latest research in AI. The final chapter of his book focuses on the subject. There has been much talk recently about the dangers posed by the emergence of a superintelligence, when a computer might one day outstrip human intelligence and assume agency. Although Dennett accepts that such a superintelligence is logically possible, he argues that it is a “pernicious fantasy” that is distracting us from far more pressing technological problems. In particular, he worries about our “deeply embedded and generous” tendency to attribute far more understanding to intelligent systems than they possess. Giving digital assistants names and cutesy personas worsens the confusion.

“All we’re going to see in our own lifetimes are intelligent tools, not colleagues. Don’t think of them as colleagues, don’t try to make them colleagues and, above all, don’t kid yourself that they’re colleagues,” he says.

Dennett adds that if he could lay down the law he would insist that the users of such AI systems were licensed and bonded, forcing them to assume liability for their actions. Insurance companies would then ensure that manufacturers divulged all of their products’ known weaknesses, just as pharmaceutical companies reel off all their drugs’ suspected side-effects. “We want to ensure that anything we build is going to be a systemological wonderbox, not an agency. It’s not responsible. You can unplug it any time you want. And we should keep it that way,” he says.•

Tags: , ,

Ezra Klein of Vox has an excellent interview about meditation and much more with Yuval Noah Harari, though I don’t know that I’m buying the main premise which is that the Israeli historian can so ably communicate such cogent ideas because of his adherence to this “mind-clearing” practice.

If that’s so, then I would have to suppose Harari was meditating far less while writing Homo Deus than when composing Sapiens, because the follow-up, while still worth reading, is not nearly as incisive or effective as his first book. (Jennifer Senior had a very good review of the sophomore effort in the New York Times.)

What separates Harari from other historians trying to communicate with a lay audience is his ability to brilliantly synthesize ideas in a very organic way. Even when I’m not sure if I’m totally buying one of these combinations (e.g., Alan Turing created a test in which a computer could pass for a human because he spent his brief, tragic life trying to pass for heterosexual), it still provokes me to think deeply on the subject.

I would assume this talent is more a quirk of his own brain chemistry and diligent development of natural gifts than anything else. Of course, meditation could be aiding in the process, or, perhaps, the practice of Vipassana is more correlation than causation. I doubt even Harari truly knows for sure.

The two opening exchanges:

Ezra Klein:

You told the Guardian that without meditation, you’d still be researching medieval military history — but not the Neanderthals or cyborgs. What changes has meditation brought to your work as a historian?

Yuval Noah Harari:

Two things, mainly. First of all, it’s the ability to focus. When you train the mind to focus on something like the breath, it also gives you the discipline to focus on much bigger things and to really tell the difference between what’s important and everything else. This is a discipline that I have brought to my scientific career as well. It’s so difficult, especially when you deal with long-term history, to get bogged down in the small details or to be distracted by a million different tiny stories and concerns. It’s so difficult to keep reminding yourself what is really the most important thing that has happened in history or what is the most important thing that is happening now in the world. The discipline to have this focus I really got from the meditation.

The other major contribution, I think, is that the entire exercise of Vipassana meditation is to learn the difference between fiction and reality, what is real and what is just stories that we invent and construct in our own minds. Almost 99 percent you realize is just stories in our minds. This is also true of history. Most people, they just get overwhelmed by the religious stories, by the nationalist stories, by the economic stories of the day, and they take these stories to be the reality.

My main ambition as a historian is to be able to tell the difference between what’s really happening in the world and what are the fictions that humans have been creating for thousands of years in order to explain or in order to control what’s happening in the world.

Ezra Klein:

One of the ideas that is central to your book Sapiens is that the central quality of Homo sapiens, what has allowed us to dominate the earth, is the ability to tell stories and create fictions that permit widespread cooperation in a way other species can’t. And what you count as fiction ranges all the way from early mythology to the Constitution of the United States of America.

I wouldn’t have connected that to the way meditation changes what you see as real, but it makes sense that if you’re observing the way your mind creates imaginary stories, maybe much more ends up falling into that category than you originally thought.

Yuval Noah Harari:

Yes, exactly. We seldom realize it, but all large-scale human cooperation is based on fiction. This is most clear in the case of religion, especially other people’s religion. You can easily understand that, yes, millions of people come together to cooperate in a crusade or a jihad or to build the cathedral or a synagogue because all of them believe some fictional story about God and heaven and hell.

What is much more difficult to realize is that exactly the same dynamic operates in all other kinds of human cooperation. If you think about human rights, human rights are a fictional story just like God and heaven. They are not a biological reality. Biologically speaking, humans don’t have rights. If you take Homo sapiens and look inside, you find the heart and the kidneys and the DNA. You don’t find any rights. The only place rights exist is in the stories that people have been inventing.

Another very good example is money. Money is probably the most successful story ever told. It has no objective value. It’s not like a banana or a coconut. If you take a dollar bill and look at it, you can’t eat it. You can’t drink it. You can’t wear it. It’s absolutely worthless. We think it’s worth something because we believe a story. We have these master storytellers of our society, our shamans — they are the bankers and the financiers and the chairperson of the Federal Reserve, and they come to us with this amazing story that, “You see this green piece of paper? We tell you that it is worth one banana.”

If I believe it and you believe it and everybody believes it, it works. It actually works. I can take this worthless piece of paper, go to a complete stranger who I never met before, give him this piece of paper, and he in exchange will give me a real banana that I can eat.

This is really amazing, and no other animal can do it. Other animals sometimes trade. Chimpanzees, for example, they trade. You give me a coconut. I’ll give you a banana. That can work with a chimpanzee, but you give me a worthless piece of paper and you expect me to give you a banana? That will never work with a chimpanzee.

This is why we control the world, and not the chimpanzees.•

Tags: ,

Rebuilding the world has to rank at the top of economic low-hanging fruit of the last century. America, its forces marshaled, played a leading role in piecing together the shattered globe in the wake of WWII. Yes, four decades of unfortunate tax rates, globalization, automation and the demise of unions have all abetted the decline of the U.S. middle class, but just as true is that the good times simply ended, the job completed (more or less), the outlier ran headlong into entropy. The contents of this half-empty glass finally spilled all over the world in 2016, provoking outrageously regressive political shifts, with perhaps more states becoming submerged this year.

As An Extraordinary Time author Marc Levinson wrote in 2016 in the Wall Street Journal: “The quarter-century from 1948 to 1973 was the most striking stretch of economic advance in human history. In the span of a single generation, hundreds of millions of people were lifted from penury to unimagined riches.” In “End of a Golden Age,” an Aeon essay, the economist and journalist further argues the global circumstances of the postwar era were a one-time-only opportunity for runaway productivity, a fortunate arrangement of stars likely to never align again.

Well, never is an extremely long stretch (we hope), but the economic-growth-rate promises brought to the trail by Sanders and Trump, which have made it to the White House with the unfortunate election of the latter candidate, were at best fanciful, though delusional might also be a fair assessment. If I had to guess, I would say someday we’ll see tremendous growth again, but when that happens and what precipitates it, I don’t know. Nobody really does.

An excerpt:

When it comes to influencing innovation, governments have power. Grants for scientific research and education, and policies that make it easy for new firms to grow, can speed the development of new ideas. But what matters for productivity is not the number of innovations, but the rate at which innovations affect the economy – something almost totally beyond the ability of governments to control. Turning innovative ideas into economically valuable products and services can involve years of trial and error. Many of the basic technologies behind mobile telephones were developed in the 1960s and ’70s, but mobile phones came into widespread use only in the 1990s. Often, a new technology is phased in only over time as old buildings and equipment are phased out. Moreover, for reasons no one fully understands, productivity growth and innovation seem to move in long cycles. In the US, for example, between the 1920s and 1973, innovation brought strong productivity growth. Between 1973 and 1995, it brought much less. The years between 1995 and 2003 saw high productivity gains, and then again considerably less thereafter.

When the surge in productivity following the Second World War tailed off, people around the globe felt the pain. At the time, it appeared that a few countries – France and Italy for a few years in the late 1970s, Japan in the second half of the ’80s – had discovered formulas allowing them to defy the downward global productivity trend. But their economies revived only briefly before productivity growth waned. Jobs soon became scarce again, and improvements in living standards came more slowly. The poor productivity growth of the late 1990s was not due to taxes, regulations or other government policies in any particular country, but to global trends. No country escaped them.

Unlike the innovations of the 1950s and ’60s, which were welcomed widely, those of the late 20th century had costly side effects. While information technology, communications and freight transportation became cheaper and more reliable, giant industrial complexes became dinosaurs as work could be distributed widely to take advantage of labour supplies, transportation facilities or government subsidies. Workers whose jobs were relocated found that their years of experience and training were of little value in other industries, and communities that lost major employers fell into decay. Meanwhile, the welfare state on which they had come to rely began to deteriorate, its financial underpinnings stressed due to the slow growth of tax revenue in economies that were no longer buoyant. The widespread sharing in the mid-century boom was not repeated in the productivity gains at the end of the century, which accumulated at the top of the income scale.

For much of the world, the Golden Age brought extraordinary prosperity. But it also brought unrealistic expectations about what governments can do to assure full employment, steady economic growth and rising living standards. These expectations still shape political life today.•

The recent Presidential election revealed that U.S. citizens either have a terrible understanding of economics or they’re willing to surrender their security in the name of identity politics. Both are likely true to a significant extent.

Immigrants were blamed for the downgrading of the American worker on the trail while automation was never discussed, and Michigan voters swung to Trump, largely because Washington had supposedly forgotten about them, after the Obama Administration wagered $79 billion on bailing out the Detroit auto industry. Not too long after that salvage job by the federal branch, the state’s populace voted in an anti-union governor in Rick Snyder. The locals may have forgotten about themselves more than D.C. ever did.

Another vital topic of discussion that was never broached during the campaign was the role that contracted work has played in shrinking middle class. For several decades, American companies have been outsourcing mail-room work, maintenance, security and other “non-core” tasks to subcontractors who would save them some money by lowering salaries and reducing benefits to laborers. This shift created a separate class. Executive pay ballooned while those with more modest pay stubs took the elevator downward, further exacerbating wealth inequality.

I’ve written before about this destabilizing phenomenon. More from Eduardo Porter of the New York Times:

…Mr. Trump is missing a more critical workplace transformation: the vast outsourcing of many tasks — including running the cafeteria, building maintenance and security — to low-margin, low-wage subcontractors within the United States.

This reorganization of employment is playing a big role in keeping a lid on wages — and in driving income inequality — across a much broader swath of the economy than globalization can account for.

David Weil, who headed the Labor Department’s wage and hour division at the end of the Obama administration, calls this process the “fissuring” of the workplace. He traces it to the 1980s, when corporations under pressure to raise quarterly profits started shedding “noncore” tasks.

The trend grew as the spread of information technology made it easier for companies to standardize and monitor the quality of outsourced work. Many employers took to outsourcing to avoid the messy consequences — like unions and workplace regulations — of employing workers directly.

“It’s an incredibly important part of the story that we haven’t paid attention to,” Mr. Weil told me.

“Lead businesses — the firms that continue to directly employ workers who provide the goods and services in the economy recognized by consumers — remain highly profitable and may continue to provide generous pay for their work force,” he noted. “The workers whose jobs have been shed to other, subordinate businesses face far more competitive market conditions.”

The trend is hard to measure, since subcontracting can take many forms. But it is big.•

Tags: , ,

In the time before centralized mass media, a whole host of traveling cons and medicine-show mountebanks could pull wool and push potions. During the era of centralized mass media, it was occasionally possible to work the masses in a big way, but mostly gatekeepers batted down these lies. In our age of decentralized communications, which began with President Reagan signing away the Fairness Doctrine and became fully entrenched with cable news and the Internet, the sideshow, now residing in the center ring, has never fooled more people who should know better.

One of the Barnums of this bizarre moment is right-wing radio talker Alex Jones, a compulsive eater and apeshit peddler of strange conspiracy theories that usually have an anti-government or bigoted bent. He is the kind of kook who would have been calling into late-night radio shows about UFOs in decades past, only to be hung up on by annoyed hosts. Today his strange fake-news impulses have provided him with a direct line to a White House led by a President who would have been laughed off the campaign trail in any reasonably decent and enlightened age. 

All of this has been enabled by a new form of hyper-democracy that resists checks and balances, in which every idea is equal true or not. It’s a scary moment in which anything–anything–is possible.

From Veit Medick’s great Spiegel profile of Jones, a Texas-sized F.O.T. (Friend of Trump):

Jones is stunned that not all Americans share his panicked view of the “jihadists.” Indeed, he believes the threat is so great that it would be best not to allow anyone at all to enter the United States anymore.

“Please forget the Statue of Liberty,” Jones says during a break. “It’s a symbol of propaganda. We should stop worshipping it and bending down to every Third World population that shows up with TB and leprosy.”

‘Foot Soldiers in the Trump Revolution’

Jones now plans to open an office in Washington. He says might hire 10 people to report on the White House, almost like a traditional media organization. He will be getting help from Roger Stone, a radical adviser to the president, who wrote a book in which he described former President Bill Clinton as a serial rapist without providing any proof. Under a deal reached between the two men, Stone began hosting the Alex Jones show for one hour a week a short time ago. “Elitists may laugh at his politics,” Stone says, but “Alex Jones is reaching millions of people, and they are the foot soldiers in the Trump revolution.”

It’s afternoon, and Jones is walking through the studio, his adrenaline level high and his blood sugar low. He needs to get something to eat. Platters of BBQ – chicken, beef and sausages – are set out on a table in the conference room. “Good barbecue,” says Jones. “You tasted it already?” 

He piles up food onto a plastic plate, and then he suddenly takes off his shirt without explanation. With his bare torso, he sits there and shovels meat into his mouth, a caricature of manliness, but also a show of power to the reporter sitting in front of him. He can do as he pleases.

Then Jones gets up and holds out a sausage. “Wanna suck?” he asks.•

Tags: ,

Futurist Thomas Frey has published “25 Shocking Predictions about the Coming Driverless Car Era in the U.S.,” a Linkedin article which envisions a time in the near future when almost all vehicles are autonomous and individual ownership has become a thing of the past.

Of course the idea that driverless will be imminently perfected and come to dominate the streets, roads and highways in, say, the next 15 years, is the most stunning of all prognostications and certainly not a sure thing. Technological and legislative hurdles most be surmounted and those unpredictable humans must be willing to let go of the wheel, if Frey’s “explosive transformation” is to proceed.

It’s certainly not impossible. Think of the monumental changes the Internet has brought in its 20+ years of wide usage and the way smartphones have rearranged life in only a decade. Driverless may do the same or perhaps we’ll all be talking when we’re old and gray about how the revolution never quite arrived.

If it does materialize, one big headache in addition to the jobs that will be shed too quickly for society to absorb is that we’ll rest inside surveillance devices while surveillance devices rest inside our pockets. We’ll be too deep inside the machine to ever extricate ourselves.

The opening two items:

1.) Life expectancy of autonomous vehicles will be less than 1 year

I’ve been doing some math on driverless cars and came to the startling conclusion that autonomous cars will wear out in as little as 9-10 months.

Yes, car speeds will be slower in the beginning, but within ten years as speeds increase and cars begin to average 60-70 mph on open freeways, a single car could easily average 1,000 miles a day.

Over a 10-month period, a single car could travel as much as 300,000 miles.

Cars today are only in use 4% of the day, less than an hour a day. An electric autonomous vehicle could be operating as much as 20 hours a day or 21 times as much as the average car today.

For an electric autonomous vehicle operating 24/7, that still leaves plenty of time for recharging, cleaning, and maintenance.

It’s too early to know what the actual life expectancy of these vehicles will be, but it’s a pretty safe assumption that it will be far less than the 11.5 years cars are averaging today.

2.) One Autonomous Car will Replace 30 Traditional Cars

2028-2030 will be the years of peak messiness for the driverless car revolution. The number of autonomous vehicles will grow quickly but they will be intermingled with traditional driver-cars.

Drivers bring with them a hard-to-quantify human variable, and that’s what makes driving today such problem-riddled experience.

There are roughly 258 million registered cars in the U.S. and replacing them will be a long drawn out process. But here’s what most people don’t understand. One autonomous vehicle that we can be summoned from a local fleet will replace 30 traditional cars.

For a city of 2 million people, a fleet of 30,000 autonomous vehicles will displace 50% of peak commuter traffic.**

During off-peak times, 30,000 autonomous vehicles will handle virtually all other transportation needs. Peak traffic times that will be the hardest to manage.•

Tags:

Bill Gates just conducted one of his wide-ranging Reddit AMAs, touching base on Guaranteed Basic Income, philanthropy, neuroscience, etc.

Like Mark Zuckerberg, who seems to be basing his development as a businessperson and public person on the Gates template, the Microsoft founder says he hopes that digital tools can help bring more citizens together without mentioning that some of these people will form horrible and dangerous blocs. It sure seems like we were better off before fringe Americans–conspiracists, new-wave KKKs and kooks of every stripe–were able to congregate online and form cohesive national movements that could push the agenda into dark corners, especially in a time of dramatic wealth inequality, when billionaires like Robert Mercer can fund hatemongers and fuel disinformation.

A few exchanges follow.


Question:

Any thoughts on the current state of the U.S.?

Bill Gates:

Overall like Warren Buffett I am optimistic about the long run. I am concerned in the short run that the huge benefits of how the US works with other countries may get lost. This includes the aid we give to Africa to help countries there get out of the poverty trap.


Question:

I have a question pertaining to an issue in the U.S. and it’s one that we’re all get sick of hearing.

Do you think social media – and perhaps the internet in general – has played a role in helping divide this country?

Instead of expanding knowledge and obtaining greater understandings of the world, many people seem to use it to

1) seek and spread information – including false information – confirming their existing biases and beliefs, and

2) converse and interact only with others who share their worldview (these are things I’m guilty of doing myself)

Bill Gates:

This is a great question. I felt sure that allowing anyone to publish information and making it easy to find would enhance democracy and the overall quality of political debate. However the partitioning you talk about which started on cable TV and might be even stronger in the digital world is a concern. We all need to think about how to avoid this problem. It would seem strange to have to force people to look at ideas they disagree with so that probably isn’t the solution. We don’t want to get to where American politics partitions people into isolated groups. I am interested in anyone’s suggestion on how we avoid this.


Question:

What do you think is the most pressing issue that we could feasibly solve in the next ten years?

Bill Gates:

A lot of people feel a sense of isolation. I still wonder if digital tools can help people find opportunities to get together with others – not Tinder but more like adults who want to mentor kids or hang out with each other. It is great that kids go off and pursue opportunities but when you get communities where the economy is weak and a lot of young people have left then something should be done to help.


Question:

What kind of technological advancement do you wish to see in your lifetime?

Bill Gates:

The big milestone is when computers can read and understand information like humans do. There is a lot of work going on in this field – Google, Microsoft, Facebook, academia,… Right now computers don’t know how to represent knowledge so they can’t read a text book and pass a test.

Another whole area is vaccines. We need a vaccine for HIV, Malaria and TB and I hope we have them in the next 10-15 years.


Question:

If you could give 19 year old Bill Gates some advice, what would it be?

Bill Gates:

I would explain that smartness is not single dimensional and not quite as important as I thought it was back then. I would say you might explore the developing world before you get into your forties. I wasn’t very good socially back then but I am not sure there is advice that would fix that – maybe I had to be awkward and just grow up….


Question:

If you could create a new IP and business with Elon Musk, what would you make happen?

Bill Gates:

We need clean, reliable cheap energy – which we don’t have. It is too bad the sun doesn’t shine all the time and the wind doesn’t blow all the time. The Economist had a good piece on this this week. So we need some invention – perhaps miracle batteries or super safe nuclear or making sun into gasoline directly.


Question:

What are the limits of money when it comes to philanthropy?

Bill Gates:

Philanthropy is small as a part of the overall economy so it can’t do things like fund health care or education for everyone. Government and the private sector are the big players so philanthropy has to be more innovative and fund pilot programs to help the other sectors. A good example is funding new medicines or charter schools where non-obvious approaches might provide the best solution.

One thing that is a challenge for our Foundation is that poor countries often have weak governance – small budgets, and the people in the ministries don’t have much training. This makes it harder to get things done.

If we had more money we could do more good things – even though we are the biggest foundation we are still resource limited.


Question:

What do you think about Universal Basic Income?

Bill Gates:

Over time countries will be rich enough to do this. However we still have a lot of work that should be done – helping older people, helping kids with special needs, having more adults helping in education. Even the US isn’t rich enough to allow people not to work. Some day we will be but until then things like the Earned Income Tax Credit will help increase the demand for labor.


Question:

What are you most curious about, Bill?

Bill Gates:

I still find the creation of life and the way the brain works the most fascinating areas. Nick Lane has some great books exploring what we know about how life started. It is amazing how little we know about the brain still but I expect we will know a lot more in 10 years.•

Tags:

In a vital Guardian article that ties together online-media manipulation, psychological profiling of voters, Trump and Brexit, reporter Carole Cadwalladr follows a byzantine trail that leads to the right-wing machinations of computer scientist and billionaire Robert Mercer, the single biggest donor in the 2016 U.S. Presidential race. 

Mercer, an old friend of fellow anti-government zealots Steve Bannon and Nigel Farage, has often futilely thrown money at his ultra-conservative causes, but when you have that much to wager, you can keep trying until you break the bank. In 2016, he did just that. 

The Renaissance Technologies CEO offered the services of data research firm Cambridge Analytica to both Trump and Leave.EU, which helped them game Facebook and Google in a large-scale way, create extensive individual profiles of citizens and destabilize genuine journalism. It permitted mud to be thrown with precision on both sides of the pond, creating propaganda to fit the Digital Age. As a ranking member of Leave.EU tells the Guardian: “What they were trying to do in the US and what we were trying to do had massive parallels.”

Mark Zuckerberg wants us to believe that Facebook, despite its many failings, should play a leading role in saving global democracy, but who will save it from Facebook?

From Cadwalladr:

Which is how, earlier this week, I ended up in a Pret a Manger near Westminster with Andy Wigmore, Leave.EU’s affable communications director, looking at snapshots of Donald Trump on his phone. It was Wigmore who orchestrated Nigel Farage’s trip to Trump Tower – the PR coup that saw him become the first foreign politician to meet the president elect.

Wigmore scrolls through the snaps on his phone. “That’s the one I took,” he says pointing at the now globally famous photo of Farage and Trump in front of his golden elevator door giving the thumbs-up sign. Wigmore was one of the “bad boys of Brexit” – a term coined by Arron Banks, the Bristol-based businessman who was Leave.EU’s co-founder.

Cambridge Analytica had worked for them, he said. It had taught them how to build profiles, how to target people and how to scoop up masses of data from people’s Facebook profiles. A video on YouTube shows one of Cambridge Analytica’s and SCL’s employees, Brittany Kaiser, sitting on the panel at Leave.EU’s launch event.

Facebook was the key to the entire campaign, Wigmore explained. A Facebook ‘like’, he said, was their most “potent weapon”. “Because using artificial intelligence, as we did, tells you all sorts of things about that individual and how to convince them with what sort of advert. And you knew there would also be other people in their network who liked what they liked, so you could spread. And then you follow them. The computer never stops learning and it never stops monitoring.”

It sounds creepy, I say.

“It is creepy! It’s really creepy! It’s why I’m not on Facebook! I tried it on myself to see what information it had on me and I was like, ‘Oh my God!’ What’s scary is that my kids had put things on Instagram and it picked that up. It knew where my kids went to school.”

They hadn’t “employed” Cambridge Analytica, he said. No money changed hands. “They were happy to help.”

Why?

“Because Nigel is a good friend of the Mercers. And Robert Mercer introduced them to us. He said, ‘Here’s this company we think may be useful to you.’ What they were trying to do in the US and what we were trying to do had massive parallels. We shared a lot of information. Why wouldn’t you?” Behind Trump’s campaign and Cambridge Analytica, he said, were “the same people. It’s the same family.”•

Tags: , ,

Secondary to the sheer un-American nature and moral failing of Muslim bans and refugee refusal is just how injurious such policies are for business.

Whether it’s because these moves have instilled fear in international travelers or because vacationers have chosen to vote with their pocketbooks, tourists from abroad have turned away sharply from the U.S. in the weeks since Trump’s attempt at an Executive Order banning visitors from seven predominantly Muslim countries. That’s a severe threat to the livelihoods of many Americans, who are already facing myriad pressures in the Digital Age.

The causes of unemployment, underemployment and stagnant wages are amazingly complex, especially among the “structurally unemployed,” a contingent unlikely to be aided by catchphrases on baseball caps.

Two excerpts follow, one on what the travel ban means for U.S. workers and another about the web of troubles keeping citizens who want to work from doing so.


From Michelle Baran at Travel Weekly:

International flight-booking data released this week confirmed concerns across the travel industry that president Trump’s 90-day travel ban on nationals from seven Muslim-majority countries is dealing a significant blow to inbound travel to the U.S.

Following the Jan. 27 ban on travel to the U.S. by nationals from Iran, Iraq, Libya, Yemen, Somalia, Sudan and Syria, net bookings from those seven countries were down 80% between Jan. 28 and Feb. 4, compared with the same period last year, according to flight reservation transactions analyzed by the travel data company ForwardKeys.

ForwardKeys also reported a 6.5% drop in overall international travel to the U.S. for the same period when compared with the equivalent eight-day period the year before.

Meanwhile, Hopper, a flight app that tracks GDS searches in order to analyze fare prices and demand, also found that the number of flight searches for travel from international points of origin to the U.S. — a key indicator of future travel intent — was down 17% the week of Jan. 27. Last year, there was only a 1.8% decline during the same time period.

“The data forces a compelling conclusion that Donald Trump’s travel ban immediately caused a significant drop in bookings to the USA and an immediate impact on future travel,” said Olivier Jager, CEO of ForwardKeys. “This is not good news for the U.S. economy.”•


From Jeanna Smialek’s Bloomberg portrait of a struggling Kentucky man named Tyler Moore:

His problems started in earnest in 2014. He had been living on his own for several years, having moved out at 18 after dropping out of high school, obtaining his GED, and going to work in security at a coal company. Moore is gay in an intensely conservative region, and he said he left school because of bullying.

Moore lost his job in late 2013 after smoking marijuana and failing a drug test. Though he found temporary work as a remote customer service representative, he lost that one when his mother died of a drug overdose in 2014 and he had to plan her funeral.

Deeply depressed and unemployed, he moved into an old Airstream camper propped on cinder blocks behind his father’s house, at the entrance to the litter-strewn trailer park that the older man owns in the misty hills of Lovely. There, surrounded by long-unemployed neighbors and rampant drug use, Moore began to abuse his medical prescriptions. “I guess I used it as my crutch, in a way,” he says.

Moore began getting in fights while drugged and was arrested twice. When he landed in jail for several months, he realized things needed to change. He graduated from a rehabilitation program in September, one year, one month, and 15 days after that last altercation. Since then, he’s deepened his friendship with Sister Therese Carew, a Catholic nun who ministers to the region, and dedicated his time to job seeking.

Opportunities are few. Coal mines have been closing, and they’ve taken most other businesses with them.

To employers outside the area, the fact that Moore is neatly groomed, soft-spoken, and polite can’t mask his history. What’s more, he’s the first to admit that the math skills he learned in the local public schools—where only eight in 10 students graduate—aren’t up to par, and his speaking patterns are colored by regional grammar.•

Tags: ,

« Older entries § Newer entries »