Books

You are currently browsing the archive for the Books category.

Lee Billings, author of the wonderful and touching 2014 book, Five Billion Years of Solitude, is interviewed on various aspects of exoplanetary exploration by Steve Silberman of h+ Magazine. An exchange about what contact might be like were it to occur:

Steve Silberman:

If we ever make contact with life on other planets, they will be the type of creatures that we could sit down and have a Mos Eisley IPA or Alderaan ale with — even if, by then, we’ve worked out the massive processing and corpus dataset problems inherent in building a Universal Translator that works much better than Google? And if we ever did make contact, what social problems would that meeting force us to face as a species?

Lee Billings:

Outside of the simple notion that complex intelligent life may be so rare as to never allow us a good chance of finding another example of it beyond our own planet, there are three major pessimistic contact scenarios that come to mind, though there are undoubtedly many more that could be postulated and explored. The first pessimistic take is that the differences between independently emerging and evolving biospheres would be so great as to prevent much meaningful communication occurring between them if any intelligent beings they generated somehow came into contact. Indeed, the differences could be so great that neither side would recognize or distinguish the other as being intelligent at all, or even alive in the first place. An optimist might posit that even in situations of extreme cognitive divergence, communication could take place through the universal language of mathematics.

The second pessimistic take is that intelligent aliens, far from being incomprehensible and ineffable, would be in fact very much like us, due to trends of convergent evolution, the tendency of biology to shape species to fit into established environmental niches. Think of the similar streamlined shapes of tuna, sharks, and dolphins, despite their different evolutionary histories. Now consider that in terms of biology and ecology humans are apex predators, red in tooth and claw. We have become very good at exploiting those parts of Earth’s biosphere that can be bent to serve our needs, and equally adept at utterly annihilating those parts that, for whatever reason, we believe run counter to our interests. It stands to reason that any alien species that managed to embark on interstellar voyages to explore and colonize other planetary systems could, like us, be a product of competitive evolution that had effectively conquered its native biosphere. Their intentions would not necessarily be benevolent if they ever chose to visit our solar system.

The third pessimistic scenario is an extension of the second, and postulates that if we did encounter a vastly superior alien civilization, even if they were benevolent they could still do us harm through the simple stifling of human tendencies toward curiosity, ingenuity, and exploration. If suddenly an Encyclopedia Galactica was beamed down from the heavens, containing the accumulated knowledge and history of one or more billion-year-old cosmic civilizations, would people still strive to make new scientific discoveries and develop new technologies? Imagine if solutions were suddenly presented to us for all the greatest problems of philosophy, mathematics, physics, astronomy, chemistry, and biology. Imagine if ready-made technologies were suddenly made available that could cure most illnesses, provide practically limitless clean energy, manufacture nearly any consumer good at the press of a button, or rapidly, precisely alter the human body and mind in any way the user saw fit. Imagine not only our world or our solar system but our entire galaxy made suddenly devoid of unknown frontiers. Whatever would become of us in that strange new existence is something I cannot fathom.

The late Czech astronomer Zdeněk Kopal summarized the pessimist outlook succinctly decades ago, in conversation with his British colleague David Whitehouse. As they were talking about contact with alien civilizations, Kopal grabbed Whitehouse by the arm and coldly said, “Should we ever hear the space-phone ringing, for God’s sake let us not answer. We must avoid attracting attention to ourselves.”•

Tags: , , ,

For those who read Lolita after the Sexual Revolution of ’60s and ’70s had ended, how can the book appear like anything but an amazing piece of writing about a horrifying “romance”? But I suppose for some of the young who came of age during the carnal tumult of that earlier time, the novel seemed like a different thing–or at least the culture told them it was. In the opening question of an interview conducted by Erik Morse of the Los Angeles Review of Books, Emily Prager, the novelist and journalist who briefly appeared on the original iteration of Saturday Night Live, astutely explains the generational differences in interpretations of the controversial work:

Erik Morse:

Do you remember when you first read Lolita? What were your initial impressions, both of Nabokov’s story and the character of Lo?

Emily Praeger:

I don’t remember when I read Lolita but the idea of Lolita was a large part of the ’60s when I matured. Recently I saw the now 50ish-year-old woman whom Roman Polanski allegedly raped. She kept stammering that it was a different time, that you can’t judge Polanski by today’s standards. That’s because the Lolita idea was everywhere — there was a book with almost softcore photos of baby ballerinas that was on every coffee table, tons of very young women with much older men and it was okay. Men ruled after all. Many took Humbert Humbert as their role model. They liked him best of all. A few years ago, I went to dinner with some women who had grown up in the ’60s. It was when the new attitude toward sexual harassment in the workplace was surfacing. We had a great laugh because every single one of us had been importuned in the workplace constantly. When I was 17 and a prop girl off-Broadway, we had to kiss the house manager when we arrived at work. We rolled our eyes and did it. We thought it was ridiculous and those who asked it of us ludicrous. Lolita, the movie, came out in 1962, and it was with Peter Sellers and Stanley Kubrick directing and it was cool. We all wanted the heart-shaped sunglasses. You know, the myth of the ’60s is that it was all about sex. The truth is we knew nothing about sex except what society told us, which was it was bad. We just didn’t want anyone anymore saying anything to us about how to think about sex. So sexual liberation had to include Lolita. It was every girl for herself. You can’t believe how innocent we were. I doubt most of us registered that she might be being taken advantage of. The other thing was that very young boys were going to fight and die in Vietnam, not 12 but 18, which then was about 13. Young girls having sex didn’t seem that wrong. Of course you read Lolita now — I teach it in my fiction-writing course and modern girls are disgusted by it, horrified.•

Tags: ,

In a Potemkin Review interview conducted by Antoine Dolcerocca and Gokhan Terzioglu, Thomas Piketty discusses what he believes is the source of long-term, even permanent, productivity, which, of course, can occur without reasonably equitable distribution, the main focus of his book Capital in the 21st Century. An excerpt:

Question:

What do you see as a source of perpetual productivity growth?

Thomas Piketty:

Simply accumulation of new knowledge. People go to school more, knowledge in science increases and that is the primary reason for productivity growth. We know more right now in semiconductors and biology than we did twenty years ago, and that will continue.

Question:

You argue in the book that while Marx made his predictions in the 19th century, we now know that sustained productivity growth is possible with knowledge (Solow residual, etc.). But do you think this can be a sustained process in the long run?

Thomas Piketty:

Yes, I do think that we can make inventions forever. The only thing that can make it non-sustainable is if we destroy the planet in the meantime, but I do not think that is what Marx had in mind. It can be a serious issue because we need to find new ways of producing energy in order to make it sustainable, or else this will come to a halt. However if we are able to use all forms of renewable energy, immaterial growth of knowledge can continue forever, or at least for a couple of centuries. There is no reason why technological progress should stop and population growth could also continue for a little more.•

Tags: , ,

There’s never been any material evidence linking Mohamedou Ould Slahi to the 9/11 terrorists, but he’s spent the past dozen years a prisoner in Guantanamo Bay. Slahi’s just-published diary makes claims of sexual abuse, among other forms of torture, punishment which he says forced him to lie and implicate others he did not know for crimes he knew nothing about. An excerpt via Britta Sandberg at Spiegel:

As soon as I stood up, the two _______ took off their blouses, and started to talk all kind of dirty stuff you can imagine, which I minded less. What hurt me most was them forcing me to take part in a sexual threesome in the most degrading manner. What many _______ don’t realize is that men get hurt the same as women if they’re forced to have sex, maybe more due to the traditional position of the man. Both _______ stuck on me, literally one on the front and the other older _______ stuck on my back rubbing ____ whole body on mine.

At the same time they were talking dirty to me, and playing with my sexual parts. I am saving you here from quoting the disgusting and degrading talk I had to listen to from noon or before until 10 p.m. when they turned me over to _______, the new character you’ll soon meet.

To be fair and honest, the _______ didn’t deprive me from my clothes at any time; everything happened with my uniform on. The senior _______________ was watching everything _____________________________________________________. I kept praying all the time.

“Stop the fuck praying! You’re having sex with American _______ and you’re praying? What a hypocrite you are!” said ______________ angrily, entering the room.

I refused to stop speaking my prayers, and after that, I was forbidden to perform my ritual prayers for about one year to come. I also was forbidden to fast during the sacred month of Ramadan October 2003, and fed by force. During this session I also refused to eat or to drink, although they offered me water every once in a while. “We must give you food and water; if you don’t eat it’s fine.”

I was just wishing to pass out so I didn’t have to suffer, and that was really the main reason for my hunger strike; I knew people like these don’t get impressed by hunger strikes. Of course they didn’t want me to die, but they understand there are many steps before one dies. “You’re not gonna die, we’re gonna feed you up your ass,” said ____________.

I have never felt as violated in myself as I had since the DoD team started to torture me to get me admit to things I haven’t done. (…)•

Tags: ,

Andrew McAfee and Eric Brynjolfsson’s The Second Machine Age, a deep analysis of the economic and political ramifications of Weak AI in the 21st century, was one of the five best books I read in 2014, a really rich year for titles of all kinds. I pretty much agree with the authors’ summation that there’s a plentitude waiting at the other end of the proliferation of automation fast approaching, though the intervening decades will be a serious societal challenge. In a post at his Financial Times blog, McAfee reconsiders, if somewhat, his reluctance to join in with the Hawking-Bostrom-Musk AI anxiety. An excerpt:

The group came together largely to discuss AI safety — the challenges and problems that might arise if digital systems ever become superintelligent. I wasn’t that concerned about AI safety coming into the conference, for reasons that I have written about previously. So did I change my mind?

Maybe a little bit. The argument that we should be concerned about any potentially existential risks to humanity, even if they’re pretty far in the future and we don’t know exactly how they’ll manifest themselves, is a fairly persuasive one. However, I still feel that we’re multiple “Watson and Crick moments” away from anything we need to worry about, so I haven’t signed the open letter on research priorities that came out in the wake of the conference — at least not yet. But who knows how quickly things might change?

At the gathering, in answer to this question I kept hearing variations of “quicker than we thought.” In robotic endeavours as diverse as playing classic Atari video games,competing against the top human players in the Asian board game Go, creating self-driving cars, parsing and understanding human speech, folding towels and matching socks, the people building AI to do these things said that they were surprised at the pace of their own progress. Their achievements outstripped not only their initial timelines, they said, but also their revised ones.

Why is this? My short answer is that computers keep getting more powerful, the available data keeps getting broader (and data is the lifeblood of science), and the geeks keep getting smarter about how to do their work. This is one of those times when a field of human inquiry finds itself in a virtuous cycle and takes off.•

Tags: ,

In the New York Review of Books, a contemporary American literary giant, Marilynne Robinson, takes on one of our greatest ever, Edgar Allan Poe, focusing mostly on his sole novel, The Narrative of Arthur Gordon Pym of Nantucket.

Poe died mysteriously and horribly, a sad and appropriate end, as if a premature burial had always awaited him. That he was able to squeeze so much genius into such a short life and so narrow an aesthetic is miraculous. An excerpt:

The word that recurs most crucially in Poe’s fictions is horror. His stories are often shaped to bring the narrator and the reader to a place where the use of the word is justified, where the word and the experience it evokes are explored or by implication defined. So crypts and entombments and physical morbidity figure in Poe’s writing with a prominence that is not characteristic of major literature in general. Clearly Poe was fascinated by popular obsessions, with crime, with premature burial. Popular obsessions are interesting and important, of course. Collectively we remember our nightmares, though sanity and good manners encourage us as individuals to forget them. Perhaps it is because Poe’s tales test the limits of sanity and good manners that he is both popular and stigmatized. His influence and his imitators have eclipsed his originality and distracted many readers from attending to his work beyond the more obvious of its effects.

Poe’s mind was by no means commonplace. In the last year of his life he wrote a prose poem, Eureka, which would have established this fact beyond doubt—if it had not been so full of intuitive insight that neither his contemporaries nor subsequent generations, at least until the late twentieth century, could make any sense of it. Its very brilliance made it an object of ridicule, an instance of affectation and delusion, and so it is regarded to this day among readers and critics who are not at all abreast of contemporary physics. Eureka describes the origins of the universe in a single particle, from which “radiated” the atoms of which all matter is made. Minute dissimilarities of size and distribution among these atoms meant that the effects of gravity caused them to accumulate as matter, forming the physical universe.

This by itself would be a startling anticipation of modern cosmology, if Poe had not also drawn striking conclusions from it, for example that space and “duration” are one thing, that there might be stars that emit no light, that there is a repulsive force that in some degree counteracts the force of gravity, that there could be any number of universes with different laws simultaneous with ours, that our universe might collapse to its original state and another universe erupt from the particle it would have become, that our present universe may be one in a series.

All this is perfectly sound as observation, hypothesis, or speculation by the lights of science in the twenty-first century. And of course Poe had neither evidence nor authority for any of it. It was the product, he said, of a kind of aesthetic reasoning—therefore, he insisted, a poem. He was absolutely sincere about the truth of the account he had made of cosmic origins, and he was ridiculed for his sincerity. Eureka is important because it indicates the scale and the seriousness of Poe’s thinking, and its remarkable integrity. It demonstrates his use of his aesthetic sense as a particularly rigorous method of inquiry.•

Tags: ,

Kim Fowley is dead, and now they’re coming to take him away, ha-haaa! Such a bag of sleaze that he could make even record-industry professionals blanch, Fowley most famously formed and managed the teenage girl group the Runaways, and the nicest way to put it is that he certainly had an eye for young talent. The opening of David L. Ulin’s knowing 2013 Los Angeles Times review of Lord of Garbage, Fowley’s mental memoir:

Kim Fowley came out of a Hollywood that doesn’t exist anymore, the Hollywood of Kenneth Anger and Ed Wood. Best known for cooking up the Runaways, he began to work in the music business in the late 1950s and since then has turned up in more places than Woody Allen’s Zelig, producing for Gene Vincent, writing with Warren Zevon and introducing John Lennon and the Plastic Ono Band when they performed in Toronto in 1969.

Fowley turned 73 in 2012, and by his own admission has been suffering from bladder cancer, so it’s no surprise that he might choose this moment to look back. But his memoir, Lord of Garbage (Kicks Books: 150 pp., $13.95 paper) may be the weirdest rock ’n’ roll autobiography since … well, I can’t think of what.

The first of a projected three-volume set (Fowley claims the follow-ups have already been delivered), “Lord of Garbage” covers the first 30 years of its author’s life, from his early years bouncing between a model mother and a B-movie actor father, through a high school membership in the 1950s gang the Pagans and on to his involvement as a songwriter and producer in 1960s L.A.

How much of it is true is hard to say, exactly: Written in  bombastic prose, it follows the broad parameters of Fowley’s biography while also insisting that, at the age of 1, his first words were: “I have a question. Why are you bigger than me?”

“Kim Fowley could talk at ten months,” he tells us, “could read and write by one and a half.” It’s no coincidence that he refers to himself in the third person, since Lord of Garbage is clearly the work of someone who considers himself larger than life. “You already know the genius music,” Fowley declares in a brief head note. “Now, know the genius man of letters.”

And yet, as self-congratulatory as that is, as sadly confrontational, it’s also, in its own weird way, slightly thrilling — not unlike Fowley himself.•

Tags: ,

Reading a new Phys.org article about Google moving more quickly than anticipated with its driverless dreams reminded me of a passage from a Five Books interview with Robopocalypse author Daniel H. Wilson. An excerpt from each piece follows.

__________________________

From Five Books:

Question:

Isn’t machine learning still at a relatively early stage? 

Daniel H. Wilson:

I disagree. I think machine learning has actually pretty much ripened and matured. Machine learning arguably started in the 1950s, and the term artificial intelligence was coined by John McCarthy in 1956. Back then we didn’t know anything – but scientists were really convinced that they had this thing nipped in the bud, that pretty soon they were going to replace all humans. This was because whenever you are teaching machines to think, the lowest hanging fruit is to give them problems that are very constrained. For example, the rules of a board game. So if you have a certain number of rules and you can have a perfect model of your whole world and you know how everything works within this game, well, yes, a machine is going to kick the crap out of people at chess. 

What those scientists didn’t realise is how complicated and unpredictable and full of noise the real world is. That’s what mathematicians and artificial intelligence researchers have been working on since then. And we’re getting really good at it. In terms of applications, they’re solving things like speech recognition, face recognition, motion recognition, gesture recognition, all of this kind of stuff. So we’re getting there, the field is maturing.

“What those scientists didn’t realise is how complicated and unpredictable and full of noise the real world is. That’s what mathematicians and artificial intelligence researchers have been working on since then. And we’re getting really good at it. In terms of applications, they’re solving things like speech recognition, face recognition, motion recognition, gesture recognition, all of this kind of stuff. So we’re getting there, the field is maturing.•

__________________________

From Phys. org:

The head of self-driving cars for Google expects real people to be using them on public roads in two to five years.

Chris Urmson says the cars would still be test vehicles, and Google would collect data on how they interact with other vehicles and pedestrians.

Google is working on sensors to detect road signs and other vehicles, and software that analyzes all the data. The small, bulbous cars without steering wheels or pedals are being tested at a Google facility in California.

Urmson wouldn’t give a date for putting driverless cars on roads en masse, saying that the system has to be safe enough to work properly.

He told reporters Wednesday at the Automotive News World Congress in Detroit that Google doesn’t know yet how it will make money on the cars.

Urmson wants to reach the point where his test team no longer has to pilot the cars. “What we really need is to get to the point where we’re learning about how people interact with it, how they are using it, and how can we best bring that to market as a product that people care for,” he said.•

Tags: ,

Well, you can’t get a much more top-shelf Oscars moment than this passage from the 1977 ceremony, as Jane Fonda introduces Norman Mailer who in turn presents the Best Original Screenplay award to Paddy Chayefsky for Network. Mailer sets up the announcing of the nominees with the famous anecdote about Voltaire visiting a gay bordello. Despite what Aquarius says, it was more way more difficult for Chayefsky to write a great novel than a great screenplay.

Tags: , ,

Granta has an excerpt called “Drone” from a forthcoming Hari Kunzru novel, a dystopian nightmare about an India in which income inequality and runaway technology are extrapolated to extremes, those of enormous wealth living in stupendous towers above the ruined earth. In this passage, a poor miner named Jai opts for human augmentation to enable survival:

One evening, he goes to buy himself an arm. It’s a common enough transaction. Most people on earth are augmented. You can increase your strength, overclock your reaction time or your lung capacity, multiply your attention span. You can cosmetically alter your face, reskin your body in the latest colours. You can augment your perception, overlaying the hideous environment of your mining camp with a pristine rainforest or an educational maze or a hypersexual forest of organs and limbs. Elsewhere in the world, people have changed themselves in ways these miners can only dream about. The rich are fantastical creatures, young gods living in a customized world, generating themselves and their environment out of the stuff of their desires. Not this, that. Not that, this. For the less fortunate there are wealth-sims and optical overlays that make cramped living spaces feel spacious, cosmetically luxurious. You may be exhausted and feeding yourself textured algae, but you’re doing it in a marble throne room.

Jai, like everyone, worries that he’s falling behind. Other miners stimulate their muscle growth, or use cheap mechanical prosthetics with docks for attaching tools. One or two have elaborate biomechanical grafts, though these many-armed, monstrously sized men are usually enslaved by the militias and are so psychologically alienated that they can’t properly be called human any more. Jai is young and strong. He has the body he was born with, a body which has been constructed entirely by chance, without selection or surgery or fetal therapies, with a variable food supply, patchy shelter and unrestricted exposure to diseases and swarms of all kinds. He is miraculously healthy, but can’t seem to make enough money to survive. Sometimes he goes hungry. He struggles to pay the water boys.

The prosthetician is based in a highly entropic zone of the camp, the informal red-light district known as the Cages. It’s a quarter that has spawned a hundred slang terms for process, words for every type and quality of peak, dip, spread and intensification. As Jai squeezes through a decaying alley, a flock of what look like geese with glandes instead of heads skitter past him. Who knows where they came from, but they’re ubiquitous in the Cages. The miners call them ‘dickchickens’. Whores grafted into the walls display available orifices or scroll out stims that grab the crotch or flicker and bounce off the eye like thrown business cards. Even the architecture is pink, moist to the touch; when it comes to overlays, miners tend to want the hard stuff. Cheap and heavy. Margaritaville. Pussytown. Jai is assaulted by a confusion of tacky skins and feelies, which override his permissions, come congaing through his field of vision, trying to trick him into giving out his credit strings. Phantom pudenda flourish and bloom. Semen spatters the optics of his sensorium. He is brushed by nipples, hair, lubricated hands.

He squeezes himself through a rectal crack into the limbmongers’ colony, the swarm of drones battering round him, thick and black. It fills the narrow alley. Machines get stuck underfoot or mashed into the deliquescent walls. The largest are the size of small birds, the tiniest mere hoverflies, with little iridescent solar sails for wings. As he is finally enclosed by the prosthetician’s stall, sheltered behind his firewall, the swarm forms a clicking, skittering crust on the transparent shell, jostling for a sight line.

The limbmonger is a sallow man with a double ridge of bone on his forehead and a cage of carbon fibre around his jaw, the platform for some kind of sensorium. As he shows Jai his wares he’s probably multitasking, climbing pre-thaw Everest or swapping feelies of cats. He has a telltale absence to his manner, a blankness. Of the various devices on offer, there’s only one Jai can afford, a contraption with a battered shovel, a claw, and some kind of twitch control that the man swears works perfectly, but which only seems to react intermittently to Jai’s instructing left shoulder.•

Tags:

A debate (by proxy) between Nicholas Carr and Andrew McAfee, two leading thinkers about the spreed of automation, takes place in Zoë Corbyn’s Guardian article about Carr’s most-recent book, The Glass Cage. I doubt the proliferation of Weak AI will ultimately be contained much beyond niches despite any good dissenting arguments. An excerpt:

As doctors increasingly follow automated diagnostic templates and architects use computer programs to generate their building plans, their jobs become duller. “At some point you turn people into computer operators – and that’s not a very interesting job,” Carr says. We now cede even moral choices to our technology, he says. The Roomba vacuum cleaning robot, for example, will unthinkingly hoover up a spider that we may have saved.

Not everyone buys Carr’s gloomy argument. People have always lamented the loss of skills due to technology: think about the calculator displacing the slide rule, says Andrew McAfee, a researcher at the MIT Sloan School of Management. But on balance, he says, the world is better off because of automation. There is the occasional high-profile crash – but greater automation, not less, is the answer to avoiding that.

Carr counters that we must start resisting the urge to increase automation unquestioningly. Reserving some tasks for humans will mean less efficiency, he acknowledges, but it will be worth it in the long run.•

Tags: , ,

In “They’re Watching You Read,” a NYRB post, Francine Prose wonders about the ramifications, both political and personal, of e-book retailers being able to tell “which books we’ve finished or not finished, how fast we have read them, and precisely where we snapped shut the cover of our e-books and moved on to something else.” With online tracking of page-turning and highlighting, the very personal joy of reading becomes public. An excerpt:

For the time being, the data being gathered concerns general patterns of behavior rather than what happens between each of us and our personal E-readers. But we have come to live with the fact that anything can be found out. Today “the information” is anonymous; tomorrow it may well be just about us. Will readers who feel guilty when they fail to finish a book now feel doubly ashamed because abandoning a novel is no longer a private but a public act? Will it ever happen that someone can be convicted of a crime because of a passage that he is found to have read, many times, on his e-book? Could this become a more streamlined and sophisticated equivalent of that provision of the Patriot Act that allowed government officials to demand and seize the reading records of public library patrons?

As disturbing may be the implications for writers themselves. Since Kobo is apparently sharing its data with publishers, writers (and their editors) could soon be facing meetings in which the marketing department informs them that 82 percent of readers lost interest in their memoir on page 272. And if they want to be published in the future, whatever happens on that page should never be repeated.

Will authors be urged to write the sorts of books that the highest percentage of readers read to the end? Or shorter books? Are readers less likely to finish longer books? We’ll definitely know that. Will mystery writers be scolded (and perhaps dropped from their publishers’ lists) because a third of their fans didn’t even stick around long to enough to learn who committed the murder? Or, given the apparent lack of correlation between books that are bought and books that are finished, will this information ultimately fail to interest publishers, whose profits have, it seems, been ultimately unaffected by whether or not readers persevere to the final pages?•

Tags:

In Gareth Cook’s New York Times Magazine profile of Princeton neuroscientist Sebastian Seung, who is trying to map the human brain with the aid of crowdsourcing online games, something akin to academia applied to Angry Birds, the writer makes a fundamental point about all attempts at cartography: charts and pictures are capable of obfuscating as well as elucidating. An excerpt:

In 1946, the Argentine man of letters Jorge Luis Borges wrote a short story about an empire, unnamed, that set out to construct a perfect map of its territory. A series of maps were drawn, only to be put aside in favor of more ambitious maps. Eventually, Borges wrote, ‘the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire, and which coincided point for point with it. The following Generations, who were not so fond of the Study of Cartography as their Forebears had been, saw that that vast map was Useless, and . . . delivered it up to the Inclemencies of Sun and Winters.’

With time, Borges’s cautionary parable has become even more relevant for the would-be cartographers of the world, Seung among them. Technological progress has always brought novel ways of seeing the natural world and thus new ways of mapping it. The telescope was what allowed Galileo to sketch, in his book The Starry Messenger, a first map of Jupiter’s largest moons. The invention of the microscope, sometime in the late 16th century, led to Robert Hooke’s famous depiction of a flea, its body armored and spiked, as well as the discovery of the cell, an alien world unto itself. Today the pace of invention and the raw power of technology are shocking: A Nobel Prize was awarded last fall for the creation of a microscope with a resolution so extreme that it seems to defy the physical constraints of light itself.

What has made the early 21st century a particularly giddy moment for scientific mapmakers, though, is the precipitous rise of information technology. Advances in computers have provided a cheap means to collect and analyze huge volumes of data, and Moore’s Law, which predicts regular doublings in computing power, has shown little sign of flagging. Just as important is the fact that machines can now do the grunt work of research automatically, handling samples, measuring and recording data. Set up a robotic system, feed the data to the cloud and the map will practically draw itself. It’s easy to forget Borges’s caution: The question is not whether a map can be made, but what insights it will bring. Will future generations cherish a cartographer’s work or shake their heads and deliver it up to the inclemencies?”

Tags: ,

One of the best books I read during 2014 was Lee Billings’ Five Billion Years of Solitude, a volume both extremely heady and deeply moving. It tells the story of the quest for exoplanets which resemble Earth, places which could possibly provide refuge for us when our mother planet finally dies. Even if we never manage to leave our solar system, just the intellectual odyssey itself is fascinating. From “Searching for Pale Blue Dots,” an Economist article about other-Earth discussions at this week’s American Astronomical Society meeting:

“IN 1990 Voyager took a photograph of Earth that was striking precisely because it showed so little. The spacecraft was six billion kilometres away at the time and the image it sent back was memorably described by Carl Sagan as a ‘pale blue dot,’ Imagine, then, how pale such a dot would be if the planet in the picture were 113,000 billion kilometres away. Yet this is the distance to the nearest confirmed exoplanet—a planet orbiting a star other than the sun. That gives some idea of the task faced by those who study these bodies. Only in the most special of circumstances can they actually see their quarry. Mostly, they have to work with indirect measurements, like watching for slight dips in the intensity of a star’s light when a planet passes in front of it, a phenomenon known as a transit.

But if indirect observation is all that is on offer, then astronomers must make the best of it. And, as numerous presentations to a meeting of the American Astronomical Society held in Seattle this week show, they have both done so, and have plans to do better in future.

The most successful planet-hunting mission so far has been Kepler, a satellite launched in 2009 by NASA, America’s space agency, which collected data using the transit method until 2013, when a mechanical failure disabled it. It has since been revived, but has only recently begun transmitting data. However, combing of the data it collected in its first incarnation continues, and Douglas Caldwell of the SETI Institute, in Mountain View, California, who is one of the mission’s chief scientists, announced to the meeting the discovery of eight new planets. Three of these lie in their solar systems’ habitable zones (that is, they are at a distance from their parent stars which makes them warm enough for water on their surfaces to be liquid, but cool enough for it not to be steam). One of these three, known as Kepler 438b, is thought particularly Earthlike. It is a bit bigger and a bit warmer than Earth, but is probably rocky. It is therefore likely to be the subject of intense future scrutiny.”

Tags: ,

Zia Haider Rahman’s novel, In the Light of What We Know, is one of the dozen or so books I read in 2014 that really stays with me, for many good reasons, but also for one that isn’t so good. I’m not sure I can make the critique without revealing too much, so you should probably skip this post if you haven’t read it yet and plan to. 

As I’ve said before, the work is an amazing panorama, full of ideas, and contains a framed tale that’s worthy of Thomas Mann. That’s stunning for any novelist, especially a debuting one.

The trouble I have is that there’s a scene of horrific violence against a female character by a male one who sees her as an abstraction, a figure onto which he can project his dreams and anguish, rather than as an actual person. There’s nothing wrong with writing a character who sees another this way; we’re probably more likely to commit violence against those whose humanity we can erase, and it’s a worthwhile topic to meditate on and try to understand. The problem I have is that the female character isn’t only an abstraction to her fictional victimizer but to the readers as well because that’s the way the author has left her. And the victimizer, often a witty and sardonic guy, is a full-fledged character, who, while pitiful, probably is the more sympathetic creation. Doesn’t feel right.

In her essay at the Los Angeles Review of Books, Hannah Harris Green sums up the situation really well. An excerpt:

“The lack of human qualities in Emily — more so than the lack of appealing qualities — really disturbed me, especially in the context of a book that consistently views women with disdainful and predatory eyes. Zafar notes that women lose most of their beauty after the age of 18, and that older women in literature are given more credit than they deserve for being ‘feisty’ and ‘strongheaded’—’things which if found in a man would scarcely get a mention.’ Elsewhere, Zafar explains that many women wait too long to become mothers in order to focus on their careers. A reader begins to doubt that these passages are simply the thoughts of a misogynistic character when they are buttressed by long descriptions of body shape and unbuttoning of blouses that come whenever a female character is introduced. We are assured within a few sentences that any new female is both beautiful and thin — as if even the possibility of imagining another kind of woman would be offensive to the reader. It’s typical for a woman to be introduced in this way: ‘It would be disingenuous of me not to confess that what was most striking about Lauren were her breasts. I would have bet my bottom dollar it was a push-up bra that made for the flawless curves.’ The narrator, who is responsible for this one, at least admits several times that he has a shallow personality, but of course Zafar is no better. In Afghanistan, he meets the director of an international microfinance organization. We get a thorough description of how her outfit highlights her curves, before he notes that her ‘name was fit for a porn star.’

It’s tempting to think that the absence of a female perspective to combat this heavily male gaze is intentional — an extent of the novel’s conceit that all people have their blind spots, even someone as scrupulous and multifaceted as Zafar. But the evidence isn’t there — not in the book itself, nor in the author’s interactions with the press. In an interview with Guernica Magazine last month, Rahman admitted that his choice to use the first person did limit him somewhat, as it forced him to exclude anything that the narrator himself doesn’t perceive. But here is the example he gives of an element he regretted cutting: ‘For instance, I have a passage in which the narrator retells the story of a cherished bicycle he had as a boy that disappeared. He gets through the story without seeing that his mother was having an affair because his eye is on the bicycle.’ So the first person narrative forced Rahman to exclude not a female point of view, but yet another instance of a woman behaving in a deceptive way.”

Tags:

Ken Kalfus, author of the wonderful short story collection Thirst, among other books, has written an n+1 piece in which he advocates for an unmanned mission to Alpha Centauri and strongly doubts the plausibility of a mission to Mars–soon or perhaps ever–let alone the planet’s colonization, believing there will be no refuge for us in outer space from Earth’s ultimate rejection of our species, noting soaring costs, low political will and high radiation levels. All that’s true, and pipe dreams like Mars One won’t get off the ground, though ever is a mighty long time. An excerpt:

A half-century after the conclusion of the Apollo mission, we have entered a new age of space fantasy—one with Mars as its ruling hallucination. Once again stirring goals have been set, determined timetables have been laid down, and artist’s renderings of futuristic spacecraft have been issued. The latest NASA Authorization Act projects Mars as the destination for its human spaceflight program. Last month’s successful test flight of the Orion space vehicle was called by NASA Administrator Charles Bolden ‘another extraordinary milestone toward a human journey to Mars.’ The space agency’s officials regularly justify the development of new rockets, like the Space Launch System, as crucial to an eventual Mars mission.

But human beings won’t be going to Mars anytime soon, if ever. In June, a congressionally commissioned report by the National Research Council, an arm of the National Academy of Sciences and the National Academy of Engineering, punctured any hope that with its current and anticipated level of funding NASA will get human beings anywhere within the vicinity of the red planet. To continue on a course for Mars without a sustained increase in the budget, the report said, “is to invite failure, disillusionment, and the loss of the longstanding international perception that human spaceflight is something the United States does best.”

The new report warns against making dates with Mars we cannot keep. It endorses a human mission to the red planet, but only mildly and without setting a firm timetable. Its “pathways” approach comprises intermediate missions, such as a return to the moon or a visit to an asteroid. No intermediate mission would be embarked upon without a budgetary commitment to complete it; each step would lead to the next. Each could conclude the human exploration of space if future Congresses and presidential administrations decide the technical and budgetary challenges for a flight to Mars are too steep.

The technical and budgetary challenges are very steep. A reader contemplating them may reasonably wonder if it’s worth sending people to Mars at all.•

Tags:

Via the wonderful Longreads, I came across Geoff Manaugh’s 2013 Cabinet piece about Los Angeles’s 1990s reputation as bank robbery capital of the world, which includes an extended meditation on the inscrutable and illegal exploits of the “Hole in the Ground Gang,” which attempted to mole its way to millions. The opening:

“In the 1990s, Los Angeles held the dubious title of ‘bank robbery capital of the world.’ At its height, the city’s bank crime rate hit the incredible frequency of one bank robbed every forty-five minutes of every working day. As FBI Special Agent Brenda Cotton—formerly based in Los Angeles but now stationed in New York City—joked at an event hosted by Columbia University’s school of architecture in April 2012, the agency even developed its own typology of banks in the region, most notably the ‘stop and rob’: a bank, located at the bottom of both an exit ramp and an on-ramp of one of Southern California’s many freeways, that could be robbed as quickly and as casually as you might pull off the highway for gas.

In his 2003 memoir Where The Money Is: True Tales from the Bank Robbery Capital of the World, co-authored with Gordon Dillow, retired Special Agent William J. Rehder briefly suggests that the design of a city itself leads to and even instigates certain crimes—in Los Angeles’s case, bank robberies. Rehder points out that this sprawling metropolis of freeways and its innumerable nondescript banks is, in a sense, a bank robber’s paradise. Crime, we could say, is just another way to use the city.

Tad Friend, writing a piece on car chases in Los Angeles for the New Yorker back in 2006, implied that the high-speed chase is, in effect, a proper and even more authentic use of the city’s many freeways than the, by comparison, embarrassingly impotent daily commute—that fleeing, illegally and often at lethal speeds, from the pursuing police while being broadcast live on local television is, well, it’s sort of what the city is for. After all, Friend writes, if you build ‘nine hundred miles of sinuous highway and twenty-one thousand miles of tangled surface streets’ in one city alone, you’re going to find at least a few people who want to really put those streets to use. Indeed, Friend, like Rehder, seems to argue that a city gets the kinds of crime appropriate to its form—or, more actively, it gets the kinds of crime its fabric calls for.

Of course, there are many other factors that contribute to the high incidence of bank robbery in Los Angeles, not least of which is the fact that many banks, Rehder explains in his book, make the financial calculation of money stolen per year vs. annual salary of a full-time security guard—and they come out on the side of letting the money be stolen. The money, in economic terms, is not worth protecting.”

Tags: , , ,

Before Barnes & Noble added couches and coffee and before those amenities were disappeared brick by brick and mortar by mortar by Amazon, there was a vast and very unwieldy version of the store near Rockefeller Center which sold remainder copies of Evergreen and Grove Press paperback plays for a buck. That’s how I came to Eugene Ionesco, Harold Pinter, Edward Albee, Bertolt Brecht and Samuel Beckett, the latter of whom was an absurdist as well as chauffeur to a pre-wrestling Andre the Giant. I can’t imagine a more trying dramatist to act for than Beckett, but Billie Whitelaw tried and succeeded. The go-to thespian for the Godot author just passed away. Here’s an excerpt from her Economist obituary:

“For 25 years she was the chosen conduit for the 20th century’s most challenging playwright, the author of Waiting for Godot She played Winnie in Happy Days, buried up to her waist in sand, carefully turning out her bag as she babbled away; the Second Woman in Play, the role in which Beckett first saw her at the Old Vic in 1964, enveloped in an urn with her face slathered with oatmeal and glue; May in Footfalls, communing with her absent mother while endlessly pacing a thin strip of carpet; and, in Rockaby, an ancient woman listening to her own voice as she slowly rocked herself to death.

She never pretended to understand these plays. She just thought of them as a state of mind, something she could recognise in herself. That was what Sam wanted: no interpretation, just perfection. If, almost unwittingly—for she wasn’t good at words, couldn’t spell and seldom read books—she replaced an ‘Oh’ with an ‘Ah,’ or paused minutely too long, upsetting the rhythm of his music, she would hear his murmured ‘Oh Lord!’ from the stalls, and see his head fall to his hands. He was always her best, gentlest and most exacting friend. In a way they were like lovers, walking arm in arm when she visited him in Paris, and rehearsing in her kitchen close up, she speaking directly into his pale, pale, powder-blue eyes, as he whispered the lines along with her. When he died, in 1989, she felt that part of her had been cut away.

Stutterer, chatterbox

It seemed unbelievable that it was her voice in Beckett’s mind when he wrote. It was nothing special to her. She had a Yorkshire accent, reflecting her Bradford childhood, but after a run of early TV typecasting in ‘trouble at t’mill’ dramas it had become residual, like her fondness for meat pies and Ilkley Moor. Her northern roots showed mostly in her liking for blunt, straight talk. At 11, after her father died, she had developed a stutter, which her mother thought might be cured by taking up acting. The cure worked so well that she became a staple on BBC radio’s Children’s Hour playing rough-voiced boys at ten shillings a time, and at 14 started to act for Joan Littlewood’s Theatre Workshop. Any challenge or crisis, though, could bring the stutter back, together with paralysing stage-fright. When she played Desdemona to Laurence Olivier’s Othello at the National Theatre, in 1963, she could hardly stop her voice trembling.

Small wonder she was nervous. She had never read Shakespeare then, and had had no classical training. Her years in rep had mostly consisted of playing dizzy blondes, busty typists and maids.”

_____________________________

Whitelaw as “Winnie” in Happy Days:

Tags: ,

A piece of Ken Kesey and Jerry Garcia being interviewed by Tom Snyder in 1981. Along with the author’s infamous Acid Tests, government-run LSD experiments in 1960s Palo Alto are also a topic of conversation. After some jesting, Kesey gives a very candid response to the question of whether drugs had injured him: “You don’t get anything for free.”

Tags: , ,

Margaret Atwood, a deservedly towering literary figure, just did an Ask Me Anything at Reddit. One question about the prophetic nature of her 1985 dystopian novel, The Handmaid’s Tale:

Question:

A lot of Dystopian Fiction from decades ago have had their fears in some ways realized in the modern day:

  • Fahrenheit 451 and the way modern people are glued to their forms of entertainment via smartphones, iPads, computers and television (and as a result there has been a very big turn towards soundbyte-simplified political and social discourse).
  • 1984 and the ubiquitous nature of government surveillance. People just shrug it off as expected with each new NSA scandal.

In what ways do you think the Handmaid’s Tale has been prophetic? What things are you sad to see come to fruition with regard to women’s rights and religious extremism in the Western/American world that you tried to warn us about?

Margaret Atwood:

Hmm, that’s a snake pit. The HM Tale was practically a meme during the last presidential election, due to the Four Unwise Republicans who opened their mouths and said what was on their minds in relation to Unreal Rape and the ability of a raped woman’s body to somehow Just Not Get Pregnant. (Tell that to the all the raped Bangladeshi women who hanged themselves at the Rape Camp where they were kept.) At this time, several states have enacted laws that make it quite dangerous for women to be pregnant in them, because if they lose the baby, or are even suspected of ThoughtCrime — being maybe About to lose the baby — they can be tried for some form of murder or attempted murder. That is, if the New York Times is to be believed. There will be ongoing contention in this area, because people hate to be forced to choose between two things, both of which they consider bad. Stay tuned. If motherhood really Were respected, of course, mothers-to-be would be offered free housing, proper nutrition, and ongoing care and support once the baby was born. But I don’t see any states standing ready to put that in place. With the poverty rates what they are, there would be a lineup for miles.•

Tags:

A person isn’t merely a “satchel full of dung” as Bishop John Fisher argued in 1535, the year he was beheaded by King Henry VIII, but a surfeit of pride is just as bad as one of shame, maybe worse. In the middle of last century, psychiatry began trying to convince us we weren’t sinners but winners, as the “self-esteem movement” kickstarted with good intentions by Dr. Abraham Manslow began to take hold, even if there wasn’t much hard data to support its efficacy. Dissent eventually came from controversial research psychologist Roy Baumeister, son to a father driven by immense self-importance. The opening of Will Storr’s Matter piece,The Man Who Destroyed America’s Ego“:

“FOR MUCH OF HUMAN HISTORY, our beliefs have been based on the assumption that people are fundamentally bad. Strip away a person’s smile and you’ll find a grotesque, writhing animal-thing. Human instincts have to be controlled, and religions have often been guides for containing the demons. Sigmund Freud held a similar view: Psychotherapy was his method of making the unconscious conscious, helping people restrain their bestial desires and accord with the moral laws of civilization.

In the middle of the 20th century, an alternative school of thought appeared. It was popularized by Carl Rogers, an influential psychotherapist at the University of Chicago, and it reversed the presumption of original sin. Rogers argued that people are innately decent. Children, he believed, should be raised in an environment of ‘unconditional positive regard.’ They should be liberated from the inhibitions and restraints that prevented them from attaining their full potential.

It was a characteristically American idea—perhaps even the American idea. Underneath it all, people are good, and to get the best out of themselves, they just need to be free.

Economic change gave Rogers’s theory traction. It was the 1950s, and a nation of workmen was turning into a nation of salesmen. To make good in life, interpersonal sunniness was becoming essential. Meanwhile, rising divorce rates and the surge of women into the workplace were triggering anxieties about the lives of children born into the baby boom. Parents wanted to counteract the stresses of modern family life, and boosting their children’s self-esteem seemed like the solution.

By the early 1960s, wild thinkers in California were pushing Rogers’s idea even further. The ‘human potential movement’ argued that most people were using just 10 percent of their intellectual capacity. It leaned on the work of Abraham Maslow, who studied exceptional people such as Albert Einstein and Eleanor Roosevelt and said there were five human needs, the most important of which was self-actualization—the realization of one’s maximum potential. Number two on the list was esteem.

At the close of the decade, the idea that self-esteem was the key to psychological riches finally exploded. The trigger was Nathaniel Branden, a handsome Canadian psychotherapist who had moved to Los Angeles as a disciple of the philosopher Ayn Rand. One of Rand’s big ideas was that moral good would arise when humans ruthlessly pursued their own self-interest. She and Branden began a tortuous love affair, and her theories had an intense impact on the young psychotherapist. In The Psychology of Self-Esteem, published in 1969, Branden argued that self-esteem ‘has profound effects on a man’s thinking processes, emotions, desires, values and goals. It is the single most significant key to his behavior.’ It was an international bestseller, and it propelled the self-esteem movement out of the counterculture and into the mainstream.”

___________________________________

A 30-minute 1971 film about Maslow’s philosophical descendants.

Tags: , , , , , ,

When Thomas Pynchon, famously fame-resistant, won the 1973 National Book Award, Professor Irwin Corey accepted on his behalf, offering up his usual high-low mishegoss, the perfect patter to represent the novelist. The amazing Paul Thomas Anderson has based his latest film on the Pynchon novel Inherent Vice. In a new Guardian profile by Mark Kermode, the director is asked about his relationship with the incognizable author. An excerpt:

“One thing Pynchon doesn’t have is a public profile. He is famously camera-shy (even his fleeting Simpsons cameos placed a cartoon paper-bag on his head), and Anderson seems determined not to throw any light on his rumoured involvement with the movie. Although Joaquin Phoenix has stated that Anderson talked regularly with Pynchon, my questions about meeting the author are met with uncharacteristic evasion.

‘He doesn’t meet people,’ Anderson deadpans. ‘I don’t know if he even exists.’

So you don’t know what he thinks of the film?

‘I can only hope that he’s happy it…’

But you didn’t deal directly with him?

‘No, no, no. I just… I just stay out of it. I just try to work with the book, you know, and to treat the book as a collaborator.’

He looks me in the eye, daring me to try again. I mention the rumour (confirmed by Josh Brolin) that Pynchon visited the set and can in fact be glimpsed in the movie.

‘Well, that’s like those stories about B Traven [the mysterious author of Treasure of the Sierra Madre, who believed that ‘the creative person should have no other biography than his works”]. No one ever knew who Traven was, and these pages would supposedly appear under [the director] John Huston’s door with notes and stuff. Or they’d be on the set and look over and there’d be a guy with a hat and sunglasses, and they’d all be going, ‘Is that B Traven? Is that him?’ So it’s all very mysterious to talk about Pynchon, but I tread delicately because he doesn’t want anything to do with all this, and I just have so much respect for him. I hope I can be like him when I grow up.'”

Tags: , , ,

I know writing–especially journalism and serious novels–is supposed to be dying, but I think 2014 was one of the richest years in memory for top-notch articles and books of all kinds. The following father-son conversation is one of my favorite passages from Zia Haider Rahman’s In the Light of What We Know, one of those aforementioned excellent pieces of long fiction. It’s about how narratives of just a few words can liberate or ensnare an animal or a people or a nation, fairly or unfairly.

_______________________________

I have a question for you. You know what the most dangerous thing in the world is?

What? I asked.

A story, replied my father. I’m not kidding. Stories are dangerous. And I don’t mean stories whose messages are capable of endangering. I mean that the form itself is dangerous, not the content. You know what a metaphor is? A story sent through the super distillation of the imagination. You know what a story is? An extended metaphor. We live in them. We live in this swirling mass of stories written by scribes hidden in some forgotten room up there in the towers. The day someone thought of calling pigeons flying rats was the day the fate of pigeons was sealed. Does anyone who hears them called flying rats stop to ask if pigeons actually carry disease? Or Plato’s cave. If a fellow knows nothing else about the man, he knows something about a cave and shadows. You’ve heard that good fences make good neighbors, but did you know when Robert Frost wrote those words he meant the opposite of what that phrase has come to stand for? Frost was being ironic; he was talking about the things that divide us. But the image contained in the bare words Good fences make good neighbors–that image is so good, so vibrant, that in our minds, in the minds of so many, its broken free of its unspoken ironies.•

Tags:

Mark Twain, America’s second greatest comic ever in my estimation (after George Carlin), died of a heart attack 104 years ago. He lived a life writ large, won fame and lost fortunes, and, most importantly, reminded us what we could be if we chose to live as one, traveling as he did from Confederate sympathizer to a place of enlightenment. I think of Twain what I thought of Pete Seeger and Odetta when they died: You can’t really replace such people because they have the history and promise of the nation coursing through their veins. He was eulogized in the April 22. 1910 Brooklyn Daily Eagle; the opening sections excerpted below follow him from birth to his emergence as a “stand-up” and his shift to author of books.

If you put a gun to my head and asked what I thought was the best novel ever written in English, I would think you were crazy. Why are you pointing a gun at my head?!? Why not just ask me without the threat of murder?!? Do you want me to call the police?!?

After you were disarmed and arrested, I would think about the question again and just as likely choose Lolita, Vladimir Nabokov’s tale of monstrous love, as anything else. The language is impeccable, amazingly weighty and nimble all at once, and the book overall both profoundly funny and sad.

Art is one thing, however, but life another. The book’s main inspiration may have been von Lichberg or it may have been a very real horror, a 1940s NYC child abduction perpetrated by a felon in a fedora named Frank La Salle. (Or perhaps it was a combination of the two.) Via Longreads, a passage from “The Real Lolita,” an historical inquiry by Sarah Weinman published at the Penguin Random House blog:

Nabokov said he conjured up the germ of the novel—a cultured European gentleman’s pedophilic passion for a 12-year-old girl resulting in a madcap, satiric cross-country excursion—’late in 1939 or early in 1940, in Paris, at a time when I was laid up with a severe attack of intercostal neuralgia.’ At that point it was a short story set in Europe, written in his first language, Russian. Not pleased with the story, however, he destroyed it. By 1949, Nabokov had emigrated to America, the neuralgia raged anew, and the story shifted shape and nagged at him further, now as a longer tale, written in English, the cross-country excursion transplanted to America.

Lolita is a nested series of tricks. Humbert Humbert, the confessing pervert, tries so hard to obfuscate his monstrosities that he seems unaware when he truly gives himself away, despite alleging the treatise is a full accounting of his crimes. Nabokov, however, gives the reader a number of clues to the literary disconnect, the most important being the parenthetical. It works brilliantly early on in Lolita, when Humbert describes the death of his mother—’My very photogenic mother died in a freak accident (picnic, lightning) when I was three’—or when he sights Dolores Haze in the company of her own mother, Charlotte, for the first time: ‘And, as if I were the fairy-tale nurse of some little princess (lost, kidnaped, discovered in gypsy rags through which her nakedness smiled at the king and his hounds), I recognized the tiny dark-brown mole on her side.’ The unbracketed narrative is what Humbert wants us to see; the asides reveal what is really inside his mind.

Late in Lolita, one of these digressions gives away the critical inspiration. Humbert, once more in Lolita’s hometown after five years away, sees Mrs. Chatfield, the “stout, short woman in pearl-gray,” in his hotel lobby, eager to pounce upon him with a “fake smile, all aglow with evil curiosity.” But before she can, the parenthetical appears like a pop-up thought balloon for the bewildered Humbert: “Had I done to Dolly, perhaps, what Frank Lasalle [sic], a fifty-year-old mechanic, had done to eleven-year-old Sally Horner in 1948?”•

_______________________________

“I think the book is shocking…I’m glad that it’s shocking.”


Tags: , , ,

« Older entries § Newer entries »