In an article by Stuart Dredge at the Guardian, Google’s Eric Schmidt holds forth on totalitarian regimes trying to control what they cannot stop: the Internet. The opening:
“Dictators are taking a new approach in their responses to use of the internet in popular uprisings, according to Google’s executive chairman Eric Schmidt.
‘What’s happened in the last year is the governments have figured out you don’t turn off the internet: you infiltrate it,’ said Schmidt, speaking at the SXSW conference in Austin, Texas.
‘The new model for a dictator is to infiltrate and try to manipulate it. You’re seeing this in China, and in many other countries.’
Schmidt was interviewed on-stage alongside Jared Cohen, director of the company’s Google Ideas think tank. The session, moderated by Wired journalist and author Steven Levy, took the pair’s The New Digital Age book as its starting point.
Levy wondered whether their enthusiasm for technology’s potential role in popular uprisings has been dampened in the last year by events in Egypt, the Ukraine and elsewhere.
‘We’re very enthusiastic about the empowerment of mobile phones and connectivity, especially for people who don’t have it,’ said Schmidt. ‘In the book, we actually say that revolutions are going to be easier to start, but harder to finish.
He suggested that governments have realised that simply trying to block internet access for citizens is unlikely to end well – partly because it shows that they’re ‘scared’ – which may encourage more people onto the streets, not less. Hence the infiltration approach.”
It’s hard to overemphasize just how much our world looks the way it does because of typographer Mike Parker, the “godfather of Helvetica,” who just passed away. From his obituary in the Economist:
“Of the more than 1,000 types he developed, his greatest success was Helvetica. It was he who adjusted it, or corralled it, to the needs of the obdurate, cranky, noisy Linotype machines which then printed almost everything in America. Originally it was the brainchild of a Swiss designer, Max Miedinger, who devised it in 1956. In contrast to the delicate exuberance of 16th-century types, Helvetica was plain, rigidly horizontal—and eminently readable. It became, in Mr Parker’s hands, the public typeface of the modern world: of the New York subway, of federal income-tax forms, of the logos of McDonald’s, Microsoft, Apple, Lufthansa and countless others. It was also, for its clarity, the default type on Macs, and so leapt smoothly into the desktop age.
Not everyone liked it. He did not always like it himself: as he roared around Brooklyn or Boston, opera pumping out at full volume from his car, he would constantly spot Helvetica being abused in some way, with rounded terminals or bad spacing, on shopfronts or the sides of trucks. But far from seeing Helvetica as neutral, vanilla or nondescript, he loved it for the relationship between figure and ground, its firmness, its existence in ‘a powerful matrix of surrounding space.’ Type gave flavour to words: and this was a typeface that gave people confidence to navigate through swiftly changing times.”
In his latest book, The Future of the Mind, physicist Michio Kaku focuses not on antimatter but on gray matter. He just did an Ask Me Anything at Reddit, answering several queries about the nature and future of consciousness and the corporeal. A few exchanges follow.
If I can make it 50 more years, will we be able to slice up my brain and cram my consciousness into a machine? That’d be swell.
Dr. Michio Kaku:
By midcentury, we may have Brain 2.0, a backup copy of the brain, the byproduct of the ambitious BRAIN project of Pres. Obama and the European Union. Hence, when we die, our Connectome and Genome still survive. So our consciousness does not have to die when we die. And this consciousness, I write, may be placed on laser beams and sent into outer space. This might be the most efficient way to explore the universe, as laser beams carrying our consciousness into outer space.
I remember watching a documentary you made for the BBC on extending life expectancy in humans – do you still follow recent advances in this field and if so, can you tell us what excites you most recently in this particular area?
Dr. Michio Kaku:
We are slowly isolating the genes involved with the aging process. We do not have the fountain of youth, but I think, in the coming decades, we will unravel the aging process at the genetic level. For example, we share 98.5% of our genes with the chimps, yet we live twice as long.
We will find these genes very soon that doubled our life span. However, I don’t [think] the current generation will be able to slow and stop aging. Our grandkids, however, may have a shot at it.
Of all the things you have covered, what are you looking forward to the most that you expect to happen within the next 20 years?
Dr. Michio Kaku:
There are so many wonders awaiting us. If we can upload memories, then we might be able to combat Alzheimer’s, as well as create a brain-net of memories and emotions to replace the internet, which would revolutionize entertainment, the economy, and our way of life. Maybe even to help us live forever, and send consciousness into outer space.
Dr. Kaku – do you think that consciousness is created entirely in the physical matter of the brain or does man possess a soul or some non-physical entity that survives death?
Dr. Michio Kaku:
A soul might very well exist, but we, as physicists, try to measure and quantify everything. So far, no one has been able to create an experiment to do this for the soul. Efforts have been made to weigh the body after death, but each time we find no evidence of a soul. So a soul may very well exist, but it is not a testable theory.
As Bryan Cranston takes on the drawl of a lifetime on Broadway, here’s LBJ in 1965 attending the Houston stop of a Rev. Billy Graham crusade, a tricked-out, latter-day revival show for the television-and-arena age.
From the LBJ Library, a transcript of a 1964 telephone call between the President and the preacher:
“[Graham is on hold 0:35 at beginning of call]
Secretary: Dr. Graham on nine-one
BG: Mr. President?
LBJ: Hello, Billy. How are you, my friend?
BG: Well, God bless you. I was telling Bill [Moyers] that last night I couldn’t sleep, and I got on my knees and prayed for you that the Lord would just give you strength.
LBJ: I told my sweet wife last night–we got mental telepathy–I said that if I didn’t think I’d embarrass him [Graham], I’d say, “Please, dear Lord, I need you more than I ever did in my life. I got the Russians on one side of me taking after, the Chinese are dropping bombs around contaminating the atmosphere, and the best man I ever knew, uh, had a stroke and disease hit him, and I’ve been tied in here with my Cabinet all day…and I’d have Him just make him come down and spend Sunday with me.”
BG: Well, bless your heart, I’ll be glad to. I told Bill that there were two things. One was I just felt terribly impressed to tell you to slow down a little bit. I’ve been awfully worried about you physically.
BG: And then the second thing–you’ve got this election, in my opinion, wrapped up and you’ve got it wrapped up big. There’s no doubt in my mind about that. And then the second thing, you know when Jesus dealt with people with moral problems, like dear Walter [Jenkins] had–and I was telling Bill I wanted to send my love and sympathy to him–
LBJ: Thank you–
BG: –He always dealt tenderly. Always. This is the way He handled it. And that’s the way I feel about it. I know the weaknesses of men, and the Bible says we’re all sinners and we’re all involved one way or another, and I just hope that if you have any contact with him, you’ll just give him my love and understanding.
LBJ: Well, that’ll mean more than anything. Come down here Saturday evening and have dinner with us, and let’s have a quiet visit and maybe have a little service Sunday morning in the White House itself.
BG: Well, I’ll be very happy to. I told Bill that my wife couldn’t come because she’s in bed sick with the flu.
LBJ: Oh, gosh, I’m sorry–
BG: I’m up in Maine and I’m traveling all over New England in different towns, preaching every night in a different town.
LBJ: Oh, wonderful, wonderful. Well, I know you’re doing a lot of good, and I’ll look forward to seeing you Saturday. You just come, bring your bag on in and call Bill and tell him what time you’ll be in so they can have a gate, and we’ll send a car for you.
BG: One of my associates, T.W. Wilson, the brother of the fellow that you met before–
LBJ: Bring him with you.
LBJ: I want him with you. I want anybody. And we’ll just have a good visit and I’ll feel stronger next week.
Those jobs that both humans and robots can do will be ceded soon enough to the machines, which is good in the long term but worrisome right now for employment. New fields will be created in a post-manufacturing society, but when will they arrive? Regardless, it seems we have no choice but to explore this brave new world. It’s as compelling as Manifest Destiny or the Space Race. It seems evolutionary in an almost biological sense. From aBloomberg report about Google and robots:
“Google Inc. Chairman Eric Schmidt said his company is experimenting with automation in ways that will ‘replace a lot of the repetitive behavior in our lives.’
‘We’re experimenting with what automation will lead to,’ Schmidt said yesterday at a conference in Santa Monica, California. ‘Robots will become omnipresent in our lives in a good way.’
Google is pushing ahead with products beyond its core search business for new sources of user traffic and revenue in areas such as mobile and online video. The company also has shown a willingness to make bets on longer-term projects, such as wearable technology, robotics and driverless cars.
‘The biggest thing will be artificial intelligence,’ Schmidt said at Oasis: The Montgomery Summit. ‘Technology is evolving from asking a question to making a relevant recommendation. It will figure out things you care about and make recommendations. That’s possible with today’s technology.”
Growing up in New York City, you would hear periodically about Kitty Genovese, a Queens resident brutally murdered as she screamed for help in earshot and view of her neighbors who did nothing to aid her. None of the dozens called the police. It was a horrifying story, repeated again and again, about a desensitized city full of unfeeling citizens, except that many of these “facts” were erroneous and the larger hypothesis was likely wrong.
There weren’t nearly that many witnesses to the visible portion of the crime, likely a half-dozen at most who understood what was happening. One neighbor briefly frightened away the attacker (who later returned), a couple called the police and another went to the victim and cradled her until the ambulance arrived. And as I was reminded recently when I read Adam Alter’s very worthwhile book, Drunk Tank Pink, subsequent psychological studies of strange non-reactions or limited reactions by numerous bystanders to distress isn’t necessarily a matter of apathy. The presence of so many eyewitnesses makes it less likely that any individual one will act. Everyone assumes somebody else will take care of things. Reaction is slowed and sometimes paralyzed by sheer numbers. It’s the “bystander effect.”
But it took many years for truth and good research to really challenge the narrative of the story, which had seemingly been written in stone and sold as a harrowing trend. How did it become so? One editor working in a high perch, A.M. Rosenthal of the New York Times, was largely responsible (or irresponsible). In a New Yorker review of just-published books about the murder, Nicholas Lemann reminds that a journalistic disregard for context and proportion can cause a random event to be mistaken for a sign of the times. An except:
“In 1964, Rosenthal was forty-one years old and relatively new on the job as the newspaper’s metropolitan editor, an important step in his ascent to a seventeen-year reign over the Times’ newsroom. Ten days after Genovese was killed, he went downtown to have lunch with New York City’s police commissioner, Michael Murphy. Murphy spent most of the lunch talking about how worried he was that the civil-rights movement, which was at its peak, would set off racial violence in New York, but toward the end Rosenthal asked him about a curious case, then being covered in the tabloids, in which two men had confessed to the same murder. He learned that one of the competing confessors, Winston Moseley, had definitely murdered a woman in Kew Gardens, Kitty Genovese. That killing had been reported at the time, including in a four-paragraph squib buried deep within the Times, but Murphy said that what had struck him about it was not the crime itself but the behavior of thirty-eight eyewitnesses. Over a grisly half hour of stabbing and screaming, Murphy said, none of them had called the police. Rosenthal assigned a reporter named Martin Gansberg to pursue the story from that angle. On March 27th, the Times ran a front-page story under a four-column headline:
The following day, the Times ran a reaction story in which a procession of experts offered explanations of what had happened, or said that it was inexplicable. From then on, the story—as they wouldn’t have said in 1964—went viral.”
Andy Greenwald has a really good article at Grantland about comedy in the time of Twitter and Instagram, inspired by Jimmy Fallon’s attempt to be crowned the new King of Late Night, but I don’t know that I agree with his conclusion about contemporary comics being “transparent.” The long tail of distribution and the decentralization of media have made for more opportunities to pursue our dreams even if most of those positions pay far less or not at all. Comics, like anyone else in media, need to place advertisements for themselves on as many channels as possible. But I don’t think that means that we get to see the real person any more now than we have in the past, except for the rare slip-up. Ubiquity is one thing but reality another. And our so-called Reality TV era has very little to do with being real. It’s still scripted, just with worse writing.
Fallon seems to be a younger and handsomer version of Jay Leno: a machine-like dispenser of entertainment who reveals very little of his real self except for the aspects he wants to stress in order to connect with his audience. If anything, he’s smoother, not as rough around the edges, having knocked about less, never having been homeless and arrested for vagrancy the way Leno was when he was trying to make his way in the L.A. stand-up scene. That’s not an insult to Fallon. There’s nothing wrong with him creating an image for himself, but it never feels particularly revelatory on a personal level. We may see Fallon and his peers in the media constantly now, but constancy doesn’t necessarily reduce distance, and being more connected doesn’t really mean we’re any closer. From Greenwald:
“When Tina Fey was a guest on Comedians in Cars Getting Coffee, she was friendly but reticent, wondering aloud how she’d continue to ‘stay opaque.’ But these days, transparency is a requirement for a young comedian. Audiences don’t want to be told jokes, they want to be in on them. Flubs and falls are endearing; seeing the cracks is what cracks people up. Jimmy Fallon has proven himself to be the ideal comedian for this moment because he understands that being funny is now a full-time gig, that oversharing is just another way of being generous. Forget leaving them wanting more: Fallon can’t ever leave them at all. He always has to be on, and so too does his show, tweeting out gags, offering up videos, and, with the help of the incomparable Roots, making Studio 6A feel like a madcap launching pad for creativity and joy, not just a destination for A-listers with projects to push. From across a generational divide and, for now at least, several tax brackets, Jerry Seinfeld and Jimmy Fallon seem to have reached the same conclusion at exactly the right moment. Comedy has a new mantra and it’s working like gangbusters: Always let them see you sweat.”
Room service prices, high enough to make your brain explode, are a major money maker for hotels, right? Apparently that’s not the case. From Zachary Crockett’s Priceonomics post which explains why this amenity costs so much and creates so little revenue:
“With numbers like this, you’d think hotels would make a killing off of late night hunger pains. On the contrary, most hotels actually lose money on the service; for major chains, it’s neither practical nor lucrative.
Robert Mandelbaum, director of information services for PKF Hospitality Research, says room service only accounts for 1% of the typical hotel’s revenue. In addition, room service is on a rampant decline: in 2007, average yearly revenue per room was $1,150; today, it’s only $866 — about $2.37 in room service charges per room per day. While the number of hotel guests overall has risen in the last six years, room service use has fallen off 25 percent.
Mandelbaum likens room service to other hotel offerings, like a pool:
‘Ninety percent of people will say they want to stay at a hotel with a pool, even though only 10 percent will actually use it.’
But room service isn’t just an economic non-factor for hotels — it’s grossly inefficient.”
“In 1895, when electric pleasure cars were new, a certain manufacturer noted with alarm that these strange vehicles running around through the streets frightened horses, then unused to such a spectacle. So this enterprising man, with a touch of imagination, constructed a model on the dashboard of which were attached the head and shoulders of a horse. This he believed would reassure his equine brothers.”
I don’t worry too much about the human loss of skill in driving cars or piloting planes if those things are made autonomous. Before America was fully developed, people used to know many home remedies to treat ailments, but eventually those practices, some of which weren’t quackery, were forgotten as we entrusted ourselves during emergencies to the technology of hospitals. But in order for us to give up “ownership” of our medical care, the technology had to reach a saturation point so that we could feel certain it would be there for us at moments of need.
Similarly, the great dream of many technologists who work in the driverless-car sector is that citizens won’t only give up the wheel but also ownership. A fleet of on-demand autonomous taxis can certainly replace cabs with human drivers and convoys of programmed delivery trucks are likely to be welcomed by most (though not by truck drivers), but will the American impulse to own, to possess, material goods be usurped by algorithms? From a post by Brad Templeton, a Google consultant, about what the government can do to aid in the development of robocars:
“Many of the big effects of this technology on cities, energy, parking, carsharing, delivery and more come about only when the vehicles can operate unamnned to deliver themselves to users, to store themselves and to refuel/recharge themselves. The lifesaving benefit of superior driving with passengers and the timesaving benefit of recovering productive time are great, but are only part of the story. Take special care to assure what you do doesn’t inhibit the deployment of safe empty vehicles.”
I’ve noticed with the newer subway cars on the track, it sends out a strange beeping noise just before it passes by. This noise actually happens at just about the same time my computer crashes or the TV freezes. I was wondering if anyone else has noticed this intrusion of the MTA equipment.
John Adams wasn’t thinking specifically of technology when he said the following, but he might as well have been: “I must study politics and war, that my sons may have the liberty to study mathematics and philosophy, geography, natural history, and naval architecture, navigation, commerce, and agriculture, in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry and porcelain.”
Eventually, and not too far off in the future, Amazon won’t need any human pickers to pull items from their inventory shelves, 3-D printers will allow ideas to spring fully grown from our heads and no one will have much use for a human taxi driver. It will all be AI. That’s great, though it does pose some new problems. Chiefly, how do we reconcile what’s largely a free-market economy with one that’s “post-jobs” to a certain degree, at least if we’re talking about the type of traditional work that our society is built on? We may become wealthier as a people, but how does that wealth reach the people? That could make for a messy transition, and to some extent it already has. Another question is what do we do with ourselves if toil is a thing of the past, and other challenges we thought were our own are assumed by silicon?
From Ray Kurzweil’s site, an exchange between a reader and the futurist about molecular assemblers, which may take building out of our hands, freeing them perhaps for painting and statuary but more likely for some yet unknown tasks:
Suppose molecular assemblers are indeed proven to be feasible on a large scale and we are given an infinite abundance to produce as much as we want — limited only by the amount of matter in our vicinity — with minimal effort.
If this scenario comes to fruition, how will humans be able to cope with the lack of challenges in their lives? It seems like with assemblers there will be very little incentive to do anything.
Since everything could be obtained effortlessly through assemblers, there appears to be little purpose to hold a job, since all possessions could be obtained for free.
Future molecular assemblers will make physical things, but not create new knowledge.
We are doubling knowledge about every year and that will remain a challenge requiring increasing levels of intelligence.”
In a 1996 Playboy interview, longtime Angeleno Ray Bradbury provided his predictably sci-fi vision for the future of urban life, in the wake of the Rodney King trial and the subsequent riots. Seemingly unaware that crime had begun to decrease precipitously in American cities (for reasons still impossible to pin down), he shared an awful solution for supposedly unmanageable metropolises. An excerpt:
If Los Angeles is an indicator for the nation, what is the future of other big cities?
Along with man’s return to the moon, my biggest hope is that L.A. will show the way for all of our cities to rebuild, because they’ve gone to hell and the crime rate has soared. When we can repopulate them, the crime rate will plunge.
What will help?
We need enlightened corporations to do it; they’re the only ones who can. All the great malls have been built by corporate enterprises. We have to rebuild cities with the same conceptual flair that the great malls have. We can turn any bad section of town into a vibrant new community.
How do you convince corporate leaders and bureaucrats that you have the right approach?
They listen because they know my track record. The center of downtown San Diego was nonexistent until a concept of mine, the Horton Plaza, was built right in the middle of bleakest skid row. Civilization returned to San Diego upon its completion. It became the center of a thriving community. And the Glendale Galleria, based on my concept, changed downtown Glendale when it was built nearly 25 years ago. So if I live another ten years – please, God! – I’ll be around to witness a lot of this in Los Angeles and inspire the same thing in big cities throughout the country.”
The Fuller Brush Man, modest an individual though he was, disappeared for the same reason that grandiose World Fairs no longer resonated: The developed world became mobile, and it’s wasn’t necessary for anyone or anything to come to our doors anymore, even to our town. Nobody was home.
Now mobility itself isn’t even very necessary. We’re home, but it all reaches us through tubes and wires, and soon drones. So places we use to drive to, like this one and this one, keep disappearing. We don’t need here or there today because we’re everywhere and nowhere. A segment from Daniel H. Pink’s To Sell Is Human about the birth of the Fuller Brush company, posted on the very fun Delancey Place blog:
“It all began in 1903, when an eighteen-year-old Nova Scotia farm boy named Alfred Fuller arrived in Boston to begin his career. He was, by his own admission, ‘a country bumpkin, overgrown and awkward, unsophisticated and virtually unschooled’ — and he was promptly fired from his first three jobs. But one of his brothers landed him a sales position at the Somerville Brush and Mop Company — and days before he turned twenty, young Alfred found his calling. ‘I began without much preparation and I had no special qualifications, as far as I knew,’ he told a journalist years later, ‘but I discovered I could sell those brushes.’
‘After a year of trudging door-to-door peddling Somerville products, Fuller began, er, bristling at working for someone else. So he set up a small workshop to manufacture brushes of his own. At night, he oversaw the mini-factory. By day he walked the streets selling what he’d produced. To his amazement, the small enterprise grew. When he needed a few more salespeople to expand to additional products and new territories, he placed an ad in a publication called Everybody’s Magazine. Within a few weeks, the Nova Scotia bumpkin had 260 new salespeople, a nationwide business, and the makings of a cultural icon.”
While Muhammad Ali was suffering through his Vietnam Era walkabout, he “boxed” retired great Rocky Marciano in a fictional contest that was decided by a computer. Dubbed the “Super Fight,” it took place in 1970. Marciano dropped a lot of weight and donned a hairpiece to provide viewers with some semblance of his younger self. The fighters acted out thecomputer prognosticationsand the filmed result was released in theaters. Marciano awkwardly stumbled onto a great description of this Singularity moment: “I’m glad you’ve got a computer being the man that makes the decision.”
The EconTalk podcast episode that Russ Roberts did with David Epstein, author of The Sports Gene, which I encouraged you to listen to last year, wound up tied for best show of 2013 in a listener vote. If you missed it and want to catch up, go here.
In the latest program, Roberts interviews Moises Velasquez-Manoff, author ofAn Epidemic of Absence, which examines whether what’s purported to be a sharp spike in autoimmune diseases and allergies in America has been caused by our fervent efforts to cleanse ourselves of parasites and worms. The Food and Drug Administration is considering treatments in which these organisms would be purposely introduced into patients. The host and guest discuss an underground scene that isn’t waiting for FDA approval, in which medicalized hookworms and such are being injected into the sick who wish to gamble on this counter-intuitive medicine.
As a layman, it’s difficult to process any of this without thinking about the recent furor about immunizations in which junk science convinced some citizens that inoculations caused autism. And even more recently, the supposed advantage of breast feeding over bottle feeding, which has since been largely debunked, changed actual childcare policy in New York City. You have to wonder how much the increase in allergies and autoimmune diseases is the result of better statistical information about the incidences of these illnesses. And even if the rise is legitimate, there obviously could be a multitude of causes.
Listen to the podcast here. An excerpt about the so-called “worm therapy” underground:
So, let’s talk about the hookworm underground and how it got started. Tell us what it is, this phenomenon of people injecting themselves deliberately with various types of parasites and why did anyone start to think that was a good idea?
Yes. Well, back up. So, in the 1990s, people started thinking about some of the parasite questions I’ve been talking about. Mostly because they understood the immunology. And they understood that parasites suppress the immune system. And they began–and they noticed also some populations that were parasitized, these diseases were far less prevalent. So they began to think: Well, how about we deliberately introduce parasites as a way to cure some of these diseases? It’s an outrageous idea. But then a gastroenterologist named Joel Weinstock, who is now at Tufts U., developed a parasite, and medicalized it so it was in theory safe. The parasite is native to pigs. And the reason he chose this parasite is it cannot reproduce sexually in humans. So that you give it to the person and no one else gets it. That’s the idea. The context, the historical context, is: we spent lots of money in this country getting rid of parasites. The last thing you want to do is reintroduce them to the population, right?
And you talk about how, when people would suggest these transmission mechanisms for allergies and autoimmune problems, the outrage that many in the medical profession, in the fields of science had to the idea that there was something beneficial about this scourge that we had eliminated.
It’s hard to–it’s difficult to accept. It’s emotionally unpleasant. But intellectually, it’s deeply disturbing. It’s like being told: Oh, we always were told to wash our hands, that that’s good for you. And doctors really should wash their hands. But it turns out maybe, sometimes, dirty hands are good for you. That’s horrifying.
As you say, it’s outrageous. So, what happened with this pig worm?
So, he developed it–this is actually in testing right now for FDA (Federal Drug Administration) approval; and I should point out that some of the results–the early results were amazing. They were so impressive. It was like 3 dozen people and a 75% remission rate for Crohn’s Disease. It was unbelievable. And now it’s in testing. And some of the results have been very lackluster, so far. So we don’t really know if it works yet. But in any case, a bunch of underground people are reading this science. I mean, this is published in reputable journals. It makes sense to a certain kind of mindset that’s kind of ecologically and holistically oriented.
And if you have a chronic disease, you’d love to try something different, if whatever you’ve been trying isn’t working reasonably. Right?
Absolutely. I mean, I think actually at some point it’s a rational–it’s a very rational choice.”
In the days before telegraph and Morse code let alone radio, TV and the Internet, reports about events that occurred in Europe wouldn’t reach America for several days. A newspaper in New York came up with a novel (and highly irresponsible) way to bridge the information gap: have a clairvoyant tell them what happened. An excerpt from a story in the April 19, 1860 Brooklyn Daily Eagle:
“The New York Daily News has been consulting a clairvoyant on the result of the Prize Fight which all suppose to have been fought by Heenan and Sayers on Monday, and says:
‘A clairvoyant in this city declares that one of the pugilists who yesterday fought for the championship of England has been killed. We have been unable to ascertain which; but the lady inclines to think it is the ‘larger man,’ whether as to the muscle or as to the pugilistic fame we know not. But she is positive one of them is killed. We are, therefore, all the more curious to know the result. It will affect either spiritual seeing or material hitting; which, a few days will tell. The old lady adds that the killed man is not the winner.”
Video killed the radio star, and the technology of special effects (as well as the franchising of films, economic shifts and globalization) have seriously wounded the movie star. As Robert Downey Jr. heads for billionaire status, the next generation of leading men and women have become become a part of a “starless” system. If Charlie Hunnam had stayed as lead of the big-budget Fifty Shades of Grey project, he was set to earn $125,000, which would have been close to nothing after agent and manager fees, taxes, etc. And it’s no better for action heroes. Chris Hemsworth, star of the Thor films which made more than a billion dollars globally, was paid just $500,000 for the sequel.
You don’t need to cry for such people since they’re still doing well relative to most people, but it’s telling that the diminution of the worker during our age of miracles and wonders has spread to even to such rarified air, even to the veritable lottery winners. From “The Last Disposable Action Hero,” by Alex French in the New York Times Magazine:
“Once upon a time, a movie poster needed to have only two words on it: the star’s last name and the title. Stallone: Rambo. Schwarzenegger: Terminator. In the new action-hero economy, though, actors rarely carry the franchise; more often, the franchise carries the actor. Chris Hemsworth was little known before Thor, and no one outside the industry was too familiar with Henry Cavill before Man of Steel. Lorenzo di Bonaventura, who produced Transformers and this winter’s Jack Ryan: Shadow Recruit, told me that studios were gambling on unproven actors for economic reasons. ‘These movies cost a lot to mount. Adding on the big movie star’s salary is the thing that makes you go, ‘Boy, I don’t know if I can afford it.’ Perhaps no movie typifies this model better than the 2006 mega-hit 300, an adaptation of Frank Miller’s popular comic-book series, which featured inexpensive and little-known actors like Gerard Butler and Michael Fassbender and then catapulted them to stardom. This week, the film’s producers are trying to replicate that success with a sequel, 300: Rise of an Empire, which is anchored by the unheralded Sullivan Stapleton and 299 other equally fit, anonymous men in leather skirts.”
The opening of “The Computer Girls,” Lois Mandel’s classic 1967 Cosmopolitan article about the promise of young women entering the field of computer programming, a promise which has only been partly fulfilled:
“Twenty years ago, a girl could be a secretary, a school teacher…maybe a librarian, a social worker or a nurse. If she was really ambitious, she could go into the professions and compete with men…usually working harder and longer to earn less pay for the same job.
Now have come the big, dazzling computers–and a whole new kind of work for women: programming. Telling the miracle machines what to do and how to do it. Anything from predicting the weather to sending out billing notices from the local department store.
And if it doesn’t sound like woman’s work–well, it just is.
‘I had this idea I’d be standing at a big machine and pressing buttons all day long,’ says a girl who programs for a Los Angeles bank. ‘I couldn’t have been further off the track. I figure out how the computer can solve a problem, and then instruct the machine to do it.’
‘It’s just like planning a dinner,’ explains Dr. Grace Hopper, now a staff scientist in systems programming for Univac. (She helped develop the first electronic digital computer, the Eniac, in 1946.) ‘You have to plan ahead and schedule everything so it’s ready when you need it. Programming requires patience and the ability to handle detail. Women are naturals at computer programming.’
What she’s talking about is aptitude–the one most important quality a girl needs to become a programmer. She also needs a keen, logical mind.”
Leonard Cohen, never sanguine about the present, spoke fearfully, in 1993, about the future. He thought that there would be a seismic shift, that privacy would be a thing of the past, and he was right. He didn’t, however, foresee that the centralization of media power–what he had decried earlier in “Tower of Song” as “the rich having their channels in the bedrooms of the poor”–would be overturned by new media. Privacy was largely sacrificed, sure, but the channels grew exponentially and were now in our hands. Things would grow more democratic and we would own the truth, though it’s still not clear if we’ll use it right.
Give me back my broken night
my mirrored room, my secret life
it’s lonely here,
there’s no one left to torture
Give me absolute control
over every living soul
And lie beside me, baby,
that’s an order!
Give me crack and anal sex
Take the only tree that’s left
and stuff it up the hole
in your culture
Give me back the Berlin wall
give me Stalin and St Paul
I’ve seen the future, brother:
it is murder.
Things are going to slide, slide in all directions
This is very cool: A 1971 Life magazine report about a Manhattan computer expo in which IBMs wowed visitors by merely playing games of 20 Questions, no chess expertise even necessary. Better yet, the exhibition was curated by Charles Eames, who was as comfortable with computers as he was with furniture. From “A Lively Show with a Robot as the Star,” written by Fortune editor Walter McQuade:
“The stroller steps off the sidewalk and into the IBM display room on 57th Street in Manhattan and approaches one of the four shiny input typewriters of an IBM System 360 computer. The game is ’20 Questions.’ The computer ‘thinks up’ one of the 12 stock mystery words, like ‘duck,’ ‘orange,’ ‘cloud,’ ‘helium,’ ‘knowledge.’ The stroller has 20 chances to guess and if, perhaps, the mystery word is ‘knowledge,’ the typical conversation could start like this:
Stroller: ‘Does it grow?’ Computer: ‘To answer that question might be misleading.’ Stroller: ‘Can I eat it? Is it edible?’ Computer: ‘Only as food for thought.’ Stroller: ‘Do computers have it?’ Computer: ‘Strictly speaking, no.’
Twenty Questions is only the pièce de resistance in what is probably the canniest and most successful exhibition on computers ever devised. It should be: its deviser, the protean Charles Eames–poet, architect, painter, mathematician, toymaker, furniture designer and film maker–has had ample exposure at expos. Here, he and his collaborators reach back into the history and prehistory of computers to show how and why calculating machines came about.
Most of the story evolves on a gigantic, 48-foot, three-dimensional wall tapestry. Woven into it are hundreds of souvenirs from 1890 to 1950, the computer’s gestation period. Here are artifacts, documents and photographs, dramatizing six decades of striving, when information began to explode on the world and nobody knew quite what to do with the fallout.
The devices range from ‘The Millionaire,’ one of the first calculators, made of brass, to Elmer Sperry’s gyroscope, to Vannevar Bush’s differential analyzer. Included are the work of such elegant minds as Alan Turing, Wallace Eckert, Norbert Wiener, John von Neumann. Even L. Frank Baum and his ‘clockwork copper man,’ Tik-Tok of Oz, is represented.
The military imperative to handle information quickly is underlined with a Norden bombsight and with ENIAC, an Army ballistics calculator and predecessor of UNIVAC. There are beautifully selected pieces of cultured debris to date it all; election literature in the years each of the Roosevelts ran for President, and one of the big old dollar bills, when they were worth 100 cents. Best of all are the evocations of mental battles fought and sometimes lost. Early in the century an English scientist, Lewis Fry Richardson, devoted many years to developing numerical models in which equations simulated physical systems to predict the weather. He was a dedicated visionary, but his widow wrote, ‘There came a time of heartbreak when those most interested in his ‘upper air’ research proved to be ‘poison gas’ experts. Lewis stopped his meteorological researches, destroying such as had not been published.
The wall closes with the birth of the UNIVAC in 1950. Since then the computer has progressed so fast, with computers working their own evolution, that the souvenirs would be just print-out sheets. But Eames demonstrates with models and film displays that if this be witchcraft, there are no witches involved–just the 350,000 full-time programmers (in the U.S. alone) and about two million other nonwitches who operate the machines; in a multiple, rapid-fire slidefilm; they chew gum, scratch themselves, dye their hair and do their work.
And when the stroller, no warlock himself, wanders in off the street with his family (it’s a great show for kids) and confronts the System 360, he is well advised to watch his language and frame his questions well. Eames’ finale to the exhibition can be fairly cheeky. System 360, Model 40, is not above printing out, in response to a muddled thought: ‘Your grammar has me stumped.’”
“Yesterday afternoon, Officer Irwin was attracted by yells and drunken screams to the den No. 91 Degraw Street, occupied by Mrs. Duck. On entering the place, the officer found three women and a child in the place. The women were drunk, and tossing the child about ‘just like,’ said the officer, ‘as if it were a foot ball.’ The little child, who is scarcely three years old, presented a most pitiable sight. The officer, on ascertaining who the mother was, arrested her. The health authorities have been notified of the den which is described as the filthiest hole in Red Hook.”