Urban Studies

You are currently browsing the archive for the Urban Studies category.

The murder rate has declined pretty much all over America, in cities enamored with the Broken Windows Theory and those not, and Los Angeles is no exception, having seen a remarkable drop since the bad old days of the early 1990s. One cloud in the silver lining of fewer murders is that those committed in L.A. have a very low arrest rate. From Hillel Aron in the LA Weekly:

The number of murders in Los Angeles County are down – way down. In 1993, the year after the Rodney King riots, there were 1,944 homicides. In 2014, there were 551. That’s nearly a 75 percent drop in less than 20 years, an astonishing reversal of fortune that occurred all over the country, though it was especially prominent in L.A., known as the gang capital of the world. 

But there’s a dark cloud to that silver lining, as it turns out. The Los Angeles News Group published a disturbing package on Sunday, “Getting Away With Murder,” the product of an 18-month study in which its Los Angeles Daily News and other papers found that between 2000 and 2010, 46 percent of all homicides went unsolved.

That’s a lot of murderers who got away clean. Over the same period, the national average was 37 percent. 

Perhaps even more depressing, 51 percent of murders where the victim was black went unsolved. When the victim was white, only 30 percent went unsolved. Over half of all unsolved homicide victims were Latino.

And actually, the real story might be even worse than the numbers suggest. The Daily News also found that LAPD listed some homicides as “cleared” (i.e., solved) even though no one had been convicted: 

596 homicide cases from 2000 through 2010 that the LAPD has classified as “cleared other” — cop speak for solving a crime without arresting and filing charges against a suspect….

The LAPD cleared some of these cases because the D.A. declined to prosecute, but when asked for the reason each case was cleared, police officials did not respond.

In other words, the police think they know who the killer is, in many cases, but knowing is not the same as being able to convict before a jury.•

Tags:

Laurie Segall, correspondent for CNN, a paragon of journaltainment, just did an Ask Me Anything at Reddit in which she discussed alternative lifestyles among Silicon Valley technologists, particularly in regards sex and drugs, from polyamory to biohacking. The Q&A is tied to a new TV special which needs to score ratings for Jeff Zucker’s party boat of reportage or he’s off to the Game Show Network. A few exchanges follow.

________________________________

Question:

What’s an example of the members of a sex party? For example, 3 silicon execs and 2 prostitutes and 4 average joes…or just 6 silicon execs.

Laurie Segall:

Lots of engineers, a mobile developer who worked at a big Silicon Valley company, a lawyer – all types. What was interesting was the guy who started the bronze party (swingers event I attended) said a lot of startup people would come to events and then help him with his website and getting around gmail filters.

Question:

Do you feel there was a difference between how men and woman approached drugs, sex, etc. in Silicon Valley, and how do you feel it reflects on the gender disparity in STEM fields as whole? 

Laurie Segall:

I learned about two different sex communities – polyamorous communities and the swinger crowd (they like to be called “lifestyle members,” not swingers). Polyamory is based more on emotions and swinger community is based more on sex. The woman I spoke to who was polyamorous seemed very in touch with how men and woman are looked at in general. She said people in her position could face discrimination at work and spoke a bit about the power dynamic in Silicon Valley. It was easier to find men experimenting with drugs than women. That’s just a start… we could talk for a long time about this.

________________________________

Question:

Did you try any of the smart drugs like piracetam, ciltep or modafinil? If so, did the have a smart effect?

Laurie Segall:

I have a bag full of them with me now actually and thought about doing a whole “I tried smart drugs” piece. I’d want to try them over a long period of time and not when I’m looking for a result if that makes any sense. I interviewed the guy who started ciltep and it’s interesting to hear about so many people trying to make their own versions and stacks of nootropics. I will try them at some point but haven’t yet. Dave Asprey, who we interviewed for this, gave us bulletproof coffee. It has some oil in there that’s supposed to make you more energetic. It certainly worked. For about 10 minutes I was talking a mile a minute. This morning I took a giant bag full of the smart drugs on the subway with me and got some interesting looks.

________________________________

Question:

You mention psychedelics. How are these being used? Are they used for a specific performance enhancement or rather to recreationally?

Laurie Segall:

The entrepreneurs I spoke to said they used LSD for creative breakthrough moments. I’m sure LSD is used recreationally, but the Cisco engineer said he didn’t consider it recreational. He took it 4-5 times a year when he was looking to think of a creative solution. Tim Ferriss said that while smart drugs are used to work harder, stay up later, be more productive, psychedelics are often used to pour gasoline on the fire and solve hard problems. LSD is classified as highly addictive so there are quite a few risks associated with using it.

Question:

What are some of the results people have said they have gotten from taking the drugs?

Laurie Segall:

One of the entrepreneurs said he was able to get rid of his brain fog. He said different types of smart drugs and nootropics helped him with everything from memory to energy. He did mention side effects. That’s something people should know about if they’re going to try this stuff. There are no long-term studies on a lot of these things.

________________________________

Question:

Is there an element of elitism going on here? Does this group feel the need to exist in a place outside of “normal”? I can almost set the scene where a group of techies sit circled around their coffee table crunching stacks and laying out the partner rotation schedule on an excel sheet. Heads stuck squarely up their own asses. Am I off base here?

Laurie Segall:

I think a lot of these people have more money and live in a place where it’s a bit more ok to explore different lifestyles that might now be acceptable in more traditional places. I do think that in order to have access to a lot of this stuff you’ve got to have more money, so that’s something to consider. I didn’t get the feeling that some of the people I interviewed had that element of elitism.•

Tags:

The American middle class is thinner and grayer than it used to be and that’s not just due to the Great Recession. The trend lines have been headed in that direction for decades. Thomas Piketty has suggested that the green shoots of widespread prosperity, like the period the U.S. enjoyed in the aftermath of WWII, are more exception than rule. But belief in the American Dream, that those who work hard will be rewarded, is difficult to shake no matter what the numbers say, and many still vote their aspirations rather than their actuality, which can lead to policy mismatched to reality. From Dionne Searcey and Robert Gebeloff at the New York Times

The middle class that President Obama identified in his State of the Union speech last week as the foundation of the American economy has been shrinking for almost half a century.

In the late 1960s, more than half of the households in the United States were squarely in the middle, earning, in today’s dollars, $35,000 to $100,000 a year. Few people noticed or cared as the size of that group began to fall, because the shift was primarily caused by more Americans climbing the economic ladder into upper-income brackets.

But since 2000, the middle-class share of households has continued to narrow, the main reason being that more people have fallen to the bottom. At the same time, fewer of those in this group fit the traditional image of a married couple with children at home, a gap increasingly filled by the elderly.

This social upheaval helps explain why the president focused on reviving the middle class, offering a raft of proposals squarely aimed at concerns like paying for a college education, taking parental leave, affording child care and buying a home. …

According to a New York Times poll in December, 60 percent of people who call themselves middle class think that if they work hard they will get rich. But the evidence suggests that goal is increasingly out of reach. When middle class people look up, they see the rich getting richer while they spin their wheels.

“The middle has basically stayed the same; it hasn’t improved,” said Lawrence F. Katz, an economist at Harvard University. “You’ve got an iPhone now and a better TV, but your median income hasn’t changed. What’s really changed is the penthouse has become supernice.”•

Tags: , ,

From the July 9, 1897 Brooklyn Daily Eagle:

A meeting of the employees of Dennett’s Fourteenth Street lunch room, where William C. Keeble, the latest bridge jumper was formerly employed, was held last night to consider means of obtaining possession of the body. Word had been received that W.E. Holmes, the Bowery dime museum proprietor and manager, to whom Keeble had willed his body and all his personal effects before he made the fatal dive into the East River, was contemplating the exhibition of the corpse as a star attraction of his museum. According to the rumor that was unhesitatingly accredited by the employees of Dennett’s. Keeble’s body was to be put in a glass case and placed on exhibition for the usual admittance fee of 10 cents.”•

Tags: ,

In an Atlantic piece, Derek Thompson notes that while CDs are under siege, digital music is itself being disrupted, with abundance making profits scarce. Music is desired, but the record store–in any form–is not. The opening:

CDs are dead.

That doesn’t seem like such a controversial statement. Maybe it should be. The music business sold 141 million CDs in the U.S. last year. That’s more than the combined number of tickets sold to the most popular movies in 2014 (Guardians) and 2013 (Iron Man 3). So “dead,” in this familiar construction, isn’t the same as zero. It’s more like a commonly accepted short-cut for a formerly popular thing is now withering at a commercially meaningful rate.

And if CDs are truly dead, then digital music sales are lying in the adjacent grave. Both categories are down double-digits in the last year, with iTunes sales diving at least 13 percent.

The recorded music industry is being eaten, not by one simple digital revolution, but rather by revolutions inside of revolutions, mouths inside of mouths, Alien-style. Digitization and illegal downloads kicked it all off. MP3 players and iTunes liquified the album. That was enough to send recorded music’s profits cascading. But today the disruption is being disrupted: Digital track sales are falling at nearly the same rate as CD sales, as music fans are turning to streaming—on iTunes, SoundCloud, Spotify, Pandora, iHeartRadio, and music blogs. Now that music is superabundant, the business (beyond selling subscriptions to music sites) thrives only where scarcity can be manufactured—in concert halls, where there are only so many seats, or in advertising, where one song or band can anchor a branding campaign.•

_____________________________

“In your home or in your car, protect your valuable tapes.”

Tags:

The Peer Economy may be a good idea whose time has come, but that doesn’t mean it’s good for workers. In America, it’s a crumb tossed to those squeezed from the middle class by globalization, automation, etc. Keeping employees happy isn’t a goal of Uber and others because it treats labor like a dance marathon, the music never stopping, new “employee-partners” continually being supplied by a whirl of desperation. From Douglas MacMillan at the WSJ:

The sheer numbers of Uber’s labor pool and rate of growth are hard to fathom. The company added 40,000 new drivers in the U.S. in the month of December alone. The authors of the paper say the number of new drivers is doubling every six months. At the same time, Uber says nearly half its drivers become inactive after a year – either because they quit or are terminated.

If those trends continue, Uber could end this year with roughly half-a-million drivers in the U.S. alone.

That growth is being driven mainly by UberX, the company’s service for non-professional drivers that first rolled out in 2012. UberX has create a new part-time job opportunity for people who have never driven professionally, which account for 64% of Uber’s total number of drivers.

Most, or 62% of Uber drivers, have at least one additional source of income. Which could mean that at least for some, Uber is not economically feasible as a full-time job.

Uber claims an average driver makes $19.04 an hour, after paying Uber a commission, higher than the $12.90 average hourly wage (including tips, Uber says) that the U.S. Bureau of Labor Statistics estimates for taxis and chauffeurs. Uber drivers make the most average pay in New York, followed by San Francisco and Boston.

The average pay for former taxi drivers on Uber is $23 per hour; for former black car drivers it’s $27 per hour.

But the paper’s authors admit these figures don’t include expenses that come out drivers own pockets, including gas, maintenance and insurance. And a number of people with experience driving for the company say Uber has made it more difficult to make a good wage because it frequently cuts prices as a way to entice new passengers.

A drop in prices can have a profound effect on driver pay.•

Tags:

GOAT SKELETON IN A COFFIN – $325 (Green Lane, PA)

Very unusual real juvenile goat skeleton in a coffin. It’s on my refrigerator and the wife (of 48 years) says I have to pass it on to someone else. She won’t let me keep it anymore. I don’t know exactly why. That’s just the way it is and I don’t know where clean underwear or food comes from so it has to go. About 4′ tall all together.

The 2015 version of the Gates Annual Letter makes bold and hopeful predictions for the world by 2030 (infant mortality halved, an HIV vaccine, Africa a prosperous continent, etc.) In the spirit of the missive, Politico invited other thinkers to consider life 15 years hence. Below are two examples representing polar opposites, neither of which seems particularly likely.

_____________________________

Technology for the good

By Vivek Wadhwa, fellow at the Arthur & Toni Rembe Rock Center for Corporate Governance at Stanford University

Technology is advancing faster than people think and making amazing things possible. Within two decades, we will have almost unlimited energy, food and clean water; advances in medicine will allow us to live longer and healthier lives; robots will drive our cars, manufacture our goods and do our chores. It will also become possible to solve critical problems that have long plagued humanity such as hunger, disease, poverty and lack of education. Think of systems to clean water; sensors to transform agriculture; digital tutors that run on cheap smartphones to educate children; medical tests on inexpensive sensor-based devices. The challenge is to focus our technology innovators on the needs of the many rather than the elite few so that we can better all of humanity.•

_____________________________

No breakthroughs for the better

By Leslie Gelb, president emeritus and board senior fellow at the Council on Foreign Relations

The world of 2030 will be an ugly place, littered with rebellion and repression. Societies will be deeply fragmented and overwhelmed by irreconcilable religious and political groups, by disparities in wealth, by ignorant citizenry and by states’ impotence to fix problems. This world will resemble today’s, only almost everything will be more difficult to manage and solve.

Advances in technology and science won’t save us. Technology will both decentralize power and increase the power of central authorities. Social media will be able to prompt mass demonstrations in public squares, even occasionally overturning governments as in Hosni Mubarak’s Egypt, but oligarchs and dictators will have the force and power to prevail as they did in Cairo. Almost certainly, science and politics won’t be up to checking global warming, which will soon overwhelm us.

Muslims will be the principal disruptive factor, whether in the Islamic world, where repression, bad governance and economic underperformance have sparked revolt, or abroad, where they are increasingly unhappy and disdained by rulers and peoples. In America, blacks will become less tolerant of their marginalization, as will other persecuted minorities around the world. These groups will challenge authority, and authority will slam back with enough force to deeply wound, but not destroy, these rebellions.

A long period of worldwide economic stagnation and even decline will reinforce these trends. There will be sustained economic gulfs between rich and poor. And the rich will be increasingly willing to use government power to maintain their advantages.

Unfortunately, the next years will see a reversal of the hopes for better government and for effective democracies that loomed so large at the end of the Cold War.•

Tags: , , ,

Joe Franklin had a talk show on television for more than 40 years, until somebody found out. 

Like a Ben Katchor character come to life, he possessed an encyclopedic memory for things nobody cared about anymore, hoarding popular culture which had grown unpopular and doggedly appreciating vanishing entertainments: platters of wax, alleys of tin pan, anecdotes of Eddie Cantor. That he occasionally welcomed a cutting-edge guest seemed almost an accident of mathematical possibility, the stopped clock being right twice a day. Franklin always insisted that he had never dyed his hair; the bottle of shoe polish declined comment. He was apparently married to a woman who did some Bettie Page-ish bondage modeling back in the day. Who knew? An excerpt from his New York Times obituary penned by James Barron:

What came to be considered campy began as pioneering programming: the first regular program that Channel 7 had ever broadcast at noon. WJZ-TV, as the station was known then, had not been signing on until late afternoon before the premiere of “Joe Franklin — Disk Jockey” on Jan. 8, 1951.

Soon celebrities like Elvis Presley, Bing Crosby and John F. Kennedy were making their way to the dingy basement studio on West 67th Street — a room with hot lights that was “twice the size of a cab,” Mr. Franklin recalled in 2002. He booked Woody Allen, Dustin Hoffman, Barbra Streisand, Bill Cosby and Liza Minnelli as guests when they were just starting out, and hired two other young performers, Bette Midler and Barry Manilow, as his in-house singer and accompanist.

“My show was often like a zoo,” he said in 2002. “I’d mix Margaret Mead with the man who whistled through his nose, or Richard Nixon with the tap-dancing dentist.”

Mr. Franklin claimed a perfect attendance record: He said he never missed a show. Bob Diamond, his director for the last 18 years of his television career, said that there were a few times in the days of live broadcasts when the show had to start without Mr. Franklin. But Mr. Franklin always got there eventually.

And he always seemed to have a gimmick. He celebrated his 40th anniversary on television by interviewing himself, using a split-screen arrangement. “I got a few questions I’m planning to surprise myself with,” he said before he began.

Had he been asked, he could have told viewers that he was born Joe Fortgang in the Bronx. He explained in his memoir, “Up Late With Joe Franklin,” written with R. J. Marx, that his press materials had long said that he had been born in 1928, “but I’m going to come clean and admit that my real birth date was March 9, 1926.” He was the son of Martin and Anna Fortgang; his father was a paper-and-twine dealer who had gone to Public School 158 with James Cagney.•

Tags: ,

Dr. Frederic Wertham did some wonderful things in his career, but his anti-comic book crusade was not among them. In 1954, when the fear of panels had gone worldwide, he squared off in Washington at congressional hearings with Mad magazine publisher William Gaines, who did not yet resemble a plate of spaghetti that had fallen to the floor.

Tags: ,

From the April 18, 1910 Brooklyn Daily Eagle:

A leg of lamb which exploded when placed on the table before several boarders, resulted in the arraignment of David Kahn, the butcher, who sold it to the boarding house keeper, before Magistrate Hylan, in the New Jersey Avenue police court, this morning. The boarding house keeper, Mrs. Elizabeth Jones, of 1467 Broadway, and her star boarder were the principal witnesses.

Mr. Kahn was held on $500 bail by the magistrate, who will hear both sides next Wednesday.

Mrs. Jones claimed that she cooked the leg of lamb and placed it on the table, much to delight of her anticipatory boarders. She cut off a couple of slices and was just preparing to cut another when the leg exploded right before her eyes. She then discovered the meat was bad.•

Tags: , ,

Say what you will about Jill Abramson, but she gave the New York Times enduring gifts with the hires of Jake Silverstein and Deborah Needleman, editors respectively of the Magazine and the T Magazine. They’ve both done a lot of excellent work early in their tenures.

Her successor, Dean Baquet, amateur proctologist, is a talented person with a huge job ahead of him at the venerable and wobbly news organization, and he may yet call Mike Bloomberg boss because such a transaction makes a lot of sense financially. In a new Spiegel interview conducted by Isabell Hülsen and Holger Stark, Baquet addresses the technological “Space Race” he’s trying to win–or at least not lose. An excerpt:

Spiegel:

Digital competitors like BuzzFeed and the Huffington Post offer an extremely colorful mix of stories and have outperformed the New York Times website with a lot of buzz.

Dean Baquet:

Because they’re free. You’re always going to have more traffic if you’re a free website. But we’ve always admitted that we were behind other news organizations in making our stories available to people on the web. BuzzFeed and the Huffington Post are much better than we are at that, and I envy them for this. But I think the trick for the New York Times is to stick to what we are. That doesn’t mean: Don’t change. But I don’t want to be BuzzFeed. If we tried to be what they are, we would lose.

Spiegel:

In May, your internal innovation report was leaked along with its harsh conclusion that the New York Times’ “journalistic advantage” is shrinking. Did you underestimate your new digital competitors?

Dean Baquet:

Yes, I think we did. We assumed wrongly that these new competitors, whether it was BuzzFeed or others, were doing so well just because they were doing something journalistically that we chose not to do. We were arrogant, to be honest. We looked down on those new competitors, and I think we’ve come to realize that was wrong. They understood before we did how to make their stories available to people who are interested in them. We were too slow to do it.

Spiegel:

The report was disillusioning for many newspaper executives because the Times is widely seen as a role model when it comes to the question of making money on the web. The report, instead, pointed out that the Times lacks a digital strategy and the newsroom is far away from a “digital first” culture.

Dean Baquet:

First, the Times is and has always been a digital leader. The report only cited some areas where we fell down. Second: Half of the report is critical, and half of it has ideas for things you can do to fix the problem. A lot of things have been done already.

Spiegel:

What has changed?

Dean Baquet:

We have, for example, built a full-bodied audience development team that engages with our readers through social networks. The team has been in operation for three months now and we already have a pretty consistent 20 percent increase in traffic.

Spiegel:

How does this influence the work of your journalists?

Dean Baquet:

It used to be, if you were a reporter, you wrote a story and then you moved on to the next one. We were used to people coming to us. We waited for them to turn on our website or to pick up our print paper and see what we have. We now understand that we have to make our stories available to our readers. A lot of people get their news from Facebook or Twitter and we want to make sure that they see some of our best stories there, too. We do this more aggressively now than we did before.•

Tags: , , , , ,

I was reading the “Hey Bill” Q&A section of Bill James’ site, and this question was posed:

Hey Bill, I thought it was interesting in 1939 the National Professional Indoor Baseball League was launched with Tris Speaker and franchises managed by guys like Bill Wambsganss, Moose McCormick, and Harry Davis. It went one and done though, disbanded after that season. Do you know how the actual play was set up?

Answered: 1/21/2015

Don’t know that I’ve ever heard of it.•

Indoor baseball, while clearly nowhere near as popular as its outdoor counterpart, was a dogged part of the American sporting scene from the 1890s till the late ’30s. Before the game essentially became softball, it was played on a pro level during the winter months at armories in front of crowds of up to 1,500 fans. Decidedly different than the summer game were the rules (e.g., 35-foot basepaths) and equipment (ball was larger and softer). The most famous iteration of the game was the short-lived 1939 National Professional Indoor Baseball League, which was presided over by the former MLB great Tris Speaker. Below are several pieces from the Brooklyn Daily Eagle about various versions of the game.

____________________________

From May 5, 1912:

____________________________

From November 20, 1939:

____________________________

 

 

____________________________

 

 

Tags: ,

The arduousness of parallel parking is one way young egos of the technological world learn humility and patience. Try and fail and try and fail and try. Soon enough, those lessons will be learned by other means, if they are to be learned, as cars will be deposited solely by sensors and such in the near future. From John R. Quain at the New York Times:

TECHNOLOGY may soon render another skill superfluous: parking a car.

Sensors and software promise to free owners from parking angst, turning vehicles into robotic chauffeurs, dropping off drivers and then parking themselves, no human intervention required.

BMW demonstrated such technical prowess this month with a specially equipped BMW i3 at the International CES event. At a multilevel garage of the SLS Las Vegas hotel, a BMW engineer spoke into a Samsung Gear S smartwatch.

“BMW, go park yourself,” and off the electric vehicle scurried to an empty parking spot, turning and backing itself perfectly into the open space. To retrieve the car, a tap on the watch and another command, “BMW, pick me up,” returned the car to the engineer.•

Technology can render things faster and cheaper but also, sometimes, out of control. Embedded in our trajectory of a safer and more bountiful world are dangers enabled by the very mechanisms of progress. In “A Fault in Our Design,” a typically smart and thoughtful Aeon essay, Colin Dickey meditates on nautical advances which allowed the Charles Mallory to deliver devastating disease in 1853 to a formerly far-flung Hawaii and considers it a cautionary tale for how modern wonders may be hazardous to our health. An excerpt:

It’s hard not to feel as though history is progressing forward, along a linear trajectory of increased safety and relative happiness.

Even a quick round-up of the technological advances of the past few decades suggests that we’re steadily moving forward along an axis of progress in which old concerns are eliminated one by one. Even once-feared natural disasters are now gradually being tamed by humanity: promising developments in the field of early warning tsunami detection systems might soon be able to prevent the massive loss of life caused by the 2004 Indian Ocean Tsunami and similar such catastrophes.

Technology has rendered much of the natural world, to borrow a term from Edmund Burke and Immanuel Kant, sublime. For Kant, nature becomes sublime once it becomes ‘a power that has no dominion over us’; a scene of natural terror that, viewed safely, becomes an enjoyable, almost transcendental experience. The sublime arises from our awareness that we ourselves are independent from nature and have ‘a superiority over nature’. The sublime is the dangerous thing made safe, a reaffirmation of the power of humanity and its ability to engineer its own security. And so with each new generation of technological innovation, we edge closer and closer towards an age of sublimity.

What’s less obvious in all this are the hidden, often surprising risks. As the story of the Charles Mallory attests, sometimes hidden in the latest technological achievement are unexpected dangers. Hawaii had been inoculated from smallpox for centuries, simply by virtue of the islands’ distance from any other inhabitable land. Nearly 2,400 miles from San Francisco, Hawaii is far enough away from the rest of civilisation that any ships that headed towards its islands with smallpox on board wouldn’t get there before the disease had burned itself out. But the Charles Mallory was fast enough that it had made the trip before it could rid itself of its deadly cargo, and it delivered unto the remote island chain a killer never before known.

Which is to say, the same technologies that are making our lives easier are also bringing new, often unexpected problems.•

Tags:

Haircut

Looking for a haircut by a professional nude hairdresser. Pls provide picture and your rate. Your location. Thanks. 

No one is more moral for eating pigs and cows rather than dogs and cats, just more acceptable. 

Dining on horses, meanwhile, has traditionally fallen into a gray area in the U.S. Americans have historically had a complicated relationship with equine flesh, often publicly saying nay to munching on the mammal, though the meat has had its moments–lots of them, actually. From a Priceonomics post by Zachary Crockett, a passage about the reasons the animal became a menu staple in the fin de siècle U.S.: 

Suddenly, at the turn of the century, horse meat gained an underground cult following in the United States. Once only eaten in times of economic struggle, its taboo nature now gave it an aura of mystery; wealthy, educated “sirs” indulged in it with reckless abandon.

At Kansas City’s Veterinary College’s swanky graduation ceremony in 1898, “not a morsel of meat other than the flesh of horse” was served. “From soup to roast, it was all horse,” reported the Times. “The students and faculty of the college…made merry, and insisted that the repast was appetizing.”

Not to be left out, Chicagoans began to indulge in horse meat to the tune of 200,000 pounds per month — or about 500 horses. “A great many shops in the city are selling large quantities of horse meat every week,” then-Food Commissioner R.W. Patterson noted, “and the people who are buying it keep coming back for more, showing that they like it.”

In 1905, Harvard University’s Faculty Club integrated “horse steaks” into their menu. “Its very oddity — even repulsiveness to the outside world — reinforced their sense of being members of a unique and special tribe,” wrote the Times. (Indeed, the dish was so revered by the staff, that it continued to be served well into the 1970s, despite social stigmas.)

The mindset toward horse consumption began to shift — partly in thanks to a changing culinary landscape. Between 1900 and 1910, the number of food and dairy cattle in the US decreased by nearly 10%; in the same time period, the US population increased by 27%, creating a shortage of meat. Whereas animal rights groups once opposed horse slaughter, they now began to endorse it as more humane than forcing aging, crippled animals to work. 

With the introduction of the 1908 Model-T and the widespread use of the automobile, horses also began to lose their luster a bit as man’s faithful companions; this eased apprehension about putting them on the table with a side of potatoes (“It is becoming much too expensive a luxury to feed a horse,”argued one critic).

At the same time, the war in Europe was draining the U.S. of food supplies at an alarming rate. By 1915, New York City’s Board of Health, which had once rejected horse meat as “unsanitary,” now touted it is a sustainable wartime alternative for meatless U.S. citizens. “No longer will the worn out horse find his way to the bone-yard,” proclaimed the board’s Commissioner. “Instead, he will be fattened up in order to give the thrifty another source of food supply.”

Prominent voices began to sprout up championing the merits of the meat.•

Tags:

I’m not a geneticist, but I doubt successful, educated parents are necessarily more likely to have preternaturally clever children than their poorer counterparts, as is argued in a new Economist article about the role of education in America’s spiraling wealth inequality. Of course, monetary resources can help provide a child every chance to realize his or her abilities, ensuring opportunities often denied to those from families of lesser material means. That, rather than genes, is the main threat to meritocracy. An excerpt:

Intellectual capital drives the knowledge economy, so those who have lots of it get a fat slice of the pie. And it is increasingly heritable. Far more than in previous generations, clever, successful men marry clever, successful women. Such “assortative mating” increases inequality by 25%, by one estimate, since two-degree households typically enjoy two large incomes. Power couples conceive bright children and bring them up in stable homes—only 9% of college-educated mothers who give birth each year are unmarried, compared with 61% of high-school dropouts. They stimulate them relentlessly: children of professionals hear 32m more words by the age of four than those of parents on welfare. They move to pricey neighbourhoods with good schools, spend a packet on flute lessons and pull strings to get junior into a top-notch college.

The universities that mould the American elite seek out talented recruits from all backgrounds, and clever poor children who make it to the Ivy League may have their fees waived entirely. But middle-class students have to rack up huge debts to attend college, especially if they want a post-graduate degree, which many desirable jobs now require. The link between parental income and a child’s academic success has grown stronger, as clever people become richer and splash out on their daughter’s Mandarin tutor, and education matters more than it used to, because the demand for brainpower has soared. A young college graduate earns 63% more than a high-school graduate if both work full-time—and the high-school graduate is much less likely to work at all. For those at the top of the pile, moving straight from the best universities into the best jobs, the potential rewards are greater than they have ever been.

None of this is peculiar to America, but the trend is most visible there. This is partly because the gap between rich and poor is bigger than anywhere else in the rich world—a problem Barack Obama alluded to repeatedly in his state-of-the-union address on January 20th (see article). It is also because its education system favours the well-off more than anywhere else in the rich world.•

From the February 23, 1919 Brooklyn Daily Eagle:

One little girl in Burford Bridge, England, has made a record of having killed 1,415 butterflies in a butterfly killing contest held in the schools of that district. We wonder what there may be about so beautiful and harmless an insect as a butterfly to warrant engaging school children in such a murderous employment, and we hope there were more than a few children who made a very poor showing in the competition, believing that for the most of them the contest offered little inspiration.•

For those who read Lolita after the Sexual Revolution of ’60s and ’70s had ended, how can the book appear like anything but an amazing piece of writing about a horrifying “romance”? But I suppose for some of the young who came of age during the carnal tumult of that earlier time, the novel seemed like a different thing–or at least the culture told them it was. In the opening question of an interview conducted by Erik Morse of the Los Angeles Review of Books, Emily Prager, the novelist and journalist who briefly appeared on the original iteration of Saturday Night Live, astutely explains the generational differences in interpretations of the controversial work:

Erik Morse:

Do you remember when you first read Lolita? What were your initial impressions, both of Nabokov’s story and the character of Lo?

Emily Praeger:

I don’t remember when I read Lolita but the idea of Lolita was a large part of the ’60s when I matured. Recently I saw the now 50ish-year-old woman whom Roman Polanski allegedly raped. She kept stammering that it was a different time, that you can’t judge Polanski by today’s standards. That’s because the Lolita idea was everywhere — there was a book with almost softcore photos of baby ballerinas that was on every coffee table, tons of very young women with much older men and it was okay. Men ruled after all. Many took Humbert Humbert as their role model. They liked him best of all. A few years ago, I went to dinner with some women who had grown up in the ’60s. It was when the new attitude toward sexual harassment in the workplace was surfacing. We had a great laugh because every single one of us had been importuned in the workplace constantly. When I was 17 and a prop girl off-Broadway, we had to kiss the house manager when we arrived at work. We rolled our eyes and did it. We thought it was ridiculous and those who asked it of us ludicrous. Lolita, the movie, came out in 1962, and it was with Peter Sellers and Stanley Kubrick directing and it was cool. We all wanted the heart-shaped sunglasses. You know, the myth of the ’60s is that it was all about sex. The truth is we knew nothing about sex except what society told us, which was it was bad. We just didn’t want anyone anymore saying anything to us about how to think about sex. So sexual liberation had to include Lolita. It was every girl for herself. You can’t believe how innocent we were. I doubt most of us registered that she might be being taken advantage of. The other thing was that very young boys were going to fight and die in Vietnam, not 12 but 18, which then was about 13. Young girls having sex didn’t seem that wrong. Of course you read Lolita now — I teach it in my fiction-writing course and modern girls are disgusted by it, horrified.•

Tags: ,

In a Backchannel interview largely about strategies for combating global poverty, Steven Levy asks Bill Gates about the existential threat of superintelligent AI. The Microsoft founder sides more with Musk than Page. The exchange:

Steven Levy:

Let me ask an unrelated question about the raging debate over whether artificial intelligence poses a threat to society, or even the survival of humanity. Where do you stand?

Bill Gates:

I think it’s definitely important to worry about. There are two AI threats that are worth distinguishing. One is that AI does enough labor substitution fast enough to change work policies, or [affect] the creation of new jobs that humans are uniquely adapted to — the jobs that give you a sense of purpose and worth. We haven’t run into that yet. I don’t think it’s a dramatic problem in the next ten years but if you take the next 20 to 30 it could be. Then there’s the longer-term problem of so-called strong AI, where it controls resources, so its goals are somehow conflicting with the goals of human systems. Both of those things are very worthy of study and time. I am certainly not in the camp that believes we ought to stop things or slow things down because of that. But you can definitely put me more in the Elon Musk, Bill Joy camp than, let’s say, the Google camp on that one.•

Tags: ,

Along with the progress being made with driverless cars and 3D bio-printers, the thing that has amazed me the most–alarmed me also–since I’ve been doing this blog has been the efforts of Boston Dynamics, the robotics company now owned by Google. The creations are so stunning that I hope the creators will remember that the applications of their machines are at least as important as the accomplishment of realizing the designs. At any rate, the Atlas robot is now untethered, liberated from its safety cord, operating freely via batteries.

At 3 Quarks Daily, Thomas Rodham Wells, that bearded philosopher, delivers a spanking to adults who imbue small children with greater value than they would others. At first blush, it seems a pedestrian argument. Little ones, still dependent, need us more, so they are prioritized. Pretty sensible. But as Wells makes clear, the cult of children informs moral decisions and perspectives in ways that may be out of proportion.

I think at the heart of issue is that we hold out hope that babies will turn out better than the rest of us did, and we’d like to enable that opportunity. Once they’ve grown and fallen into the middle of the pack like most do, that hope extinguishes. It certainly can’t just be that we like to make ourselves feel good by protecting those who are more defenseless because a lot adults, impoverished or ill, also fit that category. An excerpt:

Children are special in one particular, their extreme neediness. They have quite specific often urgent needs that only suitably motivated adults can meet, and the younger they are, the greater their neediness. That makes children’s care and protection a moral priority in any civilised society – there are lots of things that aren’t as important and should give rightly way to meeting children’s needs. As a result, children create multiple obligations upon their care-givers, as well second-order obligations on society in general, to ensure those needs are met. 

Yet the fact that you should give way to an ambulance attending an emergency doesn’t mean that the person in the ambulance is more important than you; only that her needs right now are more important than you getting to work on time. Likewise, the immanence of children’s neediness should often determine how we rank the priorities of actions we want to do, such as interrupting a movie to attend to a baby’s cries. But such an action ranking is not a guide to the relative worth of children and adults, or of babies and teenagers. There will surely be times when something even more urgent occurs – such as someone having a heart-attack in front of you – that requires a baby’s cries be neglected for the moment. 

II

The confusion of neediness with worth is only one source of the confusion though. The other major source is rather more blameworthy: the valorisation of psychological immaturity. For a peculiarity of the moral priority we grant to the neediness of children is that we do not apply it to equally needy adults, most obviously those whose mental and physical faculties decline in old age in a somewhat symmetrical way to the development of those faculties in children. If we only cared about neediness we would care more about, and take on more personal responsibility for, meeting the needs of the disabled in general without regard for their age. 

Of course we don’t do that. We seem to place a special value on children because of their blankness, the fact that they have not thought or done anything interesting or important yet and that their identity – their relationship to themselves and to others – is still unformed. (Some abortion activists make a great deal of the innocence of foetuses, the ultimate non-achievers.) As children grow up and become more like people, with a life of their own – friends and favourite things and secrets and dreams and ideas of their own – they seem to become less valuable. 

I can’t explain this bizarre phenomenon.•

Tags:

In a Business Insider piece, tech entrepreneur Hank Williams neatly dissects the problem of the intervening period between the present and that future moment when material plenty arrives, which is hopefully where technology is taking us. How hungry will people get before the banquet is served? I don’t know that I agree with his prediction that more jobs will move to China; outsourcing will likely come to mean out of species more than out of country. An excerpt:

When you read in the press the oft-quoted concept that “those jobs aren’t coming back” this “reduction of need” is what underlies all of it. Technology has reduced the need for labor. And the labor that *is* needed can’t be done in more developed nations because there are people elsewhere who will happily provide that labor less expensively.

In the long term, technology is almost certainly the solution to the problem. When we create devices that individuals will be able to own that will be able to produce everything that we need, the solution will be at hand. This is *not* science fiction. We are starting to see that happen with energy with things like rooftop solar panels and less expensive wind turbines. We are nowhere near where we need to be, but it is obvious that eventually everyone will be able to produce his or her own energy.

The same will be true for clothing, where personal devices will be able to make our clothing in our homes on demand. Food will be commoditized in a similar way, making it possible to have the basic necessities of life with a few low cost source materials.

The problem is that we are in this awful in-between phase of our planet’s productivity curve. Technology has vastly reduced the number of workers and resources that are required to make what the planet needs. This means that a small number of people, the people in control of the creation of goods, get the benefit of the increased productivity. When we get to the end of this curve and everyone can, in essence, be their own manufacturer, things will be good again. But until we can ride this curve to its natural stopping point, there will be much suffering, as the jobs that technology kills are not replaced.

The political implications of this are staggering.•

Tags:

I’m sure the advent of commercial aviation was met with prejudices about the new-fangled machines, but it took quite a while to perfect automated co-pilots and the navigation of wind shears, so horrifying death was probably also a deterrent. In the article below from the September 22, 1929 Brooklyn Daily Eagle (which is sadly chopped off a bit in the beginning), the unnamed author looks at a selected history of technophobia. 

 

« Older entries § Newer entries »