Roberts1

It’s not really stunning that a patriarchal institution like Oral Roberts University is doing something remarkably invasive, but the question is whether the school is an outlier for long or just for now. ORU will require incoming freshmen to wear Fitbit in order to monitor their exercise, food, sleep, location and body weight. A school founded by a “faith healer” that’s utilizing new technologies is bound to “lay its hands” on others in off-putting ways, though wholly secular bodies will likely attempt similar things in the not-too-distant future.

From Elizabeth Chuck at NBC News:

An Oklahoma university is taking a novel approach to fighting the “Freshman 15”: Require all incoming students to wear fitness trackers.

Oral Roberts University, a Christian university in Tulsa, announced earlier this month that all first-years must wear Fitbits — watches that track how much activity a person does. Their fitness data will be tracked by the school and will affect students’ grades.

While mandatory for all incoming freshman this year, Oral Roberts said it “has opened the program up to all students,” and said the campus bookstores have already sold more than 550 of the popular gadgets.

The university has always included a fitness component in its curriculum, requiring students to “manually log aerobics points in a fitness journal” in past years. The students get graded on their level of aerobic activity.

Now, instead of tediously entering the data by hand, it will be automatically tracked and submitted by the Fitbits, which retail for about $150.

“ORU offers one of the most unique educational approaches in the world by focusing on the Whole Person — mind, body and spirit,” ORU President William M. Wilson said in a statement. “The marriage of new technology with our physical fitness requirements is something that sets ORU apart.”

The Fitbit requirement is a first of its kind for colleges and universities, Oral Roberts said.•

Tags:

<> on August 15, 2015 in Des Moines, Iowa.

Nothing has been better than the New York Times’ day-to-day coverage of the 2016 Presidential race which kicks off in earnest tonight in Iowa. The work by reporters like Maggie Haberman, Michael Barbaro and Trip Gabriel has been lively, lucid and layered, a Herculean task in the new normal of the nonstop churn. Gabriel, who’s been stationed in first-in-nation state for a year, just did an Ask Me Anything at Reddit. A few exchanges follow.

_____________________________

Question:

As an Iowan, I’ve long been skeptical of our First in the Nation status when it comes to narrowing the field of presidential candidates, largely because of the small (tiny) non-representative population. Of course, that’s a risky opinion to have here, so I was wondering, do you feel Iowa perhaps has too much influence on the process? Or has your time in Iowa helped justify it’s position as FITN in your mind?

Trip Gabriel:

I change my mind about Iowa’s role weekly. On the plus side, it’s a state where a candidate without money can spend a lot of time doing retail campaigning. If I was reporting from Florida today, the race would be much more about who has the millions for TV ads. But yes, Iowa is unrepresentative of America, not just demographically (very white) but also ideologically. Republicans are very conservative here, and Democrats are very liberal — 43 percent called themselves “socialists” in a Des Moines Register poll this month.

_____________________________

Question:

What, exactly, is stopping a big state like NY, Texas, California or Florida from just moving up their primaries to before Iowa and simply beating back party leaders through their sheer importance in population/delegates?

Trip Gabriel:

The national parties, which control the nominating convention, write the rules, and they can — and have — discounted the delegates from states that try to jump ahead of the traditional early-voting states. That said, the GOP chairman Reince Priebus is not a fan of the four early “carve-out states” and wants to see a regional primary system that would spread the responsibility for choosing the nominee more broadly.

_____________________________

Question:

Who in the GOP side has the most extensive field operation in Iowa? I went to a couple of rallies this past weekend and didn’t see much volunteer recruitment from Trump or Rubio rallies.

Trip Gabriel:

Ted Cruz has the biggest field operation on the GOP side. He has a college dorm in Des Moines that has housed waves of volunteers from out of state. Jeb Bush, Trump and Rubio have lighter footprints, but they are still playing. I met a Rubio volunteer from Chicago at one of his events over the weekend, asking people to sign “commit to caucus” cards. You wouldn’t see much fresh recruitment of volunteers at this point. It’s all about GOTV — getting out the voters who you know support you.

_____________________________

Question:

Why do you think Rubio has failed to consolidate support? I think a lot of people expected he would emerge as the alternative to the anti-establishment trump and Cruz, but his performance has been pretty lackluster. What is he doing wrong, and do you expect to see an “establishment” candidate emerge eventually?

Trip Gabriel:

I think Rubio sent confusing messages about who he was running against. For a long time he contrasted himself with Cruz, trying to look equally conservative on immigration, promoting a “dark days for America’s future” message. Lately he has returned to his message of optimism. I do expect the anti-Trump, anti-Cruz voters to rally around one candidate eventually. The question is whether it will be too late, ie after Super Tuesday.

_____________________________

Question:

How much do you think Hillary and Bernie’s positions on climate change and fossil fuels will play into the Democratic winner?

Trip Gabriel:

I was interested to see in the new Q-Pac poll that 11 percent of Dems ranked climate change as their top issue, a pretty strong showing (only health care and the economy ranked higher). Sanders was earlier and stronger on climate change, opposing the Keystone XL pipeline for example, but Clinton has since rolled out strong proposals, going beyond even the Obama administration. If climate change is your top issue, you’d probably be happy with either candidate at this point and might be also asking about who would be most effective in getting something done.

_____________________________

Question:

Who do you think is most likely to win the Iowa Caucuses on either side? Will you be in a Caucus room while it is happening. If so, what will it be like?

Trip Gabriel:

As of this moment (and this really does change moment to moment), I’m expecting good nights for Trump and Clinton. Take it with a grain of salt, or maybe a whole shaker — we “experts” have been wrong over and over about the races this year.

_____________________________

Question:

Something I haven’t seen anywhere: What is your plan after Iowa?

Trip Gabriel:

Heading to South Carolina, the first-in-the-South primary.•

Tags:

ebbets6

From the January 18, 1955 Brooklyn Daily Eagle.

ebbets4

Tags: ,

18lpea9eos1ltjpg

I don’t think earthlings should travel to Mars by 2025. We’re in a rush, sure, but probably not in that much of a hurry. My own hope would be that in the near-term future we send unpeopled probes to our neighbor, loaded with 3D printers that begin experimenting with building a self-sustaining colony.

Of course, I’m not a billionaire, so my vote really won’t amount to much. The best argument that Elon Musk and other nouveau space entrepreneurs have for leading us at warp speed into being a multi-planet species isn’t only existential risk but also that the next generation of fabulously wealthy technologists may turn their attention from the skies. It wouldn’t be the first time the stars lost our interest.

A transcript of Musk discussing space exploration at last week’s 2016 StartmeupHK Venture Forum in Hong Kong:

Question:

Let’s get even more way out there and talk about SpaceX. You’ve said that your ultimate goal is getting to Mars. Why is Mars important? Why does Mars matter?

Elon Musk:

It’s really a fundamental decision we need to make as a civilization. What kind of future do we want? Do we want a future where we’re forever confined to one planet until some eventual extinction event, however far in the future that might occur. Or do we want to become a multi-planet species, and then ultimately be out there among the stars, among many planets, many star systems? I think the latter is a far more exciting and inspiring future than the former. 

Mars is the next natural step. In fact, it’s really the only planet we have a shot of establishing a self-sustaining city on. I think once we do establish such a city, there will be a strong forcing function for the improvement of spaceflight technology that will then enable us to establish colonies elsewhere in the solar system and ultimately extend beyond our solar system.

There’s the defensive reason of protecting the future of humanity, ensuring that the light of consciousness is not extinguished should some calamity befall Earth. That’s the defensive reason, but personally I find what gets me more excited is that this would be an incredible adventure–like the greatest adventure ever. It would be exciting and inspiring, and there needs to be things that excite and inspire people. There have to be reasons why you get up in the morning. It can’t just be solving problems. It’s got to be something great is going to happen in the future.

Question:

It’s not an exit strategy or back-up plan for when Earth fails. It’s also to inspire people and to transcend and go beyond our mental limits of what we think we can achieve.

Elon Musk:

Think of how sort of incredible the Apollo program was. If you ask anyone to name some of humanity’s greatest achievements of the 20th century, the Apollo program, landing on the moon, would in many places be number one.

Question:

When will there be a manned SpaceX mission and when will you go to Mars?

Elon Musk:

We’re pretty close to sending crew up to the Space Station. That’s currently scheduled for the end of next year. So that will be exciting, with our Dragon 2 spacecraft. Then we’ll have a next-generation rocket and spacecraft beyond the Falcon-Dragon series, and I’m hoping to describe that architecture later this year at the National Aeronautical Congress, which is the big international space event every year. I think that will be quite exciting.

In terms of me going, I don’t know, maybe four or five years from now. Maybe going to the Space Station would be nice. In terms of the first flights to Mars, we’re hoping to do that around 2025. Nine years from now or thereabouts. 

Question:

Oh my goodness, that’s right around the corner.

Elon Musk:

Well, nine years. Seems like a long time to me.

Question:

Are you doing the zero-gravity training?

Elon Musk:

I’ve done the parabolic flights. Those are fun.

Question:

You must be reading up and doing the physical work to get ready for the ultimate flight of your life.

Elon Musk:

Umm, I don’t think it’s that hard, honestly. Just float around. It’s not that hard to float around. [Laughter] Well, going to Mars is going to be hard and dangerous and difficult in every way, and if you care about being safe and comfortable going to Mars would be a terrible choice.

Tags:

Tea_Party_Protest,_Hartford,_Connecticut,_15_April_2009_-_028

It’s not that there’s nothing of use in John O’Sullivan’s Wall Street JournalSaturday Essay” about this upside-down American election season, but it’s built, in part, on shaky and partisan foundations. It argues that President Obama’s use of executive orders is an unprecedented outlier that has caused the nation to be torn asunder. Except that both Presidents George W. Bush and Bill Clinton issued far more during their terms in office. The elder President Bush was on pace to as well had he won a second term. The same goes for many earlier Commanders in Chief. 

In regards to the Affordable Care Act, O’Sullivan uses the phrase “pushed through,” language that makes it seem as if something unfair or uncommon occurred. Pushing agendas through Congress is something the Oval Office has always done. 

Let’s recall that the GOP was holding meetings prior to Obama’s inauguration to plan to torpedo his Presidency. The divisiveness wasn’t a reaction but a preemptive strike.

O’Sullivan is correct in saying the Left and Right alike have been disappointed with Obama for different reasons, though you have to wonder in those cases if the fault lies with him or if no President could satisfy such a factious moment in our nation’s history. An excerpt:

President Barack Obama is the catalyst that made everything boil over. It shouldn’t be surprising. He proclaimed that he wanted to transform America fundamentally. While the Democrats controlled Congress, he pushed through the semi-nationalization of health care. Since the Democrats lost control, he has pushed his presidential authority to the very limits of the Constitution to secure his agenda on immigration, treaty-making with Iran, global warming and much else.

Mr. Obama has succeeded in getting a majority-Republican Congress to eschew its power of the purse and finance almost his entire agenda. Only the courts have effectively blocked his extensions of lawmaking and regulatory power, and that battle is still being waged. So it would be very odd if people didn’t conclude that a determined president could achieve almost anything he wanted if he were bold enough—and that Mr. Obama has done so.

As a result, his period in office has provoked rebellious popular movements outside Washington on the right and, more surprisingly, on the left.•

Tags: ,

telephoneoperators

Do you want a digital assistant 10,000 times more useful than Siri? A voice-activated universal remote that runs your life? I suppose the answer is “yes.”

Moore’s Law made supercomputers of yore affordable and portable for almost everyone, stealing them from the domain of superwealthy corporations and states and sliding them into our shirt pockets. Similarly, efforts are being made to create AI that acts as a voice-activated universal remote for our lives, anticipating and satisfying our needs. We may soon be able to enjoy the benefits of a “staff” the way our richer brethren do. 

The thing is, most of the new technologies have not created more leisure. Will these tools, if realized, be the same? If they do actually reduce toil, what will we use the extra bandwidth for?

From Zoë Corbyn’s Guardian article about Dag Kittlaus’ attempts to create not Frankenstein but Igor:

Kittlaus is the co-founder and CEO of Viv, a three-year-old AI startup backed by $30m, including funds from Iconiq Capital, which helps manage the fortunes of Mark Zuckerberg and other wealthy tech executives. In a blocky office building in San Jose’s downtown, the company is working on what Kittlaus describes as a “global brain” – a new form of voice-controlled virtual personal assistant. With the odd flashes of personality, Viv will be able to perform thousands of tasks, and it won’t just be stuck in a phone but integrated into everything from fridges to cars. “Tell Viv what you want and it will orchestrate this massive network of services that will take care of it,” he says.

It is an ambitious project but Kittlaus isn’t without a track record. The last company he co-founded invented Siri, the original virtual assistant now standard in Apple products. Siri Inc was acquired by the tech giant for a reported $200m in 2010. The inclusion of the Siri software in the iPhone in 2011 introduced the world to a new way to interact with a mobile device. Google and Microsoft soon followed with their versions. More recently they have been joined by Amazon, with the Echo you can talk to, and Facebook, with its experimental virtual assistant, M.

But, Kittlaus says, all these virtual assistants he helped birth are limited in their capabilities. Enter Viv. “What happens when you have a system that is 10,000 times more capable?” he asks. “It will shift the economics of the internet.”•

Tags: ,

monkeylaptoppush8

 

10 search-engine keyphrases bringing traffic to Afflictor this week:

  1. native american vice president
  2. menachem begin’s death
  3. andre the giant and samuel beckett
  4. insanity people who think they’re napoleon
  5. dead communards
  6. tama janowitz in the 1980s
  7. can you sell body parts?
  8. edward snowden describing moscow
  9. mike gimbel moneyball
  10. helen gurley brown at cosmo
This week, Iowans will have to choose between two equally qualified candidates with big, orange heads.

This week, Republican Iowans must choose between two large, orange objects equally qualified to be President.

donald-trump-cnn

 

  • Tyler Cowen reviews Robert Gordon’s The Rise and Fall of American Growth.
  • Ed Miliband offers potential prescriptions for the world’s wealth-inequality woes.
  • Steve Jobs was mourned in office parks and Zuccotti Park. Why.
  • Stack Fallacy may explain why elephantine businesses are slain by mice.

sports402_1

football1909

mu1910

The rule changes spearheaded by President Teddy Roosevelt in 1905 to make football a safer game didn’t have the desired impact immediately, and in the long run seem to have done more harm than good. In fact, it’s essentially the gridiron version of the “Drunken Stagger.”

The game’s first existential crisis, with players regularly dying and incurring serious injuries, led to calls for its abolition. Roosevelt, a fan of the rough pastime among other macho endeavors that ran counter to his genteel upbringing, stepped in to reform the sport, imploring influential college officials to reduce brutality and institute the forward pass. No short-range benefit was observed, however, as deaths actually spiked by the end of the decade.

A combination of continued tweaking of rules and improved equipment did eventually make football largely free of fatality, ending the cries for its ban. That, of course, allowed for its continuance and enabled the quiet devastation of brain injuries, something only recently began receiving the necessary attention

An article in the November 21, 1909 New York Times addressed the carnage. An excerpt:

CHICAGO — Twenty-six killed, seventy seriously injured, and scores of others painfully hurt has been the cost of football to the United States thus far this year, according to figures collected by the Chicago Tribune. The list of the dead seems to be a decisive answer, the Chicago paper says, to the assertion of the football experts that the development of the open game would lead to the lessening of the perils of the gridiron.

The number of deaths is the highest it has been in years, and is almost double that of either of the two seasons recently passed. In 1907 there were only fourteen deaths, and in 1908 only thirteen.

It should be noted that The Tribune’s total includes a number of players hurt in games played during the past year or even earlier, who have died during the current twelvemonth. 

The facts also seem to disprove the claim of the game’s supporters that it is the games of untrained boys and the athletic clubs that cause the fatalities. Of this year’s dead the majority were college players, supposed to have been hardened and made fit for the contests on the gridiron by expert coaches and long preparation.

As a result of the numerous fatalities and the agitation which they have stirred up, several colleges have disbanded their teams, and many of the city High Schools in various parts of the country have been forced to give up the sport.

Virginia May Forbid the Game

The State of Virginia will probably be the one which will give the heaviest blow to football. Following the death of one of the State University players and the injury of several of her youths within the State, a bill will be introduced into the Legislature at the next session to forbid such contests in the future. It is expected that this bill will be passed. Already the City Council of Norfolk and Portsmouth have forbidden all the contests within the city limits.

The death which attracted the most attention throughout the country, and which revived to a large extent the movement for the suppression of football, was that of Cadet Byrne, a West Point cadet. Byrne was an upper classman, 22 years old, when he was fatally injured during the contest with Harvard University. His neck was broken during a mass play, and despite the fact that every attempt was made to save his life, he died soon after.•

Some of the things contemporary consumers most desire to possess are tangible (smartphones) and others not at all (Facebook, Instagram, etc.). In fact, many want the former mainly to get the latter. A social media “purchase” requires no money but is a trade of information for attention, a dynamic that’s been widely acknowledged, but one that still stuns me. Our need to share ourselves–to write our names Kilroy-like on a wall, as Hunter S. Thompson once said–is etched so deeply in our brains. Manufacturers have used psychology to sell for at least a century, but the transaction has never been purer, never required us to not only act on impulse but to publish that instinct as well. Judging by the mood of America, this new thing, while it may provide some satisfaction, also promotes an increased hunger in the way sugar does. And while the Internet seems to encourage individuality, its mass use and many memes suggests something else.

On a somewhat related topic: Rebecca Spang’s Financial Times article analyzes a new book which argues that a consumerist shift is more a political movement than we’d like to believe, often a culmination of large-scale state decisions rather than of personal choice. The passage below is referring to material goods, but I think the implications for the immaterial are the same. The excerpt:

In Empire of Things, Frank Trentmann brings history to bear on all these questions. His is not a new subject, per se, but his thick volume is both an impressive work of synthesis and, in its emphasis on politics and the state, a timely corrective to much existing scholarship on consumption. Based on specialist studies that range across five centuries, six continents and at least as many languages, the book is encyclopedic in the best sense. In his final pages, Trentmann intentionally or otherwise echoes Diderot’s statement (in his own famous Encyclopédie) that the purpose of an encyclopedia is to collect and transmit knowledge “so that the work of preceding centuries will not become useless to the centuries to come”. Empire of Things uses the evidence of the past to show that “the rise of consumption entailed greater choice but it also involved new habits and conventions . . . these were social and political outcomes, not the result of individual preferences”. The implications for our current moment are significant: sustainable consumption habits are as likely to result from social movements and political action as they are from self-imposed shopping fasts and wardrobe purges.

When historians in the 1980s-1990s first shifted from studying production to consumption, our picture of the past became decidedly more individualist. In their letters and diaries, Georgian and Victorian consumers revealed passionate attachments to things — those they had and those they craved. Personal tastes and preferences hence came to rival, then to outweigh, abstract processes (industrialisation, commodification, etc) as explanations for historical change. The world looked so different! Studied from the vantage point of production, the late 18th and 19th centuries had appeared uniformly dark and dusty with soot; imagined from the consumer’s perspective, those same years glowed bright with an entire spectrum of strange, distinct colours (pigeon’s breast, carmelite, eminence, trocadero, isabella, Metternich green, Niagra [sic] blue, heliotrope). At the exact moment when Soviet power seemed to have collapsed chiefly from the weight of repressed consumer desire, consumption emerged as a largely positive, almost liberating, historical force. “Material culture” became a common buzzword; “thing theory” — yes, it really is a thing — was born.•

Tags: ,

edisonbulb

Asking if innovation is over is no less narcissistic than suggesting that evolution is done. It flatters us to think that we’ve already had all the good ideas, that we’re the living end. More likely, we’re always closer to the beginning.

Of course, when looking at relatively short periods of time, there are ebbs and flows in invention that have serious ramifications for the standard of living. In Robert Gordon’s The Rise and Fall of American Growth, the economist argues that the 1870-1970 period was a golden age of productivity and development unknown previously and unmatched since.

In an excellent Foreign Affairs review, Tyler Cowen, who himself has worried that we’ve already picked all the low-hanging fruit, lavishly praises the volume–“likely to be the most interesting and important economics book of the year.” But in addition to acknowledging a technological slowdown in the last few decades, Cowen also wisely counters the book’s downbeat tone while recognizing the obstacles to forecasting, writing that “predicting future productivity rates is always difficult; at any moment, new technologies could transform the U.S. economy, upending old forecasts. Even scholars as accomplished as Gordon have limited foresight.” In fact, he points out that the author, before his current pessimism, predicted earlier this century very healthy growth rates.

My best guess is that there will always be transformational opportunities, ripe and within arm’s length, waiting for us to pluck them.

An excerpt:

In the first part of his new book, Gordon argues that the period from 1870 to 1970 was a “special century,” when the foundations of the modern world were laid. Electricity, flush toilets, central heating, cars, planes, radio, vaccines, clean water, antibiotics, and much, much more transformed living and working conditions in the United States and much of the West. No other 100-year period in world history has brought comparable progress. A person’s chance of finishing high school soared from six percent in 1900 to almost 70 percent, and many Americans left their farms and moved to increasingly comfortable cities and suburbs. Electric light illuminated dark homes. Running water eliminated water-borne diseases. Modern conveniences allowed most people in the United States to abandon hard physical labor for good.

In highlighting the specialness of these years, Gordon challenges the standard view, held by many economists, that the U.S. economy should grow by around 2.2 percent every year, at least once the ups and downs of the business cycle are taken into account. And Gordon’s history also shows that not all GDP gains are created equal. Some sources of growth, such as antibiotics, vaccines, and clean water, transform society beyond the size of their share of GDP. But others do not, such as many of the luxury goods developed since the 1980s. GDP calculations do not always reflect such differences. Gordon’s analysis here is mostly correct, extremely important, and at times brilliant—the book is worth buying and reading for this part alone.

Gordon goes on to argue that today’s technological advances, impressive as they may be, don’t really compare to the ones that transformed the U.S. economy in his “special century.” Although computers and the Internet have led to some significant breakthroughs, such as allowing almost instantaneous communication over great distances, most new technologies today generate only marginal improvements in well-being. The car, for instance, represented a big advance over the horse, but recent automotive improvements have provided diminishing returns. Today’s cars are safer, suffer fewer flat tires, and have better sound systems, but those are marginal, rather than fundamental, changes. That shift—from significant transformations to minor advances—is reflected in today’s lower rates of productivity.•

Tags: ,

robot-congo-2

An Economist article looks at the latest report on automation by Carl Benedikt Frey, Michael Osborne and Craig Holmes, which argues that poorer nations are more likely than, say, America, to be prone to technological unemployment despite the U.S. holding an advantage in AI.

Because such countries are not yet as widely engaged in information work, their Industrial Age could be interrupted mid-epoch before they arrive at the Information Age. It’s like being pushed down a ladder when you’ve only scaled it part of the way. The academics acknowledge, though, that everything from policy to consumer preference may forestall the rise of the machines in India and China others. After all, Foxconn’s promised one-million robots factory workforce has yet to be realized.

An excerpt:

BILL BURR, an American entertainer, was dismayed when he first came across an automated checkout. “I thought I was a comedian; evidently I also work in a grocery store,” he complained. “I can’t believe I forgot my apron.” Those whose jobs are at risk of being displaced by machines are no less grumpy. A study published in 2013 by Carl Benedikt Frey and Michael Osborne of Oxford University stoked anxieties when it found that 47% of jobs in America were vulnerable to automation. Machines are mastering ever more intricate tasks, such as translating texts or diagnosing illnesses. Robots are also becoming capable of manual labour that hitherto could be carried out only by dexterous humans.

Yet America is the high ground when it comes to automation, according to a new report* from the same pair along with other authors. The proportion of threatened jobs is much greater in poorer countries: 69% in India, 77% in China and as high as 85% in Ethiopia. There are two reasons. First, jobs in such places are generally less skilled. Second, there is less capital tied up in old ways of doing things. Driverless taxis might take off more quickly in a new city in China, for instance, than in an old one in Europe.

Attracting investment in labour-intensive manufacturing has been a route to riches for many developing countries, including China. But having a surplus of cheap labour is becoming less of a lure to manufacturers. An investment in industrial robots can be repaid in less than two years. This is a particular worry for the poor and underemployed in Africa and India, where industrialisation has stalled at low levels of income—a phenomenon dubbed “premature deindustrialisation” by Dani Rodrik of Harvard University.•

Tags: , ,

dallas-cowboys-coach-tom-landry-and-quarterback-eddie-lebaron-the-boys-are-back-blog_thumb (1)

We know football is horrible for the game’s players, the head injuries traumatic and unavoidable regardless of the equipment. The question is whether this truth is an existential threat for the most popular team sport in America. It was for boxing, once not that long ago the king of the U.S. athletics. But prizefighting was an ever-changing hodge-podge of crooked promoters and money men, whereas the NFL is a unified–and crooked–billion-dollar corporation. Can it find some way to keep kids playing a game that will ruin them?

Two recent tragic examples underline the seriousness of the crisis: The physical and mental deterioration at 36 of former wide receiver Antwaan Randle-El and the troubling post-mortem of ex-Giant Tyler Sash. In the latter case, a study of brain tissue conducted after the fatal overdose of the increasingly erratic retired safety proved he suffered from CTE (Chronic Traumatic Encephalopathy), a degenerative condition caused by repeated concussions and (most likely) sub-concussive impacts. 

CTE has thus far shown up in the tissue of many former football players who’ve died, but the rub is that there’s no way to test for it in the living. That may soon change, and if it does, it could be a game-changer for football and other contact sports. From Jack Encarnacao at the Boston Herald:

As it stands, an athlete has to be dead before he can be diagnosed with Chronic Traumatic Encephalopathy, the trauma-induced brain disease prominent in ex-football players. The disease manifests in a way that standard scans can’t detect, so there’s no way to advise a player to hang it up before irreversible damage is done.

Leading concussion researcher Dr. Robert Cantu of Boston University sees a day when this will change.

“I think we’re within a fairly short window, I hope no more than a few years, of being able to detect CTE in living people with almost 100 percent certainty,” Cantu told me in a sit-down interview for the second installment of my podcast series “Unfiltered,” which continues this week on Boston Herald Radio.

The key, Cantu said, is identifying a marker specific to CTE that a brain scan can pick up. A radioactive substance in tau — the protein at the heart of CTE — may be that marker, but current tests produce smudgy images that make it hard to discern, he said.

“Images will only get better over time, and hopefully soon it will be ready for prime time,” Cantu said.•

Tags: , , ,

retrofutre7 (2)

The late, great AI pioneer Marvin Minsky referred to us as “meat machines,” which irked many (very biased) humans. The more polite phrase subsequently coined to describe our brains in computer terms is “wetware.” Regardless of the vernacular, I think we’re essentially machines, though (for a little while longer) easily the most complex ones.

On that topic, John Pavlus of Quanta has an interesting interview with Harvard computer scientist Leslie Valiant, who believes all biology computational, that “ecorithms” underlie life the way algorithms do machines. To the researcher, learning is learning, human or AI, though there are significant differences in stimuli (external, unpredictable vs. internal, predictable). Not everyone may agree with Valiant, but we’re a far cry from the brickbats he would have received for his beliefs in the 1980s when he began working on machine learning, a field then very belittled if not verboten.

An excerpt:

Question:

So what is learning? Is it different from computing or calculating?

Leslie Valiant:

It is a kind of calculation, but the goal of learning is to perform well in a world that isn’t precisely modeled ahead of time. A learning algorithm takes observations of the world, and given that information, it decides what to do and is evaluated on its decision. A point made in my book is that all the knowledge an individual has must have been acquired either through learning or through the evolutionary process. And if this is so, then individual learning and evolutionary processes should have a unified theory to explain them.

Question:

And from there, you eventually arrived at the concept of an “ecorithm.” What is an ecorithm, and how is it different from an algorithm?

Leslie Valiant:

An ecorithm is an algorithm, but its performance is evaluated against input it gets from a rather uncontrolled and unpredictable world. And its goal is to perform well in that same complicated world. You think of an algorithm as something running on your computer, but it could just as easily run on a biological organism. But in either case an ecorithm lives in an external world and interacts with that world.

Question:

So the concept of an ecorithm is meant to dislodge this mistaken intuition many of us have that “machine learning” is fundamentally different from “non-machine learning”? An ecorithm is an algorithm, but its performance is evaluated against input it gets from a rather uncontrolled and unpredictable world. And its goal is to perform well in that same complicated world. You think of an algorithm as something running on your computer, but it could just as easily run on a biological organism. But in either case an ecorithm lives in an external world and interacts with that world.

Leslie Valiant:

Yes, certainly. Scientifically, the point has been made for more than half a century that if our brains run computations, then if we could identify the algorithms producing those computations, we could simulate them on a machine, and “artificial intelligence” and “intelligence” would become the same. But the practical difficulty has been to determine exactly what these computations running on the brain are. Machine learning is proving to be an effective way of bypassing this difficulty.•

 

Tags: ,

From the October 12, 1885 Brooklyn Daily Eagle.

witch5

lat77

L.A. 2013” is a 1988 Los Angeles Times feature that imagined life in the future for a family of four-–and their robots. The feature dreamed too big in some cases and not enough in others, though it did see smart homes, quantified health, personalization, etc. An excerpt:

6 A.M.

WITH A BARELY perceptible click, the Morrow house turns itself on, as it has every morning since the family had it retrofitted with the Smart House system of wiring five years ago. Within seconds, warm air whooshes out of heating ducts in the three bedrooms, while the water heater checks to make sure there’s plenty of hot water. In the kitchen, the coffee maker begins dripping at the same time the oven switches itself on to bake a fresh batch of cinnamon rolls. Next door in the study, the family’s personalized home newspaper, featuring articles on the subjects that interest them, such as financial news and stories about their community, is being printed by laser-jet printer off the home computer–all while the family sleeps.

6:30 A.M.

With a twitch, “Billy Rae,” the Morrows’ mobile home robot, unplugs himself from the kitchen wall outlet–where he has been recharging for the past six hours–then wheels out of the kitchen and down the hall toward the master bedroom for his first task of the day. Raising one metallic arm, Billy Rae gently knocks on the door, calling out the Morrows’ names and the time in a pleasant, if Southern drawl: ‘Hey, y’all–rise an’ shine!’

On the other side of the door, Alma Morrow, a 44-year-old information specialist. Pulling on some sweats, Alma heads for the tiny home gym, where she slips a credit–card-size X–ER Script–her personal exercise prescription–into a slot by the door. Electronic weights come out of the wall, and Alma begins her 20-minute workout.

Meanwhile, her husband, Bill, 45, a senior executive at a Los Angeles–based multinational corporation, is having a harder time. He’s still feeling exhausted from the night before, when his 70-year-old mother, Camille, who lives with the family, accidentally fell asleep with a lighted cigarette. Minutes after the house smoke detector notified them of a potential hazard, firefighters from the local station were pounding on the front door. Camille, one of the last of the old–time smokers, had blamed the accident on these “newfangled Indian cigarettes” she’s been forced to buy since India has overtaken the United States in cigarette production. Luckily, she only singed a pillowcase–and her considerable pride. Bill, however, had been unable to fall back asleep and had spent a couple of hours in the study at the personal computer, teleconferring with his counterparts in the firm’s Tokyo office. But this morning, he can’t afford to be late. With a grunt, he rolls out of bed and heads for the bathroom, where he swishes and swallows Denturinse–much easier and more effective than toothbrushing–and then hurries to get dressed. As he does, the video intercom buzzes. Camille’s collagen-improved face appears on the video screen, her gravelly voice booming over the speaker. Bill clicks off the camera on his side so Camille can’t see him in his boxer shorts, then talks to her. She tells him she wants him to drive her downtown to finalize her retirement plan with her attorney. Knowing this will make him late, he suggests that Alma could drop Camille off at the law firm’s branch office in the Granada Hills Community Center. Camille reluctantly agrees– much to Alma’s chagrin–then buzzes off. When the couple heads for the kitchen, they leave the bed unmade: Billy Rae can change the sheets.•

In his great song “Pretty Boy Floyd,” Woody Guthrie, knowing that when it comes to crime a collar can be white just as easily as blue, sang these words:

Yes, as through this world I’ve wandered
I’ve seen lots of funny men;

Some will rob you with a six-gun,
And some with a fountain pen.

For those who employ the latter modus operandi, not even a stylus, let alone a pen, is necessary anymore. Over the last four decades in the U.S. (and much of the rest of the developed world), money has mysteriously moved from the middle class into the accounts of the 1%, and no one seems completely sure how it was transferred. We’re only know that it’s shifted, that it’s been shifty. Maybe it was the manipulation of tax codes or the decline of unions or the rise of the machines or the forces of globalization or the invention of outlandish Wall Street products. Probably it was all of that and more. The result is the disappearance of the prosperity enjoyed by a far greater percentage of Americans in the aftermath of WWII through the early 1970s, which was created by a humming capitalist engine paired with severe progressive tax rates that redistributed the wealth. No one need want to return to the pre-Civil Rights United States–wildly uneven in other odious ways–but there are some economic lessons to be learned there.

One thing that seems sure is the vast accumulation of riches at the top isn’t the end result of a successful experiment in meritocracy. These are the not uniformly the best, the brightest and the most deserving. Similarly, the shit-out-of-luck souls aren’t on the ever-widening bottom because of any defect of character or lack of work ethic. Some may drink or use drugs or divorce, but so do those whose wealth provides a cushion for such failings common to mere mortals. The main reason that poor people are so is because, at long last, they don’t have any money. They haven’t failed the system. Quite the contrary.

In a London Review of Books essay, Ed Miliband, the leader of the British Labour Party prior to Jeremy Corbyn, opines on the haves, the have-nots and the what-the-fuck situation we all find ourselves in, the eclipsed and the sun-kissed alike. The politician, who believes that beyond sheer unfairness, inequality ultimately inhibits economic growth, offers some prescriptions. The opening:

‘What do I see in our future today you ask? I see pitchforks, as in angry mobs with pitchforks, because while … plutocrats are living beyond the dreams of avarice, the other 99 per cent of our fellow citizens are falling farther and farther behind.’ Who said this? Jeremy Corbyn? Thomas Piketty? In fact it was Nick Hanauer, an American entrepreneur and multibillionaire, who in a TED talk in 2014 confessed to living a life that the rest of us ‘can’t even imagine’. Hanauer doesn’t believe he’s particularly talented or unusually hardworking; he doesn’t believe he has a great technical mind. His success, he says, is a ‘consequence of spectacular luck, of birth, of circumstance and of timing’. Just as his own extraordinary wealth can’t be explained by his unique talents, neither, he says, can rising inequality in the United States be justified on the grounds that it is a side effect of a broader economic success from which everyone benefits. As Henry Ford recognised, if you don’t pay ordinary workers decent wages, the economy will lack the demand to sustain economic growth.

Hanauer is in the vanguard of the ‘Fight for 15’, the campaign for a $15 minimum wage. Like Bill Gates and Warren Buffett, who have also issued loud warnings about inequality, he is heir to a long tradition of social concern among the wealthy in the US. They have reason to be worried. The last time inequality reached comparable levels was shortly before the Wall Street Crash. As Anthony Atkinson shows in Inequality: What Can Be Done?, inequality in the US fell for decades after the crash, before beginning to rise again in the 1970s. Since then the gap between the wealthy and the rest has grown steadily wider. The top 1 per cent now has nearly 20 per cent of total US personal income. In the 1980s, inequality in the UK went up even more sharply than in the US. Since then, overall UK inequality has been relatively stable but the income share of the top 1 per cent has increased significantly and now accounts for about 12 per cent of UK personal income. The important factors are rising inequality in wages, a decline in the share of the national income that wages represent as more money goes to corporate profits and dividends, and a reversal of redistribution from the rich to the poor.

The rise in inequality should not, Atkinson insists, be brushed aside as an inevitable effect of irresistible forces such as globalisation or developments in technology. It is driven by political choices.•

Tags: ,

pi-1

AI cracked backgammon in 1979, putting all other games on notice. But today’s announcement about a Google computer system besting a human Go champion was still surprising since most researchers thought we were years, perhaps a decade, from machine intelligence accomplishing such a feat in the complex, ancient game. What does this mean for Artificial General Intelligence and where does research head next? In a Conversation piece, Peter Cowling and Sam Devlin try to answer. An excerpt:

However the real world is a step up, full of ill-defined questions that are far more complex than even the trickiest of board games. The techniques which conquered Go can certainly be applied in medicine, education, science or any other domain where data is available and outcomes can be evaluated and understood.

The big question is whether Google just helped us towards the next generation of Artificial General Intelligence, where machines learn to truly think like – and beyond – humans. Whether we’ll see AlphaGo as a step towards Hollywood’s dreams (and nightmares) of AI agents with self-awareness, emotion and motivation remains to be seen. However the latest breakthrough points to a brave new future where AI will continue to improve our lives by helping us to make better-informed decisions in a world of ever-increasing complexity.

Now that Go has seemingly been cracked, AI needs a new grand challenge – a new “lab rat” – and it seems likely that many of these challenges will come from the $100 billion digital games industry. The ability to play alongside or against millions of engaged human players provides unique opportunities for AI research. At York’s centre for Intelligent Games and Game Intelligence, we’re working on projects such as building an AI aimed at player fun (rather than playing strength), for instance, or using games to improve well-being of people with Alzheimer’s. Collaborations between multidisciplinary labs like ours, the games industry and big business are likely to yield the next big AI breakthroughs.•

____________________________

“The possibilities of game play are endless.”

Tags: ,

bookomatvending

There’s never been greater access to books than there is right now, but all progress comes with a price. If print fiction and histories and such should disappear or become merely a luxury item, digital media would change the act of reading in unexpected ways over time.

Some see screen reading promoting a decline in analytical skills, but the human brain sure seems able to adapt to new forms once it becomes acclimated. Even as someone raised on paper books, I’m not worried that what’s lost in translation will be greater than what’s gained. Of course, I say that while still primarily using dead-tree volumes.

In a smart BBC Future article, Rachel Nuwer traces the fuzzy history of e-books and considers the future of reading. Some experts she interviews hope for a “bi-literate” society that values both the paperback and the Kindle. That would be a great outcome, but I don’t know how realistic a scenario it is. The opening:

When Peter James published his novel Host on two floppy disks in 1993, he was ill-prepared for the “venomous backlash” that would follow. Journalists and fellow writers berated and condemned him; one reporter even dragged a PC and a generator out to the beach to demonstrate the ridiculousness of this new form of reading. “I was front-page news of many newspapers around the world, accused of killing the novel,”James told pop.edit.lit. “[But] I pointed out that the novel was already dying at an alarming rate without my assistance.”

Shortly after Host’s debut, James also issued a prediction: that e-books would spike in popularity once they became as easy and enjoyable to read as printed books. What was a novelty in the 90s, in other words, would eventually mature to the point that it threatened traditional books with extinction. Two decades later, James’ vision is well on its way to being realised.

That e-books have surged in popularity in recent years is not news, but where they are headed – and what effect this will ultimately have on the printed word – is unknown. Are printed books destined to eventually join the ranks of clay tablets, scrolls and typewritten pages, to be displayed in collectors’ glass cases with other curious items of the distant past?

And if all of this is so, should we be concerned?•

Tags: ,

trumpmegyn

Donald Trump, Pol Pot with hair plugs, is like all bullies, a coward. His behavior stems from weakness and insecurity, so he can be handled. Jeb!, the favored son of a privileged family, doesn’t have adequate experience neutralizing such toxic types, but there are sure ways to deal. Lecture him about it in a mature way like Bernie Sanders and the hideous hotelier looks small. Aggressively return his obnoxious behavior like Megyn Kelly and he positively wilts. 

Women in particular throw Trump for a loss because he’s spent his life making sure he’s in a superior position to the ones in his life. He controls the purse strings and they should bleed in silence. Edward Luce’s latest Financial Times column about the 2016 race looks at this particular Trump shortcoming. The opening:

Hillary Clinton should be celebrating. Donald Trump’s decision to boycott the Fox News debate was ostensibly about ratings. How can the cable network make money without his celebrity pull? Mr Trump may prove his point when Thursday night’s viewership numbers come in.

But switching channels is not the same thing as showing up at a polling booth. More than half America’s electorate is female — they accounted for 53 per cent of the vote in the last election. Even the most apathetic will by now have heard Mr Trump’s opinions about Megyn Kelly, the Fox anchor, who will co-host the debate. Ms Kelly is a “bimbo”, according to Mr Trump, who is incapable of objectivity when there is “blood coming out of her whatever”.

So that is settled. Mr Trump thinks the menstrual cycle is a handicap. He also recoils at other female bodily functions. When Mrs Clinton took a bathroom break at a recent Democratic debate, Mr Trump described her as “disgusting”. He used the same word about an opposing lawyer in a 2011 hearing when she asked for a short break to pump breast milk. Looks are also fair game. Among those attacked for their appearance are the actress Bette Midler (“extremely unattractive”), Angelina Jolie (“she’s been with so many guys she makes me look like a baby”), media figure Arianna Huffington (“unattractive both inside and out”), fellow Republican candidate Carly Fiorina (“look at that face. Would anyone vote for that?”) and comedian Rosie O’Donnell (“fat pig”).

None of which has done Mr Trump’s ratings any harm. The more controversial a celebrity, the bigger audiences they attract. The question is whether there is any longer a meaningful distinction between show business and US politics. Do ratings equal votes?•

Tags: , , , ,

applecomp1981 (2)

In a series of articles in the New York Review of Books over the last couple of years, Sue Halpern has taken a thought-provoking look at the dubious side of the Digital Era, considering the impact of tech billionaires, technological unemployment and the Internet of Things.

Her latest salvo tries to locate the real legacy of Steve Jobs, who was mourned equally in office parks and Zuccotti Park. In doing so she calls on the two recent films on the Apple architect, Alex Gibney’s and Danny Boyle’s, and the new volume about him by Brent Schlender and Rick Tetzeli. Ultimately, the key truth may be that Jobs used a Barnum-esque “magic” and marketing myths to not only sell his new machines but to plug them into consumers’ souls.

An excerpt:

So why, Gibney wonders as his film opens—with thousands of people all over the world leaving flowers and notes “to Steve” outside Apple Stores the day he died, and fans recording weepy, impassioned webcam eulogies, and mourners holding up images of flickering candles on their iPads as they congregate around makeshift shrines—did Jobs’s death engender such planetary regret?

The simple answer is voiced by one of the bereaved, a young boy who looks to be nine or ten, swiveling back and forth in a desk chair in front of his computer: “The thing I’m using now, an iMac, he made,” the boy says. “He made the iMac. He made the Macbook. He made the Macbook Pro. He made the Macbook Air. He made the iPhone. He made the iPod. He’s made the iPod Touch. He’s made everything.”

Yet if the making of popular consumer goods was driving this outpouring of grief, then why hadn’t it happened before? Why didn’t people sob in the streets when George Eastman or Thomas Edison or Alexander Graham Bell died—especially since these men, unlike Steve Jobs, actually invented the cameras, electric lights, and telephones that became the ubiquitous and essential artifacts of modern life?* The difference, suggests the MIT sociologist Sherry Turkle, is that people’s feelings about Steve Jobs had less to do with the man, and less to do with the products themselves, and everything to do with the relationship between those products and their owners, a relationship so immediate and elemental that it elided the boundaries between them. “Jobs was making the computer an extension of yourself,” Turkle tells Gibney. “It wasn’t just for you, it was you.”•

Tags: , , ,

Auto_polo_by_the_International_News_Service

Auto_polo_by_Collier's

Auto_polo_crop

autopolo2

The odd game of Auto Polo was popularized in the summer of 1912 because of a marketing ploy by a Kansas Ford dealer trying to sell Model T’s. It soon become a craze in New York City, headlining at Madison Square Garden for most of December. Although the activity had initially been devised a decade earlier, it was this moment when the game got (relatively) big. 

Dangerous as all fuck, the sport squared off an equal number of teams of vehicles holding two players–the driver and the mallet-wielder–trying to propel a ball between two posts. It thrived in New York and Chicago for most of the 1920s but disappeared before the arrival of the Great Depression. By then, cars were largely stable enough to sell themselves, even if most Americans couldn’t afford them. The photographs above are not from the MSG contests, but an article in the December 8, 1912 New York Times recalls that particular series. An excerpt:

Not a few of the dwellers or toilers along Automobile Row have been predicting a popular future for auto polo, the game from the South and West which gave the public a number of thrills as a game and furnished food for thought for the motor enthusiast at Madison Square Garden for the week that just ended. There had been rumors of the game from time to time, and people heard that the four-wheel “ponies” on which it was played provided as many sensational moments as the four-legged ones of the horse-polo match. But no one was quite prepared for the exhibition which took place in the arena still covered, oddly enough, with the tanbark of the Horse Show. 

As in regulation polo, the mallet is only a factor in the newer game. The horse, or in this case the car, is quite as important to success, if not more so. It was on the performance of the cars that the interest of automobile men naturally centered. Occasionally there was a bit of of engine trouble, but for the most part the little machines, stripped to the bare frames and lacking even bonnets, stood up manfully under conditions that were grueling to say the least. Every canon of good motor car driving, from the viewpoint of the car, was broken time and again as the drivers sought to block the bounding leather ball or fed gas to their motors until the pop of explosions became an almost continuous roar in an effort to be the first “on” the elusive prize. Turns so short that they resulted in turnovers were made several times, but still the motors remained operable, to the surprise of the onlookers. 

Whether the game can ever become general–even as general as polo pony–is a moot question. It involves, in the first place, a deal of expense, for, played in earnest and in the heat of the desire to win, a big repair bill would be inevitable. In other words, it would be an expensive thing to promote in a professional way.

It would be hard to devise a game in which the players took bigger chances of mishap. The factor of danger may prove either a damper or a stimulus. At any rate the game has definitely taken its place as a circus stunt crowded with thrills, and a demonstration of car ability which is a revelation even to the man who has driven his hundreds of miles at a mile-a-minute clip.•

reed-hastings

17nfchxk68rn8jpg

ironically-hastings-offered-to-sell-forty-nine-percent-of-netflix-to-blockbuster-in-2000-to-act-as-an-online-arm-for-the-video-r

underwood5

The particular rules Clayton Christensen laid down for disruptive innovation probably don’t much matter because the world doesn’t exist within his constructs, but ginormous companies (even entire industries) being done in by much smaller ones has become an accepted part of life in the Digital Age.

In trying to explain this phenomenon, Christopher Mims of the Wall Street Journal explores the ideas in Anshu Sharma’s much-debated article about Stack Fallacy, which argues that companies moving up beyond their core businesses are likely to fail (Google+, anyone?), while those moving down into the guts of what they know have a far better chance. For an example of the latter, Mims writes of the ride-sharing sector. An excerpt:

To really understand the stack fallacy, it helps to recognize that companies move “down” the stack all the time, and it often strengthens their position. It is the same thing as vertical integration. For example, engineers of Apple’s iPhone know exactly what they want in a mobile chip, so Apple’s move to make its own chips has yielded enormous dividends in terms of how the iPhone performs. In the same way, Google’s move down its own stack—creating its own servers, designing its own data centers, etc.—allowed it to become dominant in search. Similarly, Tesla’s move to build its own batteries could—as long as it allows Tesla to differentiate its products in terms of price and/or performance—be a deciding factor in whether or not it succeeds.

Of course, the real test of a sweeping business hypothesis is whether or not it has predictive power. So here’s a prediction based on the stack fallacy: We’re more likely to see Uber succeed at making cars than to see General Motors succeed at creating a ride-sharing service like Uber. Both companies appear eager to invade each other’s territory. But, assuming that ride sharing becomes the dominant model for transportation, Uber has the advantage of knowing exactly what it needs in a vehicle for such a service.

It is also worth noting that the stack fallacy is just that: a fallacy and not a law of nature. There are ways around it. The key is figuring out how to have true, firsthand empathy for the needs of the customer for whatever product you’re trying to build next.•

Tags: , ,

tentaclearm (1)

In addition to yesterday’s trove of posts about the late Marvin Minsky, I want to refer you to a Backchannel remembrance of the AI pioneer by Steven Levy, the writer who had the good fortune to arrive on the scene at just the right moment in the personal-computer boom and the great talent to capture it. The journalist recalls Minsky’s wit and conversation almost as much as his contributions to tech. Just a long talk with the cognitive scientist was a perception-altering experience, even if his brilliance was intimidating.

[Editor’s note: It should be stated that Levy’s article appeared five years before Minsky was accused of participating in the rape of minor children as part of Jeffrey Epstein’s web of shocking criminality.]

The opening:

There was a great contradiction about Marvin Minsky. As one of the creators of artificial intelligence (with John McCarthy), he believed as early as the 1950s that computers would have human-like cognition. But Marvin himself was an example of an intelligence so bountiful, unpredictable and sublime that not even a million Singularities could conceivably produce a machine with a mind to match his. At the least, it is beyond my imagination to conceive of that happening.

But maybe Marvin could imagine it. His imagination respected no borders.

Minsky died Sunday night, at 88. His body had been slowing down, but that mind had kept churning. He was more than a pioneering computer scientist — he was a guiding light for what intellect itself could do. He was also our Yoda. The entire computer community, which includes all of us, of course, is going to miss him. 

I first met him in 1982; I had written a story for Rolling Stone about young computer hackers, and it was optioned by Jane Fonda’s production company. I traveled to Boston with Fonda’s producer, Bruce Gilbert; and Susan Lyne, who had engineered my assignment to begin with. It was my first trip to MIT; my story about been about Stanford hackers.

I was dazzled by Minsky, an impish man of clear importance whose every other utterance was a rabbit’s hole of profundity and puzzlement.•

Tags: ,

DR. WERNHER VON BRAUN SUITED UP IN SPACE SUIT PRIOR TO ENTERING MARSHALL SPACE FLIGHT CENTER'S NEUTRAL BUOYANCY SIMULATOR. 1967

Five Books did an excellent interview with geneticist Matthew Cobb on the topic of the “History of Science.” In discussing William E. Burrows’ really fun 1999 title, This New Ocean: The Story of the First Space Age, Cobb comments on Wernher von Braun an erstwhile Nazi and American hero who directly oversaw the murders of Jewish prisoners and who wanted to gas monkey astronauts in outer space (I swear!). An excerpt:

Question:

You just mentioned Enceladus so, talking of space missions, we’ll go on to your next book: William Burrows’s This New Ocean: The Story of the First Space Age published in 1998. What do you like about this book?

Matthew Cobb:

Space! Rockets! When it came out I was about to go on holiday and wanted a thick book to read. Burrows is a science journalist: not a historian or a scientist. I find it incredibly readable, very exciting. Although it was written by an American, it didn’t cover up the fact that Wernher von Braun, the brains behind the Apollo programme, was a Nazi Party member who was absolved for his involvement with the Hitler regime because he could build ICBMs. The book contains a good account—as good as there could be at the time, given the archives in the USSR hadn’t fully opened—of the huge advances the Russians made, which became obvious as they first flew up the Sputnik and then put the first man in space. I find it an extremely readable account of a time I grew up in—almost like a novel. I wasn’t reading it with a professional eye because I don’t know much about space history.

Question:

Burrows’s book is very dramatic—especially some of the moments like the first moon landing.

Matthew Cobb:

I remember it! I was 11 years old at the time. I was watching it with my uncle Brian in the middle of the night. Although I remember the excitement of seeing Neil Armstrong’s feet stepping down on to the ground, I was equally amazed by the fact that Brian was eating four Weetabix at three o’clock in the morning. We have lost a lot of the excitement about space flight. A year ago NASA trialled the Orion space capsule, which they may use to fly to Mars. The launch was in the middle of one of my lectures, so I decided to take a brief break and show the students the NASA live stream. You don’t see rocket launches on live TV anymore. The space shuttle has been scrapped and although there are rockets going to the Space Station, and private companies like SpaceX and Blue Origin developing reusable rockets, they doesn’t enjoy the same media attention as in the 60s and 70s. So we all sat and watched it—the students were very excited.•

Tags: ,

« Older entries § Newer entries »