Science/Tech

You are currently browsing the archive for the Science/Tech category.

One of the best things I’ve read this year is an excellent longform conversation at the Baffler between Thomas Piketty and David Graeber, both of whom believe the modern financial system is passé, but only one of whom (Graeber) believes it’s certain to collapse. An exchange:

Moderators:

Is capitalism itself the cause of the problem, or can it be reformed?

Thomas Piketty:

One of the points that I most appreciate in David Graeber’s book is the link he shows between slavery and public debt. The most extreme form of debt, he says, is slavery: slaves belong forever to somebody else, and so, potentially, do their children. In principle, one of the great advances of civilization has been the abolition of slavery.

As Graeber explains, the intergenerational transmission of debt that slavery embodied has found a modern form in the growing public debt, which allows for the transfer of one generation’s indebtedness to the next. It is possible to picture an extreme instance of this, with an infinite quantity of public debt amounting to not just one, but ten or twenty years of GNP, and in effect creating what is, for all intents and purposes, a slave society, in which all production and all wealth creation is dedicated to the repayment of debt. In that way, the great majority would be slaves to a minority, implying a reversion to the beginnings of our history.

In actuality, we are not yet at that point. There is still plenty of capital to counteract debt. But this way of looking at things helps us understand our strange situation, in which debtors are held culpable and we are continually assailed by the claim that each of us “owns” between thirty and forty thousand euros of the nation’s public debt.

This is particularly crazy because, as I say, our resources surpass our debt. A large portion of the population owns very little capital individually, since capital is so highly concentrated. Until the nineteenth century, 90 percent of accumulated capital belonged to 10 percent of the population. Today things are a little different. In the United States, 73 percent of capital belongs to the richest 10 percent. This degree of concentration still means that half the population owns nothing but debt. For this half, the per capita public debt thus exceeds what they possess. But the other half of the population owns more capital than debt, so it is an absurdity to lay the blame on populations in order to justify austerity measures.

But for all that, is the elimination of debt the solution, as Graeber writes? I have nothing against this, but I am more favorable to a progressive tax on inherited wealth along with high tax rates for the upper brackets. Why? The question is: What about the day after? What do we do once debt has been eliminated? What is the plan? Eliminating debt implies treating the last creditor, the ultimate holder of debt, as the responsible party. But the system of financial transactions as it actually operates allows the most important players to dispose of letters of credit well before debt is forgiven. The ultimate creditor, thanks to the system of intermediaries, may not be especially rich. Thus canceling debt does not necessarily mean that the richest will lose money in the process.

David Graeber:

No one is saying that debt abolition is the only solution. In my view, it is simply an essential component in a whole set of solutions. I do not believe that eliminating debt can solve all our problems. I am thinking rather in terms of a conceptual break. To be quite honest, I really think that massive debt abolition is going to occur no matter what. For me the main issue is just how this is going to happen: openly, by virtue of a top-down decision designed to protect the interests of existing institutions, or under pressure from social movements. Most of the political and economic leaders to whom I have spoken acknowledge that some sort of debt abolition is required.”

Tags: ,

Long before we used electricity to create friends (and make monsters) on Facebook, Mary Shelley jolted to life a new pal in Victor Frankenstein’s laboratory. Was her creature inspired by the terror of childbirth, or was he delivered due to a radically odd weather event? Perhaps both. From Hannah Gersen at The Millions:

“In winter of 1814, British sailors recorded seeing ‘clouds of ashes’ at the peak of Mount Tambora, a volcanic mountain in the East Indies. A few months later, in the spring of 1815, Tambora exploded with huge, jet-like flames, a column of fire known as a ‘Plinian’ eruption, after Pliny the Younger, who witnessed the eruption of Mount Vesuvius. But Tambora burned hotter than Vesuvius, and it was so powerful that it ejected rock, ash, and other materials into the stratosphere, where they remained suspended, wreaking havoc on global weather patterns for the next three years. 1816 was known as ‘The Year Without Summer’—a relatively mild title for a year that brought famine, disease, and poverty. In the United States, there was snow in June, destroying crops and bringing the country’s first economic depression. In Ireland and China, unremitting rains flooded fields; while in India, monsoon season never arrived. Bacteria flourished in these stagnant, impoverished conditions, and outbreaks of typhus and cholera can be traced back to that dreary, volcanic winter.

I learned these and many other historical details from Gillen D’arcy Wood’s Tambora: The Eruption That Changed The World. Tambora is a new book, but one I discovered haphazardly, through that great portal of haphazardness: Wikipedia. I was fact-checking an overwrought simile (re: procrastinating) and landed on the Wikipedia entry for Frankenstein, where I learned that the great fictional monster was the indirect result of “The Year Without Summer.” I’d never heard of “The Year Without Summer” and in its addictive way, Wikipedia provided a link to an article on the subject, which in turn provided a link to the 1815 Eruption of Mount Tambora, which in turn provided a link to the Pacific Ring of Fire, which in turn led to an article about plate tectonics, which in turn led to a page about super-Earths, which in turn led me to wonder about the origin of the universe and what is the meaning of life on Earth, which I believe is that state of existential confusion to which all Wikipedia rabbit holes eventually lead. I am grateful that on this particular foray, it only took six steps—and also, of course, that it led me to read Tambora, which gave me a glimpse into a startlingly dramatic period in history.

To get back to ‘The Year Without Summer’ (which at this point in July sounds like a marvelous situation) and the creation of Frankenstein, you must transport yourself to a storm-lashed villa on Switzerland’s Lake Geneva. There, sitting in front of a roaring fire, is Percy ShelleyMary soon-to-be-Shelley Godwin, Mary’s stepsister, Claire ClairmontLord Byron, and also, Lord Byron’s doctor (whose presence is somewhat irrelevant, but who I will include, anyway, in the spirit of Wikipedia). This privileged, literary bunch has been driven indoors by unseasonably cold weather, driving rain, and spectacular thunderstorms—all due to Mount Tambora, although of course they don’t know it. Bored and perhaps tired of reciting poetry, they decide to have a contest for who can tell the best ghost story. Mary’s late entry is a tale about a student, Victor Frankenstein, who discovers how to bring life to inanimate material. Frankenstein uses this power to create an eight-foot tall “creature” who is never given a name, but who eventually kills Frankenstein’s wife and escapes to the North Pole. It’s not a ghost story but a monster story, one inspired by Shelley’s extensive readings into science and myth.

Wood argues that Frankenstein was also inspired by the stormy, Tambora-induced weather, and that ‘the pyrotechnical lightning displays’ raging outside Shelley’s villa windows were written into the novel.”

Tags: ,

Here’s the 1962 film of Stanley Milgram’s “Obedience to Authority” experiments at Yale, which were shocking in more than one sense, a supposed study of memory which was really just a measurement of complicitous savagery. A companion (if in spirit) to Philip Zimbardo’s “Stanford Prison Experiment,” conducted a decade later, which was also ethically dubious and yielded surprisingly sad results.

Tags:

Education in and of itself is something American universities do very well, however exorbitant in price many of them are. But education is not merely the goal of the education system in the U.S. (and pretty much everywhere else). It’s about utility, about getting jobs. When a very difficult economic time rolls along like it has now, with threats of massive automation in the future, the follies of the system’s cost structure come under attack. From David Bromwich at the New York Review of Books:

“Andrew Rossi’s documentary Ivory Tower prods us to think about the crisis of higher education. But is there a crisis? Expensive gambles, unforeseen losses, and investments whose soundness has yet to be decided have raised the price of a college education so high that today on average it costs eleven times as much as it did in 1978. Underlying the anxiety about the worth of a college degree is a suspicion that old methods and the old knowledge will soon be eclipsed by technology.

Indeed, as the film accurately records, our education leaders seem to believe technology is a force that—independent of human intervention—will help or hurt the standing of universities in the next generation. Perhaps, they think, it will perform the work of natural selection by weeding out the ill-adapted species of teaching and learning. A potent fear is that all but a few colleges and universities will soon be driven out of business.

It used to be supposed that a degree from a respected state or private university brought with it a job after graduation, a job with enough earning power to start a life away from one’s parents. But parents now are paying more than ever for college; and the jobs are not reliably waiting at the other end. ‘Even with a master’s,’ says an articulate young woman in the film, a graduate of Hunter College, ‘I couldn’t get a job cleaning toilets at a local hotel.’ The colleges are blamed for the absence of jobs, though for reasons that are sometimes obscure. They teach too many things, it is said, or they impart knowledge that is insufficiently useful; they ask too much of students or they ask too little. Above all, they are not wired in to the parts of the economy in which desirable jobs are to be found.”

I don’t use illegal drugs, and I don’t think you should, either. They’re bad for you. But that doesn’t mean I support any cockamamie “War on Drugs.” That’s just bad policy crashing into stark reality. I think if someone sells drugs to a minor, they should be given a prison sentence. Otherwise, the whole thing should be decriminalized. That doesn’t mean it should be legalized. Relatively mild substances like marijuana should be legal and arrests for other harder drugs should be met with out-patient rehab and community-service sentences, for both dealers and buyers. 

Of course, the situation is further complicated because you don’t have to do anything illegal to get a dangerous high. The number of Americans attaining painkillers, Oxy and others, with prescriptions is staggering. I don’t doubt these folks have pain, though usually it’s more mental than physical. The pusher got pushed by Big Pharma, and attempting to cage that monster will only cause more problems, especially with the Internet opening up global sales far too large to be prosecuted with precision.

Mike Jay, who wrote this brilliant article for Aeon last year, returns to the same publication with a piece that doesn’t try to make sense of this unwinnable war but to show how senseless it is in the light of history and the new normal. The opening:

“When the US President Richard Nixon announced his ‘war on drugs’ in 1971, there was no need to define the enemy. He meant, as everybody knew, the type of stuff you couldn’t buy in a drugstore. Drugs were trafficked exclusively on ‘the street’, within a subculture that was immediately identifiable (and never going to vote for Nixon anyway). His declaration of war was for the benefit the majority of voters who saw these drugs, and the people who used them, as a threat to their way of life. If any further clarification was needed, the drugs Nixon had in his sights were the kind that was illegal.

Today, such certainties seem quaint and distant. This May, the UN office on drugs and crime announced that at least 348 ‘legal highs’ are being traded on the global market, a number that dwarfs the total of illegal drugs. This loosely defined cohort of substances is no longer being passed surreptitiously among an underground network of ‘drug users’ but sold to anybody on the internet, at street markets and petrol stations. It is hardly a surprise these days when someone from any stratum of society – police chiefs, corporate executives, royalty – turns out to be a drug user. The war on drugs has conspicuously failed on its own terms: it has not reduced the prevalence of drugs in society, or the harms they cause, or the criminal economy they feed. But it has also, at a deeper level, become incoherent. What is a drug these days?

Consider, for example, the category of stimulants, into which the majority of ‘legal highs’ are bundled. In Nixon’s day there was, on the popular radar at least, only ‘speed’: amphetamine, manufactured by biker gangs for hippies and junkies. This unambiguously criminal trade still thrives, mostly in the more potent form of methamphetamine: the world knows its face from the US TV series Breaking Bad, though it is at least as prevalent these days in Prague, Bangkok or Cape Town. But there are now many stimulants whose provenance is far more ambiguous.

Pharmaceuticals such as modafinil and Adderall have become the stay-awake drugs of choice for students, shiftworkers and the jet-lagged: they can be bought without prescription via the internet, host to a vast and vigorously expanding grey zone between medical and illicit supply. Traditional stimulant plants such as khat or coca leaf remain legal and socially normalised in their places of origin, though they are banned as ‘drugs’ elsewhere. La hoja de coca no es droga! (the coca leaf is not a drug) has become the slogan behind which Andean coca-growers rally, as the UN attempts to eradicate their crops in an effort to block the global supply of cocaine. Meanwhile, caffeine has become the indispensable stimulant of modern life, freely available in concentrated forms such as double espressos and energy shots, and indeed sold legally at 100 per cent purity on the internet, with deadly consequences. ‘Legal’ and ‘illegal’ are no longer adequate terms for making sense of this hyperactive global market.”

Tags: ,

From Peter Cheney’s Globe and Mail piece about the rapid rise of the robocar, a passage about some of the conversion’s consequences, intended or not:

“Eliminating human drivers will have far-reaching social and economic implications. Entire industries (like truck and cab driving) may be wiped out. AVs will also dramatically reduce (and possibly eliminate) crashes – as safety experts can tell you, almost all accidents are caused by human error. This will shift the landscape for industries like body repair and auto insurance.

‘There won’t be very many claims,’ says [Canadian Automated Vehicles Centre Of Excellence Director Barry] Kirk. ‘But there won’t be much revenue, either. There’s not much risk to underwrite.’

There will also be a direct impact on the medical system. Treating car crash victims is a major industry. A decline in crashes would sharply reduce the supply of human donor organs available for transplant – the largest supply comes from drivers aged 18 to 30.

Autonomous cars will have a positive impact on congestion – they can operate at optimum speed and spacing, maximizing traffic flow. They can also be used with networked control systems that optimize traffic flow by commanding cars to take optimum routes, and letting each car know what other vehicles are doing. This type of networked traffic system has already been developed for aviation – the Next Generation Air Transportation System (NexGen) is starting to be phased in across the United States.

Google has studied the impact of human drivers on road congestion by using what’s known as Agent-Based Simulation – computers model traffic on a road system, and determine how flow is affected when a percentage of drivers engage in behaviours like tailgating, speeding and rapid lane switching. As the research has shown, these drivers have a significant impact on traffic flow.”

Tags:

I wonder if it’s necessity or ego telling us that AI has to think the same we do to be on our level. Couldn’t it operate otherly and best us the way animals on four legs outrun humans on two? Is thinking only one thing or can it be another thing again? From “Unthinking Computers Perform Clever Parlor Tricks,” Richard Waters’ middling enthusiasm for deep learning in the Financial Times:

“The success of deep learning is a product of the times. The idea is decades old: that a batch of processors, fed with enough data, could be made to function like a network of artificial neurons. Grouping and sorting information in progressively more refined ways, they could ‘learn’ how to parse it in something akin to the way the human brain is believed to function.

It has taken the massive computing power concentrated in cloud data centres to train neural networks enough to make them useful. It sounds like a dream of artificial intelligence as conjured up by Google: ingest all the world’s data and apply enough processing power, and the secrets of the universe will reveal themselves to you.

Deep learning has produced some impressive results. In a project known as DeepFace, Facebook recently reported that it had reached 97.35 per cent accuracy in identifying the faces of 4,000 people in a collection of 4m images, far better than had been achieved before. Such feats of pattern recognition come naturally to humans, but they are hard for computer scientists to copy. Even trite-sounding results can point to important advances. Google’s report two years ago that it had designed a system that identified cats in YouTube videos still reverberates around the field.

Using the same techniques to ‘understand’ language or solve other problems that rely on pattern recognition could make machines far better at interpreting the world around them. By analysing what people are doing and comparing it to what they (and thousands of others) have done in similar situations in the past, they could also anticipate what they might do next.

The result could be behavioural systems that truly understand your behaviour and recommendation engines capable of suggesting things you actually want. These may sound eerie. But done properly, machines could come to anticipate our needs and act as lifetime guides.

But there is a risk of equating the output of systems such as these with the products of actual human intelligence. In reality, they are parlour tricks, albeit impressive ones. The important thing will be to know where to apply their skills – and how far to trust them.”

 

Tags:

In 1951, Hollywood director Edward Ludwig predicted computers would soon automatically write screenplays, and it’s difficult to see how they wouldn’t be capable of managing the flat dialogue of today’s globalized blockbusters. But machines don’t only want the starring roles–they’re also after us bit players. From Rob Enderle’s CIO report essay the so-called “robot apocalypse” and what it will mean for your job:

“It’s time for a discussion about what the future will bring. It won’t be world of lollipops and rainbows that [Marc] Andreessen and [Larry] Page will live in. The world of the rich won’t apply to the rest of us. Interestingly, Google Chairman Eric Schmidt better anticipates the ‘jobs and robots’ problem, but his solution is investing in startups, which is where we’ll all work while the robots do our existing jobs.

Sure, robots already do some jobs: Assembly lines, self-driving cars, delivery drones and cleaning robots, both the consumer Roomba and larger, industrial vacuums. There’s a bigger threat: Workers who basically look at numbers and draw conclusions. Robots are surprisingly good at this, too. Robots could do a range of jobs – including analysis, purchasing, consulting and journalism – because they can look at more real-time information in less time and with better recommendations than people.

This is one downside to big data analytics. Once you have the information, Watson, Siri, Cortana or any other artificial intelligence-like system can do a pretty decent job of identifying the best path. In the near term, at least, people will remain in the loop, but they’ll increasingly serve as little more than quality control – and, unfortunately, won’t operate fast enough to do the job properly.

Sheehy also created a spreadsheet that ranks the jobs that robots are most and least likely to take from people. The top jobs at risk: Financial analyst, financial advisor, industrial buyer, administrator, chartered legal executive (compliance officer) and financial trader. Least at risk: Clinical embryologist, bar manager, diplomatic services officer, community arts worker, international aid worker, dancer, aid/development worker and osteopath.

What’s interesting is that jobs that focus on dealing with people are relatively safe, while jobs that focus on analyzing things aren’t. Now if the people you focus on are increasingly unemployed, I have to wonder where the money’s coming from to pay the salaries of the people-focused folks. (Given that folks who write about technology need an audience to consume things to pay our salaries, we shouldn’t be sleeping that well, even though we aren’t on the list.)”

Tags: ,

Long-form 1986 interview with J. G. Ballard. (A little Swedish, mostly English.) 

“The only point of reality we have is inside our own heads,” Ballard said, though that feels like a very different time, our heads no longer really our own, their contents now increasingly quantified and commodified.

The writer also feared a “boring, event-less future.” No such luck.

Tags:

I’ve yet to meet a single McKinsey consultant who didn’t seem to have a head full of gunpowder, but I’ll trust the firm’s think-tank wing, the MGI, which reports that China, for all its crush of modernization and smartphone ubiquity, has a majority of businesses surprisingly left unplugged. From the Economist:

“AT FIRST glance it would appear that China has gone online, and gone digital, with great gusto. The spectacular rise of internet stars such as Alibaba, Tencent and JD would certainly suggest so. The country now has more smartphone users and households with internet access than any other. Its e-commerce industry, which turned over $300 billion last year, is the world’s biggest. The forthcoming stockmarket flotation of Alibaba may be the largest yet seen.

So it is perhaps surprising to hear it argued that much of Chinese business has still not plugged in to the internet and to related trends such as cloud computing and ‘big data’ analysis; and therefore that these technologies’ biggest impact on the country’s economy is still to come. That is the conclusion of a report published on July 24th by the McKinsey Global Institute (MGI), a think-tank run by the eponymous consulting firm. It finds that only one-fifth of Chinese firms are using cloud-based data storage and processing power, for example, compared with three-fifths of American ones. Chinese businesses spend only 2% of their revenues on information technology, half the global average. Even the biggest, most prestigious state enterprises, such as Sinopec and PetroChina, two oil giants, are skimping on IT. Much of the benefit that the internet can bring in such areas as marketing, managing supply chains and collaborative research is passing such firms by, the people from McKinsey conclude.

Speaking of chess prodigies who declined young, Bobby Fischer, who was profiled in Life by Brad Darrach in 1971 prior to his Cold War showdown with champion Boris Spassky, was the subject of the same writer for sister publication People in 1974, two years after dispatching of his Soviet opponent and becoming one of the most famous people on Earth. Darrach wrote of Fischer as a man who’d shaken off the world’s embrace, who had briefly found God–one of them, anyhow–and had entered into an exile from the game. What the piece couldn’t have predicted is that he would never really play another meaningful match. The opening ofThe Secret Life of Bobby Fischer“:

Whatever happened to Bobby Fischer? Six weeks after winning the world chess championship on Sept. 1, 1972, he abruptly vanished without a trace into the brown haze of Greater Los Angeles. Rumors flew, but the truth was weirder than the rumors.

At the pinnacle of chess success, Bobby abandoned the game that had made him famous and took up residence in a closed California community of religious extremists. With rare exceptions, the world outside has not seen or heard of him for more than 16 months. Reporters who tried to track him down were turned back by the private police force that patrols the church property in Pasadena.

On the day he finished off his great Russian opponent, Boris Spassky, in Iceland, Bobby had realized the first of his three main ambitions. The second, he said, was ‘to make chess a major sport in the United States.’ The third was to be ‘the first chess millionaire.’ As history’s first purely intellectual superstar, Bobby was offered record deals, TV specials, book contracts, product endorsements. ‘He could make $10 million in the next two, three years,’ his lawyer said after Bobby’s victory at Reykjavik. And to promote chess, Bobby promised to put his title on the line ‘at least twice a year, maybe more.’ Millions of chess amateurs enthused at the prospect of a Fischer era of storm, stress and magnificent competition.

But it didn’t happen quite like that. After curtly declining New York Mayor Lindsay’s offer of a ticker-tape parade (“I don’t believe in hero worship”), Bobby made impulsive appearances on the Bob Hope and Johnny Carson shows—and then was swallowed up by the Worldwide Church of God, a fundamentalist sect founded in 1934 by a former adman named Herbert W. Armstrong. Well advertised on radio and television by Armstrong’s hellfire preaching—and more recently by the charm-drenched sermons of Garner Ted Armstrong, the founder’s son—the church now claims 85,000 members. They celebrate the sabbath on Saturday and observe the dietary laws of the Old Testament—no pork, no shellfish. Smoking, divorce and cheek-to-cheek dancing are forbidden. Necking is the worst kind of sin. Tithing is mandatory—the church’s annual income probably exceeds $50 million. Church leaders live palatially and gad about the world in three executive jets provided by the faithful. Recently, however, scandals and schisms have shaken the flock.

Bobby Fischer, the child of a Jewish mother and a Gentile father, first tuned in on the elder Armstrong while still in his late teens. Lonely and despairing after he muffed his chance to become world champion at 19—Bobby found strength in the church’s teachings and has adhered to them closely ever since. He turned to the church in the crisis he faced after Reykjavik. Verging on nervous exhaustion after his two-month battle with Spassky and the match organizers, Bobby decided that the last thing he wanted after his triumph was the world that lay at his feet. In the large and outwardly peaceful community that surrounds the Armstrong headquarters, he saw a safe setting where he could unsnarl his nerves and find the normal life that he had sacrificed to competition and monomania.

The church welcomed him. Though Bobby is not a full church member—he is listed as a ‘coworker’—he offered Armstrong a double tithe (20%) of his $156,250 winnings. ‘Ah, my boy, that’s just as God would have it!’ Armstrong replied, and passed the word that Bobby was to be given VIP treatment. A pleasant three-bedroom apartment in a church-owned development was made available. So were the gymnasium, squash courts and swimming pool of the church’s Ambassador College. Leaders of the Armstrong organization were told to make sure that Bobby had plenty of dinner invitations. “The word went out,” says a church member, “that Bobby should never be left alone, or allowed to feel neglected.”

To make doubly sure, the church assigned a friendly weightlifter in the phys. ed. department as Bobby’s personal recreation director. The two of them played paddle tennis almost every day, and Bobby worked out with weights to build up his arms and torso.

Not long after he arrived in Pasadena, the 31-year-old Bobby confessed to a high church official that he wanted to meet some girls. There is a rigid rule against dating between church members and nonmembers like Bobby, but the official allowed that in Bobby’s case the rule would be suspended. What sort of girls did he like? Bobby said that he liked “vivacious” girls with “big breasts.” A suitable girl was discovered and Bobby began to date her frequently, taking the weightlifter and his wife along as chaperones.•

Tags: , , , ,

“Pillsbury has for a long time been insane, becoming violent at times through blindfold chess playing.”

A great light of the nineteenth-century chess world who burned briefly, Harry Nelson Pillsbury was a brilliant player as well as an accomplished mnemonist capable of quickly absorbing and regurgitating seemingly endless strings of facts. Pillsbury never had the opportunity to become world champion because his mental health deteriorated, the result of syphilis which he contracted in his twenties. An article in the April 9, 1906 Brooklyn Daily Eagle assigned his decline to more genteel origins. The text:

“Harry Nelson Pillsbury, the greatest chess player since the days of Paul Morphy, is to be taken from the Battle Creek Sanitarium, where he is at present, to a sanitarium at Atlantic City, N.J. Pillsbury has for a long time been insane, becoming violent at times through blindfold chess playing. The fact became known through a letter from William Penn Shipley, of the Pennsylvania Chess Association, to a friend at the Brooklyn Chess Club.

The game of blindfold chess requires intense concentration of the mind, and, according to the physicians who have been working on Pillsbury’s case, ultimately destroys the memory cells of the brain, if carried on to excess. A player is placed in a room by himself and plays the game, entirely from memory, while his opponent moves for him at the table.

One instance of Pillsbury’s remarkable skill was shown when he payed for thirteen hours, sitting all alone in the little anteroom which leads into the main rooms of the Brooklyn Chess Club. He did not stop even to eat, and bore in mind twenty-four games during that time. Blackburn and Morphy kept no more than fifteen games in their mind at once. Physicians state that the gift to play blindfold is a gift and cannot be acquired.

While Pillsbury’s case is considered practically hopeless, every effort that can be brought to bear to bring the former champion into the knowledge of the world again will be made.”

 

 

Tags: ,

I’ve complained in the recent past about physicists bashing philosophy, thinking this technological epoch an ideal time for thinking deeply about ethical questions. I also believe that philosophers can reach truths before scientists can, even if they can’t prove their assertions. Those beliefs can be guideposts for others making scientific progress. The physicist George Ellis agrees, as he states in a Scientific American dialogue with journalist John Horgan. (Thanks to The Browser for pointing it out.) An excerpt:

John Horgan:

[Lawrence] Krauss, Stephen Hawking and Neil deGrasse Tyson have been bashing philosophy as a waste of time. Do you agree?

George Ellis:

If they really believe this they should stop indulging in low-grade philosophy in their own writings. You cannot do physics or cosmology without an assumed philosophical basis. You can choose not to think about that basis: it will still be there as an unexamined foundation of what you do. The fact you are unwilling to examine the philosophical foundations of what you do does not mean those foundations are not there; it just means they are unexamined.

Actually philosophical speculations have led to a great deal of good science. Einstein’s musings on Mach’s principle played a key role in developing general relativity. Einstein’s debate with Bohr and the EPR paper have led to a great of deal of good physics testing the foundations of quantum physics. My own examination of the Copernican principle in cosmology has led to exploration of some great observational tests of spatial homogeneity that have turned an untested philosophical assumption into a testable – and indeed tested – scientific hypothesis. That’ s good science.”

Tags: ,

Now you can put the Encyclopaedia Britannica on the head of a pin, and you can slide a war in your pocket. Or at least a drone. That’s what American soldiers may soon have to conduct remote reconnaissance. Of course, it’s just a matter of time–and not much time–until the “nano air vehicles” will be in your neighborhood. Just try to legislate that, attempt to manage that cheapness and smallness. From Douglas Ernst at the Washington Times:

“Future U.S. Army soldiers sent into combat may have a brand new tool at their disposal: the pocket drone.

The U.S. Army Natick Soldier Research, Development and Engineering Center in Massachusetts is developing a “pocket-sized aerial surveillance device” for soldiers assigned to small units in dangerous environments.

When the Army’s efforts come to fruition, the Cargo Pocket Intelligence, Surveillance and Reconnaissance program will provide dismounted troops with real-time surveillance of threats in their environment.

‘The Cargo Pocket ISR is a true example of an applied systems approach for developing new Soldier capabilities,’ said Dr. Laurel Allender, acting NSRDEC technical director, Army.mil reported July 21.”

__________________________

“Just about 10 cm x 2.5 cm”:

Tags: ,

The Internet is a grand experiment in the macro, and within that framework there are many smaller tests being run on us, some unethical. The question is why is there no real comeuppance for companies, Facebook and OKCupid included, which abuse the rules–abuse us. I guess the answer is twofold: 1) It’s difficult to uncouple our lives from a social network when we’ve been unpacking it there for years, and 2) There seems to be something tacit in the new-media bargain that tells us that we’re not paying with money so there will be some other type of payment. And there is. From Dan Gillmor at the Guardian:

“If you thought the internet industry was chastened by the public firestorm after Facebook revealed it had manipulated the news feeds of its own users to affect their emotions, think again: OKCupid.com, the dating site, is now bragging that it deliberately arranged matches between people whom its algorithms determined were not compatible – just to get data on how well the site was working.

In a Monday blog post entitled – I’m not making this up – ‘We Experiment On Human Beings!’ the site’s co-founder, Christian Rudder, essentially told us to face the facts of our modern world … at least as he sees them:

[G]uess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.

Human experimentation is definitely part of how websites work, in a way, because all online services of considerable size do something called A/B testing – seeing how users respond to tweaks, then adjusting accordingly. But that doesn’t mean sites can, do or should routinely and deliberately deceive their users or customers.”

Tags: ,

From the July 26, 1933 Brooklyn Daily Eagle:

Philadelphia--A clap of thunder during a severe electrical storm here last night caused a well-dressed young man of about 26 to lose his memory. He was taken to a hospital.”

 

In the same science-centric 1978 issue of Penthouse that had a futuristic look at labor during a time of automation, there’s an interview by Richard Ballad with the late NASA astronomer Robert Jastrow, who possessed an interesting mix of beliefs. A staunch supporter of the Singularity, he saw computers as a new lifeform, and he was also a denier of human-made climate change. A few exchanges follow. 

________________________

Robert Jastrow:

I say that computers, as we call them, are a newly emerging form of life, one made out of silicon rather than carbon. Silicon is chemically similar to carbon, but it can enter into a sort of metal structure in which it is relatively invulnerable to damage, is essentially immortal, and can be extended to an arbitrarily large brain size. Such new forms of life will have neither human emotions nor any of the other trappings we associate with human life.

Penthouse:

You use the term life to describe what we usually think of as lifeless creatures. One might call them “computers with delusions of grandeur.” How can you say they are a form of life?

Robert Jastrow:

They are new forms of life. They react to stimuli, they think, they reason, they learn by experience. They don’t, however, procreate by sexual union or die — unless we want them to die. We take care of their reproduction for them. We also take care of their food needs, which are electrical. They are evolving at a dynamite speed. They have increased in capabilities by a power of- ten every seven years since the dawn of the computer age, in 1950. Man, on the other hand, has not changed for a long time. By the end of the twentieth century, the curves of human and computer growth will intersect, and by that time, I am confident, quasi-human intelligences wilt be with us. They will be similar in mentality to a fresh- ly minted Ph.D.: very strong, very narrow, with no human wisdom, but very powerful in brute reasoning strength. They will be working in combination with our managers, who will be providing the human intuition. Silicon entities will be controlling and regulating the complex affairs of our twenty-first-century society. The probability is that this will happen virtually within our own lifetime, What happens in the thirtieth century, or the fortieth? There are 6 billion years left before the sun dies, and over that long period I doubt whether biological intelligence will continue to be the seat of intelligence for the highest forms of life on this planet. Nor do I think that those advanced beings on other planets, who are older than we are, if they exist, are housed in shells of bone on a fish model of carbon chemistry Silicon, I think, is the answer.

________________________

Penthouse:

Are concepts like ethics, morals, and spirituality irrelevant to these silicon beings?

Robert Jastrow:

No, not at all. I think that such intelligent beings may be capable of aesthetic perceptions far beyond our imagining. The only thing is that their aesthetics will not be human. These beings will not have a large baggage of emotions. You see our musical and verbal perceptions were bred into us for their survival values back in the time when we were evolving out of the Savannah apes. Forms of life in the African savannah environment developed selected traits for survival that have nothing to do with the needs of today or with those of a billion years in the future.

Penthouse:

What would be the purpose of the existence of these computer beings?

Robert Jastrow:

That’s a big question. I don’t know what the “kicks” of these computer beings will be. No human emotions will be involved, but there may be other emotions. I think that they may find their pleasure in aesthetic perceptions akin to our delight in music and art and design, and also find it in the larger search for order and harmony in the universe. Their understanding of the harmony of the cosmos and the nature of physical reality may transcend ours. Their curiosity to discover may be what drives them.

________________________

Penthouse:

Will humans as we know them die out like the dodo?

Robert Jastrow:

It may be that a symbiotic union will exist between humans and new forces of life, between biological and nonbiological intelligence — and it may now exist on other planets. We might continue to serve the needs of the silicon brain while it serves ours.

Penthouse:

Do you think that the computer beings will triumph in the end?

Robert Jastrow:

Yes. Not “triumph” in the sense of a war but triumph in the same sense that the mammals triumphed over the dinosaurs. It will be the next stage of perfection.•

 

Tags: ,

Guglielmo Marconi may or may not have been the very first to create the wireless, as he’s often credited, but he was certainly a passionate supporter of Fascist madman Benito Mussolini, and that wasn’t the inventor’s only strange idea. The text of the announcement of Marconi’s death from the July 20, 1937 Brooklyn Daily Eagle:

Rome–The Marquis Guglielmo Marconi, who invented wireless when he was only 21, died suddenly at 3:45 a.m. today (10:45 p.m. Monday, E.D.T.) at the ancient palace in downtown Rome where he lived and worked.

The 63-year-old conqueror of the ether died of heart paralysis. His widow, the Countess Cristina Bezzi-Scali, was at his bedside. She had been called back from the seaside resort of Viareggio when he began to feel ill yesterday.

Their daughter, Elettra Elena, whose godmother is Queen Elena, remained at the resort and will not return to Rome until time for the state funeral. Today is her eighth birthday.

Duce Pays Respects

Premier Mussolini, whose ardent supporter Marconi had been, was notified of the death immediately. He dispatched a telegram of condolences and later went to Marconi’s home in the Via Condotti and paid his respects beside the body.•

 

Tags: , , ,

An excerpt from “How to Forward a New Global Age,” economist Carlota Perez’s Financial Times piece, which argues that embarking on a green revolution would allow us to do well by doing good:

Focus on intangible growth

Green growth is not just about climate change. It is about shifting production and consumption patterns towards intangible goods, materials and energy saving, multiplying the productivity of resources and creating new markets for special materials, renewable energy, really durable products for business models based on rental rather than possession, a huge increase in personal (quality of life) services and so on.

It implies a redefinition of the aspirational ‘good life’ towards the health of the individual and the environment, imitating the educated elites (as has happened historically).

And full global development, why? Because that’s what would create growing demand for equipment, infrastructure and engineering, all redesigned in a green and sustainable direction. Accelerating the already existing shifts in those directions, would require a major set of policy innovations, including a radical reform of the tax system to change relative profitability.

For instance, instead of salaries, profits and VAT, we might need to tax materials, energy and transactions. Does that sound like a major change? Yes, it needs to be!

These are times for as much institutional imagination and bold leadership as were displayed to shape the previous revolution. Putting patches on the old policies won’t do the job! As for finance, the opportunities for profitable innovation would then be innumerable. New models would be needed to fund the green transformation, plus the knowledge intensive enterprises, the new social economy practices, the investment needs of global development and so on.”

Tags:

There are those with unique flair who innovate. Yes, if one person didn’t event the light bub–and one didn’t–another would. But I don’t think too many Americans decry rewarding someone who’s truly clever, even if that person had help–and they almost always have help. But the myth of the solitary genius has been so bastardized in our economy, where CEOs are paid exorbitant sums for often doing a poor job, rewarded for the throne rather than their rule, compensated lavishly while they have the floor and even more when they’re shown the door. The idea has proved hurtful. The opening of “The End of Genius?by Jonathan Low:

“For an economy so committed to collaboration, cooperation and partnership, we demonstrate a persistent fascination with the myth of the lone genius.

Particularly in fields where innovation and creativity are so often successfully translated into cash, the ‘my way or the highway’ ethos prevails despite ample evidence that it takes, if not a village, than at least a couple of buddies.

Even in tech, where Steve had Woz, Larry had Sergey and Bill had, well, he really did have a village, maybe even a city, the believers cling to the revealed truth. ‘We invest in people, not in companies’ huff the venture capitalists. Not systems, not processes, not teams, not intellectual capital, but ‘people,’ however that may be defined, the implication being that the Alpha Dog controls the biological survival imperative.

But even as the strains of Frank Sinatra singing ‘I Did It My Way,’ continue to waft from entrepreneurs’ ear buds, the reality is that the world is becoming too complex for this belief to endure.”

Tags:

Privacy as we knew it is gone and the next-generation tools will decide, far more than any legislation, how far things will go. I’m not saying I’m in favor of that, private person that I am, but that’s just how it is. We’ll never be truly alone, though that doesn’t necessarily mean we won’t be lonely. The opening of “The Internet of Things – the Next Big Challenge to Our Privacy,” by Jat Singh and Julia Powles at the Guardian:

“If there’s a depressing slogan for the early era of the commercial internet, it’s this: ‘Privacy is dead – get over it.’

For most of us, the internet is complex and opaque. Some might be vaguely aware that their personal data are getting sucked, their search histories tracked, and their digital journeys scoured.

But the current nature of online services provides few mechanisms for individuals to have oversight and control of their information, particularly across tech-vendors.

An important question is whether privacy will change as we enter the era of pervasive computing. Underpinned by the Internet of Things, pervasive computing is where technology is seamlessly embedded within the real world, intrinsically tied to the physical environment.

If the web is anything to go by, the new hyperconnected world will only make things worse for privacy. Potentially much worse.”

Tags: ,

History of Unimate

In 1978, Penthouse, a magazine that wanted to pee on you or someone, anyone, took a look at the automated future of our workforce in a good article, “Robot Lib,” by Bob Schneider. Quaint that the piece predicted Big Labor would delay factory automation by seventy years. An excerpt:

In fact several roboticists believe that the day when human blue collar workers are entirely replaced by solid-state slaves is not very far off. “With the spectrum of technology available now, it would be possible to eliminate most of the blue-collar jobs today performed by humans within the next twenty or thirty years,” [Joseph F.] Engelberger maintains. “But,” he adds, “because of the social, political, and economic factors involved, a more reasonable time is likely to be a hundred years.” These three factors can be reduced to two words: Big Labor. The unions know that robots will be replacing their people on the assembly lines as well as in the foundries–and they don’t like it. They’re already fighting a holding action: as of now a robot can only replace a worker who retires or dies.

Tom Binford believes that 30 percent of the human labor force could be replaced by intelligent sensitive automata within thirty years. And Robert Malone forecasts totally roboticized factories that will need practically no human supervision: fully autonomous robots will oversee production, and robot managers and foremen will direct blue-collar robots to best meet pre-programmed quotas. A single human could probably manage several factories at the same time.•

Tags: , ,

Vice Media was built on the back of people who should have known better going someplace dangerous and doing something stupid. But some of the site’s extreme science stuff is interesting. An excerpt from Blanca Talavera’s Q&A with an anonymous sufferer of hypnophobia, the fear of sleep:

Question:

What was it that triggered it in your case?

J.: 

It all began with a vermian injury [a brain injury that causes loss of balance and dizziness]. One night in August 2010, while having dinner and watching television, I suddenly lost consciousness for a few seconds. I fell off the couch. Immediately after I came to, alone and unaided, I went to the hospital.

The treatment I received was very bad and the doctors thought my problem was a ‘mania’ or something ‘invented.’ The psychiatrist and the doctor diagnosed me with ‘hypochondria and a psychosomatic problem.’ This was the starting point of my hypnophobia. …

Question:

You’ve said that you do everything you can in order to not to fall asleep. What do you usually do?

J.: 

When I prepare to sleep I suffer a gradual increase of anxiety. My body triggers episodes of panic and choking, to prevent me from falling asleep. It’s hard to explain, you have to feel it: My pulse quickens, I tremble, I don’t know what to do. You feel powerless. The situation, your subconscious dominates you.

Besides that, I sometimes consciously get out of bed and go out desperately seeking help. I’ve gone to mental health centers, where instead of helping me, what they did was aggravate my condition with pills and drugs. I have thought about ending it all, but let’s say I am a strong person. I have an inner strength that keeps me from doing that.

Question:

When you do fall asleep, do you rest well?

J.: 

When I sleep, it is because I fall asleep. Still, my mind plays tricks on me, reacting as a self-defense mechanism to keep my consciousness from relaxing and disconnecting from reality to have a restful sleep. I guess the brain disconnects because it knows that if you do not sleep, you die.”

Tags:

Two drivers, with different results: Stirling Moss, who crashed and burned, and Ray Harroun, who made it to the finish line. Who learned more about life from their experience?

_______________________

Life speeds in one direction, and how can anything ever be different? Then events occur. Similar traumas in the past haven’t caused a break, but this one takes hold. The brain rewires itself. All is different now. You can never return.

Moss in his career-ending run in 1962.

_______________________

In 1961, newly crowned Indianapolis 500 champ A.J. Foyt appeared on I’ve Got a Secret with Ray Harroun, who won the inaugural Indy 500 in 1911.

Robot Olympics, sure, they are numerous, but there have never been robots in the Olympics, and Japanese Prime Minister Shinzo Abe wants to change that at the 2020 Summer Games. If nothing else, it’s instructive to know that Japan, thought to be an unstoppable tech powerhouse just several decades ago, is now desperately trying to establish itself as a premiere player in robotics. In what areas will China not be able to sustain its momentum? From Eric Geller at the Daily Dot:

“Japan is set to host the 2020 Summer Olympics, and Prime Minister Shinzo Abe is looking for ways to turn it up a notch. His solution? Robots, of course.

According to Agence France-Presse, Abe expressed his interest in hosting an Olympic event specifically for robots as part of the international athletic competition in 2020.

‘I would like to gather all of the world’s robots and aim to hold an Olympics where they compete in technical skills,’ Abe said. ‘We want to make robots a major pillar of our economic growth strategy.’

Abe’s focus on robots for the Olympics came as part of a visit last Thursday to robot production facilities in the Japanese city of Saitama, where factories churn out robots that both assist humans and operate autonomously in a diverse array of workplaces, including daycare.”

Tags: ,

« Older entries § Newer entries »