Gordon Moore

You are currently browsing articles tagged Gordon Moore.

Moore’s Law continues apace fifty years on, which is stunning and great and challenging. The computer chip growing yet shrinking has allowed for everything from smartphones to sensors to Siri to driverless cars, things which are remaking society and economics in fundamental ways, quantifying behavior and vanishing jobs. They may ultimately do more to reorder the way we live than politics ever could. 

Since Gordon Moore recognized the pattern in 1965, there’s been a continuous guessing game about when the rule would run into entropy. In 2006, Moore himself said this:

“I think Moore’s Law will continue as long as Moore does anyhow! Ha ha ha… I’m periodically amazed at how we’re able to make progress. Several times along the way, I thought we reached the end of the line, things taper off, and our creative engineers come up with ways around them…Materials are made of atoms, and we’re getting suspiciously close to some of the atomic dimensions with these new structures, but I’m sure we’ll find ways to squeeze even further than we think we presently can.”•

I think futurists get ahead of themselves, however, when they apply Moore’s Law to seemingly everything when it really only applies to integrated circuits. Chemical reactions are certainly not amenable to its rules,  which is why battery progress badly trails that of the computer chip. Immortality or a-mortality in any physical sense is not right around the corner because of Moore’s Law. 

From Annie Sneed at Scientific American:

Of course, Moore’s law is not really a law like those describing gravity or the conservation of energy. It is a prediction that the number of transistors (a computer’s electrical switches used to represent 0s and 1s) that can fit on a silicon chip will double every two years as technology advances. This leads to incredibly fast growth in computing power without a concomitant expense and has led to laptops and pocket-size gadgets with enormous processing ability at fairly low prices. Advances under Moore’s law have also enabled smartphone verbal search technologies such as Siri—it takes enormous computing power to analyze spoken words, turn them into digital representations of sound and then interpret them to give a spoken answer in a matter of seconds.

Another way to think about Moore’s law is to apply it to a car. Intel CEO Brian Krzanich explained that if a 1971 Volkswagen Beetle had advanced at the pace of Moore’s law over the past 34 years, today “you would be able to go with that car 300,000 miles per hour. You would get two million miles per gallon of gas, and all that for the mere cost of four cents.”•

Tags: , ,

In a belated London Review of Books assessment of The Second Machine Age and Average Is Over, John Lanchester doesn’t really break new ground in considering Deep Learning and technological unemployment, but in his customarily lucid and impressive prose he crystallizes how quickly AI may remake our lives and labor in the coming decades. Two passages follow: The opening, in which he charts the course of how the power of a supercomputer ended up inside a child’s toy in a few short years; and a sequence about the way automation obviates workers and exacerbates income inequality.

__________________________________

In 1996, in response to the 1992 Russo-American moratorium on nuclear testing, the US government started a programme called the Accelerated Strategic Computing Initiative. The suspension of testing had created a need to be able to run complex computer simulations of how old weapons were ageing, for safety reasons, and also – it’s a dangerous world out there! – to design new weapons without breaching the terms of the moratorium. To do that, ASCI needed more computing power than could be delivered by any existing machine. Its response was to commission a computer called ASCI Red, designed to be the first supercomputer to process more than one teraflop. A ‘flop’ is a floating point operation, i.e. a calculation involving numbers which include decimal points (these are computationally much more demanding than calculations involving binary ones and zeros). A teraflop is a trillion such calculations per second. Once Red was up and running at full speed, by 1997, it really was a specimen. Its power was such that it could process 1.8 teraflops. That’s 18 followed by 11 zeros. Red continued to be the most powerful supercomputer in the world until about the end of 2000.

I was playing on Red only yesterday – I wasn’t really, but I did have a go on a machine that can process 1.8 teraflops. This Red equivalent is called the PS3: it was launched by Sony in 2005 and went on sale in 2006. Red was only a little smaller than a tennis court, used as much electricity as eight hundred houses, and cost $55 million. The PS3 fits underneath a television, runs off a normal power socket, and you can buy one for under two hundred quid. Within a decade, a computer able to process 1.8 teraflops went from being something that could only be made by the world’s richest government for purposes at the furthest reaches of computational possibility, to something a teenager could reasonably expect to find under the Christmas tree.

The force at work here is a principle known as Moore’s law. This isn’t really a law at all, but rather the extrapolation of an observation made by Gordon Moore, one of the founders of the computer chip company Intel. By 1965, Moore had noticed that silicon chips had for a number of years been getting more powerful, in relation to their price, at a remarkably consistent rate. He published a paper predicting that they would go on doing so ‘for at least ten years’. That might sound mild, but it was, as Erik Brynjolfsson and Andrew McAfee point out in their fascinating book, The Second Machine Age, actually a very bold statement, since it implied that by 1975, computer chips would be five hundred times more powerful for the same price. ‘Integrated circuits,’ Moore said, would ‘lead to such wonders as home computers – or at least terminals connected to a central computer – automatic controls for automobiles and personal portable communications equipment’. Right on all three. If anything he was too cautious.•

__________________________________

Note that in this future world, productivity will go up sharply. Productivity is the amount produced per worker per hour. It is the single most important number in determining whether a country is getting richer or poorer. GDP gets more attention, but is often misleading, since other things being equal, GDP goes up when the population goes up: you can have rising GDP and falling living standards if the population is growing. Productivity is a more accurate measure of trends in living standards – or at least, it used to be. In recent decades, however, productivity has become disconnected from pay. The typical worker’s income in the US has barely gone up since 1979, and has actually fallen since 1999, while her productivity has gone up in a nice straightish line. The amount of work done per worker has gone up, but pay hasn’t. This means that the proceeds of increased profitability are accruing to capital rather than to labour. The culprit is not clear, but Brynjolfsson and McAfee argue, persuasively, that the force to blame is increased automation.

That is a worrying trend. Imagine an economy in which the 0.1 per cent own the machines, the rest of the 1 per cent manage their operation, and the 99 per cent either do the remaining scraps of unautomatable work, or are unemployed. That is the world implied by developments in productivity and automation. It is Pikettyworld, in which capital is increasingly triumphant over labour. We get a glimpse of it in those quarterly numbers from Apple, about which my robot colleague wrote so evocatively. Apple’s quarter was the most profitable of any company in history: $74.6 billion in turnover, and $18 billion in profit. Tim Cook, the boss of Apple, said that these numbers are ‘hard to comprehend’. He’s right: it’s hard to process the fact that the company sold 34,000 iPhones every hour for three months. Bravo – though we should think about the trends implied in those figures. For the sake of argument, say that Apple’s achievement is annualised, so their whole year is as much of an improvement on the one before as that quarter was. That would give them $88.9 billion in profits. In 1960, the most profitable company in the world’s biggest economy was General Motors. In today’s money, GM made $7.6 billion that year. It also employed 600,000 people. Today’s most profitable company employs 92,600. So where 600,000 workers would once generate $7.6 billion in profit, now 92,600 generate $89.9 billion, an improvement in profitability per worker of 76.65 times. Remember, this is pure profit for the company’s owners, after all workers have been paid. Capital isn’t just winning against labour: there’s no contest. If it were a boxing match, the referee would stop the fight.•

Tags: , , , ,

Gordon Moore didn’t publish his Law until 1965, so you can forgive U.S. space research bigwig J. Gordon Baethe for underestimating how quickly we would take ourselves from atmosphere to stratosphere, when he was interviewed in 1954 on Longines Chronoscope. As usual, that Nazi Wernher von Braun was the more accurate prognosticator.

Tags: ,

"Hewlett-Packard introduced the first programmable desktop calculator."

In Paul Allen’s forthcoming memoir, Idea Man, which is excerpted in the new Vanity Fair, the Microsoft co-founder pinpoints ten months when the technology we know today first became possible:

“That year, 1968, would be a watershed in matters digital. In March, Hewlett-Packard introduced the first programmable desktop calculator. In June, Robert Dennard won a patent for a one-transistor cell of dynamic random-access memory, or DRAM, a new and cheaper method of temporary data storage. In July, Robert Noyce and Gordon Moore co-founded Intel Corporation. In December, at the legendary ‘mother of all demos’ in San Francisco, the Stanford Research Institute’s Douglas Engelbart showed off his original versions of a mouse, a word processor, e-mail, and hypertext. Of all the epochal changes in store over the next two decades, a remarkable number were seeded over those 10 months: cheap and reliable memory, a graphical user interface, a ‘killer’ application, and more.”

Tags: , , , ,