Science/Tech

You are currently browsing the archive for the Science/Tech category.

Drugs have always been for polite people, too, though the packaging is often nicer. Prescriptions written on clean, white sheets of paper dispense pain killers with alarming regularity now, but it’s always been one high or another. From “White-Collar Pill Party,” Bruce Jackson’s 1966 Atlantic article:

Think for a moment: how many people do you know who cannot stop stuffing themselves without an amphetamine and who cannot go to sleep without a barbiturate (over nine billion of those produced last year) or make it through a workday without a sequence of tranquilizers? And what about those six million alcoholics, who daily ingest quantities of what is, by sheer force of numbers, the most addicting drug in America?

The publicity goes to the junkies, lately to the college kids, but these account for only a small portion of the American drug problem. Far more worrisome are the millions of people who have become dependent on commercial drugs. The junkie knows he is hooked; the housewife on amphetamine and the businessman on meprobamate hardly ever realize what has gone wrong.

Sometimes the pill-takers meet other pill-takers, and an odd thing happens: instead of using the drug to cope with the world, they begin to use their time to take drugs. Taking drugs becomes something to do. When this stage is reached, the drug-taking pattern broadens: the user takes a wider variety of drugs with increasing frequency. For want of a better term, one might call it the white collar drug scene.

I first learned about it during a party in Chicago last winter, and the best way to introduce you will be to tell you something about that evening, the people I met, what I think was happening.

There were about a dozen people in the room, and over the noise from the record player scraps of conversation came through:

“Now the Desbutal, if you take it with this stuff, has a peculiar effect, contraindication, at least it did for me. You let me know if you … “

“I don’t have one legitimate prescription, Harry, not one! Can you imagine that?” “I’ll get you some tomorrow, dear.”

“… and this pharmacist on Fifth will sell you all the leapers [amphetamines] you can carry—just like that. Right off the street. I don’t think he’d know a prescription if it bit him.” “As long as he can read the labels, what the hell.”

“You know, a funny thing happened to me. I got this green and yellow capsule, and I looked it up in the Book, and it wasn’t anything I’d been using, and I thought, great! It’s not something I’ve built a tolerance to. And I took it. A couple of them. And you know what happened? Nothing! That’s what happened, not a goddamned thing.”

The Book—the Physicians’ Desk Reference, which lists the composition and effects of almost all commercial pharmaceuticals produced in this country—passes back and forth, and two or three people at a time look up the contents and possible values of a drug one of them has just discovered or heard about or acquired or taken. The Book is the pillhead’s Yellow Pages: you look up the effect you want (“Sympathomimetics” or “Cerebral Stimulants,” for example), and it tells you the magic columns. The pillheads swap stories of kicks and sound like professional chemists discussing recent developments; others listen, then examine the PDR to see if the drug discussed really could do that.

Eddie, the host, a painter who has received some recognition, had been awake three or four days, he was not exactly sure. He consumes between 150 and 200 milligrams of amphetamine a day, needs a large part of that to stay awake, even when he has slipped a night’s sleep in somewhere. The dose would cause most people some difficulty; the familiar diet pill, a capsule of Dexamyl or Eskatrol, which makes the new user edgy and overenergetic and slightly insomniac the first few days, contains only 10 or 15 milligrams of amphetamine. But amphetamine is one of the few central nervous system stimulants to which one can develop a tolerance, and over the months and years Ed and his friends have built up massive tolerances and dependencies. “Leapers aren’t so hard to give up,” he told me. “I mean, I sleep almost constantly when I’m off, but you get over that. But everything is so damned boring without the pills.”

I asked him if he knew many amphetamine users who have given up the pills.

“For good?”

I nodded.

“I haven’t known anybody that’s given it up for good.” He reached out and took a few pills from the candy dish in the middle of the coffee table, then washed them down with some Coke.•

Tags:

Here’s a 1971 episode of the British version of This Is Your Life, which features soccer legend George Best, who was in his prime, a remarkable athlete who looked like a member of Led Zeppelin, a swashbuckling playboy envied by all. But he was already dying–he’d been dying almost from the beginning. Like his mother who is seen in this program, Best was an inveterate alcoholic who wrecked his career and himself at an early age. He had it all, except what he needed most, whatever that was. It was a miracle that he made it to 59. Was it nurture or nature? 

We never know what’s inside of somebody else–sometimes even inside of ourselves. Of course, that’s not only true in the negative sense. People can unfold in beautiful and surprising ways also. And why do some people keep growing, changing, evolving? Again: Is it nature or nurture? Probably something innate that may need to be unlocked by experience.

Tags:

Obamacare is far from perfect, but for tens of millions of Americans it’s the difference between life and death. I can’t believe how often that point gets lost in the discussion. As if we aren’t all unique people who mean something special to those close to us. People with health insurance don’t face death panels, but people without it potentially face them every day.

I think there are a few economists who read the blog, and it would be appreciated if you could refer me to any studies of what tens of millions of newly insured people will mean to the economy. I would think it would be a boon, but I’d like to read what non-demagogue professionals have to say.

From a new Michael Moore op-ed in the New York Times that looks at both sides of the Affordable Care Act:

“TODAY marks the beginning of health care coverage under the Affordable Care Act’s new insurance exchanges, for which two million Americans have signed up. Now that the individual mandate is officially here, let me begin with an admission: Obamacare is awful.

That is the dirty little secret many liberals have avoided saying out loud for fear of aiding the president’s enemies, at a time when the ideal of universal health care needed all the support it could get. Unfortunately, this meant that instead of blaming companies like Novartis, which charges leukemia patients $90,000 annually for the drug Gleevec, or health insurance chief executives like Stephen Hemsley of UnitedHealth Group, who made nearly $102 million in 2009, for the sky-high price of American health care, the president’s Democratic supporters bought into the myth that it was all those people going to get free colonoscopies and chemotherapy for the fun of it. …

And yet — I would be remiss if I didn’t say this — Obamacare is a godsend. My friend Donna Smith, who was forced to move into her daughter’s spare room at age 52 because health problems bankrupted her and her husband, Larry, now has cancer again. As she undergoes treatment, at least she won’t be in terror of losing coverage and becoming uninsurable. Under Obamacare, her premium has been cut in half, to $456 per month.

Let’s not take a victory lap yet, but build on what there is to get what we deserve: universal quality health care.”

Tags: ,

In 1844, Samuel Morse tapped out his first coded sentence: “What hath God wrought!” And in the 170 years since then, the tools that have been wrought have been increasingly wonderful and terrifying. You really can’t legislate the more ill effects away, but the bright side is that they are double-edged swords, and those who misuse them are also prone to them.

On the topic of tools run amok: A passage from a Cleveland Plain Dealer article by Paul Hoynes explaining how the Indians signing of outfielder David Murphy, meant to be kept a secret for a while, spread accidentally at first and then virally:

“The Indians signing of free agent outfielder David Murphy to a two-year $12 million deal didn’t belong in the same airspace, let along the flight path, of Seattle’s deal with Cano. Still, it will go down as the most intriguing of the winter because the story was first reported by Murphy’s five-year-old daughter, Faith, at her Dallas-area preschool.

The deal wasn’t officially announced until Nov. 25 even though it hit Twitter on Nov. 19. The trigger to the story – a lesson on the meaning of Thanksgiving at Faith’s preschool.

‘She was in preschool and they were learning about Pilgrims and Indians,’ Murphy told reporters on the day his deal became official. ‘She spoke up that her dad was going to the Indians. Obviously, the word spreads quickly because of social media. It’s not the best situation, but it’s a good story to tell her when she gets older.’

There are no more scoops in the news business — at least not in the traditional sense. Breaking news hits the Internet in a matter of seconds. No one knows that better than a general manager of a big league baseball team, but even Chris Antonetti was taken back by a text he received from a reporter concerning Murphy.

‘Initially, I didn’t know how it broke,’ said Antonetti, entering his fourth year as Indians general manager. ‘Then I got a text from a writer and it said, ‘There is a kindergarten teacher in Texas Tweeting that David Murphy is going to be an Indian. I said, OK.’

Some back tracking was needed to see how the story leaked.”

Tags: , , ,

Ray Kurzweil’s prognostications always seem too optimistic and aggressive to me. It’s not that I don’t think we’ll accomplish most of what he says we will–if we don’t destroy ourselves first, that is–but I think it will take longer, in some cases much longer. The opening of his CNN piece which predicts the short term future of science and technological development:

By the early 2020s, we will have the means to program our biology away from disease and aging.

Up until recently, health and medicine was basically a hit or miss affair. We would discover interventions such as drugs that had benefits, but also many side effects. Until recently, we did not have the means to actually design interventions on computers.

All of that has now changed, and will dramatically change clinical practice by the early 2020s.

We now have the information code of the genome and are making exponential gains in modeling and simulating the information processes they give rise to.

We also have new tools that allow us to actually reprogram our biology in the same way that we reprogram our computers.

RNA interference, for example, can turn genes off that promote disease and aging. New forms of gene therapy, especially in vitro models that do not trigger the immune system, have the ability to add new genes.

Stem cell therapies, including the recently developed method to create ‘induced pluripotent cells’ (IPCs) by adding four genes to your own skin cells to create the equivalent of an embryonic stem cell but without use of an embryo, are being developed to rejuvenate organs and even grow then from scratch.

There are now hundreds of drugs and processes in the pipeline using these methods to modify the course of obesity, heart disease, cancer, and other diseases and aging processes.

As one of many examples, we can now fix a broken heart — not (yet) from romance — but from a heart attack, by rejuvenating the heart with reprogrammed stem cells.

Health and medicine is now an information technology and is therefore subject to what I call the ‘law of accelerating returns,’ which is a doubling of capability (for the same cost) about each year that applies to any information technology.

As a result, technologies to reprogram the ‘software’ that underlie human biology are already a thousand times more powerful than they were when the genome project was completed in 2003, and will again be a thousand times more powerful than they are today in a decade, and a million times more powerful in two decades.”

Tags:

Whenever someone frets about us using computers to augment memory, I think back to Socrates agonizing over the effect of the written word on the same. I think the gain is far greater than the loss. Chris Ware, that brilliant fellow, isn’t so sure, at least when it comes to capturing special moments on smartphones. An excerpt from an essay he wrote to explain his newest New Yorker cover:

“Steve Jobs, along with whatever else we’re crediting to him, should be granted the patent on converting the universal human gesture for trying to remember something from looking above one’s head to fumbling in one’s pants pocket. I’m pretty sure I read somewhere that most pre-industrial composers could creditably reproduce an entire symphony after hearing it only once, not because they were autistic but simply because they had to. We’ve all heard Bach’s Brandenburg Concertos hundreds of times more than Bach ever did, and where our ancestors might have had only one or two images by which to remember their consumptive forebears, we have hours of footage of ours circling the luxury-cruise midnight buffet tables.

Sometimes, I’ve noticed with horror that the memories I have of things like my daughter’s birthday parties or the trips we’ve taken together are actually memories of the photographs I took, not of the events themselves, and together, the two somehow become ever more worn and overwrought, like lines gone over too many times in a drawing. The more we give over of ourselves to these devices, the less of our own minds it appears we exercise, and worse, perhaps even concomitantly, the more we coddle and covet the devices themselves. The gestures necessary to operate our new touch-sensitive generation of technology are disturbingly similar to caresses.”

Tags:

I wish I had more of a feel for pop culture than I do, but most of it leaves me cold, from comic-book film adaptations to reality TV to pop music. I just don’t care. I don’t think I’m better than it–just separate from it. 

For instance: I’ve never had any interest in Star Trek, the TV series or films. I actually feel physical pain if I have to sit through it. But creator Gene Roddenberry was obviously a special guy and not only for his progressive outlook on race and gender. In a 1976 Penthouse interview conducted by Linda Merinoff, Roddenberry laid out the next 40 years of our society, from the Internet to email to swarms and crowdsourcing to the decline of the traditional postal service to online learning to the telecommunications revolution. Three excerpts follow.

_________________________

Penthouse:

What is happening to television as a piece of mechanical equipment?

Gene Roddenberry:

I think there is little doubt that we’re probably on the threshold of a whole new revolution in telecommunications. We are now experimenting with mating television sets with print-out devices, think of TV mated with a Xerox-type machine in which probably our newspapers will ultimately be delivered. It’s a much more efficient system. The minute you put the newspaper to bed electronically, you can then push a button and any house that subscribes to the service can have the thing rolled right out of the TV set. We’re also experimenting, in some cities already, with mating television with simple computers and the home will be run by a home-computing feature. You’ll do your billing on it, your banking, probably a great part of your shopping. I think it is inescapable that we mate TV with reproducing devices, that it will become our postal system of the future, almost certainly our telephone or videophone. So I see television going in either of two directions. One is that it can become that opiate we fear. Or, used properly, it can be a way for all people, everywhere, to have access to all the recorded knowledge of all humanity.

_________________________

Penthouse:

Where do you think mankind is heading?

Gene Roddenberry:

There’s a theory I have that i’ve been making notes on for a couple of years now and intend to write a book on it sometime in the future. You often hear the question, “I wonder what the next dominant species will be?” I think that completely unnoticed by practically all people is the fact that the next dominant species on earth has already arrived and has been with us for some time. And this is a species that I call socio-organism. It first began to make its appearance when men started to gather together in tribal groups, and then city-states, and more lately in nations, giant corporations, and so on. The socio-organism is a living organism that is made up of individual cells–which are human beings. In other words the United States of America is a socio-organism. It is made up of 200 million cells, many of them become increasingly specialized just as the cells in our body do. Furnish food, take away waste products, or the nerves–the sight, the thinking, the planning. Your local PTA is also a small socio-organism. General Motors and ITT are socio-organisms. The interesting thing about this new creature is that unlike all the past life forms, one cell in a socio-organism can be a member of several of these socio-organisms. Also, they do not have to live in physical proximity with each other as in our bodies. It sounds a rather foolish sci-fi thing to say that General Motors is a living organism. But if you take a few steps back and view it from this point of view, you begin to discover that the evolution of this socio-organism almost exactly parallels everything we know about Darwinian evolution.

Briefly, Darwinian evolution is fairly generally accepted, that the first life forms on earth were individual cells floating on the warm soup seas of the time. Finally, through chance and other factors, groups of these cells discovered that by being gathered together they could get their food more efficiently, protect themselves, and become dominant over the single-cell amoebas. With humans, exactly the same thing happened. More and more individual units began to get more and more specialized. As it became more complex, with more and more highly specialized units, the creature became more and more powerful, was capable of protecting itself, taking care of its individual cells. This is a process of accumulating interdependence. The frightening thing about viewing humankind now, this way, is that the socio-organisms are really becoming more dominant than the individual. In Red China they are teaching the very lessons that our bodies have, over the centuries, taught to its cells–that we can no longer exist for ourselves. We must exist for the whole. But you can see the same thing in the United States. People now live the corporate morality. If I join a corperation, my duty is to the corperation. If the corperation says lie, cheat, steal, move here, do that, I must do it because my duty is to the whole. So if indeed civilization is following the laws of Darwinian evolution, you can predict ahead a few centuries or a few dozen or hundred centuries, until a time in which the independent individual will have totally vanished and this planet will be inhabited by totally specialized cells who function as part of these giant, living things. The great battle and great decision we humans face is whether to let this continue until we become faceless, totally interdependent organisms. Whether this is goood or bad I don’t know. You might, if it were possible, talk to a cell of my heart and say, “Look cell, are you happy?” It seems to have adapted well. Maybe this is the way it suppose to be. Maybe there is some form of mass mind, mass consciousness, when a socio-organism reaches its final form, and we will be part of it and perfectly happy to be part of it. There may be contentments and happiness in this that we presently can’t visualize. I fear it because I can’t visualize it being better than remaining a free individual. I also fear the fact that is I remain, and insist on remaining totally independent and free, that the way things are going I am to be treated as a cancer cell by the socio-organisms around me, which will find it necessary to eradicate me because I endanger the organism.

Penthouse:

What is one’s purpose in this socio-organism? Just to survive?

Gene Roddenberry:

No. My purpose… that’s a hard question. I’ll try to answer it. My purpose is to live out whatever my function may be as a part of the whole that is God. I am a piece of Him. I believe that all intelligence is a part of the whole and it may be a great cyclical thing in which we have to go on, evolving, perfecting, until we reach the point where we are God, so that we can create ourselves so that we know we existed in the first place.

_________________________

Penthouse:

You’ve said that you felt that Star Trek was a very optimistic show. Are you still that optimistic in the 70’s about the future of mankind? 

Gene Roddenberry:

Yes, but I think that if we have an earth of the Star Trek century, it will not be ab unbroken, steady rise to that kind of civilization. We’re in some very tough times. Our twentieth-century technological civilization has no guarantees that it is going to stay around for a long time. But I think man is really an incredible creature. We’ve had civilizations fall before and we build a somewhat better one on the ashes every time. And I’d never consider the society we depicted in Star Trek necessarily a direct, uninterrupted out-growth of our present civilization, with its heavy emphasis on materialism. I think But my optimism is not for our society. It’s for our essential ingredient in humankind. And I think we humans will rebuild and, if necessary, we’ll lose another civilization and rebuild again on top of that until slowly, bit by bit, we’ll get there.•

Tags: ,

It’s always been a difficult balance for newspapers–and never more than it is now–to give readers what they want and what they seemingly need. From Eugene L. Meyer’s Bethesda Magazine interview with Katherine Weymouth, the Graham family member who has stayed aboard the Washington Post as publisher at the behest of new owner, Jeff Bezos:

Question:

What can Jeff Bezos do that the Grahams couldn’t?

Katherine Weymouth:

I personally believe there’s no magic bullet. If there were, someone would’ve found it, how to transform for the digital era. But we are in a great position. We have a credible brand, deeply engaged readers, [and we] cover Washington. And now we are owned by someone with deep pockets who cares what we do and is willing to invest for the long term.

Question:

What has changed now that the Post newspaper is owned by Jeff Bezos?

Katherine Weymouth:

People have stopped wearing ties, that’s the biggest change around here. …He hasn’t yet told us what to do, not that he would. He’s buying it for all the right reasons: It’s an important institution. He said, ‘I’m an optimist by nature and, yes, I’m optimistic about the future of the Post. If not, I wouldn’t join you.’ Can he bring something to the table? He clearly does have deep pockets. By itself, that’s not enough. He is obsessively focused on the reader’s experience.

 Question:

Have you and he discussed changes you might make under his ownership that you were unable to or didn’t make before?

Katherine Weymouth:

I do not anticipate any dramatic changes. He has made it clear that he wants to build on what we do best, with a deep focus on serving our readers…[while] experimenting with new ways of presenting our journalism digitally that will create even more compelling experiences for our readers and users.”

Tags: , ,

At Practical Ethics, Joao Fabiano has a smart consideration of some of the perils of neuro-modification of morality, which we will probably delay dealing with for as long as we can. But what if a violent serial criminal could be “adjusted” to no longer behave aberrantly? Sounds great and frightening. The opening:

It is 2025. Society has increasingly realised the importance of breaking evolution’s chains and enhancing the human condition. Large grants are awarded for building sci-fi-like laboratories to search for and create the ultimate moral enhancer. After just a few years, humanity believes it has made one of its most major breakthroughs: a pill which will rid our morality of all its faults. Without any side-effects, it vastly increases our ability to cooperate and to think rationally on moral issues, while also enhancing our empathy and our compassion for the whole of humanity. By shifting individuals’ socio-value orientation towards cooperation, this pill will allow us to build safe, efficient and peaceful societies. It will cast a pro-social paradise on earth, the moral enhancer kingdom come.

I believe we better think twice before endeavouring ourselves into this pro-social paradise on the cheap. Not because we will lose ‘the X factor,’ not because it will violate autonomy, and not because such a drug would cause us to exit our own species. Even if all those objections are refuted, even if the drug has no side-effects, even if each and every human being, by miracle, willingly takes the drug without any coercion whatsoever, even then, I contend we could still have trouble.

Surprisingly, the scenario imagined in the first paragraph is not that far-fetched. The field of cognitive moral neuroscience and the study of moral cognition have been flourishing; we have already found many neurochemical manipulations which seem to alter our social and moral preferences.”

Tags:

A passage from a new Wired interview by Alex Pasternack with security expert Bruce Schneier about safety vulnerabilities, the physical kinds and virtual ones:

Wired:

What about attacks that affect infrastructure? Obviously the past few years have shown that industry, cities, utilities, even vehicles are vulnerable to hacking. Are those serious threats?

 Bruce Schneier:

There are threats to all embedded systems. We’ve seen groups mostly at universities hacking into medical devices, hacking into automobiles, into various security cameras, and demonstrating the vulnerabilities. There’s not a lot of fixing at this time. The industries are still largely ignoring the problem, maybe very much like the computer industry did maybe twenty years ago, when they belittled the problem, pretended it wasn’t there. But we’ll get there.

When I look at the bigger embedded systems, the power grids, various infrastructure systems in cities, there are vulnerabilities. I worry about them a little less because they’re so obscure. But I still think we need to start figuring out how to fix them, because I think there are a lot of hidden vulnerabilities in embedded systems.

 Wired:

Are there particular security concerns right now that you think the public, given its misunderstanding about security, doesn’t appreciate enough?

 Bruce Schneier:

I’m most worried about potential security vulnerabilities in the powerful institutions we’re trusting with our data, with our security. I’m worried about companies like Google and Microsoft and Facebook. I’m worried about governments, the US and other governments. I’m worried about how they are using our data, how they’re storing our data, and what happens to it. I’m less worried about the criminals. I think we’ve kinda got cyber-crime under control, it’s not zero but it never will be. I’m much more worried about the powerful abusing us than the un-powerful abusing us.”

Tags: ,

The 1970s sensation of the King Tut exhibit obviously had it roots in ancient times, but its modern story began in 1922 when Howard Carter unearthed the unimaginable trove, wonderfully preserved. Soon after the discovery, the New York Times sent a reporter to Egypt to document the find that stunned the world. The article’s opening:

Through the courtesy of Howard Carter, the American Egyptologist, who, as director of Lord Carnarvon’s expedition, has, after thirty-three years’ search dug up the tomb of King Tutankhamen of the eighteenth dynasty, the correspondent of The New York Times was allowed today an exclusive view of the interior of the two ante-chambers of the tomb in the Valley of the Kings in Upper Egypt. The rest of the chambers of the tomb are still sealed.

Down a steep incline we entered straight to the first chamber. In the middle of the wall to the right is a doorway evidently leading to the chamber or chambers wherein doubtless are the sarcophagus and mummy of the King, and perhaps other treasures, since the antechambers are merely a hallway with a drawing room concealed behind a tantalizing sealed door, which will not be opened before the return of Lord Carnarvon from London, which will be about the middle of February.

Against this doorway are two life-size statues of the King made of bitumenized wood–not ebony, as at first reported. They are still standing on reed mats, just as they stood in the King’s palace and exactly as laid down on the Pharaonic funerary ritual. This again is evidence that this is the tomb and not the cache of Tutankhamen, as, if it were the cache the statues would be standing anywhere and anyhow, certainly not in exact accordance with the ritual.

The feet of each statue are shod with solid gold sandals of inestimable value. Each statue is crowned with a golden crown, bearing in front the royal serpent, or uraeus. As Thebes was the shrine of the cult of the serpent this is not unusual.

Incidentally, the day the tomb was opened and the party found these golden serpents in the crowns of the two statues there was an interesting incident at Carter’s house. He brought a canary with him this year to relieve his loneliness. When the party was dining, that night there was a commotion outside on the veranda. The party rushed out and found that a serpent of similar type to that in the crowns had grabbed the canary. They killed the serpent, but the canary died, probably from fright.

The incident made an impression on the native staff, who regard it as a warning from the spirit of the departed King against further intrusion on the privacy of his tomb.

But the most notable thing about the statues is the rare beauty of the faces. They have evidently been made from plaster casts such as were made by the ancient Egyptians a thousand years before the Greeks or Romans ever thought of them. They show the King as a man of royal mien. Gazing on the beautiful, calm, kindly and strong countenance on the left-hand statue, which is undamaged, one finds it difficult to realize that such a monarch could have succumbed to the overwhelming influence among the priests as he did, to become again an adherent of the orthodox religion. The explanation is probably that he realized the futility of opposition to pressure so strong that it even forced the Queen to change her name from Ankhosenaten to Ankhosenamen.

It is certain that the King would not have agreed to his humiliation unless there was no alternative. This fact is historically most interesting as indicating that the power of the Hierarchy of Amon in the days of Tutankhamen was greater than that of Pharaoh, though these sacredotal Princes did not seize the throne from the Pharaohs until more than 300 years later.

As works of art those statues reach a plane of excellence probably higher than has been reached in any subsequent period of the world.

On the other side of the chamber is a throne incomparably magnificent and wondrously beautiful. One must note its infallible evidence of the wholly unsuspected height reached by ancient Egyptian art. The innate refinement, pure lofty estheticism and amazing skill of the craftsman constitute a startling revelation. It shows not only the imperial splendor of ancient Egypt was far more delicate and magnificent than was imagined or equaled in the world’s history, but also that the late greatest craftsmen of ancient Greece were mere hacks compared to the master who designed and adorned the throne.•

Tags: , ,

You could tell me anything really far-fetched about technology right now, and I couldn’t readily dismiss it, even if I thought you were probably lying. So reports about gigantic vending machines in China dispensing electric cars didn’t really make me blink. Unlike Mark Rogowsky of Forbes, however, I’m not high on the potential of this disruptive business model. The opening of his recent breathless article about Kandi Technologies:

“China is growing so fast it’s sometimes difficult to get different sources to even agree which the biggest cities are and how many people live in them. But that said, among them is a name unfamiliar to most Americans, the city of Hangzhou, located in eastern China, and home to 8.7 million as of 2010. That would make it the biggest city in the U.S. even though it’s barely a third the size of Shanghai, the world’s largest. But Hangzhou isn’t just big, it’s also home to an ambitious experiment that combines electric vehicles, giant vending machines and a Zipcar-like business model. Oh, and if it works, private car ownership as we know it is probably going to disappear in the world’s biggest cities.”

My one Libertarian streak is that I’ve always believed that consenting adults shouldn’t be limited in what they can do with their time and money and bodies. Children should be protected–I don’t see why grade schoolers are even allowed to play tackle football or eat at fast-food restaurants–but grown-ups are grown-ups and should be treated as such.

But it’s tougher for me to maintain this stance over time, simply because some behaviors have costs (financial and social) that can plague us for generations, whether we’re talking about drugs or gambling or other behaviors. The crack epidemic in NYC led to broken homes that sadly reverberate to this day, damaging children who weren’t even alive during the crisis. Of course, the War on Drugs does little to combat these problems and just creates a black market, so I don’t know if there’s any good answer. But whenever there’s a ballot initiative regarding casinos, which are supposedly going to boost the economy, I know it’s fool’s gold. The attendant problems of such establishments take from the economy at least as much as they give back. From Elisabetta Povoledo in the New York Times:

PAVIA, Italy — Renowned for its universities and a celebrated Renaissance monastery, this Lombardy town about 25 miles south of Milan has in recent years earned another, more dubious, distinction: the gambling capital of Italy.

Slot machines and video lottery terminals, known as V.L.T.s, can be found all over in coffee bars and tobacco shops, gas stations, mom-and-pop shops and shopping malls, not to mention 13 dedicated gambling halls. By some counts, there is one slot machine or V.L.T. for every 104 of the city’s 68,300 residents.

Critics blame the concentration of the machines for an increase in chronic gambling — and debt, bankruptcies, depression, domestic violence and broken homes — recorded by social service workers in Pavia.

But in many ways, Pavia is merely the most extreme example of the spread of gambling throughout Italy since lawmakers significantly relaxed regulation of the gambling industry a decade ago.”

Tags:

From “A.I. Has Grown Up and Left Home,” David Auerbach’s Nautilus article about how the field may be hamstrung by too much concern over how thinking “works,” a passage about the frustrations of the Cyc project:

“Unfortunately, not all facts are so clear-cut. Take the statement ‘Cats have four legs.’ Some cats have three legs, and perhaps there is some mutant cat with five legs out there. (And Cat Stevens only has two legs.) So Cyc needed a more complicated rule, like ‘Most cats have four legs, but some cats can have fewer due to injuries, and it’s not out of the realm of possibility that a cat could have more than four legs.’ Specifying both rules and their exceptions led to a snowballing programming burden.

After more than 25 years, Cyc now contains 5 million assertions. Lenat has said that 100 million would be required before Cyc would be able to reason like a human does. No significant applications of its knowledge base currently exist, but in a sign of the times, the project in recent years has begun developing a ‘Terrorist Knowledge Base.’ Lenat announced in 2003 that Cyc had ‘predicted’ the anthrax mail attacks six months before they had occurred. This feat is less impressive when you consider the other predictions Cyc had made, including the possibility that Al Qaeda might bomb the Hoover Dam using trained dolphins.”

Tags:

I know it’s been bandied about that there’s no low-hanging fruit left for the American economy, but what will the impact be job-wise if we end up having universal health coverage (or near-universal)? Will it create a large amount of employment opportunities and have residual positive effects on the economy? I have to think tens of millions of newly insured people will be a boon not only from a humanistic viewpoint but from a financial one as well. But I’m not an economist, so I can’t answer that. What I can tell you from my own experience of having worked for Internet companies is that it’s ludicrous to think that the rollout was bumpy because it was done by the public sector. The same thing happens regularly in the private sector. There are tons of IT workers in America, and most of them are mediocre at best. At any rate, it seems that many of the bugs in the ACA site have been worked out. From USA Today:

WASHINGTON–The federal health exchange, Healthcare.gov, received 880,000 visitors Dec. 24, the last day people could enroll to receive health coverage on Jan. 1, officials say.

‘We’re going to do everything we can to ensure a smooth transition period for consumers whose coverage begins on January 1,’ Julie Bataille wrote in a blog Friday. Bataille serves as director of the office of communications for the Centers for Medicare and Medicaid Services. ‘And we’re going to continue to work to ensure every American who still wants to enroll in Marketplace coverage by the end of the open enrollment period is able to do so.’

Consumers have until March 31 to enroll on the health insurance exchanges to avoid paying a fine with their 2015 taxes for not having health insurance.

More than a million people visited the site over the weekend, and 600,000 had hit the page by mid-day Monday — the original deadline for Jan. 1 coverage.”

Yes, the good stuff you can do with drones is endless, though you could say the same about the bad stuff as well. From a new Economist report on domestic drones, a prognostication on what will be the initial roles of these robots:

“There could be 10,000 drones buzzing around America’s skies by 2017, reckons the FAA. ‘The good stuff you can do is endless,’ says Lucien Miller of Innov8tive Designs, a UAS firm in San Diego county. Estate agents and architects can use them for aerial photography. Energy firms will be able to monitor pieces of vital infrastructure, such as pipelines. Amazon recently caused a stir by saying it was looking into delivery-by-drone, releasing a video of a test run. However, the prospect of automated aircraft being allowed to carry heavy parcels along crowded city streets still seems distant.

At first drones’ main civilian uses, AUVSI predicts, will be in agriculture, followed distantly by public safety. Farmers will be able to monitor their land in detail, pinpointing outbreaks of disease and infestation, for example, or checking soil humidity. They will also be able to apply nutrients and pesticides more precisely. Besides Mr Loh’s drones for fire-and-rescue workers, other potential public-safety uses include police tracking of suspects. Ben Kimbro of Tactical Electronics, a technology firm, says they will find uses in various other ‘dull, dirty and dangerous’ public-service jobs.”

Posting the Christopher Evans interview with J.G. Ballard earlier reminded me that I watched an excellent 1979 TV show a couple of years ago which was presented by the British computer scientist. A six-part series about how microprocessors were going to change the world, it was based on Evans’ book, The Mighty Micro (retitled The Micro Millennium in the United States). It succinctly journeys from Blaise Pascal to ATMs, aptly calling the coming epoch the “Second Industrial Revolution.” It never explicitly discusses the advent of the Internet but suggests many of its successes and perils. 

There are just two things that the show seemed naive about: 1) That paper money disappearing would lead to the end of theft, and 2) That powerful technology would make war unappealing (which is a mistake that Nikola Tesla began making at the end of the 1800s).

But there’s so much that’s prescient: robots ending drudgery but causing unease about employment, online shopping, telecommuting and potential transformations in education. (It’s odd and unfortunate that this decades-old show reminds that we still haven’t taken advantage of gaming’s capacity for revolutionizing learning.)

It’s a future, the host asserts, that no country can afford to abstain from, even with all its disruption: “Those who lag back will become steadily less competitive, just the way that those countries that missed out on the Industrial Revolution remain locked in medieval standards of living.”

All six are embedded below, but if you only have time for a couple, Parts 4 (“The Introverted Society”) and 6 (“All Our Tomorrows”) are my two favorites. In 4, there’s a stunning prototype of what we recognize today as a Kindle. Part 6 presents four scientists (I.J. Good, James Martin, Barrie Sherman, Tom Stonier) discussing the promise and problems of the future as if they had just read 2013 newspapers (online versions, of course).

Final note: Evans was battling cancer while filming this series and passed away before it was completed, so the producer Lawrence Moore and his guests handle the finale.

Tags: , , , , ,

Let’s give monkeys prosthetic noses so that they can talk like humans, thought drunk scientists in 1905. From a story in that year’s New York Times:

“W. Reed Blair, the animal physician at the Bronx Park Zoo, and several other scientists have come to the conclusion that the only reason a monkey cannot talk like a human being is his nose. They have found that a monkey’s vocal chords and the general contour of his head are the same as a man’s, but the nose is different. They say that it is too flat to allow a monkey to articulate like a man. They propose to remedy this by a gutta-percha nose and to experiment with the artificial nose on August, the latest orangutang which has arrived at the Bronx Zoo. Later similar experiments will be tried on Duhong, another orangutang, and Soko and Polly, two chimpanzees.

Keeper Reilly, who says he has taught the monkeys to do everything but talk, has volunteered to be their language teacher. The keeper will begin to teach August his A B C as soon as the new nose arrives. Monkeys are very quick in imitating, and it is believed that with the right kind of nose they will be able to imitate the sound of the human voice. August will be taught to talk just the same as a child in school.

The scientists got the idea of a gutta percha nose from a well-known professor who has studied monkeys and the supposed monkey language for the last fifteen years in the Congo. Some years ago the professor met a man whose nose had been shot off in a battle. The man was able to talk only by forming a cone with his hands over the place where his nose had been. The professor reasoned that a monkey was in about the same condition as a man with his nose shot off, and has been working on the theory of an artificial nose since.”

Tags: , , , , ,

A few excerpts from computer scientist and TV presenter Dr. Christopher Evans’ 1979 interview of J.G. Ballard in the UK version of Penthouse, which was much classier than its US counterpart because all the beaver shots wore bowler hats and had the quaintest accents. 

__________________________

On the transition from the Space Age to the Personal Computer Age:

J.G. Ballard:

In the summer of ’74 I remember standing out in my garden on a bright, clear night and watching a moving dot of light in the sky which I realised was Skylab. I remember thinking how fantastic it was that there were men up there, and I felt really quite moved as I watched it. Through my mind there even flashed a line from every Hollywood aviation movie of the 40s, ‘it takes guts to fly those machines.’ But I meant it. Then my neighbour came out into his garden to get something and I said, ‘Look, there’s Skylab,’ and he looked up and said, ‘Sky-what?’ And I realised that he didn’t know about it, and he wasn’t interested. No, from that moment there was no doubt in my mind that the space age was over.

Dr. Christopher Evans:

What is the explanation for this. Why are people so indifferent?

J.G. Ballard:

I think it’s because we’re at the climactic end of one huge age of technology which began with the Industrial Revolution and which lasted for about 200 years. We’re also at the beginning of a second, possibly even greater revolution, brought about by advances in computers and by the development of information-processing devices of incredible sophistication. It will be the era of artificial brains as opposed to artificial muscles, and right now we stand at the midpoint between these two huge epochs. Now it’s my belief that people, unconsciously perhaps, recognise this and also recognise that the space programme and the conflict between NASA and the Soviet space effort belonged to the first of these systems of technological exploration, and was therefore tied to the past instead of the future. Don’t misunderstand me – it was a magnificent achievement to put a man on the moon, but it was essentially nuts and bolts technology and therefore not qualitatively different from the kind of engineering that built the Queen Mary or wrapped railroads round the world in the 19th century. It was a technology that changed peoples lives in all kinds of ways, and to a most dramatic extent, but the space programme represented its fast guttering flicker.

__________________________

On the PC bringing the world into the home, from social to pornography:

Dr. Christopher Evans:

How do you see the future developing?

J.G. Ballard:

I see the future developing in just one way – towards the home. In fact I would say that if one had to categorise the future in one word, it would be that word ‘home.’ Just as the 20th century has been the age of mobility, largely through the motor car, so the next era will be one in which instead of having to seek out one’s adventures through travel, one creates them, in whatever form one chooses, in one’s home. The average individual won’t just have a tape recorder, a stereo HiFi, or a TV set. He’ll have all the resources of a modern TV studio at his fingertips, coupled with data processing devices of incredible sophistication and power. No longer will he have to accept the relatively small number of permutations of fantasy that the movie and TV companies serve up to him, but he will be able to generate whatever he pleases to suit his whim. In this way people will soon realise that they can maximise the future of their lives with new realms of social, sexual and personal relationships, all waiting to be experienced in terms of these electronic systems, and all this exploration will take place in their living rooms.

But there’s more to it than that. For the first time it will become truly possible to explore extensively and in depth the psychopathology of one’s own life without any fear of moral condemnation. Although we’ve seen a collapse of many taboos within the last decade or so, there are still aspects of existence which are not counted as being legitimate to explore or experience mainly because of their deleterious or irritating effects on other people. Now I’m not talking about criminally psychopathic acts, but what I would consider as the more traditional psychopathic deviancies. Many, perhaps most of these, need to be expressed in concrete forms, and their expression at present gets people into trouble. One can think of a million examples, but if your deviant impulses push you in the direction of molesting old ladies, or cutting girl’s pig tails off in bus queues, then, quite rightly, you find yourself in the local magistrates court if you succumb to them. And the reason for this is that you’re intruding on other people’s life space. But with the new multi-media potential of your own computerised TV studio, where limitless simulations can be played out in totally convincing style, one will be able to explore, in a wholly benign and harmless way, every type of impulse – impulses so deviant that they might have seemed, say to our parents, to be completely corrupt and degenerate.

__________________________

On media decentralization, the camera-saturated society, Reality TV, Slow TV:

Dr. Christopher Evans:

Will people really respond to these creative possibilities themselves? Won’t the creation of these scenarios always be handed over to the expert or professional?

J.G. Ballard:

I doubt it. The experts or professionals only handle these tools when they are too expensive or too complex for the average person to manage them. As soon as the technology becomes cheap and simple, ordinary people get to work with it. One’s only got to think of people’s human responses to a new device like the camera. If you go back 30 or 40 years the Baby Brownie gave our parents a completely new window on the world. They could actually go into the garden and take a photograph of you tottering around on the lawn, take it down to the chemists, and then actually see their small child falling into the garden pool whenever and as often as they wanted to. I well remember my own parents’ excitement and satisfaction when looking at these blurry pictures, which represented only the simplest replay of the most totally commonplace. And indeed there’s an interesting point here. Far from being applied to mammoth productions in the form of personal space adventures, or one’s own participation in a death-defying race at Brands Hatch it’s my view that the incredibly sophisticated hook-ups of TV cameras and computers which we will all have at our fingertips tomorrow will most frequently be applied to the supremely ordinary, the absolutely commonplace. I can visualise for example a world ten years from now where every activity of one’s life will be constantly recorded by multiple computer-controlled TV cameras throughout the day so that when the evening comes instead of having to watch the news as transmitted by BBC or ITV – that irrelevant mixture of information about a largely fictional external world – one will be able to sit down, relax and watch the real news. And the real news of course will be a computer-selected and computer-edited version of the days rushes. ‘My God, there’s Jenny having her first ice cream!’or ‘There’s Candy coming home from school with her new friend.’ Now all that may seem madly mundane, but, as I said, it will be the real news of the day, as and how it affects every individual. Anyone in doubt about the compulsion of this kind of thing just has to think for a moment of how much is conveyed in a simple family snapshot, and of how rivetingly interesting – to oneself and family only of course – are even the simplest of holiday home movies today. Now extend your mind to the fantastic visual experience which tomorrow’s camera and editing facilities will allow. And I am not just thinking about sex, although once the colour 3-D cameras move into the bedroom the possibilities are limitless and open to anyone’s imagination. But let’s take another level, as yet more or less totally unexplored by cameras, still or movie, such as a parent’s love for one’s very young children. That wonderful intimacy that comes on every conceivable level – the warmth and rapport you have with a two-year-old infant, the close physical contact, his pleasure in fiddling with your tie, your curious satisfaction when he dribbles all over you, all these things which make up the indefinable joys of parenthood. Now imagine these being viewed and recorded by a very discriminating TV camera, programmed at the end of the day, or at the end of the year, or at the end of the decade, to make the optimum selection of images designed to give you a sense of the absolute and enduring reality of your own experience. With such technology interfaced with immensely intelligent computers I think we may genuinely be able to transcend time. One will be able to indulge oneself in a kind of continuing imagery which, for the first time will allow us to dominate the awful finiteness of life. Great portions of our waking state will be spent in a constant mood of self-awareness and excitement, endlessly replaying the simplest basic life experiences.•

Tags: ,

The quote in the headline comes from a 1996 comment made by Colin Wilson, the celebrated and derided British writer who passed away earlier this month. It can’t be true, can it? In the interview, he claims that no crimes of a sexual nature were committed before Jack the Ripper, citing how during the Victorian Era, inexpensive prostitutes made sex crimes “unnecessary.” But I’m sure there was plenty of cheap sex to be had at the time of the Whitechapel slayings, and there certainly was during Ted Bundy’s life, so that couldn’t be the motivation. Wilson further claims that so-called “self-esteem killings” began in the 1960s, but I think you can fit Leopold and Loeb in the category without too much of a stretch. At any rate, Wilson was at the time promoting his book, A Plague of Murder.

Tags:

Grace Hopper who hated bugs and was wittier than Letterman, was one of the true pioneers in modern computing. A TV appearance from 1986, a couple of months after her involuntary retirement from the Navy.

Tags: ,

From “Endless Fun,” neuroscientist Michael Graziano’s excellent Aeon article about the implications, many worrisome, of immortality through computer uploading, which sidesteps cryogenics and its frozen heads and gets to the essence beneath–the brain’s data:

Imagine a future in which your mind never dies. When your body begins to fail, a machine scans your brain in enough detail to capture its unique wiring. A computer system uses that data to simulate your brain. It won’t need to replicate every last detail. Like the phonograph, it will strip away the irrelevant physical structures, leaving only the essence of the patterns. And then there is a second you, with your memories, your emotions, your way of thinking and making decisions, translated onto computer hardware as easily as we copy a text file these days.

That second version of you could live in a simulated world and hardly know the difference. You could walk around a simulated city street, feel a cool breeze, eat at a café, talk to other simulated people, play games, watch movies, enjoy yourself. Pain and disease would be programmed out of existence. If you’re still interested in the world outside your simulated playground, you could Skype yourself into board meetings or family Christmas dinners.

This vision of a virtual-reality afterlife, sometimes called ‘uploading’, entered the popular imagination via the short story ‘The Tunnel Under the World’ (1955) by the American science-fiction writer Frederik Pohl, though it also got a big boost from the movie Tron (1982). Then The Matrix (1999) introduced the mainstream public to the idea of a simulated reality, albeit one into which real brains were jacked. More recently, these ideas have caught on outside fiction. The Russian multimillionaire Dmitry Itskov made the news by proposing to transfer his mind into a robot, thereby achieving immortality. Only a few months ago, the British physicist Stephen Hawking speculated that a computer-simulated afterlife might become technologically feasible.”

Tags:

In a post at Practical Ethics, Dominic Wilkinson asks a thorny question that seems like a simple one at first blush: Should some people, who are considered exceptional, receive health care that others don’t? Of course not, we all would say. Human lives are equal in importance, and our loved ones are just as valuable as the most famous or successful among us. But Wilkinson quickly points out that Nelson Mandela, probably the most beloved among us during his life, received expensive and specialized care that would have been denied almost anyone else in South Africa. But how could we deny Mandela anything, after he sacrificed everything and ultimately led a nation 180 degrees from a civil war that could have cost countless lives? You can’t, really, though I would wager that Peter Singer disagrees with me. The opening of Wilkinson’s post:

There are approximately 150,000 human deaths each day around the world. Most of those deaths pass without much notice, yet in the last ten days one death has received enormous, perhaps unprecedented, attention. The death and funeral of Nelson Mandela have been accompanied by countless pages of newsprint and hours of radio and television coverage. Much has been made of what was, by any account, an extraordinary life. There has been less attention, though, on Mandela’s last months and days. One uncomfortable question has not been asked. Was it ethical for this exceptional individual to receive treatment that would be denied to almost everyone else?

At the age of almost 95, and physically frail, Mandela was admitted to a South African hospital intensive care unit with pneumonia. He remained there for three months before being transferred for ongoing intensive care in a converted room in his own home. Although there are limited details available from media coverage it appears that Mandela received in his last six months a very large amount of highly expensive and invasive medical treatment. It was reported that he was receiving ventilation (breathing machine support) and renal dialysis (kidney machine). This level of treatment would be unthinkable for the vast majority of South Africans, and, indeed, the overwhelming majority of the people with similar illnesses even in developed countries. Frail elderly patients with pneumonia are not usually admitted to intensive care units. They do not have the option of prolonged support with breathing machines and dialysis at home.”

Tags: ,

Ima Hogg, 1900.

Scientific studies (which I mostly don’t believe) have long shown that those with more common names fare better in life than those with unique ones. Barack Obama is a small sample size, but he’s done fairly well personally and professionally. And then there’s Ima Hogg, who was the celebrated belle of Texas as well as un-porcine. Well, she did have family connections, so I could be talking about another exception. I suppose the one area in which a name can have an impact is when it allows an employer with biased hiring practices to know the race or ethnicity of the applicant. That does have a bearing on happiness.

The opening of an interesting New Yorker blog post on the topic by Maria Konnikova:

“In 1948, two professors at Harvard University published a study of thirty-three hundred men who had recently graduated, looking at whether their names had any bearing on their academic performance. The men with unusual names, the study found, were more likely to have flunked out or to have exhibited symptoms of psychological neurosis than those with more common names. The Mikes were doing just fine, but the Berriens were having trouble. A rare name, the professors surmised, had a negative psychological effect on its bearer.

Since then, researchers have continued to study the effects of names, and, in the decades after the 1948 study, these findings have been widely reproduced. Some recent research suggests that names can influence choice of profession, where we live, whom we marry, the grades we earn, the stocks we invest in, whether we’re accepted to a school or are hired for a particular job, and the quality of our work in a group setting. Our names can even determine whether we give money to disaster victims: if we share an initial with the name of a hurricane, according to one study, we are far more likely to donate to relief funds after it hits.

Much of the apparent influence of names on behavior has been attributed to what’s known as the implicit-egotism effect: we are generally drawn to the things and people that most resemble us. Because we value and identify with our own names, and initials, the logic goes, we prefer things that have something in common with them. For instance, if I’m choosing between two brands of cars, all things being equal, I’d prefer a Mazda or a Kia.

That view, however, may not withstand closer scrutiny.”

Tags: , ,

In promoting his new book about Twitter, Nick Bilton sat for an excellent interview with Shaun Randol of the Los Angeles Review of Books. A passage in which the Times reporter describes the changes in journalistic portraits in the Information Age:

Shaun Randol:

Can you speak about writing a narrative using at least four competing memories?

Nick Bilton:

There were over 100 competing memories. Everyone has a different viewpoint of what happened. I interviewed not only the founders and the board members; I interviewed also the people who worked there in the early days, their spouses, their ex-spouses, ex-girlfriends, and ex-boyfriends. I found people who worked at nearby coffee shops. I spoke with anyone who I could have a conversation with.

What I found the most fascinating was that I could go back to social media and use that in my reporting. For example: There’s a moment in the book when Twitter launches at the Love Parade, a rave in San Francisco. Everyone I spoke to believed it happened in June or July, or the beginning of summer. I looked up the Love Parade online and discovered it was in September. So then I went through and searched Jack’s tweets and those of other people from that time, and I ended up finding references of them at the Love Parade in September. Their memories believed it was the beginning of the summer, but they had actually documented it was the end of the summer.

That was the moment I realized that I could use these tweets and social media as a reporting tool for this book. There was a treasure trove of stuff that existed online, whether it was tweets, Flickr photos, videos on YouTube, Facebook updates, or Foursquare updates. These things existed everywhere and allowed me to pinpoint almost with exact accuracy where people were at certain points in time. I was able to untangle all of the somewhat different memories.

Shaun Randol:

There are significant implications of leaving that digital breadcrumb.

Nick Bilton:

Yes. If you want to write a book about me and I won’t let you interview me, you could potentially say what I was doing at certain points in time just by looking at my social media feeds: Foursquare, Facebook, Twitter. For this book, I had access to thousands of emails and other documents, but there were certain events that I could find via social media. The places people had gone. Videos of boat trips they took. Writing and reporting this story was a real eye-opening experience.

Shaun Randol:

I’m reminded of Gay Talese’s famous portrait of Frank Sinatra, ‘Frank Sinatra Has a Cold,’ in which a vivid portrait of the singer was drawn without ever speaking to him.

Nick Bilton:

That’s the piece that everyone attains to when they write a story like this. Imagine how Talese’s piece would have looked if Frank Sinatra was on Twitter and there were photographs on Instagram of him. As you see in my book, there are incredible details about what people were wearing, the temperature that day, and even the gusty wind. I used the internet to find these things. I could look at almanacs to find what the weather was that day and the photos on Flickr of what people were wearing that day.”

Tags: ,

« Older entries § Newer entries »