Science/Tech

You are currently browsing the archive for the Science/Tech category.

Even if we were treating our environment well–and we’re not–eventually weather patterns will shift and the agrarian culture we’ve enjoyed will be imperiled. So it makes sense for humans to experiment with genetically modified foods. I know it seems intuitive that natural food is good and laboratory modifications are bad, but there is plenty of poison in nature. We should demand transparency from corporations involved with our food chain, but we should proceed. We need to have an honest discussion, not disinformation. Unfortunately, we’re often getting the latter. From Bjørn Lomborg at Slate:

“French researcher Gilles-Eric Séralini attempted to fuel public opposition to genetically modified foods byshowing the public how GM corn, with and without the pesticide Roundup, caused huge tumors and early death in 200 rats that had consumed it over two years.

Supplying an abundance of pictures of rats with tumors the size of ping-pong balls, Séralini certainly captured the public’s attention. France’s health, ecology, and agriculture ministers promised a prompt investigation and threatened to ban imports of Monsanto’s GM corn to the European Union. Russia actually did block imports of Monsanto corn. 

But Séralini’s research posed many problematic issues. For starters, the Sprague-Dawley strain of rats that he used is naturally prone to tumors. Studies of Sprague-Dawley rats show that 88 percent to 96 percent of those that serve as experimental controls develop tumors before they reach two years of age. But the public saw only pictures of tumorous rats that had consumed GM corn and Roundup. If the public had seen the similarly grotesque tumors that grow on untreated rats, officials most likely would not have acted so hastily.”

Tags: ,

Two teams hope to send DNA-sequencing machines to Mars to prove the Red Planet harbors life. One of the groups is led by Craig Venter, who believes in building better bugs. From Antonio Regalado at MIT Technology Review:

“Although neither team yet has a berth on Mars rocket, their plans reflect the belief that the simplest way to prove there is life on Mars is to send a DNA sequencing machine.

‘There will be DNA life forms there,’ Venter predicted Tuesday in New York, where he was speaking at the Wired Health Conference.

Venter said researchers working with him have already begun tests at a Mars-like site in the Mojave Desert. Their goal, he said, is to demonstrate a machine capable of autonomously isolating microbes from soil, sequencing their DNA, and then transmitting the information to a remote computer, as would be required on an unmanned Mars mission. Heather Kowalski, a spokeswoman for Venter, confirmed the existence of the project but said the prototype system was ‘not yet 100 percent robotic.'”

Tags: , ,

I’ve mentioned before that automatic, driverless cars, for all the good they can bring, will become a magnet for hackers, even terrorists. But the future may have arrived before computers have become designated drivers. From Nathan Willis at LWN:

“There was no security track at the 2012 Automotive Linux Summit, but numerous sessions and the ‘hallway track’ featured anecdotes about the ease of compromising car computers. This is no surprise: as Linux makes inroads into automotive computing, the security question takes on an urgency not found on desktops and servers. Too often, though, Linux and open source software in general are perceived as insufficiently battle-hardened for the safety-critical needs of highway speed computing — reading the comments on an automotive Linux news story it is easy to find a skeptic scoffing that he or she would not trust Linux to manage the engine, brakes, or airbags. While hackers in other embedded Linux realms may understandably feel miffed at such a slight, the bigger problem is said skeptic’s presumption that a modern Linux-free car is a secure environment — which is demonstrably untrue.

First, there is a mistaken assumption that computing is not yet a pervasive part of modern automobiles. Likewise mistaken is the assumption that safety-critical systems (such as the aforementioned brakes, airbags, and engine) are properly isolated from low-security components (like the entertainment head unit) and are not vulnerable to attack. It is also incorrectly assumed that the low-security systems themselves do not harbor risks to drivers and passengers. In reality, modern cars have shipped with multiple embedded computers for years (many of which are mandatory by government order), presenting a large attack surface with numerous risks to personal safety, theft, eavesdropping, and other exploits. But rather than exacerbating this situation, Linux and open source adoption stand to improve it.”

Tags:

A Pacific Northwest man has repurposed a decommissioned Boeing airplane into a home. From Inhabitat: “What looks like a jetliner that has miraculously landed in the woods is actually one man’s dream retreat! Inspired by his passion for the aircraft as well as the need for shelter, Oregonian Bruce Campbell converted a Boeing 727-200 into a home. Campbell is not looking to be in Better Homes and Gardens – instead of turning the airplane into a full-fledged house he has adapted his daily life to live onboard an airplane.”

Tags:

Pushing the human body beyond what seems normal might not be healthy but it is fascinating, whether we’re talking about professional pedestrians in the nineteenth century or today’s ultramarathoners. Harvard evolutionary scientist Daniel Lieberman has an excellent post at Edge about the origins and development of endurance in humans. An excerpt: 

“We have this notion that humans are terrible natural athletes. But we’ve been looking at the wrong kind of athleticism. What we’re really good at is not power, what we’re really phenomenal at is endurance. We’re the tortoises of the animal world, not the hares of the animal world. Humans can actually outrun most animals over very, very long distances.

The marathon, of course, is a very interesting example. A lot of people think marathons are extraordinary, and they wonder how many people can run marathons. At least a million people run a marathon every year. If you watch any major marathon, you realize that most of those folks aren’t extraordinary athletes, they’re just average moms and dads. A lot of them are charity runners who decided to raise money for some cancer cause or diabetes or something. I think that proves that really your average human being can run 26.2 miles without that much training, or much ability to be a great athlete. Of course, to run a marathon at really fast speeds is remarkable, but again, it just takes some practice and training. It’s not something that’s really extraordinary.

We’re actually remarkable endurance athletes, and that endurance athleticism is deeply woven into our bodies, literally from our heads to our toes. We have adaptations in our feet and our legs and our hips and pelvises and our heads and our brains and our respiratory systems. We even have neurobiological adaptations that give us a runner’s high, all of which help make us extraordinary endurance athletes. We’ve lost sight at just how good we are at endurance athleticism, and that’s led to a perverse idea that humans really aren’t very good athletes.

A good example is that every year they have races where they actually compare humans and horses. In Wales, this started a few years ago, I guess it started out as a typical sort of drunken pub bet, where some guy bet that a human couldn’t beat a horse in a marathon. They’ve been running a marathon in Wales for the last, I think 15-20 years. To be fair, most years, the horses beat the humans, but the humans often come very close. Whenever it’s hot, the humans actually beat the horses. They also have now ultramarathons in Arizona, where humans race horses. Again, most years, the horses beat the humans, but every once in a while, the humans do beat the horses. The point is not that humans are poor athletes, because the horses occasionally beat us, but humans can actually compete with and often beat horses at endurance races. Most people are surprised at that.”

Tags:

Even many of those plastic bottles we dutifully place in recycling bins end up in the ocean, as barges accidentally drop tons and tons of the discards into the water. It’s obviously a threat to the food chain. The opening of Bettina Wassener’s New York Times story about a British company using diesel made from plastic to fuel a long plane trip:

“Sometime in the next few months, a single-engine Cessna will fly from Sydney to London. Converted to be able to carry extra amounts of fuel, the small plane will take 10 days for its journey, making 10 or so stops along the way.

What will make this journey special is not the route or the identity of the pilot — a 41-year-old British insurance industry executive who lives in Australia — but the fuel that the aircraft will be using: diesel processed from discarded plastic trash.

‘I’m not some larger-than-life character, I’m just a normal bloke,’ the pilot, Jeremy Rowsell, said by phone. ‘It’s not about me — the story is the fuel.’

The fuel in question will come from Cynar, a British company that has developed a technology that makes diesel out of so-called end-of-life plastics — material that cannot be reused and would otherwise end up in landfills.

Batches of the fuel will be prepositioned along the 17,000-kilometer, or 10,500-mile, route.

‘The idea is to fly the whole route on plastic fuel alone and to prove that this technology works,’ Mr. Rowsell said. ‘I’m a kind of carrier pigeon, carrying a message.'”

Tags: ,

Dendrochronology, not to be confused with Dendrophilia, is the science of tree rings. The opening of Ross Andersen’s new Aeon piece on the topic of ring-related research, which compares the past century of fervent deforestation with the burning of another set of valuable leaves, the Library of Alexandria:

“No event, however momentous, leaves an everlasting imprint on the world. Take the cosmic background radiation, the faint electromagnetic afterglow of the Big Bang. It hangs, reassuringly, in every corner of our skies, the firmest evidence we have for the giant explosion that created our universe. But it won’t be there forever. In a trillion years’ time it is going to slip beyond what astronomers call the cosmic light horizon, the outer edge of the observable universe. The universe’s expansion will have stretched its wavelength so wide that it will be undetectable to any observer, anywhere. Time will have erased its own beginning.

On Earth, the past is even quicker to vanish. To study geology is to be astonished at how hastily time reorders our planet’s surface, filling its craters, smoothing its mountains and covering its continents in seawater. Life is often the fastest to disintegrate in this constant churn of water and rock. The speed of biological decomposition ensures that only the most geologically fortunate of organisms freeze into stone and become fossils. The rest dissolve into sediment, leaving the thinnest of molecular traces behind.

Part of what separates humans from nature is our striving to preserve the past, but we too have proved adept at its erasure. It was humans, after all, who set fire to the ancient Library of Alexandria, whose hundreds of thousands of scrolls contained a sizable fraction of classical learning. The loss of knowledge at Alexandria was said to be so profound that it set Western civilisation back 1,000 years. Indeed, some have described the library’s burning as an event horizon, a boundary in time across which information cannot flow.

The burning of books and libraries has perhaps fallen out of fashion, but if you look closely, you will find its spirit survives in another distinctly human activity, one as old as civilisation itself: the destruction of forests. Trees and forests are repositories of time; to destroy them is to destroy an irreplaceable record of the Earth’s past. Over this past century of unprecendented deforestation, a tiny cadre of scientists has roamed the world’s remaining woodlands, searching for trees with long memories, trees that promise science a new window into antiquity. To find a tree’s memories, you have to look past its leaves and even its bark; you have to go deep into its trunk, where the chronicles of its long life lie, secreted away like a library’s lost scrolls. This spring, I journeyed to the high, dry mountains of California to visit an ancient forest, a place as dense with history as Alexandria. A place where the heat of a dangerous fire is starting to rise.”

Tags:

From Robert L. Blum’s post at Kurzweil AI about extreme long-term planning for humanity–like a billion years or so–which he feels would be better accomplished by intelligent machines than people:

“Long-term, humanity (whether augmented, re-engineered, or uploaded) will be left in the dust by the machines, who will stand in relation to us as we to bacteria. OK, that has a heavy-metal Skynet ring to it, so let me replace it immediately by a term I’ve come to love (from David Grinspoon’s book Lonely Planets): the Immortals.

Who are the Immortals? Perhaps we know who we want them to be: wise, superintelligent, compassionate, and just. And powerful! More powerful than a light-speed rocket, able to leap into intergalactic space in a single bound, and imbued with truth, justice, and the Western democratic way!

Whatever we choose to call them, further evolution of themselves and their tools will be in their hands and not ours. While future advances will greatly benefit humans, humans will be replaced as the helmsmen of a space-faring civilization before the Singularity — probably by 2040 (Philip K. Dick nailed this prediction in Blade Runner).

The evolving prototypes that will eventually leap to the stars will be electronic — informed by human design and concerns, but not constrained by them. Their decisions and wisdom will encompass all that is on the Web and all that is perceived by the world’s sensors. With a solar system full of effectors they will accomplish engineering that we cannot imagine. That is how they will begin their evolution and their journey to the stars.

So let’s leave the really long term planning (post-Singularity) to the Immortals.

Sometime before the Cambrian era 500 million years ago, the first differentiated, multicellular creatures arose. As the reproductive unit changed from a single cell to a multicellular organism, the individual cells had surrendered their autonomy for a greater chance of survival.

I think about the coming superorganism as something that will (at least initially) encompass human beings and confer upon them greater survival and quality of life.”

Tags:

Michael Pollan, who tells us to “eat food, not too much, mostly plants,” has a new Ask Me Anything on Reddit. Some excerpts follow.

________________________________

Question:

If it turns out that a diet based almost entirely on animal fat and protein is best for us and at the same time that such a diet is the worst for the planet, what should we do?

Michael Pollan

We can’t be eating more meat — the planet can’t take it. Plant-based diets are the key, meat should be treated more as a flavor principle –as it is in traditional Asian diets– and not the center of the plate. To exonerate saturated fat is not to say we should all eat 16 ounce steaks! plants are still better for you, and lots of red meat is correlated with higher rates of cancer.

________________________________

Question:

How do you feel about Mayor Bloomberg’s soda ban?

Michael Pollan:

On Mayor Bloomberg’s ban– it’s not on soda, but only big cups. And I think it’s an experiment we need to try. The proof is in the pudding. But the soda companies are trying to stop him, as they are trying to stop soda tax proposals nationwide. Can we just give one of these plans a real-world test? Reducing soda consumption will do more for the public health than just about anything else we can do in the food area– let’s test to see if that’s true!

________________________________

Question:

I am not sure how to word my question but I would like to know exactly you came to write about food? This is a very selfish question because I myself have a yearning to do the same as you. I began thinking I should major in Food Science but after taking an English class and writing many papers on GMOs I”m not sure if that was the right choice. So I was wondering if you had any advice for a hopeful food writer, or specifically what majors I should look into. I was thinking about possibly double majoring in English and Anthropology.

Michael Pollan:

I was an english major, so obviously had no plan to write about food– I might have taken biology if I did. But I followed my curiosity, and a passion for gardening –and growing food– turned into a series of essays on what gardens have to teach us and this eventually brought be around to looking at agriculture. You can predict these things. But English and Anthro is a great prep for just about anything.

Tags:

The opening of “From Cooling System to Thinking Machine,” Carl Zimmer’s excellent Being Human essay about historical attempts to understand brain function, from anatomical experiments to thought experiments:

Hilary Putnam is not a household name. The Harvard philosopher’s work on the nature of reality, meaning, and language may be required reading in graduate school, but Putnam’s fame hasn’t extended far beyond the academy. But one of Putnam’s thought experiments is familiar to millions of people: what it would be like to be a brain in a vat?

Here’s how Putnam presented the idea in his 1981 book, Reason, Truth, and History:

Imagine that a human being…has been subjected to an operation by an evil scientist. The person’s brain…has been removed from the body and placed in a vat of nutrients which keeps the brain alive. The nerve endings have been connected to a super-scientific computer which causes the person whose brain it is to have the illusion that everything is perfectly normal. There seem to be people, objects, the sky, etc.; but really, all the person…is experiencing is the result of electronic impulses travelling from the computer to the nerve endings.

Philosophers have wondered for thousands of years how we can be sure whether what we’re experiencing is reality or some shadowy deception. Plato imagined people looking at shadows cast by a fire in a cave. Descartes imagined a satanic genius. Starting in the 1960s, philosophers began to muse about what it would be like to be a brain in a vat, with reality supplied by a computer. The story circulated in obscure philosophy journals for over a decade before Putnam laid it out in his book.” (Thanks Browser.)

Tags: ,

The Chiba Institute of Technology has created a wheelchair that climbs steps.

An Israeli inventor has successfully designed and built a cardboard bicycle which figures to retail for about $20. For poor people in desperate need of transportation it could improve lives–even save them. From Reuters:

“Izhar Gafni, 50, is an expert in designing automated mass-production lines. He is an amateur cycling enthusiast who for years toyed with an idea of making a bicycle from cardboard.

He told Reuters during a recent demonstration that after much trial and error, his latest prototype has now proven itself and mass production will begin in a few months.

‘I was always fascinated by applying unconventional technologies to materials and I did this on several occasions. But this was the culmination of a few things that came together. I worked for four years to cancel out the corrugated cardboard’s weak structural points,’ Gafni said.

‘Making a cardboard box is easy and it can be very strong and durable, but to make a bicycle was extremely difficult and I had to find the right way to fold the cardboard in several different directions. It took a year and a half, with lots of testing and failure until I got it right,’ he said.”

Tags:

A classic Al Jarnow animated short, “Cubits,” made in 1978, which uses a geometrical shape to make a paper representation of a computer. It plays in a loop in my head. From Donna Shepper in Film Quarterly Library: “‘Cubits’ fuses the use of plastic visual concerns and logic with animation. Every aspect of the animation is measured and systematic. The filmmaker has set up a closed visual system based upon a total visual analysis of the single, simple geometric form, the cube. The music composed especially for ‘Cubits’ by Brenda Murphy, lends a feeling of humor.”

Tags: , ,

I posted a brief Jeremy Bernstein New Yorker piece about Stanley Kubrick that was penned in 1965 during the elongated production of 2001: A Space Odyssey. The following year the same writer turned out a much longer profile for the same magazine about the director and his sci-fi masterpiece. Among many other interesting facts, it mentions that MIT AI legend Marvin Minsky, who’s appeared on this blog many times, was a technical consultant for the film. An excerpt from “How About a Little Game?” (subscription required):

By the time the film appears, early next year, Kubrick estimates that he and [Arthur C.] Clarke will have put in an average of four hours a day, six days a week, on the writing of the script. (This works out to about twenty-four hundred hours of writing for two hours and forty minutes of film.) Even during the actual shooting of the film, Kubrick spends every free moment reworking the scenario. He has an extra office set up in a blue trailer that was once Deborah Kerr’s dressing room, and when shooting is going on, he has it wheeled onto the set, to give him a certain amount of privacy for writing. He frequently gets ideas for dialogue from his actors, and when he likes an idea he puts it in. (Peter Sellers, he says, contributed some wonderful bits of humor for Dr. Strangelove.)

In addition to writing and directing, Kubrick supervises every aspect of his films, from selecting costumes to choosing incidental music. In making 2001, he is, in a sense, trying to second-guess the future. Scientists planning long-range space projects can ignore such questions as what sort of hats rocket-ship hostesses will wear when space travel becomes common (in 2001 the hats have padding in them to cushion any collisions with the ceiling that weightlessness might cause), and what sort of voices computers will have if, as many experts feel is certain, they learn to talk and to respond to voice commands (there is a talking computer in 2001 that arranges for the astronauts’ meals, gives them medical treatments, and even plays chess with them during a long space mission to Jupiter–‘Maybe it ought to sound like Jackie Mason,’ Kubrick once said), and what kind of time will be kept aboard a spaceship (Kubrick chose Eastern Standard, for the convenience of communicating with Washington). In the sort of planning that NASA does, such matters can be dealt with as they come up, but in a movie everything is visible and explicit, and questions like this must be answered in detail. To help him find the answers, Kubrick has assembled around him a group of thirty-five artists and designers, more than twenty-five special effects people, and a staff of scientific advisers. By the time this picture is done, Kubrick figures that he will have consulted with people from a generous sampling of the leading aeronautical companies in the United States and Europe, not to mention innumerable scientific and industrial firms. One consultant, for instance, was Professor Marvin Minsky, of M.I.T., who is a leading authority on artificial intelligence and the construction of automata. (He is now building a robot at M.I.T. that can catch a ball.) Kubrick wanted to learn from him whether any of the things he was planning to have his computers do were likely to be realized by the year 2001; he was pleased to find out that they were.•

Tags: , , ,

The wild card in the future of technology is quantum computing, which would allow us to place heretofore unimaginable power in every shirt pocket. From Adam Frank in the New York Times:

“Classical computers use ‘bits’ of information that can be either 0 or 1. But quantum-information technologies let scientists consider ‘qubits,’ quantum bits of information that are both 0 and 1 at the same time. Logic circuits, made of qubits directly harnessing the weirdness of superpositions, allow a quantum computer to calculate vastly faster than anything existing today. A quantum machine using no more than 300 qubits would be a million, trillion, trillion, trillion times faster than the most modern supercomputer.

Going even further is the seemingly science-fiction possibility of ‘quantum teleportation.’ Based on experiments going on today with simple quantum systems, it is at least a theoretical possibility that one day objects could be reconstituted — beamed — across a space without ever crossing the distance.

When a revolution in science yields powerful new technologies, its effect on human culture is multiplied exponentially. Think of the relation between thermodynamics, steam engines and the onset of the industrial era. Quantum information could well be the thermodynamics of the next technological revolution.'”

Tags:

More people are opting to live alone, choosing to not have families. It’s mostly because we suck, and being in close proximity to other people who are similar to us reminds us of this fact. But there are other reasons. The opening of a piece about this significant societal shift by Joel Kotkin at New Geography:

“For most of human history, the family — defined by parents, children and extended kin — has stood as the central unit of society. In Europe, Asia, Africa and, later, the Americas and Oceania, people lived, and frequently worked, as family units.

Today, in the high-income world and even in some developing countries, we are witnessing a shift to a new social model. Increasingly, family no longer serves as the central organizing feature of society. An unprecedented number of individuals — approaching upwards of 30% in some Asian countries — are choosing to eschew child bearing altogether and, often, marriage as well.

The post-familial phenomena has been most evident in the high income world, notably in Europe, North America and, most particularly, wealthier parts of East Asia. Yet it has bloomed as well in many key emerging countries, including Brazil, Iran and a host of other Islamic countries.

The reasons for this shift are complex, and vary significantly in different countries and cultures.” (Thanks Browser.)

Tags:

Before 2001: A Space Odyssey became screen legend in 1968, Stanley Kubrick and Arthur C. Clarke struggled forever to complete the project that was originally entitled, Journey Beyond the Stars. From a 1965 “Talk of the Town” piece by Jeremy Bernstein in the New Yorker (subscription required) about the work-in-progress three years before its release:

Our briefing session took place in the living room of Mr. Kubrick’s apartment. When we got there, Mr. Kubrick was talking on a telephone in the next room, Mr. Clarke had not yet arrived, and three lively Kubrick daughters–the eldest is eleven–were running in and out with several young friends. We settled ourselves in a large chair, and a few minuted later the doorbell rang. One of the little girls went to the door and asked, ‘Who is it?’ A pleasantly English-accented voice answered, through the door, “It’s Clarke,” and the girls began jumping up and down and saying, “It’s Clark Kent!”-a reference to another well-known science-fiction personality. They opened the door, and in walked Mr. Clarke, a cheerful-looking man in his forties. He was carrying several manila envelopes, which, it turned out, contained parts of Journey Beyond the Stars. Mr. Kubrick then came into the room carrying a thick pile of diagrams and charts, and looking like the popular conception of a nuclear physicist who has been interrupted in the middle of some difficult calculations. Mr. Kubrick and Mr. Clarke sat down side by side on a sofa, and we asked them about their joint venture.

Mr. Clarke said that one of the basic problems they’ve had to deal with is how to describe what they are trying to do. “Science-fiction films have always meant monsters and sex, so we have tried to find another term for our film,” said Mr. C.

“About the best we’ve been able to come up with is a space Odyssey–comparable in some ways to Homer’s Odyssey,” said Mr. K. ‘It occurred to us that for the Greeks the vast stretches of the sea must have had the same sort of mystery and remoteness that space has for our generation, and that the far-flung islands Homer’s wonderful characters visited were no less remote to them that the planets our spacemen will soon be landing on are to us. Journey also shares with the Odyssey a concern for wandering, and adventure.”•

Tags: , , ,

As Argo is released, documentarian Judd Ehrlich has taken to Kickstarter to raise money to finish a stranger and truer film on the topic. It was begun long before the Hollywood version.

Tags:

hughesnail9

From a 1979 People Q&A with Wilbur Thain, who was the final doctor to treat Howard Hughes, a singular American character who lived in fear of the outside world but was betrayed from within:

People:

Was Hughes an impossible patient?

Dr. Wilbur Thain:

That’s a masterpiece of understatement. He wanted doctors around, but he didn’t want to see them unless he had to. He would allow no X-rays—I never saw an X-ray of Hughes until after he died—no blood tests, no physical exams. He understood his situation and chose to live the way he lived. Rather than listen to a doctor, he would fall asleep or say he couldn’t hear.

People:

Is that why you didn’t accept his job offer after you got out of medical school?

Dr. Wilbur Thain:

No, I just wanted to practice medicine on my own. I understand that Hughes was quite upset. I didn’t see him again for 21 years. He was 67 then. He had grown a beard, his hair was longer. He had some hearing loss partially due to his work around aircraft. That’s why he liked to use the telephone: It had an amplifier. He was very alert and well-informed. His toenails and fingernails were pretty long, but he had a case of onchyomycosis—a fungus disease of the nails which makes them thick and very sensitive. It hurt like hell to trim them. For whatever reason, he only sponge-bathed his body and hair.

People:

What was the turning point?

Dr. Wilbur Thain:

After his successful hip surgery in August of 1973 he chose never to walk again. Once—only once—he walked from the bedroom to the bathroom with help. That was the beginning of the end for him. I told him we’d even get him a cute little physical therapist. He said, “No, Wilbur, I’m too old for that.”

People:

Why did he decide not to walk?

Dr. Wilbur Thain:

I never had the chance to pry off the top of his head to see what motivated decisions like this. He would never get his teeth fixed, either. Worst damn mouth I ever saw. When they operated on his hip, the surgeons were afraid his teeth were so loose that one would fall into his lung and kill him!

 People:

What kinds of things did he talk about toward the end of his life?

Dr. Wilbur Thain:

The last year we would talk about the Hughes Institute medical projects and his earlier life. All the reporting on Hughes portrayed him as a robot. This man had real feelings. He talked one day about his parents, whom he loved very much, and his movies and his girls. He said he finally gave up stashing women around Hollywood because he got tired of having to talk to them. In our last conversation, he told me how much he still loved his ex-wife Jean Peters. But he was also always talking about things 10 years down the road. He was an optimist in that sense. If it hadn’t been for the kidney failure, Hughes might have lasted a lot longer.

People:

Do you have any regrets?

Dr. Wilbur Thain:

Sure, sure. I wish I could have treated him the way I wanted: Fix his teeth—that would have been Number One. It would have helped his diet. I wish I could have treated him just like any patient in a county hospital who comes in with a broken hip, bad teeth and rundown health. At the end Hughes was shrunken, wasted—he was 6’1″ and weighed 93 pounds. When his kidneys failed in Acapulco, a major medical center like Houston was the only hope. But knowing Hughes, he would have refused to be placed on dialysis. He always said, “I don’t want to be kept alive by machines.” Howard Hughes was still imposing that tremendous will of his—right up to the last.•

Tags: ,

I know that some health clubs have rigged their treadmills so that exercisers produce energy to help run the gyms, and experiments show that smart sidewalks can turn pedestrians into a power source. So, it only makes sense that automobiles could be used to create, not just consume, energy. From the Futurist:

“A scheme envisioned at the Technology University of Delft would use fuel cells of parked electric vehicles to convert biogas or hydrogen into more electricity. And the owners would be paid for the energy their vehicles produce. 

Another project at the university is the Energy Wall, a motorway whose walls generate energy for roadside lighting and serve as a support for a people mover on top.”

Connected personal computers mostly disrupted pure-information businesses like music and travel agencies. But 3D printers will strain even endeavors that require a physical component. From Peter Frase at Jacobin:

“Like the computer, the 3-D printer is a tool that can rapidly dis-intermediate a production process. Computers allowed people to turn a downloaded digital file into music or movies playing in their home, without the intermediary steps of manufacturing CDs or DVDs and distributing them to record stores. Likewise, a 3-D printer could allow you to turn a digital blueprint (such as a CAD file) into an object, without the intermediate step of manufacturing the object in a factory and shipping it to a store or warehouse. While 3-D printers aren’t going to suddenly make all of large-scale industrial capitalism obsolete, they will surely have some very disruptive effects.

The people who were affected by the previous stage of the file-sharing explosion were cultural producers (like musicians) who create new works, and the middlemen (like record companies) who made money selling physical copies of those works. These two groups have interests that are aligned at first, but are ultimately quite different. Creators find their traditional sources of income undermined, and thus face the choice of allying with the middlemen to shore up the existing regime, or else attempting to forge alternative ways of paying the people who create culture and information. But while the creators remain necessary, a lot of the middlemen are being made functionally obsolete. Their only hope is to maintain artificial monopolies through the draconian enforcement of intellectual property, and to win public support by presenting themselves as the defenders of deserving artists and creators.” (Thanks Browser.)

Tags:

From a new Maria Popova piece at Slate about Vannevar Bush’s 1945 Atlantic essay, “As We May Think,” a passage about the compression of information:

“Marveling at the rapid rate of technological progress, which has made possible the increasingly cheap production of increasingly reliable machines, Bush makes an enormously important—and timely—point about the difference between merely compressing information to store it efficiently and actually making use of it in the way of gleaning knowledge. (This, bear in mind, despite the fact that 90 percent of data in the world today was created in the last two years.)

Assume a linear ratio of 100 for future use. Consider film of the same thickness as paper, although thinner film will certainly be usable. Even under these conditions there would be a total factor of 10,000 between the bulk of the ordinary record on books, and its microfilm replica. The Encyclopoedia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence, having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van. Mere compression, of course, is not enough; one needs not only to make and store a record but also be able to consult it, and this aspect of the matter comes later. Even the modern great library is not generally consulted; it is nibbled at by a few.”

Tags: ,

Ikea furniture assembled by bots created by your friends at Motoman.

In Lauren Weiner’s New Atlantis article about Ray Bradbury, she provides a tidy description of the Space Age sage’s youthful education:

“Bradbury spent his childhood goosing his imagination with the outlandish. Whenever mundane Waukegan was visited by the strange or the offbeat, young Ray was on hand. The vaudevillian magician Harry Blackstone came through the industrial port on Lake Michigan’s shore in the late 1920s. Seeing Blackstone’s show over and over again marked Bradbury deeply, as did going to carnivals and circuses, and watching Hollywood’s earliest horror offerings like Dracula and The Phantom of the Opera. He read heavily in Charles Dickens, George Bernard Shaw, Edgar Allan Poe, H. G. Wells, Arthur Conan Doyle, L. Frank Baum, and Edgar Rice Burroughs; the latter’s inspirational and romantic children’s adventure tales earned him Bradbury’s hyperbolic designation as ‘probably the most influential writer in the entire history of the world.’

Then there was the contagious enthusiasm of Bradbury’s bohemian, artistic aunt and his grandfather, Samuel, who ran a boardinghouse in Waukegan and instilled in Bradbury a kind of wonder at modern life. He recounted: ‘When I was two years old I sat on his knee and he had me tickle a crystal with a feathery needle and I heard music from thousands of miles away. I was right then and there introduced to the birth of radio.’

His family’s temporary stay in Arizona in the mid-1920s and permanent relocation to Los Angeles in the 1930s brought Bradbury to the desert places that he would later reimagine as Mars. As a high-schooler he buzzed around movie and radio stars asking for autographs, briefly considered becoming an actor, and wrote and edited science fiction ‘fanzines’ just as tales of robots and rocket ships were gaining in popularity in wartime America. He befriended the staffs of bicoastal pulp magazines like Weird Tales,Thrilling Wonder StoriesDime Mystery, and Captain Future by bombarding them with submissions, and, when those were rejected, with letters to the editor. This precocity was typical. Science fiction and ‘fantasy’ — a catchall term for tales of the supernatural that have few or no fancy machines in them — drew adolescent talent like no other sector of American publishing. Isaac Asimov was in his late teens when he began writing for genre publications; Ursula K. Le Guin claimed to have sent in stories from the age of eleven.”

••••••••••

Harry Blackstone, Sr. with his classic bit, “The Bunny Trick”:

Tags: , ,

Some 1934 footage of Alpha the Robot, armed and dangerous, in action.

« Older entries § Newer entries »