Science/Tech

You are currently browsing the archive for the Science/Tech category.

From Robert Walker’s well-considered Science 2.0 article explaining why terraforming Mars is a far more fraught operation than merely building working Biospheres, which themselves aren’t easy assignments:

“Our only attempt at making a closed Earth-like ecosystem so far on Earth, in Biosphere 2, failed. There, it was because of an interaction of a chemical reaction with the concrete in the building, which indirectly removed oxygen from the habitat. Nobody predicted this and it was only detected after the experiment was over. The idea itself doesn’t seem to be fundamentally flawed, it was just a mistake of detail.

In the future perhaps we will try a Biosphere 3 or 4, and eventually get it right. When we build self-enclosed settlements in space such as the Stanford Torus, they will surely go wrong too from time to time in the early stages. But again, you can purge poisonous gases from the atmosphere, and replenish its oxygen. In the worst case, you can evacuate the colonists from the space settlement, vent all the atmosphere, sterilize the soil, and start again.

It is a similar situation with Mars, there are many interactions that could go wrong, and we are sure to make a few mistakes to start with. The difference is, if you make a mistake when you terraform a planet, it is likely that you can’t ‘turn back the clock’ and undo your mistakes.

With Mars, we can’t try again with a Mars 2, Mars 3, Mars 4 etc. until we get it right.”

Tags:

From a Wired piece by Liz Stinson about a printable paper speaker by a French product designer: 

“If you’re the tinkering type, you’ve probably deconstructed a fair number of electronics. It doesn’t take a genius to tear apart a radio, but once you get past the bulk of plastic packaging and down to the guts, you begin to realize that reading the mess of circuits, chips and components is like trying to navigate your way through a foreign country with a map from the 18th century.

But it doesn’t have to be so complicated, says Coralie Gourguechon. ‘Nowadays, we own devices that are too complicated considering the way we really use them,’ she says. Gourguechon, maker of the Craft Camera, believes that in order to understand our electronics, they need to be vastly simpler and more transparent than they currently are.”

Tags: ,

Libertarian billionaire Peter Thiel is an interesting guy, though I don’t agree with most of what he says. I’d love, for instance, to see him apply some of his know-how to coming up with solutions for poverty. Like a lot of people in Silicon Valley, he seems to exist on an island where such messy problems don’t register.

From a new Financial Times profile of Thiel by Richard Waters, in which the subject rails against government regulation, some of which might have come in handy on Wall Street during the aughts:

“He sounds equally uncomfortable discussing himself. The ‘ums’ multiply as he tries to explain why he threw in law and banking and came to Silicon Valley to pursue something far more world-changing. ‘There was this decision to move back to California and try something new and different,’ he says as though it were something that happened to someone else.

He is similarly vague when talking about the origins of his personal philosophy. ‘I’ve always been very interested in ideas and trying to figure things out.’ His undergraduate degree, from Stanford University, was in philosophy but his stance against the dominant political philosophy on many issues seems more visceral than intellectual. ‘I think that one of the most contrarian things one can do in our society is try to think for oneself,’ he says.

He only really regains his stride when talking about how technological ambition has gone from the world, leaving what he calls an ‘age of diminished expectations that has slowly seeped into the culture.’ Predictably, given his libertarian bent, much of this is traced back to regulation.

This is his explanation for why the computer industry (which inhabits ‘the world of bits’) has thrived while so many others (‘the world of atoms’) have not. ‘The world of bits has not been regulated and that’s where we’ve seen a lot of progress in the past 40 years, and the world of atoms has been regulated, and that’s why it’s been hard to get progress in areas like biotechnology and aviation and all sorts of material science areas.'”

Tags: ,

Karaoke, the smallpox of singing, the pained sounds people make when success is no longer an option, was not a naturally occurring, viral phenomenon, but an invention, originally called a Juke 8, that was marketed more than 40 years ago. Via Alexis C. Madrigal’s Atlantic article, Daisuke Inoue, the enabler of drunken-salarymen warbling, recalls his devilry:

“Inoue recounted his adventures in 2005 to Topic Magazine, which allowed the Atlantic-favorite history site The Appendix to reprint his first person account of creating a modern sensation.

One day, the president of a small company came to the club where I was playing to ask a favor. He was meeting business clients in another town and knew they would all end up at a drinking establishment and that he would be called on to sing. ‘Daisuke, your keyboard playing is the only music that I can sing to! You know how my voice is and what it needs to sound good.’

So at his request I taped a number of his favorite songs onto an open-reel tape recorder in the keys that would best suit his voice. A few days later he came back full of smiles and asked if I could record some more songs. At that moment the idea for the Juke 8 dawned on me: You would put money into a machine with a microphone, speaker and amplifier, and it would play the music people wanted to sing.

As I had attended a Denko (or Electric Industry) High School, you’d think I could have built the machine myself. But I was always scared of electricity and so graduated without much of an ability to put things together. A member of my band introduced me to a friend of his who had an electronics shop. I took my idea to him, and he understood exactly what I’d envisioned. With my instruction, he built eleven Juke 8s. Each machine consisted of an amplifier, a microphone, a coin box and an eight-track car stereo. Putting the machines together took about two months and cost around $425 per unit.

That was in 1969, but the machines did not actually hit the market until 1971. At first, people weren’t all that interested, but once they figured out how they worked, they started to take off with an Atari-like speed.”

Tags: ,

Google isn’t striving for driverless cars just so its employees can get to work more easily; it wants to sell the software to every automaker. The same likely goes for the robots it plans to build. From Illah Nourbakhsh’s New Yorker blog post about the company’s recent robotics-buying binge:

“While some analysts initially suggested that Google’s goal was to more thoroughly automate factories—highly controlled environments that are well-suited for a fleet of semi-independent robots—it’s now clear that the company’s team of engineers and scientists has a vision of truly dexterous, autonomous robots that can walk on sidewalks, carry packages, and push strollers. (A disclosure: Carnegie Mellon’s Community Robotics, Education, and Technology Empowerment Lab, which I direct, has partnered with Google on a number of mapping projects that utilize our GigaPan technology.) Before its acquisition of Boston Dyanmics, Google ingested seven start-ups in just six months—companies that have created some of the best-engineered arms, hands, motion systems, and vision processors in the robotics industry. The companies Meka and Schaft, two of Google’s recent acquisitions, designed robot torsos that can interact with humans at work and at home. Another, Redwood Robotics, created a lightweight, strong arm that uses a nimble-control system. Boston Dynamics, Google’s eighth acquisition, is one of the most accomplished robotics companies on the planet, having pioneered machines that can run, jump, and lift, often better than humans. Its latest projects have focussed on developing full-body robot androids that can wear hazmat clothing and manipulate any tool designed for humans, like front loaders or jackhammers. The potential impact of Google’s robot arsenal, already hinted at by its self-driving car effort, is stunning: Google could deploy human-scale robots throughout society. And, while Amazon is busy optimizing delivery logistics, Google bots could roboticize every Amazon competitor, from Target to Safeway.

If robots pervade society, how will our daily experiences change?”

Tags:

Al Goldstein was a horrible man, so it’s a shame he was right. The Screw publisher, who just passed away at 77, had a dream–a wet one–and lived long enough to see it become reality. Goldstein envisioned a world in which porn was ubiquitous and acceptable, and now it’s available on every screen in our homes and shirt pockets. He was the McLuhan of smut, not envisioning a Global Village so much as a universal circle jerk. He won.

One of my favorite video clips of all time: Smartmouth Stanley Siegel interviews Goldstein and comedian Jerry Lewis in 1976. When not busy composing the world’s finest beaver shots, Goldstein apparently had a newsletter about tech tools. He shows off a $3900 calculator watch and a $2200 portable phone. Lewis, easily the biggest tool on the stage, flaunts his wealth the way only a truly insecure man can.

The opening of a post at the Lefsetz Letter which offers a perfectly reasonable takedown of those who see Beyoncé’s record sales last week as anything but an extreme outlier, just a brief flash when an old paradigm still worked, a singular moment of calm before the sharks again turn the water red:

It’s a stunt. No different from Radiohead’s In Rainbows. Unrepeatable by mere mortals, never mind wannabes and also-rans.

That’s how desperate Apple is. It lets Beyonce circumvent its rules and release a ‘video album,’ so the record industry can have its bundle and the Cupertino company can delude itself into believing that it’s got a solution to Spotify, when the Swedish streaming company is chasing YouTube, not iTunes.

And the media is so impressed by numbers that it trumpets the story, believing its role is to amplify rather than analyze.

Yes, it was a story. The same way a bomb or SpaceX or anything new gets people’s attention. Only in this case, there was something to buy. Whoo-hoo! We got lemmings and fans to lay down their credit cards to spend money for the work of a superstar, as if this is a new paradigm.

And we’ve got Rob Stringer and the rest of the inane music business slapping its back, declaring victory.

What a bunch of hogwash.

The story of 2013 is cacophony.”

Tags:

Walter Isaacson, who’s writing a book about Silicon Valley creators, knows firsthand that sometimes such people take credit that may not be coming to them. So he’s done a wise thing and put a draft of part of his book online, so that crowdsourcing can do its magic. As he puts it: “I am sketching a draft of my next book on the innovators of the digital age. Here’s a rough draft of a section that sets the scene in Silicon Valley in the 1970s. I would appreciate notes, comments, corrections.” The opening paragraphs of his draft at Medium:

“The idea of a personal computer, one that ordinary individuals could own and operate and keep in their homes, was envisioned in 1945 by Vannevar Bush. After building his Differential Analyzer at MIT and helping to create the military-industrial-academic triangle, he wrote an essay for the July 1945 issue of the Atlantic titled ‘As We May Think.’ In it he conjured up the possibility of a personal machine, which he dubbed a memex, that would not only do mathematical tasks but also store and retrieve a person’s words, pictures and other information. ‘Consider a future device for individual use, which is a sort of mechanized private file and library,’ he wrote. ‘A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.’

Bush imagined that the device would have a ‘direct entry’ mechanism so you could put information and all your records into its memory. He even predicted hypertext links, file sharing, and collaborative knowledge accumulation. ‘Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified,’ he wrote, anticipating Wikipedia by a half century.

As it turned out, computers did not evolve the way that Bush envisioned, at least not initially. Instead of becoming personal tools and memory banks for individuals to use, they became hulking industrial and military colossi that researchers could time share but the average person could not touch. In the early 1970s, companies such as DEC began to make minicomputers, the size of a small refrigerator, but they dismissed the idea that there would be a market for even smaller ones that could be owned and operated by ordinary folks. ‘I can’t see any reason that anyone would want a computer of his own,’ DEC president Ken Olsen declared at a May 1974 meeting where his operations committee was debating whether to create a smaller version of its PDP-8 for personal consumers. As a result, the personal computer revolution, when it erupted in the mid-1970s, was led by scruffy entrepreneurs who started companies in strip malls and garages with names like Altair and Apple.

Once again, innovation was spurred by the right combination of technological advances, new ideas, and social desires. The development of the microprocessor, which made it technologically possible to invent a personal computer, occurred at a time of rich cultural ferment in Silicon Valley in the late 1960s, one that created a cauldron suitable for homebrewed machines. There was the engineering culture that arose during World War II with the growth of defense contractors, such as Westinghouse and Lockheed, followed by electronics companies such as Fairchild and its fairchildren. There was the startup culture, exemplified by Intel and Atari, where creativity was encouraged and stultifying bureaucracies disdained. Stanford and its industrial park had lured west a great silicon rush of pioneers, many of them hackers and hobbyists who, with their hands-on imperative, had a craving for computers that they could touch and play with. In addition there was a subculture populated by wireheads, phreakers, and cyberpunks, who got their kicks hacking into the Bell System’s phone lines or the timeshared computers of big corporations.

Added to this mix were two countercultural strands: the hippies, born out of the Bay Area’s beat generation, and the antiwar activists, born out of the Free Speech Movement at Berkeley. The antiauthoritarian and power-to-the-people mindset of the late 1960s youth culture, along with its celebration of rebellion and free expression, helped lay the ground for the next wave of computing. As John Markoff wrote in What the Dormouse Said, ‘Personal computers that were designed for and belonged to single individuals would emerge initially in concert with a counterculture that rejected authority and believed the human spirit would triumph over corporate technology.'”

Tags:

Apollo astronauts knew they’d always have a job in government or aviation or academia or corporate America if they made it back to Earth alive from their missions, but the actual job didn’t pay very well, even by the standards of the 1960s. From Norman Mailer’s Of a Fire on the Moon“Of course, most of the astronauts worked for only thirteen thousand dollars a year in base pay. Not much for an honored profession. There are, of course, increments and insurance policies and collective benefits from the Life Magazine contract, but few earn more than twenty thousand dollars a year.”

Tags:

While Muhammad Ali was suffering through his Vietnam Era walkabout, he “boxed” retired great Rocky Marciano in a fictional contest that was decided by a computer. Dubbed the “Super Fight,” it took place in 1970. Marciano dropped a lot of weight and donned a hairpiece to provide viewers with some semblance of his younger self. The fighters acted out the computer prognostications and the filmed result was released in theaters. Marciano awkwardly stumbled onto a great description of this Singularity moment: “I’m glad you’ve got a computer being the man that makes the decision.”

I frequently post videos from Boston Dynamics, the best and scariest robotics company on the planet. I’ve been surprised that Google or Amazon, with such deep pockets, didn’t acquire it, instantly becoming  leader in a sector that could help it with order processing and things far beyond that. But recently Google took the plunge and is now the company’s owner. What does it want from its newest division? From Samuel Gibbs at Guardian:

“Boston Dynamics is not the only robotics company Google has bought in recent years. Put under the leadership of Andy Rubin, previously Google’s head of Android, the search company has quietly acquired seven different technology companies to foster a self-described ‘moonshot’ robotics vision.

The acquired companies included Schaft, a small Japanese humanoid robotics company; Meka and Redwood Robotics, San Francisco-based creators of humanoid robots and robot arms; Bot & Dolly who created the robotic camera systems recently used in the movie Gravity; Autofuss an advertising and design company; Holomni, high-tech wheel designer, and Industrial Perception, a startup developing computer vision systems for manufacturing and delivery processes.

Sources told the New York Times that Google’s initial plans are not consumer-focused, instead aimed at manufacturing and industry automation, although products are expected within the next three to five years.”

___________________________________

From Boston Dynamics.

Petman:

Petman’s best friend:

Tags: ,

The opening of a Quartz article by Christopher Mims detailing what needs to be established before the Internet of Things can take off:

“As Quartz has already reported, the Internet of Things is already here, and in the not too distant future it will replace the web. Many enabling technologies have arrived which will make the internet of things ubiquitous, and thanks to smartphones, the public is finally ready to accept that it will become impossible to escape from the internet’s all-seeing eye.

But a critical piece of the internet of things puzzle remains to be solved. What engineers lack is a universal glue to bind all the of the ‘things’ in the internet of things to each other and to the cloud.

To understand how important these standards will be, it helps to know a bit about the history of the web. When the internet was born, it was a mishmash of now mostly-forgotten protocols designed to accomplish different tasks—gopher for retrieving documents, FTP for sending and receiving files, and no standard for social networking other than email. Then the web came along and unified those protocols, and made them accessible to non-geeks. All of this magic was possible because the internet is built on open standards: transparent, agreed-upon ways that devices should communicate with one another and share data.”

Tags:

Google certainly aspires to be the Bell Labs of our age, but is it doing that level of work? Two contrasting opinions: David Litwak (who is pro) and Zak Kukoff (who is con).

From Litwak:

“Bell Labs was the research division of AT&T and Western Electric Research Laboratories, originally formed in 1925 to work on telephone exchange switches. However, over the next 50 years or so, their research won 7 Nobel Prizes, for things very loosely connected to telephone switches, if at all. Among their inventions are the transistor, the laser, UNIX, radio astronomy and the C and C++ programming languages.

Under various ownership structures and names, Bell Labs spit out truly groundbreaking inventions for 50+ years. They still enjoy a measure of success, but by most opinions their best days are behind them, and many of their ~20 locations have been shuttered.

Google is the only tech company who has devoted significant resources to not just figuring out what the next big thing is, but figuring out what the big thing will be 15 years from now, much like Bell Labs used to.”

From Kukoff:

“I won’t argue with much of the article, because I think David makes some compelling points. Google is doing some compelling and interesting work, especially at Google X. But one big point missed by David (and many who agree with him) is that Bell Labs operated in no small part for the public good, producing IP like UNIX and C that entered the public domain. In fact, despite being a part of a state sanctioned monopoly, Bell Labs produced a staggering amount of freely-available knowledge that moved entire industries forward.”

Tags: ,

The opening of “A Model World,” Jon Turney’s Aeon article about computer models, which he reminds are not all created equal:

“Here’s a simple recipe for doing science. Find a plausible theory for how some bits of the world behave, make predictions, test them experimentally. If the results fit the predictions, then the theory might describe what’s really going on. If not, you need to think again. Scientific work is vastly diverse and full of fascinating complexities. Still, the recipe captures crucial features of how most of it has been done for the past few hundred years.

Now, however, there is a new ingredient. Computer simulation, only a few decades old, is transforming scientific projects as mind-bending as plotting the evolution of the cosmos, and as mundane as predicting traffic snarl-ups. What should we make of this scientific nouvelle cuisine? While it is related to experiment, all the action is in silico — not in the world, or even the lab. It might involve theory, transformed into equations, then computer code. Or it might just incorporate some rough approximations, which are good enough to get by with. Made digestible, the results affect us all.

As computer modelling has become essential to more and more areas of science, it has also become at least a partial guide to headline-grabbing policy issues, from flood control and the conserving of fish stocks, to climate change and — heaven help us — the economy. But do politicians and officials understand the limits of what these models can do? Are they all as good, or as bad, as each other? If not, how can we tell which is which?”

Tags:

I don’t agree with Malcolm Gladwell’s logic in diminishing the importance of satire, but I’m on board with him in this Grantland exchange with Bill Simmons about the hypocrisies in the discussion of performance-enhancing drugs:

Malcolm Gladwell:

As you know, I’ve had mixed feelings for years about doping. It’s not that I’m in favor of it. It’s just that I’ve never found the standard arguments against doping to be particularly compelling. So professional cyclists take EPO because they can rebuild their red blood cell count, in order to step up their training. I’m against ‘cheating’ when it permits people to take shortcuts. But remind me why I would be against something someone takes because they want to train harder?

Bill Simmons:

Or why blood doping is any different from ‘loading your body with tons of Toradol’ or ‘getting an especially strong cortisone shot’? I don’t know.

Malcolm Gladwell:

Exactly! Or take the so-called ‘treatment/enhancement’ distinction. The idea here is that there is a big difference between the drug that ‘treats’ some kind of illness or medical disorder and one, on the other hand, that ‘enhances’ some preexisting trait. There is a huge amount of literature on treatment/enhancement among scholars, and with good reason. Your health insurance company relies on this distinction, for example, when it decides what to cover. Open heart surgery is treatment. A nose job, which you pay for yourself, is enhancement. This principle is also at the heart of most anti-doping policies. Treatment is OK. Enhancement is illegal. That’s why Tommy John surgery is supposed to be OK. It’s treatment: You blow out your ulnar collateral ligament so you get it fixed.

But wait a minute! The tendons we import into a pitcher’s elbow through Tommy John surgery are way stronger than the ligaments that were there originally. There’s no way Tommy John pitches so well into his early forties without his bionic elbow. Isn’t that enhancement?”

Tags: ,

From Mark Pack’s well-rounded take on autonomous vehicles, which are being developed at an ever-accelerating pace and must now be a consideration during the planning of all long-range transportation projects:

“Think just how quickly driverless cars have developed in the last five years alone – and then think how long it takes to get planning permission, let alone build or fit out, a big public transport project. Public transport plans now should already be factoring in the high likelihood of a near future in which cars no longer need humans to drive them.

Some of the benefits like to accrue from this are brilliant – but do not require policy changes. A further improvement in road safety is likely for, as we have seen in other areas where automated machinery replaces humans in repetitive tasks, computers are more reliable, less sleepy and never drunk. Brilliant news for humanity (road deaths killed more people than genocides during the twentieth century after all), a useful saving for the NHS but not something which much knock-on policy impacts.

Other changes are likely to be more troubling.”

Tags:

We often measure the intelligence of animals based on how much they’re like humans or how closely they can follow tricks we try to teach them. But their talents are really Other and often amazing. They have superpowers we can only dream of. Via the ever-wonderful Browser, here’s an excerpt from “The Brains of Animals,” Amit Majmudar’s Kenyon Review post:

“There may come a time when we cease to regard animals as inferior, preliminary iterations of the human—with the human thought of as the pinnacle of evolution so far—and instead regard all forms of life as fugue-like elaborations of a single musical theme.

Animals are routinely superhuman in one way or another. They outstrip us in this or that perceptual or physical ability, and we think nothing of it. It is only our kind of superiority (in the use of tools, basically) that we select as the marker of ‘real’ superiority. A human being with an elephant’s hippocampus would end up like Funes the Memorious in the story by Borges; a human being with a dog’s olfactory bulb would become a Vermeer of scent, but his art would be lost on the rest of us, with our visually dominated brains. The poetry of the orcas is yet to be translated; I suspect that the whale sagas will have much more interesting things in them than the tablets and inscriptions of Sumer and Akkad.

If science should ever persuade people of this biological unity, it would be of far greater benefit to the species than penicillin or cardiopulmonary bypass; of far greater benefit to the planet than the piecemeal successes of environmental activism. We will have arrived, by study and reasoning, at the intuitive, mystical insights of poets.”

Tags:

The opening of “Where Are They?” philosopher Nick Bostrom’s 2008 essay explaining why the discovery of extraterrestrial life may spell doom for earthlings:

“When water was discovered on Mars, people got very excited. Where there is water, there may be life. Scientists are planning new missions to study the planet up close. NASA’s next Mars rover is scheduled to arrive in 2010. In the decade following, a Mars Sample Return mission might be launched, which would use robotic systems to collect samples of Martian rocks, soils, and atmosphere, and return them to Earth. We could then analyze the sample to see if it contains any traces of life, whether extinct or still active. Such a discovery would be of tremendous scientific significance. What could be more fascinating than discovering life that had evolved entirely independently of life here on Earth? Many people would also find it heartening to learn that we are not entirely alone in this vast cold cosmos.

But I hope that our Mars probes will discover nothing. It would be good news if we find Mars to be completely sterile. Dead rocks and lifeless sands would lift my spirit.”

Tags:

A follow-up post to the earlier one about Time‘s Michael Scherer seemingly being contacted by a telemarketing robot that was programmed to deceive him and deny being A.I. It may have not been a robot but a human telemarketer trying to hide a foreign accent by choosing pre-recorded answers. Alexis C. Madrigal of the Atlantic did a nice job in (perhaps) unraveling the mystery. From his article:

“The theory I heard — and keep in mind it is just a hypothesis to explain a perplexing situation — goes like this:

Samantha West is a human being who understands English but who is responding with a soundboard of different pre-recorded messages. So a human parses the English being spoken and plays a message from Samantha West. It is IVR, but the semantic intelligence is being provided by a human. You could call it a cyborg system. Or perhaps an automaton in that 18th-century sense.

If you’re reading this, you must be wondering: WHY?!?!

Well, while Americans accept customer service and technical help from people with non-American accents, they do not take well to telemarketing calls from non-Americans. The response rates for outbound marketing via call center are apparently abysmal.

So, Samantha West, could be the rather strange solution to this set of circumstances and technical capabilities.”

Tags: ,

Planes may not eventually need pilots, but hijackers likewise may not have to board to perform their machinations. On the former topic, an excerpt from Stephen Pope at Flying:

“Honeywell advanced technology guru Bob Witwer gave an interesting talk in Las Vegas this week in which he discussed the future of air travel and posed the intriguing question of whether airliners, cargo planes and business jets years from now will have a need for pilots or, indeed, even cockpit windows.

If the thought of the captain of your airliner being a software app that lives in the avionics gives you pause, you’re not alone. Still, as we shift to a satellite-based NextGen operating environment where airplanes can be controlled by computers in 4-D – that is, having the capability of hitting a specific point in space at a precise time, every time – will airliners really need two pilots? Will they even need one?

The idea that’s quietly gaining traction is that the ‘pilots’ would sit in an air-conditioned room in some central location on the ground and perform certain necessary flight duties via a comm link. Of course, we’ve already witnessed the rise of unmanned aerial vehicles, which have been used as killing machines in the airspace over foreign nations and for law enforcement and other duties here at home. The next logical step, many say, is to take aircraft that are currently piloted by humans and replace the pilots with computers. 

‘It’s kind of hard for me to imagine why we wouldn’t use unmanned vehicles 10 or 20 years from now to carry cargo if the infrastructure allowed us to move aircraft safely without a pilot,’ said Witwer, who is vice president for advanced technology at Honeywell Aerospace.”

Tags: ,

George Dvorsky of iO9 has a fascinating post about a telemarketing robot programmed to lie and deceive. An excerpt:

“Recently, Time Washington Bureau Chief Michael Scherer received a phone call from an apparently bright and engaging woman asking him if he wanted a deal on his health insurance. But he soon got the feeling something wasn’t quite right.

After asking the telemarketer point blank if she was a real person or a computer-operated robot, she chuckled charmingly and insisted she was real. Looking to press the issue, Scherer asked her a series of questions, which she promptly failed. Such as, ‘What vegetable is found in tomato soup?’ To which she responded by saying she didn’t understand the question. When asked what day of the week it was yesterday, she complained of a bad connection (ah, the oldest trick in the book).”

 

Tags: ,

Paper is too useful to ever completely disappear–and some think the scenario is even more sanguine than that–but could spying concerns mean a comeback for the dead trees? I think not, but not everyone agrees.  From Michael Lewis at the Toronto Star:

“Eugene Kaspersky said governments and corporations had already begun to elevate security concerns but revelations of U.S. spying activity contained in documents leaked over the summer by former National Security Agency contractor Edward Snowden added a new sense of urgency.

‘Big enterprises were even talking about back-to-paper scenarios because of espionage attacks,’ Kaspersky said Wednesday after the company released is 2014 cyber threat forecast.

‘Enterprises, governments — they are really serious about extra levels of security, extra regulation, disconnecting their services from the Internet, maybe even getting some processes back to paper,’ Kaspersky said.

‘It’s a very visible step backward.'”

Tags: ,

As often as I’ve criticized Bud Selig and Joe Torre for not acting quickly enough to eliminate home-plate collisions from baseball, I should stop and praise them for finally eradicating the concussion-inducing crashes. It’s time for the sport to progress, and this step is a good one.

Football, on the other hand, like boxing, has no answer for what ails it. No helmet is going to stop brain injuries. Football is in trouble. From Joshua Shepherd’s Practical Ethics post about the moral role parents play in their children participating in contact sports:

“A number of ethical questions arise in connection with this growing awareness. (What should the governing bodies of sports leagues do to protect players? What do teams owe players in such sports? Is the decision to play such a sport, or to continue playing in spite of suffering a concussion, really autonomous? Should fans speak up about player protection, and if not, are they complicit in the harm done to players? And so on.) Here I want to consider one question that has received little attention. It involves the role of parents in fostering participation in high-impact sports.

Without parental encouragement, participation in such sports would dramatically decrease. Certainly, parental encouragement or discouragement can be trumped. In societies which highly value such sports, some adolescents would find a way to participate. But I will not consider here the (vexing) question of how best to respect an adolescent’s budding autonomy. Arguably, if an adolescent wants to participate in a high-impact sport, a parent should acquiesce. Whether that argument is plausible depends, in part, on the risks of playing the sport in question. The question I want to consider is the following: is it morally permissible for parents to encourage their children to play high-impact sports?”

Tags: , ,

I don’t know that we needed research to show that observing something and photographing that same thing affects our brains differently. Two distinct acts even from the same perspective will necessarily send disparate bits of information to the brain. Focusing your iPhone is not the same thing as focusing your brain sans iPhone. From Sarah Knapton at the Telegraph:

“Taking photographs at a birthday or a wedding has become as natural as blowing out candles or cutting the cake.

But our obsession with recording every detail of our happiest moments could be damaging our ability to remember them, according to new research.

A study has shown that taking pictures rather than concentrating fully on the events in front of us prevents memories taking hold.

Dr. Linda Henkel, from Fairfield University, Connecticut, described it as the ‘photo-taking impairment effect.’

She said: ‘People so often whip out their cameras almost mindlessly to capture a moment, to the point that they are missing what is happening right in front of them.”•

____________________________________________

Edwin H. Land brings instant gratification to photography, 1948:

Tags: ,

I’ve read a thing or two in the Hollywood trades that made the British anthology series, Black Mirror, sound like it’s right up my alley. Charlie Brooker’s Twilight Zone-ish program looks at the dark side of all things digital, which is a favorite topic of mine. (Though the bright side of technology is equally a fascination.) The first two paragraphs from Andy Greenwald’s wonderfully written Grantland consideration of the soul-shattering show and how we now live inside a series of screens, which seem like mirrors until we realize, perhaps too late, that they may be something else:

“Midway through ‘Be Right Back,’ the soul-cleaving fourth episode of the British anthology series Black Mirror, I sought refuge in a second screen. It happens sometimes when I watch TV, usually when things get too emotional, too painful, too intense. The mind can’t wander, so the hands do, fiddling with pens and scraps of paper, drumming on the desk. Eventually — inevitably — I found myself lifting up my iPhone, my thumb moving circles across its screen as if it were a rosary. The mindless swiping of Candy Crush Saga didn’t help me process my feelings about ‘Be Right Back,’ didn’t make it any easier to see Hayley Atwell’s face shattering like a dropped wine glass. But I guess it didn’t hurt much, either. Distancing myself made the experience of watching seem less passive. It restored a flickering feeling of control. I couldn’t handle what was coming at me, so I threw up a wall to stop it.

Modern life is full of little walls like that, tricks we can pull to blunt unwanted or unexpected impact. There’s always a game just a click away. Or a photo. Or a ‘friend.’ It’s actually what ‘Be Right Back’ is about. The episode begins by toying with our natural need to be distracted, placated, and protected from the world before demonstrating, in disturbing ways, how the world is increasingly designed to meet that need. It’s about how we’re willing to submerge ourselves in the comforting warmth of denial right up to the moment reality sidles up beside us and rips our hearts out of our chests. So was it ironic or inevitable the way I was idly crossing striped candies when Atwell yelled at Domnall Gleeson for not being fully present? (Gleeson played her boyfriend, or at least he had earlier in the episode. The specifics are both too confusing and too important to the overall experience to discuss here.) I was hovering on the edge of two screens, fully engaged in neither. Did that make me the viewer or the subject? Which one was the game and which was the drama? Was I consuming media or was the media consuming me?”•

__________________

The entire history of you:

Tags: ,

« Older entries § Newer entries »