Quentin Hardy

You are currently browsing articles tagged Quentin Hardy.

teachingmachine1975 (1)

We can spend our time discerning patterns, but with the surfeit of information at our avail, no one has that much time.

Quentin Hardy of the New York Times, who does reliably excellent reporting about technology, has penned a piece about where the relationship between AI and business may be headed, with an emphasis on recognizing, sorting, mapping and teaching. Whether this means we’ll require a new class of workers or merely new algorithms to make sense of the mountains of data, no one can say for sure, but someone or something will have to explain the new normal. Both carbon and silicon will likely be required, at least initially.

An excerpt:

Over the last decade, smartphones, social networks and cloud computing have moved from feeding the growth of companies like Facebook and Twitter, leapfrogging to Uber, Airbnb and others that have used the phones, personal rating systems and powerful remote computers in the cloud to create their own new businesses.

Believe it or not, that stuff may be heading for the rearview mirror already. The tech industry’s new architecture is based not just on the giant public computing clouds of Google, Microsoft and Amazon, but also on their A.I. capabilities. These clouds create more efficient and supple use of computing resources, available for rent. Smaller clouds used in corporate systems are designed to connect to them.

The A.I. resources [Google Compute Engine head Diane B.] Greene is opening up at Google are remarkable. Google’s autocomplete feature that most of us use when doing a search can instantaneously touch 500 computers in several locations as it guesses what we are looking for. Services like Maps and Photos have over a billion users, sorting places and faces by computer. Gmail sifts through 1.4 petabytes of data, or roughly two billion books’ worth of information, every day.

Handling all that, plus tasks like language translation and speech recognition, Google has amassed a wealth of analysis technology that it can offer to customers. Urs Hölzle, Ms. Greene’s chief of technical infrastructure, predicts that the business of renting out machines and software will eventually surpass Google advertising. In 2015, ad profits were $16.4 billion.

“In the ’80s, it was spreadsheets,” said Andreas Bechtolsheim, a noted computer design expert who was Google’s first investor. “Now it’s what you can do with machine learning.”•

Tags: , ,

helmet77777-2

At some point this century, and probably sooner than later, sensors will live inside pretty much all manufactured objects, moving every last thing into an interconnected data-gathering and -crunching system. Part of the mission will be to make individual lives and entire cities more efficient, constantly upgrading, but much of it will be about consumerism, creating and selling “products that respond to their owner’s tastes,” as Quentin Hardy of the New York Times notes in his really smart article about technologist Adam Bosworth attempting to bring about “data singularity.” All the world will be a “smart object,” privacy will be compromised to an unprecedented degree, and there’ll be no way to opt out. The blessing will be mixed.

An excerpt:

Imagine if almost everything — streets, car bumpers, doors, hydroelectric dams — had a tiny sensor. That is already happening through so-called Internet-of-Things projects run by big companies like General Electric and IBM.

All those devices and sensors would also wirelessly connect to far-off data centers, where millions of computer servers manage and learn from all that information.

Those servers would then send back commands to help whatever the sensors are connected to operate more effectively: A home automatically turns up the heat ahead of cold weather moving in, or streetlights behave differently when traffic gets bad. Or imagine an insurance company instantly resolving who has to pay for what an instant after a fender-bender because it has been automatically fed information about the accident.

Think of it as one, enormous process in which machines gather information, learn and change based on what they learn. All in seconds.

“I’m interested in affecting five billion people,” said Mr. Bosworth, a former star at Microsoft and Google who now makes interactive software atSalesforce.com, an online software company that runs sales for thousands of corporations. “We’re headed into one of those historic discontinuities where society changes.”

Tags: ,

Algorithms may be biased, but people certainly are. 

Financial-services companies are using non-traditional data cues to separate signals from noises in determining who should receive loans. I’d think in the short term such code, if written well, may be fairer. It certainly has the potential, though, to drift in the wrong direction over time. If our economic well-being is based on real-time judgements of our every step, then we could begin mimicking behaviors that corporations desire, and, no, corporations still aren’t people.

From Quentin Hardy at the New York Times:

Douglas Merrill, the founder and chief executive of ZestFinance, is a former Google executive whose company writes loans to subprime borrowers through nonstandard data signals.

One signal is whether someone has ever given up a prepaid wireless phone number. Where housing is often uncertain, those numbers are a more reliable way to find you than addresses; giving one up may indicate you are willing (or have been forced) to disappear from family or potential employers. That is a bad sign. …

Mr. Merrill, who also has a Ph.D. in psychology…thinks that data-driven analysis of personality is ultimately fairer than standard measures.

“We’re always judging people in all sorts of ways, but without data we do it with a selection bias,” he said. “We base it on stuff we know about people, but that usually means favoring people who are most like ourselves.” Familiarity is a crude form of risk management, since we know what to expect. But that doesn’t make it fair.

Character (though it is usually called something more neutral-sounding) is now judged by many other algorithms. Workday, a company offering cloud-based personnel software, has released a product that looks at 45 employee performance factors, including how long a person has held a position and how well the person has done. It predicts whether a person is likely to quit and suggests appropriate things, like a new job or a transfer, that could make this kind of person stay.

Tags: ,

It seems that drivers may surrender the wheel, but will they concede ownership as well? “We don’t think people will give up their own cars,” asserts Mercedes-Benz futurist Eric Larsen in a New York Times Q&A with Quentin Hardy. That isn’t nearly the most disputed thing he said during the interview.

When asked about electric vehicles, Larsen says that internal-combustion model “isn’t broken for most people,” since fracking has kept gas prices low and refueling with gasoline requires of the owner only five minutes weekly. I suppose in a lower-case sense, the model isn’t broken, but in the much bigger one, the one in which we’re putting ourselves in a very precarious position environmentally, the model seems hopelessly broken for everyone. 

An excerpt about changes to the automobile interior, which touches on another thorny issue–privacy:

Question:

What has changed inside the car itself?

Eric Larsen:

Screens have become more important. Will a driver’s screen get lots of upgrades like a phone app? If you have a five-year-old car now, people know it by looking at the sound system and the screen. Leased vehicles may be refurbished more often, as dealers look to make them seem newer. Cars may become more modular that way, and there won’t be model years in American cars the way there were.

There is more awareness in the controls. You can’t input long addresses into a navigational system while you’re driving. When a car knows it is at rest, it may allow you to put the seat back further, letting you work, sleep or watch TV from the driver’s seat.

But there’s also a tightrope of personalization and privacy. Companies can know how fast you drive, how tight you corner. We’ve already seen start-ups that tell how fast you’re driving and how you are braking by using the sensors in your phone. It can be a capability in the car itself. As you get into “pay as you drive” car businesses, that will become an issue. There are legal points that have to be worked out.•

 

Tags: ,

Quantifying our behavior is likely only half the task of the Internet of Things, with nudging us the other part of the equation. I don’t necessarily mean pointing us toward healthier choices we wouldn’t necessarily make (which is dubious if salubrious) but placing us even more inside a consumerist machine.

Somewhat relatedly: Quentin Hardy of the New York Times looks at how the data-rich tomorrow may mostly benefit the largest technology companies. An excerpt:

This sensor explosion is only starting: Huawei, a Chinese maker of computing and communications equipment with $47 billion in revenue, estimates that by 2025 over 100 billion things, including smartphones, vehicles, appliances and industrial equipment, will be connected to cloud computing systems.

The Internet will be almost fused with the physical world. The way Google now looks at online clicks to figure out what ad to next put in front of you will become the way companies gain once-hidden insights into the patterns of nature and society.

G.E., Google and others expect that knowing and manipulating these patterns is the heart of a new era of global efficiency, centered on machines that learn and predict what is likely to happen next.

“The core thing Google is doing is machine learning,” Eric Schmidt, Google’s executive chairman, said at an industry event on Wednesday. Sensor-rich self-driving cars, connected thermostats or wearable computers, he said, are part of Google’s plan “to do things that are likely to be big in five to 10 years. It just seems like automation and artificial intelligence makes people more productive, and smarter.”

Tags: ,

Feedback loops between humans and machines, a new type of conversation and one that will ultimately be conducted in hushed tones, is one of the goals of the Internet of Things. Measuring our minutiae will lead to a smarter society, but there’ll really be no way to opt out. The opening of Quentin Hardy’s New York Times Q&A with IoT enthusiast Tim O’Reilly:

Question:

The way most companies sell it, the Internet of Things is about gaining efficiency from putting all kinds of devices online. What is wrong with that definition?

Tim O’Reilly: 

The IoT is really about human augmentation. The applications are profoundly different when you have sensors and data driving the decision-making.

Question: 

Can you give me an example?

Tim O’Reilly: 

Uber is a company built around location awareness. An Uber driver is an augmented taxi driver, with real-time location awareness. An Uber passenger is an augmented passenger, who knows when the cab will show up. Uber is about eliminating slack time and worry.

People would call it “IoT” if there was a driverless car, but it already is part of the IoT. You can measure, test and change things dynamically. The IoT is about the interpolation of computer hardware and software into all sorts of things.

Question: 

But the IoT isn’t just about one sensor in two-way contact with a remote cloud computing battery of servers, or a driver and a rider with a smartphone. There are going to be lots of different data sets, and lots of different feedback loops.

Tim O’Reilly: 

The characteristics are that things are contingent, in relationship with other data. They are on demand. They are load-balanced, and aware of other parts of the system. That is why you get things like congestion pricing. It’s a more context-oriented world, because there is better data.

Question:

Why do you think this isn’t better understood?

Tim O’Reilly: 

We’re not letting the IoT teach us enough about what is possible once you add sensors. There is a complex interplay of humans, interfaces and machines. A big question is, How do we create feedback loops from devices to humans?•

Tags: ,

Cloud robotics is an absolute must if driverless cars and the Internet of Things are going to take off, but it’s also an invitation to mayhem, heretofore unimagined acts of terrorism and war. In Quentin Hardy’s New York Times interview with Berkeley roboticist Ken Goldberg, various angles of the topic are analyzed. The opening: 

Question:

What is cloud robotics?

Ken Goldberg:

Cloud robotics is a new way of thinking about robots. For a long time, we thought that robots were off by themselves, with their own processing power. When we connect them to the cloud, the learning from one robot can be processed remotely and mixed with information from other robots.

Question:

Why is that a big deal?

Ken Goldberg:

Robot learning is going to be greatly accelerated. Putting it a little simply, one robot can spend 10,000 hours learning something, or 10,000 robots can spend one hour learning the same thing.

Question:

How long has this been around?

Ken Goldberg:

The term ‘cloud robotics’ was coined in 2010 by James Kuffner, who was at Carnegie Mellon and then went to Google. I had been doing robot control over the Internet since the mid-90s, with a garden people could connect to, then plant seeds or water their plants.

The cloud is different from my Internet ‘telegarden,’ though. The cloud can have all the computation and memory stored remotely. That means all of the endpoints can be lightweight, and there is a huge collective benefit. These robots can address billions of behaviors and learn how to do important things quickly.

Question:

What are some examples of this?

Ken Goldberg:

Google’s self-driving cars are cloud robots. Each can learn something about roads, or driving, or conditions, and it sends the information to the Google cloud, where it can be used to improve the performance of other cars.

Health care is also very promising: Right now radiation treatments involve putting a radioactive seed next to a tumor, using a catheter that has to push through other tissue and organs. The damage could be minimized if the catheter worked like a robot and had motion planning to avoid certain objects. Tedious medical work, like suturing a wound, might be done faster and better. Giving intravenous fluids to Ebola patients is difficult and risks contamination; some people are looking at ways a robot could sense where a vein is and insert the needle.

Another area is household maintenance, particularly with seniors. Robots could pick up clutter, which would help elderly people avoid falling and hurting themselves.”

Tags: ,

The Internet of Things makes too much sense to not happen, but there have to be some sort of universal operating standards before machines can communicate coherently with each other and us, before our health and homes can be quantified and the connectivity of computers can be duplicated in all objects. It will result in challenges (e.g., everything will be a target of hackers) but also real benefits. From a post by Quentin Hardy at the New York Times’ “Bits” blog:

Attention: Internet of Things. For better or worse, big boys are in the room.

A consortium of industrial giants, including AT&T, Cisco, General Electric, IBM and Intel said on Thursday that they would cooperate to create engineering standards to connect objects, sensors and large computing systems in some of the world’s largest industrial assets, like oil refineries, factories or harbors. The White House and other United States governmental entities were also involved in the creation of the group, which is expected to enroll other large American and foreign businesses.

‘I don’t think anything this big has been tried before’ in terms of sweeping industrial cooperation, said William Ruh, vice president of G.E.’s global software center. ‘This is how we will make machines, people and data work together.’

There are connections among all sorts of industrial assets, like sensors on turbines or soda machines that tell suppliers when they are running low on cola.

The means by which this ‘Internet of Things’ uses power and sends data around has been somewhat haphazard.

The group, called the Industrial Internet Consortium, hopes to establish common ways that machines share information and move data.”

Tags: ,

I love the Internet and the information it brings me, but I’m not on Facebook or Twitter and I don’t have a smartphone, so I’ve obviously said “no” to certain things. But will the things I’ve said “yes” be the same tomorrow? Will they be quietly remade by updating, constant updating? From “When Tech Turns Nouns Into Verbs,” Quentin Hardy’s New York Times blog post about a world in which the tools you use to measure also measure you, where things, simply put, change:

“We’re remaking the world so quickly that our language is breaking down.

Think about the phone you carry. You talk with people on it, but you can also open apps and transform it into a camera or chess board. As much as you talk on it, you use its Internet browser. In total daily usage, your phone is mostly pinging cellphone towers and Wi-Fi antennas, informing phone service providers, digital map makers and retailers of where you are.

Whatever this object is, it isn’t a phone in any conventional sense. And that may be a clue to a whole new way of thinking about the world around us.

The phone is a little connected computer — a device whose uses and meaning we continually explore and modify. It is by no means a phone in the historical sense. It is still a physical object, of course, but it is really a vehicle for one or another software-enabled experience. In an important sense, it is made to be contingent, changing with every download and update. That focus on the needs-driven experience means it behaves less like a static noun and more like an active verb.

This is becoming a commonplace across our connected world.”

Tags:

Douglas Rushkoff, author of Present Shock, discussing the anarchic nature of the flow of time in the Digital Age in a New York Times interview conducted by Quentin Hardy:

Question:

You say we have ‘a new relationship with time.’ What is it, and why is that a bad thing?

Douglas Rushkoff:

What we’ve done has made time even more dense. On Facebook, your past comes into your present when someone from your second grade class suddenly pops up to send you a message, and your future is being manipulated by what Facebook knows to put in front of you next. Present shock interrupts our normal social flow.

It didn’t have to be this way. When digital culture first came along, it was supposed to create more time, by allowing us to shift time around. Somehow instead we’ve strapped devices to ourselves that ping us all the time.

Question:

Hasn’t time been collapsing for centuries? We moved from the rhythm of seasons to living by the clock in the Industrial Age. We’ve paced in front of the microwave for decades.

Douglas Rushkoff: 

Yes, but it has hit a point where we have lost any sense of analog time, the way a second hand sweeps around a clock. We’ve chosen the false ‘now’ of our devices. It has led to a collapse of linear narratives and a culture where you have political movements demanding that everything change, now. The horrible truth is we are linear beings; we can’t multitask, and we shouldn’t keep interrupting important connections to each other with the latest message coming in.

Question:

It’s a funny thing: the counterculture used to talk about ‘Be here now,’ and the need to chase after self-awareness by seeking the eternal present. What is the difference between that world of the “now” and this one?

Douglas Rushkoff:

People are seduced by signals from the world, but that is manipulation, not reality. Computers have learned more about us than we’ve learned about them.•

Tags: ,

The military contractor Lockheed Martin is speeding ahead into the world of quantum computing, which packs the potential to rewrite the rules of what such machines can do. From Quentin Hardy in the New York Times:

“Ray Johnson, Lockheed’s chief technical officer, said his company would use the quantum computer to create and test complex radar, space and aircraft systems. It could be possible, for example, to tell instantly how the millions of lines of software running a network of satellites would react to a solar burst or a pulse from a nuclear explosion — something that can now take weeks, if ever, to determine.

‘This is a revolution not unlike the early days of computing,’ he said. ‘It is a transformation in the way computers are thought about.’ Many others could find applications for D-Wave’s computers. Cancer researchers see a potential to move rapidly through vast amounts of genetic data. The technology could also be used to determine the behavior of proteins in the human genome, a bigger and tougher problem than sequencing the genome. Researchers at Google have worked with D-Wave on using quantum computers to recognize cars and landmarks, a critical step in managing self-driving vehicles.

Quantum computing is so much faster than traditional computing because of the unusual properties of particles at the smallest level. Instead of the precision of ones and zeros that have been used to represent data since the earliest days of computers, quantum computing relies on the fact that subatomic particles inhabit a range of states. Different relationships among the particles may coexist, as well. Those probable states can be narrowed to determine an optimal outcome among a near-infinitude of possibilities, which allows certain types of problems to be solved rapidly.”

Tags: ,