“It’s A Day That’s Here”

Facebook has been shown in repeated studies over numerous years to make its users unhappy, but there’s a bright side. For Facebook, that is.

Knowing when someone is sad is useful when trying to manipulate them to buy stuff or stay on the social network longer. It’s good for the bottom line, and it happens so quietly you don’t even notice it. That’s true about so much of our tacit agreement with Digital Age technology. The “free” things we receive have very large hidden costs.

Mark Zuckerberg can go on all the listening tours he likes and bottle-feed cattle in photo ops, but the communications monster he’s raised feels like it was stuffed with an abnormal brain.

Two pieces follow about algorithms snaking through our society in dubious ways, one about Zuckerberg’s unhappiness machine acting unethically and another about stealth algorithms being utilized to sentence criminals.


From Jessica Guynn of USA Today:

SAN FRANCISCO — Facebook admits it didn’t follow its own policies when it showed at least one advertiser how to reach emotionally insecure and vulnerable teens.

But, it says, Facebook does not offer tools to target advertising to users based on their emotional state.

According to a report by The Australian, the social network shared a 23-page presentation with a bank that showed Facebook’s ability to detect when users as young as 14 are feeling emotions such as defeat, stress, anxiety or simply being overwhelmed.

“Anticipatory emotions are more likely to be expressed early in the week, while reflective emotions increase on the weekend,” according to the leaked Facebook presentation. “Monday-Thursday is about building confidence; the weekend is for broadcasting achievements.”

Facebook said sharing the research was an “oversight.” It also said the data was collected anonymously and was not used to target ads.•


From Adam Liptak of the New York Times:

When Chief Justice John G. Roberts Jr. visited Rensselaer Polytechnic Institute last month, he was asked a startling question, one with overtones of science fiction.

“Can you foresee a day,” asked Shirley Ann Jackson, president of the college in upstate New York, “when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”

The chief justice’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”

He may have been thinking about the case of a Wisconsin man, Eric L. Loomis, who was sentenced to six years in prison based in part on a private company’s proprietary software. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.
 
In March, in a signal that the justices were intrigued by Mr. Loomis’s case, they asked the federal government to file a friend-of-the-court brief offering its views on whether the court should hear his appeal.

The report in Mr. Loomis’s case was produced by a product called Compas, sold by Northpointe Inc. It included a series of bar charts that assessed the risk that Mr. Loomis would commit more crimes.

The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”•

Tags: ,