Monday 30 June 2014

News!! Facebook reveals news feed experiment to control emotions



It already knows whether you are single

or dating, the first school you went to

and whether you like or loathe Justin

Bieber. But now Facebook, the world's

biggest social networking site, is facing

a storm of protest after it revealed it

had discovered how to make users feel

happier or sadder with a few computer

key strokes.

It has published details of a vast

experiment in which it manipulated

information posted on 689,000 users'

home pages and found it could make

people feel more positive or negative

through a process of "emotional

contagion".

In a study with academics from Cornell

and the University of California

, Facebook filtered users' news feeds –

the flow of comments, videos, pictures

and web links posted by other people in

their social network. One test reduced

users' exposure to their friends'

"positive emotional content", resulting

in fewer positive posts of their own.

Another test reduced exposure to

"negative emotional content" and the

opposite happened.

The study concluded: "Emotions

expressed by friends, via online social

networks, influence our own moods,

constituting, to our knowledge, the first

experimental evidence for massive-

scale emotional contagion via social

networks."

Lawyers, internet activists and

politicians said this weekend that the

mass experiment in emotional

manipulation was "scandalous", "spooky"

and "disturbing".

On Sunday evening, a senior British MP

called for a parliamentary investigation

into how Facebook and other social

networks manipulated emotional and

psychological responses of users by

editing information supplied to them.

Jim Sheridan, a member of the Commons

media select committee, said the

experiment was intrusive. "This is

extraordinarily powerful stuff and if

there is not already legislation on this,

then there should be to protect people,"

he said. "They are manipulating material

from people's personal lives and I am

worried about the ability of Facebook

and others to manipulate people's

thoughts in politics or other areas. If

people are being thought-controlled in

this kind of way there needs to be

protection and they at least need to

know about it."

A Facebook spokeswoman said the

research, published this month in the

journal of the Proceedings of the

National Academy of Sciences in the US,

was carried out "to improve our

services and to make the content

people see on Facebook as relevant and

engaging as possible".

She said: "A big part of this is

understanding how people respond to

different types of content, whether it's

positive or negative in tone, news from

friends, or information from pages they

follow."

But other commentators voiced fears

that the process could be used for

political purposes in the runup to

elections or to encourage people to

stay on the site by feeding them happy

thoughts and so boosting advertising

revenues.

In a series of Twitter posts, Clay

Johnson, the co-founder of Blue State

Digital, the firm that built and managed

Barack Obama's online campaign for the

presidency in 2008, said: " The Facebook

'transmission of anger' experiment is

terrifying

."

He asked: "Could the CIA incite

revolution in Sudan by pressuring

Facebook to promote discontent?

Should that be legal? Could Mark

Zuckerberg swing an election by

promoting Upworthy [a website

aggregating viral content] posts two

weeks beforehand? Should that be

legal?"

It was claimed that Facebook may have

breached ethical and legal guidelines by

not informing its users they were being

manipulated in the experiment, which

was carried out in 2012.

The study said altering the news feeds

was "consistent with Facebook's data

use policy, to which all users agree prior

to creating an account on Facebook,

constituting informed consent for this

research".

But Susan Fiske, the Princeton

academic who edited the study, said she

was concerned. "People are supposed

to be told they are going to be

participants in research and then agree

to it and have the option not to agree to

it without penalty."

James Grimmelmann, professor of law

at Maryland University, said Facebook

had failed to gain "informed consent" as

defined by the US federal policy for the

protection of human subjects, which

demands explanation of the purposes of

the research and the expected duration

of the subject's participation, a

description of any reasonably

foreseeable risks and a statement that

participation is voluntary. "This study is

a scandal because it brought Facebook's

troubling practices into a realm –

academia – where we still have

standards of treating people with

dignity and serving the common good,"

he said on his blog.

It is not new for internet firms to use

algorithms to select content to show to

users and Jacob Silverman, author of

Terms of Service: Social Media,

Surveillance, and the Price of Constant

Connection, told Wire magazine on

Sunday the internet was already "a vast

collection of market research studies;

we're the subjects".

"What's disturbing about how Facebook

went about this, though, is that they

essentially manipulated the sentiments

of hundreds of thousands of users

without asking permission," he said.

"Facebook cares most about two things:

engagement and advertising. If

Facebook, say, decides that filtering out

negative posts helps keep people happy

and clicking, there's little reason to

think that they won't do just that. As

long as the platform remains such an

important gatekeeper – and their

algorithms utterly opaque – we should

be wary about the amount of power and

trust we delegate to it."

Robert Blackie, director of digital at

Ogilvy One marketing agency, said the

way internet companies filtered

information they showed users was

fundamental to their business models,

which made them reluctant to be open

about it.

"To guarantee continued public

acceptance they will have to discuss

this more openly in the future," he said.

"There will have to be either

independent reviewers of what they do

or government regulation. If they don't

get the value exchange right then

people will be reluctant to use their

services, which is potentially a big

business problem."

No comments :

Post a Comment