Project Proposal by Martin Stacey


Manipulating beliefs and behaviour via social media

Software

None

Covers

Ethics, psychology, social impact of the internet, social impact of technology

Skills Required

Interest in ethics and moral philosophy, interest in social impact of computers, interest in psychology, ideally some understanding of statistics

Challenge

Conceptual ???? Technical ?? Programming

Brief Description

How information is presented on social media systems such as Facebook, and what information is selected for presentation, as well as how users then pass it on, might have a significant influence on people's beliefs, attitudes and behaviour. What is it acceptable to control or alter, how, and when, and why?

This information can be manipulated for purposes of scientific research. In 2014 Facebook published a study finding evidence for large scale 'emotional contagion' produced by users transmitting positive or negative emotional messages to each other by Facebook, influenced by the emotional content of the Facebook newsfeed. This got them - and Cornell University - a lot of adverse publicity for violating ethical standards expected for research. Cornell was forced to make a public statement that since its researchers used data previously gathered by Facebook, they hadn't needed to take their study to an ethics review.

This information might also be manipulated for commercial gain, or to persuade people to hold particular beliefs or attitudes, by selective filtering of information, choice of descriptive words or framing contexts, and by outright distortion or lying - all familiar tactics used by businesses and political groups with other means of communication. For instance, American political advertising frequently involves lying by implication without making individual statements that are demonstrably false - if you're well-informed about the facts you're not being told, you can see that you're being bamboozled, but all the voters who aren't... And the politicians being slandered can't scream "That's a lie!" - they can only hope that the voters will listen to a complicated explanation.

Systematic distortion and manipulation doesn't need to be under direct human control or even intended. In February 2018, The Guardian published an article reporting on how YouTube's Up Next algorithm directs attention to extremist views, fake news and hate speech, systematically biasing the consumption of political content in pro-Brexit, pro-Trump, racist and islamophobic directions, just because the extreme content is effective clickbait. There is a serious possibility that this has influenced close elections.

Variants

A project on how social media applications such as Facebook can be used to manipulate users' behaviour could have a number of different slants.

Resources

A short paper produced by the controversial 2013-2014 Facebook study:

Kramer, A.D.I., Guilloy, J.E. & Hancock, J.T. (2014) Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111, 8788-8790.

Lying by Sissela Bok would be a good place to start considering the morality of deception.

Lewis, P. (2018) 'Fiction is outperforming reality': how YouTube's algorithm distorts truth The Guardian, 2 February 2018.


Back to