fbpx
The Stack Archive

Research aims to stop users sharing inappropriate information on Facebook

Tue 7 Jul 2015

UK researchers are attempting to define methodologies and approaches which can give social network users warning when they are likely to share inappropriate content. The paper Implicit Contextual Integrity in Online Social Networks [PDF], published yesterday by Natalia Criado of John Moores University at Liverpool and Jose M. Such of Lancaster University, details research wherein ‘simulated societies’ were created in an attempt to monitor and define the labyrinthine problem of understanding what ‘appropriate’ data-sharing actually is in an environment with contexts as varied as ‘work’, ‘friends’ and ‘partners’.

The coding of an intervention mechanism was the least of the tasks facing the research team – the problem is actually one of context and currency. The report notes that inadequate privacy controls on social networks such as Facebook can lead to damaging dissemination of personal information to ‘contexts’ of a user’s network that should not have privy to it, and cites the example of a lesbian woman who was ‘outed’ and faced threats to sever family ties, as well as the rise in behaviour-based divorce petitions that cite Facebook activity as a cause, as well as damaging leaks concerned with identity theft, social phishing, cyberstalking and cyber-bullying. The report says:

‘In all these examples, the specific context determines whether or not the exchange of information is appropriate — e.g., one may be willing to share her political affiliations with friends but not with workmates,’

The solution researched is the creation of an Information Assistant Agent (IAA) which observes and evaluates the end-user’s network of contacts, and develops a model of relationship contexts based on previous and current engagement. At the point that the end-user has filled in a ‘new post’ or private message form, the IAA may intervene to flag that the intended content may be inappropriate for the context into which the user is about to send it (i.e. to a particular individual or group).

natalia-criado-code-message-context

The complex task of the research is to create an algorithm that can provide meaningful and accurate warnings about disclosure to multiple contexts such as groups which may contain both family members, friends and work colleagues, or – for instance – to work colleagues who fall further into the category of ‘friends’ (perhaps even ‘partners’) than other apparently similar colleagues in a user’s friend list. The possible permutations of context are difficult ti evaluate and predict:

‘For example, Alice may want to send a message to her family (S) in which she tells about Mary’s pregnancy. Charlie is her brother-in-law (π R). Charlie also works at the same company (i.e., Mary and Charlie share the work context) and, as a consequence, Charlie might reveal the information about Mary’s pregnancy at work. Thus, this message may entail the dissemination of unknown information at the work context,’

A further problem in evaluation is ‘decay’, wherein the initial assumptions the algorithm might make about a relationship between a user and an other user becomes altered by other factors over the course of time. What if you used to send sports links to a colleague who (the algorithm may note) has since become your boss? Or if you disclosed very freely to a partner who is now an ex-partner, yet still a correspondent? Or if you once confided a great deal of personal information to someone who you do not frequently write to any longer?

In the lattermost case, there is the possibility that events undocumented on Facebook have changed the nature of your relationship with that individual (i.e. they are now your boss but have not changed their Facebook employment status); that some of the latter correspondence between the two of you was fractious in a way that the algorithm would need to recognize, leading to a diminution of relations; or even that there’s no problem – you’re both just ‘hanging back’ a little, waiting for news to build up enough to make new communication worthwhile.

In the case of the ‘outed’ lesbian woman cited earlier, any preventative measure within the network would need to be able to understand that a gay person who is outspoken about their lifestyle within the walled garden of their chosen contacts may not want that aspect of their identity disclosed to certain individuals (their family) – who in many other respects may be among their closest attachments.

ddnatalia-criado-code-message-context2

In the example at hand any IAA would need to have been able to understand the appropriate meaning of the word ‘out’ (in the name of a Facebook group) in order to protect the lesbian woman from undesired disclosure of her sexuality to her family via the sharing of a Facebook group with them.

The report further outlines the difficulty in distinguishing between ‘inappropriate’ and ‘unusual’ information, since both types of information share certain ‘flaggable’ characteristics such as scarcity.

Tags:

Facebook news research
Send us a correction about this article Send us a news tip