|Steering Committee (CDMSI)|
|Bureau of the Committee (CDMSI-BU)|
|Former Steering Committee (CDMC)|
|Former Bureau of the Committee (CDMC-BU)|
|Rights of Internet Users|
|Legal and Human Rights Capacity Building|
|FORMER GROUPS OF SPECIALISTS|
|Public Service Media Governance|
|Protection Neighbouring Rights of Broadcasting Organisations|
|Public service Media|
Conference Freedom of Expression and Democracy in the Digital Age -
Opportunities, Rights, Responsibilities, Belgrade, 7-8/11/2013
Conference "The Hate factor in political speech - Where do responsibilities lie?", Warsaw18-19 September 2013
Conference "Tackling hate speech - Living together on-line", Budapest 27-28/11/2012
|Conference of Ministers, Reykjavik - Iceland, 28-29 May 2009|
|European Dialogue on Internet Governance (EuroDIG)|
|Committee of Ministers texts|
|Parliamentary Assembly texts|
Strasbourg, 15 September 2011
COMMITTEE OF EXPERTS ON NEW MEDIA
Measures to protect and promote respect for human rights with regard to social networking services1
Document prepared by the Secretariat
DRAFT RECOMMENDATION OF THE COMMITTEE OF MINISTERS TO MEMBER STATES
1. Social networking services are increasingly becoming an important part of people’s daily lives. They are a tool for expression but also for communication between individuals or for mass communication. This complexity gives them a great potential to promote the exercise and enjoyment of human rights and fundamental freedoms, in particular the freedom to express, to create and to exchange content and communication.
2. Given their increasingly prominent role, social networking services and other social media services also offer great possibilities for enhancing the individual’s right to participate in political, social and cultural life. Bearing in mind Recommendation (2007)16 of the Committee of Ministers on the public service value of the Internet which states that the Internet and other ICT services have high public service value in that they serve to promote the exercise and enjoyment of human rights and fundamental freedoms for all who use them, greater efforts could be put into exploring how social networking services and other social media could act as a means to enhancing participation (especially of marginalised groups in society) and contributing to the strengthening of democracy and social cohesion.
3. The right to freedom of expression and information, as well as the right to private life and human dignity, may also be challenged on social networking services. These challenges may arise, for example, through lack of due process preceding the exclusion of users, insufficient protection of children2 and young people against the harmful behaviour of others, violation of other people’s rights, lack of privacy-protective default settings, and lack of transparency about the purposes for which personal data is being collected and processed.
4. Users of social networking services need to respect other people’s rights and freedoms. Media education is particularly important in the context of social networking services in order to make the users aware of their rights when using these tools. Media literacy should also help individuals to acquire the human rights values and behaviour necessary to respect other people’s rights and freedoms.
5. A number of co- and self-regulatory mechanisms have already been set up in some Council of Europe member states. It is important that procedural safeguards are respected by these mechanisms, in line with the human right to a fair trial, within reasonable time, and starting with the presumption of innocence.
6. The Committee of Ministers recommends that member states, in cooperation with private sector actors and civil society, develop and promote coherent strategies to protect and promote respect for human rights with regard to social networking services, in line with the European Convention on Human Rights (ETS No. 5), especially Article 8 (Right to respect for private and family life) and Article 10 (Freedom of expression) and with the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108), in particular by:
i. ensuring users are aware of possible challenges to their human rights on social networking services (in particular their freedom of expression and information and their right to private life and protection of personal data) as well as on how to avoid having a negative impact on other people’s rights when using these services;
ii. protecting users of social networking services from harm from other users while also ensuring all users’ right to freedom of expression and access to information;
iii. encouraging transparency about data processing, and in particular about the kinds of personal data that are being collected and the legitimate purposes for which they are being processed, including further processing by third parties;
iv. preventing the illegitimate processing of personal data;
v. encouraging providers of social networking services to set up co- or self-regulatory mechanisms;
vi. taking measures with regard to social networking services in line with the principles set out in the appendix to this recommendation;
vii. bringing these principles to the attention of all relevant public authorities and private actors, in particular social networking providers, and civil society.
I. Transparency as regards freedom of expression and access to information
1. Social networking services offer the possibility to both receive and impart information. Users can invite recipients on an individual basis, but in most cases the recipients are a dynamic group of people, sometimes even a “mass” of unknown people (all the members of the social network). In cases where users’ profiles are indexed by search engines there is potentially unlimited access to parts of or all information published on the profile.
2. It is important for participants to feel confident about imparting information and to know whether this information has a public or private character. In particular, children, especially teen agers, and other categories of vulnerable people need guidance in order to be able to manage their profile and understand the impact that the publication of information of a private nature could have, in order to prevent harm to themselves and others.
3. In cooperation with the private sector and civil society, member states should ensure that the users’ right to freedom of expression is guaranteed, in particular by:
i. informing users clearly about the difference between private and public communication and the possible consequences of unlimited access (in time and geographically) to their profile and communication;
ii. providing information about the core conditions of participating in the social networking service in a form and language that is geared to, and easily understandable by, the target groups of the social networking service;
iii. fostering awareness initiatives for parents and teachers to supplement information provided by the social networking service.
II. Appropriate protection of children against harmful content and behaviour
4. Freedom of expression includes the freedom to impart and receive information which may be shocking, disturbing and offensive and/or content that is unsuitable for particular age groups. In some cases, human dignity and the duty to respect and protect the rights of vulnerable groups may outweigh this right to freedom of expression.
5. Social networking services play an increasingly important role in the life of children, as part of the development of their own personality and identity, and as part of their participation in debates and social activities. There is in any case a need to protect children against the inherent vulnerability that their age implies. This does not, however, entail an obligation on social networking services to control, supervise and/or rate all content uploaded by its users. Parents should play a primary role in working with children to ensure that they are using the services in an appropriate manner.
6. Age-verification systems are often described as a possible solution for protecting children from output that may be harmful to them. However, there is not a single technical solution with regard to online age verification that does not infringe on other human rights and/or does not facilitate age falsification, thus causing greater risks than benefits to the children involved.
7. In cooperation with the private sector and civil society, member states should ensure users’ safety and protect their human dignity while also guaranteeing procedural safeguards and the right to freedom of expression and access to information, in particular by:
i. informing users what content is considered “illegal” according to legal provisions and what content or behaviour is considered “inappropriate” according to the core conditions of the social networking service;
ii. encouraging law enforcement bodies and social networking services to establish a transparent mechanisms for cooperation and promote qualified initiatives such as hotlines;
iii. ensuring that users have easy access to mechanisms for reporting inappropriate and illegal content or behaviour of other users to the social networking service provider;
iv. adopting other specific measures to prevent cyberbullying and cybergrooming, such as labelling and age-rating of content; however, age-differentiated access should be treated carefully, as a best effort that is based on age input provided by the children themselves;
v. ensuring that any decisions to block content should be taken in accordance with Recommendation (2008)6 of the Committee of Ministers to member states on measures to promote the respect for freedom of expression and information with regard to Internet filters and its guidelines;
vi. guaranteeing that blocking and filtering, and, in particular, nationwide general blocking or filtering measures, are only introduced by the state if the conditions of Article 10, paragraph 2, of the European Convention on Human Rights are fulfilled and avoiding the general blocking of offensive or harmful content for users who are not part of the groups for which a filter has been activated to protect. Instead, encouraging social networking services to offer adequate and transparent voluntary individual filter mechanisms may suffice to protect those groups.
III. Ensuring users’ control over their data
8. Social networking services process large amounts of personal data, including users’ profiling data and traffic data. Publishing personal data in a profile can lead to access by third parties, including, amongst others, employers, insurance companies, law enforcement agencies and the secret services.
In order for users to exercise control over their data, default settings must be privacy-protective, as already highlighted by a number of instruments adopted at both European and international level3. The interface must be clear and allow users to effectively exercise their rights.
9. Social networking services should not process personal data beyond the legitimate and specified purposes for which they have collected it. They should limit processing only to that data which is strictly necessary for the agreed purpose and for as short a time as possible. Social networking services must seek the informed consent of users if they wish to process new data about them, share their data with other categories of people or companies and/or use their data in other new ways. The user’s decision (refusal or consent) should not have any effect on the continued availability of the service to him or her. When allowing third party applications to access users’ personal data, the services must provide sufficiently multi-layered access to allow users to specifically consent to access to different kinds of data.
10. The default setting for users should be that access is limited to self-selected “friends” or contacts. Users should be able to make an informed decision to grant access to a larger public, in particular with regard to indexability by external search engines. The social networking service must offer adequate, refined possibilities to ‘opt in’ for (consent to) wider access. In case a user wants to widen access to all users of a social networking service or even globally, through indexability by external search engines, it must be clear - and the appropriate tools must be easily accessible - how they may restrict access again, including removal from archives and search engine caches. Photo-tagging suggestions through facial recognition should not be activated by default.
11. It is key that social networking services apply the most appropriate security measures to protect personal data against unlawful access by third parties. Such measures should include the site-wide encryption of the connection against eavesdropping. In case there is no applicable data-breach legislation, social networking services are encouraged to report personal data breaches to their users, to enable them to take preventive measures, such as changing their password and/or keeping a close eye on their financial transactions (where the providers are in possession of bank or credit card details). Providers can strenghten security and users’ control of their personal data also through the implementation of the privacy by design principle.
12. Users should be informed about the processing of their personal data, including the existence of, and means of, exercising their rights (i.e. access, rectification, erasure), in a clear and understable manner, in language geared, where necessary, to the target audience. Users should be informed about possible challenges to their right to private life, not only in the social networking services’ core conditions, but every time such a challenge may arise, for example, when the users make information on their profile available to new (groups of) users or when they install a third party application.
13. The practice of pseudonymous profiles offers both possibilities and challenges for human rights. In its Declaration on freedom of communication on the Internet (adopted on 28 May 2003), the Committee of Ministers stressed that “in order to ensure protection against online surveillance and to enhance the free expression of information and ideas, member States should respect the will of users of the Internet not to disclose their identity”. The right of being able to use an online pseudonym should be guaranteed both from the perspective of free expression of information and ideas and from the perspective of the right to private life. This right should not be confused with the possibility for the services to register the real identity of users, and implementation of this right should be accompanied by an effective control system for inappropriate behaviour such as complaint and report mechanisms, moderation, etc.
14. Users should always be able to withdraw their consent to the processing of their personal data. Before terminating their account, users should be able to easily and freely move the data they have uploaded to another service or device, in a usable format. Upon termination, all data from and about the users should be permanently eliminated from the servers of the social networking service.
15. Non-users of the social network may also be affected by incautious disclosure of their personal data by users of the service or by use of their data by the social networking service itself. Non users should thus have effective means of exercising their rights without having to become a member of the service and/or otherwise provide excessive personal data. Social networking providers should refrain from collecting and processing personal data about non-users, such for example e-mail addresses and biometrical data. Users should be made aware of the obligations they have towards other individuals and, in particular, that the publication of personal data related to other people must respect rights of those individuals.
16. In cooperation with the private sector and civil society, member states should ensure that users’ right to private life is protected, in particular by:
i. enforcing applicable privacy principles, especially that social networking services have default privacy-friendly settings that limit access to self-selected “friends” or contacts, that they apply the most appropriate security measures and ask for the informed consent of their users before they share data about them, share their data with other categories of people or (categories of) companies and/or use their data in other new ways;
ii. ensuring that users are able to effectively exercise their rights by offering, amongst other things, a clear user interface, understandable and readily accessible information about the purposes of the data processing, and sufficiently multi-layered access for third parties;
iii. ensuring that users are informed about possible consequences of publishing personal data in a profile, as well as about possible legal access by third parties (including also e.g. law enforcement authorities);
iv. ensuring that users are informed about the need to obtain the prior consent of other people before they publish their personal data, including audio and video, in cases where they have widened access beyond self-selected friends or contacts;
v. guaranteeing that users must be able to completely delete their profile and all data stored about and from them in a social networking service; [this includes tools for parents to manage their children’s data];
vi. guaranteeing the possibility of using a pseudonym;
vii. ensuring that the processing of personal data stemming from the use of social networking services for law enforcement purposes is only carried out under an appropriate legal framework, or following specific orders or instructions from the competent public authority made in accordance with the law.
IV. Gender equality
17. Gender inequality may be perpetuated in social network services, in particular because those services may be used for gender-based violence and abuse (cyber stalking, sexual harassment, etc.). In cooperation with the private sector and civil society, and in particular associating women as a distinct stakeholder group, member states should ensure that gender equality is respected and that women’s right to participation in public life on the Internet is ensured, in particular by:
encouraging the adoption of self-regulatory standards aiming at preventing gender-based violence and discrimination, and enhancing respect for women’s dignity and their rights.
GUIDELINES FOR SOCIAL NETWORKING PROVIDERS
Social networking services provide a very important platform both for receiving and imparting information. They are therefore an important tool both for realising the human right to free expression as well as participation in social, cultural, economic [and in some cases even, political] life.
It is important for individuals using social networking services to feel confident about using these tools. They have to be sure that their right to private life will be protected when they use social networking services and that their personal data will not be misused. They also have to understand when the information they post online is no longer private correspondence but has become available to a large public.
It is equally important to recall that the exercise of freedom of expression carries with it duties and responsibilities, in particular as regards the protection of health and morals and the rights of all users. Social network providers are encouraged to ensure that users are protected from harmful content or actions such as cyberbullying.
Social network providers should promote and facilitate users’ wellbeing while respecting fundamental rights, in particular the right to freedom of expression and the right to private life and secrecy of correspondence.
Social network providers are therefore encouraged to take note of, discuss and do their utmost efforts, also through self-regulatory standards, to comply with the following guidelines (below). These guidelines should be read and understood in connection with the relevant Council of Europe documents, in particular [draft] Recommendation on measures to protect and promote respect for human rights with regard to social networking services [CMRec…].
Social network providers are invited to:
i. Inform users clearly about the core conditions of the service in a form and language that is appropriate to, and easily understandable by, the target group of the social networking site (for example, short videos or information in ‘plain language’).
ii. Inform users in particular about the difference between private and public communication and the possible consequences of unlimited access (in time and geographically) to their profile and communication.
iii. If possible, offer, or contribute to, awareness-raising initiatives for users, parents and teachers on the safe use of social networking services which is respectful of other people’s rights.
iv. Inform the user clearly about what content is considered “illegal” according to legal provisions and what content or behaviour is considered “inappropriate” according to core conditions of the social networking service.
v. Refrain from collecting and processing personal data about non-users, such for example e-mail addresses and biometrical data.
vi. Offer non-users effective means of exercising their rights when data relating to them are disclosed by users, without having to become a member of the service and/or otherwise having to provide excessive personal data.
vii. Ensure that users have easy access to mechanisms for reporting inappropriate and illegal content, or behaviour, of other users to the service provider.
viii. Adopt other specific measures to prevent cyberbullying and cyber grooming, such as labelling and age rating of content. However, offering age-differentiated access should be treated carefully, as a best effort based on age input provided by the children themselves.
ix. Establish transparent mechanisms for cooperation with law enforcement bodies and promote qualified initiatives such as hotlines.
x. Ensure that any decisions to block content are taken in accordance with Recommendation (2008)6 of the Committee of Ministers to member states on measures to promote the respect for freedom of expression and information with regard to Internet filters and its guidelines.
xi. Ensure, in particular, that self-regulatory mechanisms set up to protect users from illegal and harmful content are effective, transparent, independent and accountable and give individuals the right to appeal decisions to block content.
xii. Respect applicable data protection principles, especially the limiting by default of access to self-selected friends or contacts, apply the most appropriate security measures and have legitimate grounds for the processing of personal data for specific purposes, including further processing by third parties and use for behavioural advertising.
xiii. Ensure that photo-tagging suggestions through facial recognition are not activated by default.
xiv. Ensure transparent information for users about the management of their personal data, including the existence of, and the means of exercising, their rights, in a form and language that is appropriate for the target group of the social networking services, in particular children, especially teen agers, and other vulnerable people.
xv. Ensure that users are informed about the need to obtain the prior consent of other people before they publish their personal data, including audio and video, in cases where they have widened access beyond self-selected friends or contacts.
xvi. Make sure that users are able to exercise their rights in respect of their personal data and, in particular, completely delete their profile and all data stored about and by them in a social networking service [this includes tools for parents to manage their children’s data].
xvii. Ensure the possibility of pseudonymous profiles.
xviii. Adopt other specific measures to prevent gender inequality, gender based violence and abuse.
1 Draft Recommendation of the Committee of Ministers to member states and Guidelines
2 Below the age of 18.
3 See Article 29 Data Protection Working Party Opinion 5/2009 (12 June 2009), 30th International Conference of Data Protection and Privacy Commissioners Resolution on Privacy Protection in Social Network Services (Strasbourg 17 October 2008), International Working Group on data Protection in Telecommunications (IWGDPT) “Rome Memorandum” (Rome on 3-4 March 2008).