Activities
STANDARD-SETTING
  Steering Committee (CDMSI)
  Bureau of the Committee (CDMSI-BU)
  Former Steering Committee (CDMC)
  Former Bureau of the Committee (CDMC-BU)
  Committee of Experts on Protection of Journalism and Safety of Journalists (MSI-JO)
  Committee of Experts on cross-border flow of Internet traffic and Internet freedom (MSI-INT)  
CONVENTIONS
  Transfrontier Television
  Conditional Access
COOPERATION
  Legal and Human Rights Capacity Building
FORMER GROUPS OF SPECIALISTS
  Rights of Internet Users
  Information Society
  New Media
  Public Service Media Governance
  Cross-border Internet
  Protection Neighbouring Rights of Broadcasting Organisations
  Media Diversity
  Public service Media
 
Events
  Conference Freedom of Expression and Democracy in the Digital Age - Opportunities, Rights, Responsibilities, Belgrade, 7-8/11/2013
  Conference "The Hate factor in political speech - Where do responsibilities lie?", Warsaw18-19 September 2013
  Conference of Ministers, Reykjavik - Iceland, 28-29 May 2009
  European Dialogue on Internet Governance (EuroDIG)
 
Documentation
  Conventions
  Committee of Ministers texts
  Parliamentary Assembly texts
  Ministerial Conferences
  Publications
  Translations
 
Useful links

Strasbourg, 5 October 2010

MC-NM(2011)009_en
Franais
Pdf

 

COMMITTEE OF EXPERTS ON NEW MEDIA

(MC-NM)

______

2nd Meeting
25 – 26 March 2010
Agora Building
Room G 05

______

Proposal for draft
GUIDELINES FOR SEARCH ENGINE PROVIDERS
______

 

I Transparency

1. There is a delicate balance between the commercial need for search engines to optimise results and thus generate advertising revenues and the necessity to respect the protection of the private life and personal data. In order to properly target the advertisements, search engines try to gain as much insight as possible into the characteristics and context of each individual query. An individual’s search history contains a footprint of that person’s interests, relations and intentions and should therefore be treated as sensitive data. The treatment of personal data by search engines is becoming even more crucial given the explosion and proliferation of audiovisual data (digital images, audio and video content) and the increasing popularity of mobile internet access. Specialised people search engines, location based services, the inclusion of user-generated images into general purpose search indexes and increasingly accurate face recognition technologies are some of the developments that raise deep concerns about the future impact of search engines on human rights such as privacy and freedom of expression.

2. Most internet users are unaware of the extent of the collection of personal data and of the purposes they are being used for. If they are not aware of this processing they are unable to make informed decisions about it. They require accurate, easily accessible and easily comprehensible information, adapted to different age groups and levels of education.

3. The collection and processing of large amounts of these sensitive data for direct marketing purposes, for example in the form of behavioural targeting, poses a significant challenge to the right to privacy. Allowing the public to make an informed decision about such use is best guaranteed by asking for specific consent, instead of offering an opt-out. In practice, there is no easy strategy for users to opt-out. If they for example manage to delete all cookies from their computer, they will also delete the ‘opt-out cookies’ that would register their opt-out from behavioural targeting from the different advertising networks or publishers.

4. Some of the currently described purposes, such as ‘the development of new services’ or ‘the offering of personalised advertising’ are too broadly defined to offer an appropriate framework to judge the legitimacy of the purpose. Many large search engine providers also offer services on other platforms such as for example social networking services and webmail. The large amounts of personal data collected through the search platform can be used to develop new services within the search platform, or on other platforms. It is of some concern that such new purposes for the processing of personal data could be added retro-actively, without an informed decision of the users.

5. There is an equally delicate balance between the necessity for search engines to protect their business methods and to protect the service against abuse by, for example, search spammers and malevolent distributors of spyware on the one hand and the importance of being transparent about the process of selecting and ranking search engine results on the other hand. Lack of transparency about the selection and ranking of search results, the possible blocking or filtering of specific types of content and the lack of knowledge by the public about the functioning of search engines pose an equally significant challenge to the right to freely seek and access information.

6. Full disclosure of business methods may not be possible, given that the precise algorithms used may have a high relevance for competition. Full disclosure about the ranking of results could also result in increased vulnerability of search engine services to abuse of their services. For example, undue influence could be exercised by malevolent distributors of spyware, but also search spammers and third parties with a legitimate commercial interest using search engine optimisation techniques.

7. By ranking results, search engines impose particular sets of values as regards the relevance and quality of information for the public. Commonly the relevance of results is decided on a combination of amount of hyperlinks pointing to particular websites and previous click-through rates of presented results. Often individual search histories are also being used, as well as geographical indications of the adequacy of results. The right of access to information may be challenged by un-transparent prioritisation.

II Rights of users to control their data

8. As regards the data held on the search history of their users, search engines should respect the rights of users to access and, where appropriate, to correct or delete information held about them. These rights apply foremost to the data from authenticated users stored by search engines, including personal profiles. However, these rights also apply to non-registered users.

III Data minimisation

9. If personal data are stored, the retention period should be no longer than necessary for the specific purposes of the processing. As personal data could be deleted after the end of a search session, continued storage needs an adequate justification. For each purpose, a limited retention time should be defined. Moreover, the set of personal data to be retained should not be excessive in relation to each purpose. Following opinion WP148 of the EU data protection authorities, in any case the maximum retention period should not supersede 6 months.

IV Censorship

10. To ensure freedom of information the user should be able to access Web content without censorship or restrictions. With regard to systematic nationwide filtering or blocking at the request of public authorities, search engine providers should strive for transparency and foreseeability by the public. With regard to individual filtering requests by private parties and individuals, search engine providers must adhere to the principle of due process and provide access to redress mechanisms.

V Co- and self-regulation

11. Efforts at co- and self-regulatory adherence to human rights standards have been made on a national level, for example in Germany, in France, and on a global scale by for example the Global Network Initiative.