Strasbourg, 5 October 2010

MC-NM(2011)009_en
Franais
Pdf

 

COMMITTEE OF EXPERTS ON NEW MEDIA

(MC-NM)

______

2nd Meeting
25 – 26 March 2010
Agora Building
Room G 05

______

Proposal for draft
GUIDELINES FOR SEARCH ENGINE PROVIDERS
______

 

I Transparency

1. There is a delicate balance between the commercial need for search engines to optimise results and thus generate advertising revenues and the necessity to respect the protection of the private life and personal data. In order to properly target the advertisements, search engines try to gain as much insight as possible into the characteristics and context of each individual query. An individual’s search history contains a footprint of that person’s interests, relations and intentions and should therefore be treated as sensitive data. The treatment of personal data by search engines is becoming even more crucial given the explosion and proliferation of audiovisual data (digital images, audio and video content) and the increasing popularity of mobile internet access. Specialised people search engines, location based services, the inclusion of user-generated images into general purpose search indexes and increasingly accurate face recognition technologies are some of the developments that raise deep concerns about the future impact of search engines on human rights such as privacy and freedom of expression.

2. Most internet users are unaware of the extent of the collection of personal data and of the purposes they are being used for. If they are not aware of this processing they are unable to make informed decisions about it. They require accurate, easily accessible and easily comprehensible information, adapted to different age groups and levels of education.

    Search engine providers should give a comprehensive overview of the different specified, explicit and legitimate purposes for which they process personal data. It is key for the search engine providers to explore innovative ways to present this information to the public, outside of the realm of the general terms and conditions.

3. The collection and processing of large amounts of these sensitive data for direct marketing purposes, for example in the form of behavioural targeting, poses a significant challenge to the right to privacy. Allowing the public to make an informed decision about such use is best guaranteed by asking for specific consent, instead of offering an opt-out. In practice, there is no easy strategy for users to opt-out. If they for example manage to delete all cookies from their computer, they will also delete the ‘opt-out cookies’ that would register their opt-out from behavioural targeting from the different advertising networks or publishers.

    Search engine providers must inform their (potential) users about the specific and legitimate purposes for which personal data are being processed. If data are being used for direct marketing purposes or the creation of profiles, consent of the users is clearly preferred over any opt-out strategy.

4. Some of the currently described purposes, such as ‘the development of new services’ or ‘the offering of personalised advertising’ are too broadly defined to offer an appropriate framework to judge the legitimacy of the purpose. Many large search engine providers also offer services on other platforms such as for example social networking services and webmail. The large amounts of personal data collected through the search platform can be used to develop new services within the search platform, or on other platforms. It is of some concern that such new purposes for the processing of personal data could be added retro-actively, without an informed decision of the users.

    Search engine providers should seek informed consent from their users if they wish to use the personal data they have already collected for new processing purposes.

5. There is an equally delicate balance between the necessity for search engines to protect their business methods and to protect the service against abuse by, for example, search spammers and malevolent distributors of spyware on the one hand and the importance of being transparent about the process of selecting and ranking search engine results on the other hand. Lack of transparency about the selection and ranking of search results, the possible blocking or filtering of specific types of content and the lack of knowledge by the public about the functioning of search engines pose an equally significant challenge to the right to freely seek and access information.

6. Full disclosure of business methods may not be possible, given that the precise algorithms used may have a high relevance for competition. Full disclosure about the ranking of results could also result in increased vulnerability of search engine services to abuse of their services. For example, undue influence could be exercised by malevolent distributors of spyware, but also search spammers and third parties with a legitimate commercial interest using search engine optimisation techniques.

7. By ranking results, search engines impose particular sets of values as regards the relevance and quality of information for the public. Commonly the relevance of results is decided on a combination of amount of hyperlinks pointing to particular websites and previous click-through rates of presented results. Often individual search histories are also being used, as well as geographical indications of the adequacy of results. The right of access to information may be challenged by un-transparent prioritisation.

    As a vital service for the information society, search engine providers should be transparent about the criteria they apply to select and rank results.

II Rights of users to control their data

8. As regards the data held on the search history of their users, search engines should respect the rights of users to access and, where appropriate, to correct or delete information held about them. These rights apply foremost to the data from authenticated users stored by search engines, including personal profiles. However, these rights also apply to non-registered users.

    In this context, search engine providers should apply their technological innovation capacity to find a meaningful solution to grant access to the search history held of non-registered users.

    Search engine providers should further develop tools that allow registered users to gain access to, and correct and delete data that have been collected in the course of the use of services, including a possible profile created for example for direct marketing purposes.

III Data minimisation

9. If personal data are stored, the retention period should be no longer than necessary for the specific purposes of the processing. As personal data could be deleted after the end of a search session, continued storage needs an adequate justification. For each purpose, a limited retention time should be defined. Moreover, the set of personal data to be retained should not be excessive in relation to each purpose. Following opinion WP148 of the EU data protection authorities, in any case the maximum retention period should not supersede 6 months.

    Given the sensitivity of search behaviour, search engine providers should determine adequate, non-excessive retention periods and irreversibly delete personal data once the limited retention period has expired.

IV Censorship

10. To ensure freedom of information the user should be able to access Web content without censorship or restrictions. With regard to systematic nationwide filtering or blocking at the request of public authorities, search engine providers should strive for transparency and foreseeability by the public. With regard to individual filtering requests by private parties and individuals, search engine providers must adhere to the principle of due process and provide access to redress mechanisms.

    Search engine providers must promote transparency about systematic nationwide blocking or filtering about certain types of content and adhere to the principle of due process when removing specific search results from their index and provide access to redress mechanisms.

    Search engine providers should offer and continue to develop adequate individual filtering tools to allow their users to protect themselves, or in a family situation their children, against specific kinds of content, as well as filters that block or warn users against sites that apparently spread viruses, spy and other kinds of computer malware.

V Co- and self-regulation

11. Efforts at co- and self-regulatory adherence to human rights standards have been made on a national level, for example in Germany, in France, and on a global scale by for example the Global Network Initiative.

    In order for co- and self-regulation to produce meaningful results, search engine providers should adopt adequate sanctioning mechanisms.