Strasbourg, 7 November 2011
COMMITTEE OF EXPERTS ON NEW MEDIA
20-21 September 2011
Draft Recommendation of the Committee of Ministers to member states on the
protection of human rights with regard to search engines
SEARCH ENGINES PLAY A PIVOTAL ROLE IN THE INFORMATION SOCIETY
1. Search engines enable a worldwide public to seek, impart and receive information and ideas, and other content, in particular to acquire knowledge, engage in debate, and participate in the democratic processes.
2. Recommendation CM/Rec(2007)16 of the Committee of Ministers to member states on measures to promote the public service value of the Internet underlines the importance of access to information on the Internet and stresses that the Internet and other ICT services have high public service value in that they serve to promote the exercise and enjoyment of human rights and fundamental freedoms for all who use them. The Committee of Ministers is convinced of the importance of search engines for the realisation of the value of the Internet and the World Wide Web for the public and considers it essential that search engines are allowed to freely crawl and index the information that is openly available on the Web and intended for mass outreach.
3. Suitable regulatory frameworks should be able to give adequate response to legitimate concerns in relation to reference by search engines to content created by others. Further consideration is necessary as to the extent and the modalities of application of national legislation, including on copyright, to search engines as well as related legal remedies.
HUMAN RIGHTS AND FUNDAMENTAL FREEDOMS CAN BE CHALLENGED BY THE OPERATION OF SEARCH ENGINES
4. Search engines’ activity needs to take due account of fundamental rights as the operation of search engines may bear on freedom of expression and, even more so, on the right to seek, impart and receive information, as well as the right to private life and the protection of personal data. Such challenges may stem inter alia from the design of algorithms, de-indexing and/or partial treatment or biased results, market concentration and lack of transparency about both the process of selecting and ranking results.
5. The impact on private life may result from the pervasiveness of search engines or their ability to penetrate and index content which, while in the public space, was not intended for mass communication (or mass communication in aggregate), and from data processing generally and data retention periods. Moreover, search engines generate new kinds of personal data, such as individual search histories and behaviour profiles.
6. There is a need to protect and promote the values and merits of access, diversity, impartial treatment, security and transparency in the context of search engines. Media literacy and the acquisition of skills that enable users to have informed access to the greatest possible variety of information, content and services should be adapted having regard to Recommendation CM/Rec(2011)7 on a new notion of media.
7. The Committee of Ministers therefore, under the terms of Article 15. b of the Statute of the Council of Europe, recommends that member states, in co-operation with the private sector actors and civil society, develop and promote coherent strategies to protect freedom of expression, access to information and other human rights and fundamental freedoms in relation to search engines in line with the European Convention on Human Rights, especially Article 8 (Right to respect for private and family life) and Article 10 (Freedom of expression) and with the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108), in particular by:
FOSTERING TRANSPARENCY ABOUT THE WAY IN WHICH ACCESS TO INFORMATION IS PROVIDED, in order to ensure access to and pluralism and diversity of information and services, in particular according to the criteria that search results are selected, ranked, or removed.
REVIEWING SEARCH RANKING AND INDEXING OF CONTENT which, while in the public space, is not intended for mass communication (or for mass communication in aggregate). This could include listing content sufficiently low in search results as search output so as to strike a balance between the accessibility of the content in question and the intentions or wishes of its producer (e.g. attaching different consequences to a distinction between content available in a public space and content published i.e. seeking broad dissemination). Default setting should be conceived taking account of this objective.
ENCOURAGING TRANSPARENCY IN THE COLLECTION OF PERSONAL DATA and the legitimate purposes for which they are being processed.
ENABLING USERS TO EASILY ACCESS TO, and, where appropriate, to correct or delete their personal data processed by the search engine providers.
DEVELOPING TOOLS TO MINIMISE THE COLLECTION AND PROCESSING OF PERSONAL DATA, including enforcing limited retention periods, adequate irreversible anonymisation as well as tools for the deletion of data.
ENSURING THAT SUITABLE LEGAL SAFEGUARDS ARE IN PLACE FOR ACCESS TO USERS PERSONAL DATA BY ANY PUBLIC OR PRIVATE ENTITY, thus securing the full enjoyment of the rights and freedoms enshrined in the European Convention on Human Rights (ETS No. 5).
ENCOURAGING SEARCH ENGINE PROVIDERS TO ONLY DISCARD SEARCH RESULTS IN ACCORDANCE WITH CONDITIONS THAT OUTWEIGH THE RIGHT TO FREEDOM OF EXPRESSION. In this event, the user should be informed about the origin of the request to discard the results subject to respect for the right to private life and protection of personal data.
PROMOTING MEDIA LITERACY WITH REGARD TO THE FUNCTIONING OF SEARCH ENGINES, in particular on the processes of selecting, ranking and prioritising of search results and on the implications of the use of search engines on users’ right to private life and personal data.
CONSIDERING OFFERING USERS A CHOICE OF SEARCH ENGINES, in particular with regard to search outputs based on public value criteria.
PROMOTING TRANSPARENT CO- AND SELF-REGULATORY MECHANISMS FOR SEARCH, in particular with regard to the accessibility of illegal and harmful content and Council of Europe standards on freedom of expression and due process rights.
TAKING MEASURES with regard to search engines in line with the principles set out in the Appendix to this Recommendation.
BRINGING THIS RECOMMENDATION and its Appendix to the attention of all relevant public authorities and private actors.
Appendix to the Recommendation
I. HELPING THE PUBLIC MAKE INFORMED CHOICES WHEN THEY SEARCH
Context and challenges:
1. Search engines play a crucial role as one of the first points of contact in exercising the right to seek and access information, opinions, facts and ideas, as well as other content, including entertainment on the global internet. Such access to information is essential to build one's personal opinion and participate in social, political, cultural and economic life. Search engines are also an important portal for citizens' access to the mass media, including electronic newspapers and audiovisual media services.
2. There is concern that users are prone to using a very limited number of dominant search engines. This may pose challenges as regards access to and diversity of the sources of information especially if one considers that the ranking of information by search engines is not exhaustive nor neutral. In this regard, certain types of content or services may be unduly favoured.
3. The process of searching for information is strongly influenced by the way that information is arranged, this includes the selection and ranking of search results and, as applicable, the de-indexing of content. Most search engines provide very little or only general information about these matters, in particular the way values are being used to qualify a given result as the ‘best’ answer to a particular query.
4. While recognising that full disclosure of business models and methods or business-related decisions may not be appropriate because algorithms are highly relevant for competition and related information might also result in increased vulnerability of search engine services to abuse of their services (e.g. in the form of search manipulation), member states, in cooperation with the private sector and civil society, should:
PROMOTE GREATER TRANSPARENCY AS REGARDS GENERAL CRITERIA AND PROCESSES APPLIED TO THE SELECTION AND RANKING OF RESULTS. This should include information about search bias, such as in presenting results based on apparent geographic location or on earlier searches.
ENCOURAGE SEARCH ENGINE PROVIDERS TO CLEARLY DIFFERENTIATE BETWEEN SEARCH RESULTS AND ANY FORM OF COMMERCIAL COMMUNICATION, advertisement or sponsored output, including “own content” offers.
PROMOTE RESEARCH ABOUT THE DYNAMIC SEARCH ENGINE MARKET, to address issues including the public value dimension of search engine services, the increasing concentration of the search engine market, and the risk of abuse and manipulation of search results.
II. RIGHT TO PRIVATE LIFE AND THE PROTECTION OF PERSONAL DATA
Context and challenges:
5. Search engines process large amounts of personal data about the search behaviour of individuals, varying from cookies and IP addresses to individual search histories, as highlighted by a number of relevant texts already adopted at both European and international level1.
6. An individual's search history contains a footprint which may reveal the person's beliefs, interests, relations or intentions. Individual search histories may also disclose sensitive data (revealing racial origin, political opinions or religious or other beliefs, or being related to health, sexual life, or criminal convictions) that deserves special protection under Article 6 of Convention 108.
7. The processing of personal data by search engines acquires an additional dimension due to the proliferation of audiovisual data (digital images, audio and video content) and the increasing popularity of mobile internet access. Specialised search engines aiming at finding information on individuals, location based services, the inclusion of user-generated images into general purpose search indexes and increasingly accurate face recognition technologies are some of the developments that raise concerns about the future impact of search engines on fundamental rights such as the right to private life, and its potential bearing on the exercise of freedom of expression or the right to seek, impart and receive information of one’s choice.
8. By combining different kinds of information on an individual, search engines create an image of a person that does not necessarily correspond to reality or to the image that a person would want to give of her or himself. The combination of search results creates a much higher risk for that person than if all the data published on the Internet remained separate. Even long forgotten personal data can resurface as a result of the operation of search engines. As an element of media literacy, users should be informed about their right to remove incorrect or excessive personal data from original web pages, with due respect of the right to information. Search engines should promptly respond to users’ requests to delete their personal data from (extracts of) copies of web pages that search engine providers may still store (“cache” and "snippets") after the original content has been deleted.
9. Overall, it is vital to ensure compliance with the applicable privacy and data protection principles, starting from Article 8 of the European Convention on Human Rights and Article 9 of Convention 108 that foresee strict conditions to ensure that individuals are protected from unlawful interference in their private life and abusive processing of their personal data.
10. Member states (through the designated authorities) should enforce compliance with the applicable data protection principles, in particular:
ENSURING THAT REQUESTS FROM LAW ENFORCEMENT AUTHORITIES to search engine providers for users’ data are based on appropriate legal procedures and do not represent an undue burden for the providers in question.
ENSURING THAT THE COLLECTION OF PERSONAL DATA BY SEARCH ENGINE PROVIDERS IS MINIMISED. No users’ IP address should be stored where it is not necessary for the legitimate purpose pursued and when the same results can be achieved by sampling or surveying, or by anonymising personal data. Innovative approaches promoting anonymous searches should also be encouraged.
ENSURING THAT RETENTION PERIOD IS NO LONGER THAN STRICTLY NECESSARY for the legitimate and specified purposes of the processing. Search engine providers should be in a position to justify with demonstrable reasons the collection and the retention of personal data. Information in this connection should be publicly available and easily accessible.
ENSURING THAT SEARCH ENGINE PROVIDERS APPLY THE MOST APPROPRIATE SECURITY MEASURES to protect personal data against unlawful access by third parties. Such measures should include end-to-end encryption of the communication between the user and the search engine provider.
ENSURING THAT THE CROSS-CORRELATION OF DATA originating from different services/platforms belonging to the search engine provider is performed only if consent has been granted by the user for that specific service. The same applies to user profile enrichment exercises as also stated in Recommendation (2010)13 on the protection of individuals with regard to automatic processing of personal data in the context of profiling. Search engines must clearly inform the users upfront of all intended uses of their data (underlying that the initial purpose of such processing is to better respond to their search requests) and respect the user’s right with respect to their personal data.
ENCOURAGING SEARCH ENGINE PROVIDERS TO FURTHER DEVELOP TOOLS that allow users to gain access to, and correct and delete, data related to themselves that have been collected in the course of the use of services, including any profile created for example for direct marketing purposes.
III. FILTERING AND DE-INDEXING
Context and challenges:
11. A prerequisite for the existence of effective search engines is the freedom to crawl and index the information available on the Web. The filtering and blocking of internet content by search engine providers entails the risk of violation of Article 10 of the European Convention on Human Rights in respect to the rights of providers and readers to distribute and access information.
12. Search engine providers should not be obliged to proactively monitor their services in order to detect possibly illegal web content. There may be legitimate requests however to search engine providers to remove specific web sources from their index, for example in cases where other rights outweigh the right of freedom of expression and information; the right to information cannot be understood as extending the access to content beyond the intention of the person who exercises her freedom of expression.
13. In many countries, search engine providers de-index or filter specific websites at the request of public authorities, to comply with legal obligations or at their own initiative, for example in the cases unrelated to the content of websites but technical dangers as malware. Any such de-indexing or filtering should be narrowly tailored and reviewed regularly. In many other cases, requests for the de-indexing or filtering of specific web sources are filed by private parties and individuals.
14. Member states should:
ENSURE THAT ANY LAW, POLICY OR INDIVIDUAL REQUEST ON DE-INDEXING OR FILTERING is done with full respect of the right to freedom of expression and the right to seek, impart and receive information. The principles of due process and access to independent and accountable redress mechanisms should also be respected in this context.
ENSURE THAT ANY NECESSARY FILTERING OR BLOCKING IS TRANSPARENT TO THE USER. The blocking of all search results for certain keywords should not be included or promoted in co- and self-regulatory frameworks for search engines. As regards the content that has been defined in a democratic process as harmful for certain categories of users, member states should avoid the general de-indexation rendering such content inaccessible to other categories of users. In many cases, encouraging search engines to offer adequate voluntary individual filter mechanisms may suffice to protect those groups.
EXPLORE THE POSSIBILITY TO ALLOW DE-INDEXATION OF CONTENT which, while in the public domain, was not intended for mass communication (or mass communication in aggregate).
IV. SELF AND CO-REGULATION
Context and challenges:
15. Self regulatory initiatives by search engine providers aiming at protecting individuals’ fundamental rights should be welcomed. It is important to recall that all co- and self-regulation may amount to interference with the rights of others and should therefore be transparent, independent, accountable and effective in line with Article 10 of the European Convention on Human Rights. A productive interaction between different stakeholders, as state actors, private actors and civil society, can significantly contribute to the setting up of standards protecting human rights.
16. Member states should:
ENSURE THAT CO-REGULATORY ARRANGEMENTS with the respective industry meet the requirements of the European Convention on Human Rights, in particular the right to due process.
WORK WITH THE INDUSTRY TO MAKE SURE that all self-regulatory arrangements meet the requirements of the European Convention on Human Rights, in particular the right to due process.
V. MEDIA LITERACY
Context and challenges:
17. Users should be informed and educated about the functioning of different search engines (search engine literacy) in order to make informed choices about the sources of information provided, in particular that a high ranking search does not necessarily reflect the importance, relevance or trustworthiness of the source. As search engines play an increasingly important role with regard to the accessibility of media and information online, media and information literacy strategies should accordingly be adapted. Users should be made aware of the implications of the use of search engines, both with regard to personalised search results, as well as to the impact on their image and reputation of combined search results about them, and of the available tools to exercise their rights.
18. Member states should ensure that:
SEARCH ENGINE LITERACY BECOMES PART OF ANY NATIONAL MEDIA LITERACY strategy including as part of formal or informal national education strategies.
MEDIA LITERACY EMPOWERS users to manage their online identity.