Media - Freedom of expression and information

 Committee of experts on protection of journalism and safety of journalists (MSI-JO)

Activities
STANDARD-SETTING
  Steering Committee (CDMSI)
  Bureau of the Committee (CDMSI-BU)
  Former Steering Committee (CDMC)
  Former Bureau of the Committee (CDMC-BU)
  Committee of Experts on Protection of Journalism and Safety of Journalists (MSI-JO)
  Committee of Experts on cross-border flow of Internet traffic and Internet freedom (MSI-INT)  
CONVENTIONS
  Transfrontier Television
  Conditional Access
COOPERATION
  Legal and Human Rights Capacity Building
FORMER GROUPS OF SPECIALISTS
  Rights of Internet Users
  Information Society
  New Media
  Public Service Media Governance
  Cross-border Internet
  Protection Neighbouring Rights of Broadcasting Organisations
  Media Diversity
  Public service Media
 
Events
  Conference Freedom of Expression and Democracy in the Digital Age - Opportunities, Rights, Responsibilities, Belgrade, 7-8/11/2013
  Conference "The Hate factor in political speech - Where do responsibilities lie?", Warsaw18-19 September 2013
  Conference of Ministers, Reykjavik - Iceland, 28-29 May 2009
  European Dialogue on Internet Governance (EuroDIG)
 
Documentation
  Conventions
  Committee of Ministers texts
  Parliamentary Assembly texts
  Ministerial Conferences
  Publications
  Translations
 
Useful links

Strasbourg, 15 September 2011

MC-NM(2011)14_en
Franais
Pdf

 

COMMITTEE OF EXPERTS ON NEW MEDIA

(MC-NM)

______

5th Meeting
20-21 September 2011
Agora Building
Room G05


______

Measures to protect and promote respect for human rights with regard to search engines1

Document prepared by the Secretariat
______

DRAFT RECOMMENDATION OF THE COMMITTEE OF MINISTERS TO MEMBER STATES

1. Search engines play a central role as intermediaries in the information society by enabling a worldwide public to seek, impart and receive information and ideas, in particular to gain access to knowledge and expressions, engage in debate and participate in a democratic society.

2. Recommendation CM/Rec(2007)16 of the Committee of Ministers to member states on measures to promote the public service value of the Internet underlines the importance of access to information on the Internet and stresses that the Internet and other ICT services have high public service value in that they serve to promote the exercise and enjoyment of human rights and fundamental freedoms for all who use them. The Committee of Ministers is convinced of the importance of search engines for the realisation of the value of the Internet and the World Wide Web for the public and considers it important that search engines are allowed to freely index the information that is openly available on the Web.

3. This activity needs to take due account of fundamental rights as the operation of search engines may challenge the right to freedom of expression and information and the right to private life and protection of personal data, and possibly other human rights and fundamental freedoms. This may stem inter alia from the design of algorithms, blocking and discrimination of content, market concentration and lack of transparency about both the process of selecting and ranking results and about data processing and data retention periods. It is a fact that search engines generate new kinds of personal data, such as individual search histories. It is important that member states adopt strong legal safeguards for access to this information by both public entities and legally entitled private parties.

4. There is a need to protect and promote the values of access, diversity, security and transparency in the context of search engines. It is equally important to foster media literacy and the acquisition of skills that enable users to have access to the greatest possible variety of information and services.

5. In certain member states, co- and self-regulatory mechanisms have been set up to regulate the accessibility of illegal and harmful content through search engines.

6. The Committee of Ministers therefore recommends that member states, in co-operation with private sector actors and civil society, develop and promote coherent strategies to protect freedom of expression, access to information and other human rights and fundamental freedoms in relation to search engines in line with the European Convention on Human Rights (ETS No. 5), especially Article 8 (Right to respect for private and family life) and Article 10 (Freedom of expression) and with the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108), in particular by:

– fostering transparency about the way in which access to information is provided - in particular according to which criteria search results are selected, ranked or prioritised as well as whether certain search results have been removed - in order to ensure access to and pluralism and diversity of information and services;

– encouraging transparency about the way in which personal data are being collected and the legitimate purposes for which they are being processed;

– promoting the further development of tools to minimise collection and processing of personal data, including enforcing limited retention periods, adequate irreversible anonymisation as well as tools for the deletion of data;

– allowing users to easily access, and, where appropriate to correct or delete data collected by the search engine providers from and about them;

– ensuring that the principle of due process is adhered to when search results are removed from search indexes, ensuring also that access to redress mechanisms is provided, regardless the origin of removal requests (governmental, co-regulatory or private);

– ensuring that the principle of due process is also adhered to before disclosure of individuals’ search records to both public entities and legally entitled private parties;

– promoting the development of specific knowledge in the field of media literacy about the functioning of search engines, in particular on the processes of selecting, ranking and prioritising of search results and on the implications that the use of search engines has on users’ right to private life and personal data;

– taking measures with regard to search engines in line with the principles set out in the appendix to this recommendation;

– bringing this recommendation and its appended principles to the attention of all relevant public authorities and private actors.

Appendix to the Recommendation

PRINCIPLES

I. Selection and ranking of information

1. Search engines play a crucial role as a first point of contact to freely seek and access information, opinions, facts and ideas on the global internet. Such free access to information is essential to build one's personal opinion and participate in social, political, cultural and economic life.

2. The process of obtaining information is strongly influenced by the arrangement of the information, the selecting and ranking of search results and the blocking of content. Most search engines provide very little or only general information about the way results are being selected and ranked, and what values are being used to qualify a given result as the ‘best’ answer to a particular query. To enhance the users’ right to freely exercise and enjoy the right to freedom of expression and information, search engine providers should be transparent about the general criteria they apply to select and rank results, and also indicate individual bias, such as presenting results based on apparent geographic location or on earlier searches.
With respect to indexing, it is important that graphical presentation of content displayed on the user’s screen clearly differentiates between the search result and any commercial advertisement.

3. While recognising that full disclosure of business methods may not be appropriate, given that the precise algorithms used may have a high relevance for competition, and might also result in increased vulnerability of search engine services to abuse of their services (search manipulation), member states, in cooperation with the private sector and civil society, are encouraged to:

      a. ensure transparency about the process of selecting and ranking results to allow the public to make informed decisions about their use of search engines.

      b. promote ongoing research into the dynamic search engine market, to address issues such as whether and to what extent search results are influenced.

II. Transparency on ownership and the challenge of concentration

4. There is concern that concentration in the search engine market could challenge access to a diversity of information, in particular if one considers that the display and ranking of information by search engines is not exhaustive or neutral.

5. The general dependence on a small number of well-known search engines increases the concern that major search engines may be in a position to abuse their power.

6. Member states should:

    promote ongoing research into the dynamic search engine market, to address issues such as the increasing concentration of the search engine market, to what extent this leads or might lead to abuse of market power.

III. Right to private life and the protection of personal data

7. Search engines process large amounts of personal data about the search behaviour of individuals, varying from cookies and IP addresses to individual search histories, as highlighted by a number of relevant texts already adopted at both European and international level2. An individual's search history contains a footprint which may include the person's beliefs, interests, relations, and intentions. Individual search histories may also disclose sensitive data (revealing racial origin, political opinions or religious or other beliefs, or being related to health, sexual life, or criminal convictions) that deserve the special protection under Article 6 of Convention 108. The processing of personal data by search engines is becoming even more crucial given the proliferation of audiovisual data (digital images, audio and video content) and the increasing popularity of mobile internet access. Specialised search engines aiming at finding information on individuals, location based services, the inclusion of user-generated images into general purpose search indexes and increasingly accurate face recognition technologies are some of the developments that raise concerns about the future impact of search engines on fundamental rights such as right to private life and freedom of expression.

8. Consequently, it is vital to ensure compliance with the applicable privacy and data protection principles, starting from Article 8 of the European Convention on Human Rights and Article 9 of Convention 108 that foresee strict conditions to ensure that individuals are protected from unlawful interference in their private life and abusive processing of their personal data. Bearing this in mind, search engines should be in a position to respond to requests from law enforcement authorities for available users’ data on the basis of appropriate legal procedures.

9. The collection of personal data by search engine providers must be minimised. No users’ IP address should be stored where it is not necessary for the legitimate purpose pursued and when the same results can be achieved by sampling or polling or by anonymising personal data.
Innovative approaches promoting anonymous searches should also be encouraged.
Search engine providers must delete or irreversibly anonymise personal data once they no longer serve the specified and legitimate purpose they were collected for. Therefore the retention period should be no longer than what is strictly necessary for the purposes of the processing and should be based on adequate justification.

10. It is key that search engine providers apply the most appropriate security measures to protect personal data against unlawful access by third parties. Such measures should include the site wide encryption of the connection against eavesdropping.

11. Cross-correlation of data originating from different services/platforms belonging to the search engine provider may only be performed if consent has been granted by the user for that specific service. The same applies to user profile enrichment exercises as also stated in Recommendation (2010)13 on the protection of individuals with regard to automatic processing of personal data in the context of profiling. Search engines must clearly inform the users upfront of all intended uses of their data (underlying that the initial purpose of such processing is to better respond to their search requests) and respect users’ rights to readily access, correct or delete their personal data.

12. Consideration should be given to the fact that by combining different kinds of information on an individual, search engines create a profile that does not necessarily correspond to reality. The combination creates a much higher risk for that person than if all the data published on the Internet remained separate. Even long forgotten personal data can be actualized by search engines. Adequate tools should be explored in order to consider the person’s interest to be fairly depicted and to protect the right to private life in the due respect of the right to information. Users should have the possibility to have their data deleted from copies of web pages that search engine providers may still store (“cache”) while the original content has been deleted.

13. Member states (through the designated authorities) should:

    i. enforce compliance with the applicable data protection principles, in particular:

      a. ensuring that requests from law enforcement authorities to search engine providers of users’ data are based on appropriate legal procedures and only concern available data;

      b. ensuring that the retention period is no longer than what is strictly necessary for the legitimate purposes of the processing;

      c. encouraging search engine providers to further develop tools that allow users to gain access to, and correct and delete data related to them that have been collected in the course of the use of services, including a possible profile created for example for direct marketing purposes;

IV. Filtering and blocking

    14. A prerequisite for the existence of effective search engines is the freedom to crawl the available information on the Web. The filtering and blocking of internet content by search engine providers entails the risk of violation of Article 10 of the European Convention on Human Rights in respect to the rights of providers and readers to distribute and access information. Preferably, filtering or blocking should therefore take place at the end points of the network, at the request of users. Search engine providers should not be obliged to proactively monitor their services in order to detect possibly illegal web content. There may be legitimate requests however to search engine providers to remove specific web sources from their index, for example in cases where other rights outweigh the right of freedom of expression and information. In many countries, search engine providers block or filter specific websites at the request of public authorities, to comply with legal obligation or at their own initiative, for example in the case of websites spreading spyware. Any such blocking or filtering should be narrowly tailored and reviewed regularly.

    15. In many other cases requests for the blocking or filtering of specific web sources are filed by private parties and individuals. It is important that any law, policy or single request on blocking or filtering is done with full respect of the right to freedom of expression and to seek information. The principles of due process and access to independent and accountable redress mechanisms should also be respected in this context. Member states should:

i. ensure the freedom of search engines to crawl the available information on the Web and ensure that possible legislation on mandatory filtering and blocking of content by general purpose search engines is in accordance with Recommendation (2008)6 of the Committee of Ministers to member states on measures to promote the respect for freedom of expression and information with regard to Internet filters and its guidelines;

ii. guarantee that blocking or filtering mechanisms, in particular nationwide general blocking or filtering measures, are only introduced by the state if the conditions of Article 10, paragraph 2, of the European Convention on Human Rights are fulfilled. Any such filtering or blocking should be transparent to the user. Member states should avoid the general blocking of content that has been defined in a democratic process as harmful for users who are not part of the groups for which a filter has been activated to protect. In many cases, encouraging search engines to offer adequate voluntary individual filter mechanisms may suffice to protect those groups.

V. Self and co-regulation

    16. Self regulatory initiatives by search engine providers aiming at protecting individuals’ fundamental rights should be welcomed. It is important to recall that all co- and self-regulation, as a form of interference, should be transparent, independent, accountable and effective. A productive interaction between different stakeholders, as state actors, private actors and civil society, can significantly contribute to the setting up of standards protecting human rights.

    17. Member states should:

    ensure that all self regulatory arrangements meet the minimum requirements of the European Convention on Human Rights, in particular the right to due process. Complaints mechanisms have to be transparent, effective, independent and accountable.

VI. Media literacy

    18. Users must be educated and made aware of the functioning of different search engines (search engine literacy) in order to make informed choices about the sources of information provided, in particular that a high ranking does not necessarily reflect the importance, relevance or trustworthiness of the source. As search engines play a more and more important role with regard to the accessibility of media and information online, media and information literacy strategies should accordingly be adapted. Users should be made aware of the implications of the use of search engines on their personal data and of the available tools to exercise their rights.

    19. Member states should ensure that:

i. search engine literacy becomes part of the national media literacy curricula;

ii. media literacy is considered a priority for national education strategies both in formal and informal education.

GUIDELINES FOR SEARCH ENGINE PROVIDERS

I. Selection and ranking of information

1. There is a delicate balance between the necessity for search engines to protect their business methods and to protect the service against abuse by, for example, search spammers and malevolent distributors of spyware on the one hand and the importance of being transparent about the process of selecting and ranking search engine results on the other hand. Lack of transparency about the selection and ranking of search results, the possible blocking or filtering of specific types of content and the lack of knowledge by the public about the functioning of search engines pose an equally significant challenge to the right to freely seek and access information.

2. Full disclosure of business methods may not be possible, given that the precise algorithms used may have a high relevance for competition. Full disclosure about the ranking of results could also result in increased vulnerability of search engine services to abuse of their services. For example, undue influence could be exercised by malevolent distributors of spyware, but also search spammers and third parties with a legitimate commercial interest using search engine optimisation techniques.

3. By ranking results, search engines impose particular sets of values as regards the relevance and quality of information for the public. Commonly the relevance of results is decided on a combination of amount of hyperlinks pointing to particular websites and previous click-through rates of presented results. Often individual search histories are also used for this purpose, as well as geographical indications of the adequacy of results. The right of access to information may be challenged by opaque prioritisation and by the use of bias criteria in search results.

      As a vital service for the information society, search engine providers should be transparent about the general criteria they apply to select and rank results, and also indicate individual bias, such as presenting results based on apparent geographic location or on earlier searches.

II. Right to private life and the protection of personal data

4. There is an equally delicate balance between the commercial need for search engines to optimise results and thus generate advertising revenues and the necessity to respect the right to private life and to protect personal data. In order to properly target the advertisements, search engines try to gain as much insight as possible into the characteristics and context of each individual query. An individual’s search history contains a footprint of that person’s beliefs, interests, relations and intentions and can include the most intimate details of a person’s life. The treatment of personal data by search engines is becoming even more crucial given the proliferation of audiovisual data (digital images, audio and video content) and the increasing popularity of mobile internet access. Specialised search engines aiming at finding information on individuals, location based services, the inclusion of user-generated images into general purpose search indexes and increasingly accurate face recognition technologies are some of the developments that raise deep concerns about the future impact of search engines on human rights such as right to private life and freedom of expression. All these data can be used for different purposes, namely commercial purposes, and may be also requested by law enforcement authorities or national security services. Internet users should have confidence that search engines will respect their right to private life and that their data are processed according to Article 8 of the European Convention on Human Rights and to Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108).

      Search engine providers should ensure the implementation of privacy and data protection principles, in particular affording an adequate protection to personally identifiable search records against illegitimate access.

5. Most internet users are unaware of the extent of the collection of personal data and of the purposes they are being used for. If they are not aware of this processing they are unable to make informed decisions about it. They require accurate, easily accessible and easily comprehensible information, adapted to different age groups and levels of education.

      Search engine providers should give the user a comprehensive overview of the processing of personal data they carry out. It is key for the search engine providers to explore innovative ways to present this information to the public, outside of the realm of the core conditions.

6. The collection and processing of large amounts of personal data for direct marketing purposes, for example in the form of behavioural targeting, poses a significant challenge to the right to private life. Allowing the public to make an informed decision about such use is best guaranteed by asking for specific consent, instead of offering an opt-out. In practice, there is no easy strategy for users to opt-out. If they for example manage to delete all cookies from their computer, they will also delete the ‘opt-out cookies’ that would register their opt-out from behavioural targeting from the different advertising networks or publishers.

      Search engine providers must inform their (potential) users about the specific and legitimate purposes for which personal data are processed. If data are used for direct marketing purposes or the creation of profiles, consent of the users is clearly preferable to any opt-out strategy.

7. Some of the currently described purposes for the processing of personal data, such as ‘the development of new services’ or ‘the offering of personalised advertising’ are too broadly defined to offer an appropriate framework to judge the legitimacy of the purpose. Many large search engine providers also offer services on other platforms such as for example social networking services and webmail. The large amounts of personal data collected through the search platform can be used to develop new services within the search platform, or on other platforms. It is of some concern that such new purposes for the processing of personal data could be added retro-actively, without an informed decision of the users.

      Search engine providers should seek informed consent from their users if they wish to use the personal data they have already collected for new legitimate processing purposes.

8. As regards the data held on the search history of their users, search engines should respect the rights of users to access and, where appropriate, to correct or delete information held about them. These rights apply foremost to the data from authenticated users stored by search engines, including personal profiles. However, these rights also apply to non-registered users.

      In this context, search engine providers should apply their technological innovation capacity to find a meaningful solution to grant access to the search history held of non-registered users.

      Search engine providers should further develop tools that allow users to gain access to, and correct and delete data that have been collected in the course of the use of services, including a possible profile created for example for direct marketing purposes.

9. Reasonable efforts should be made in order to limit processing of personal data to the minimum necessary. Innovative approaches in order to promote anonymous searches should be encouraged. If personal data are stored, the retention period should be no longer than necessary for the specific purposes of the processing. As personal data could be deleted after the end of a search session, continued storage needs an adequate justification. For each purpose, a limited retention time should be defined. Moreover, the set of personal data to be retained should not be excessive in relation to each purpose.

      Search engine providers should limit the processed personal data to the minimum necessary.

      Given the sensitivity of search behaviour, search engine providers should determine adequate, non-excessive retention periods and irreversibly delete personal data once the limited retention period has expired.

III. Censorship

10. To ensure freedom of information the user should be able to access Web content without censorship or restrictions. With regard to systematic nationwide filtering or blocking at the request of public authorities, search engine providers should strive for transparency and foreseeability by the public. With regard to individual filtering requests by private parties and individuals, search engine providers must adhere to the principle of due process and provide access to redress mechanisms.

      Search engine providers must promote transparency about systematic nationwide blocking or filtering about certain types of content and adhere to the principle of due process when removing specific search results from their index and provide access to redress mechanisms.

      Search engine providers should offer and continue to develop adequate individual filtering tools to allow their users to protect themselves, or in a family situation their children, against specific kinds of content, as well as filters that block or warn users against sites that apparently spread viruses, spy and other kinds of computer malware.

IV. Co- and self-regulation

11. Efforts at co- and self-regulatory adherence to human rights standards have been made on a national level, and on a global scale by for example the Global Network Initiative.

      In order to produce meaningful results, search engine providers should set up effective, tranparent, independent and accountable co- or self-regulatory mechanisms, ensuring appropriate sanctions and remedies in cases of breach of the provisions.


1 Draft Recommendation of the Committee of Ministers to member states and Guidelines

2 Article 29 Working Party Opinion 1/2008 (4 April 2008), the 28th International Data Protection and Privacy Commissioners’ Conference Resolution on Privacy Protection and Search Engines (London, 2 and 3 November 2006).