Media - Freedom of expression and information

 Committee of experts on protection of journalism and safety of journalists (MSI-JO)

Activities
STANDARD-SETTING
  Steering Committee (CDMSI)
  Bureau of the Committee (CDMSI-BU)
  Former Steering Committee (CDMC)
  Former Bureau of the Committee (CDMC-BU)
  Committee of Experts on Protection of Journalism and Safety of Journalists (MSI-JO)
  Committee of Experts on cross-border flow of Internet traffic and Internet freedom (MSI-INT)  
CONVENTIONS
  Transfrontier Television
  Conditional Access
COOPERATION
  Legal and Human Rights Capacity Building
FORMER GROUPS OF SPECIALISTS
  Rights of Internet Users
  Information Society
  New Media
  Public Service Media Governance
  Cross-border Internet
  Protection Neighbouring Rights of Broadcasting Organisations
  Media Diversity
  Public service Media
 
Events
  Conference Freedom of Expression and Democracy in the Digital Age - Opportunities, Rights, Responsibilities, Belgrade, 7-8/11/2013
  Conference "The Hate factor in political speech - Where do responsibilities lie?", Warsaw18-19 September 2013
  Conference of Ministers, Reykjavik - Iceland, 28-29 May 2009
  European Dialogue on Internet Governance (EuroDIG)
 
Documentation
  Conventions
  Committee of Ministers texts
  Parliamentary Assembly texts
  Ministerial Conferences
  Publications
  Translations
 
Useful links

COUNCIL OF EUROPE
COMMITTEE OF MINISTERS

______

RECOMMENDATION No. R (2001) 8

OF THE COMMITTEE OF MINISTERS TO MEMBER STATES
ON SELF-REGULATION CONCERNING CYBER CONTENT
(SELF-REGULATION AND USER PROTECTION AGAINST ILLEGAL
OR HARMFUL CONTENT ON NEW COMMUNICATIONS
AND INFORMATION SERVICES)

(Adopted by the Committee of Ministers on 5 September 2001,
at the 762nd meeting of the Ministers’ Deputies)

The Committee of Ministers, under the terms of Article 15.b of the Statute of the Council of Europe,

Considering that the aim of the Council of Europe is to achieve greater unity between its members for the purpose of safeguarding and realising the ideals and principles, which are their common heritage;

Having regard to its Declaration on a European policy for new information technologies, adopted on the occasion of the 50th anniversary of the Council of Europe in 1999;

Recalling the commitment of the member States to the fundamental right to freedom of expression and information as guaranteed by Article 10 of the Convention for the Protection of Human Rights and Fundamental Freedoms, and to entrust the supervision of its application to the European Court of Human Rights;

Reaffirming that freedom of expression and information is necessary for the social, economic, cultural and political development of every human being, and constitutes a condition for the harmonious progress of social and cultural groups, nations and the international community, as expressed in its Declaration on the Freedom of Expression and Information of 1982;

Stressing that the continued development of new communications and information services should serve to further the right of everyone, regardless of frontiers, to express, seek, receive and impart information and ideas, for the benefit of every individual and the democratic culture of any society;

Stressing that the freedom to use new communications and information services should not prejudice the human dignity, human rights and fundamental freedoms of others, especially of minors;

Recalling its Recommendation No. R (89) 7 concerning principles on the distribution of videograms having a violent, brutal or pornographic content, its Recommendation No. R (92) 19 on video games with a racist content, its Recommendation No. R (97) 19 on the portrayal of violence in the electronic media, its Recommendation No. R (97) 20 on “hate speech” and Article 4, paragraph a of the International Convention on the Elimination of All Forms of Racial Discrimination of the United Nations;

Bearing in mind the differences in national criminal law concerning illegal content as well as the differences in what content may be perceived as potentially harmful, especially to minors and their physical, mental and moral development, hereinafter referred to as “harmful content”;

Bearing in mind that self-regulatory organisations could, in accordance with national circumstances and traditions, be involved in the monitoring of compliance with certain norms, possibly within a co-regulatory framework, as defined in a particular country;

Aware of self-regulatory initiatives for the removal of illegal content and the protection of users against harmful content taken by the new communications and information industries, sometimes in co-operation with the state, as well as of the existence of technical standards and devices enabling users to select and filter content;

Desirous to promote and strengthen self-regulation and user protection against illegal or harmful content,

Recommends to the governments of member States:

1. o implement in their domestic law and/or practice the principles appended to this Recommendation;

2. to disseminate widely this Recommendation and its appended principles, where appropriate accompanied by a translation; and

3. to bring them in particular to the attention of the media, the new communications and information industries, users and their organisations, as well as regulatory authorities for the media and new communications and information services and relevant public authorities.

Appendix to Recommendation No. R (2001) 8

Principles and mechanisms concerning self-regulation
and user protection against illegal or harmful content
on new communications and information services

Chapter I – Self-regulatory organisations

1. Member States should encourage the establishment of organisations which are representative of Internet actors, for example Internet service providers, content providers and users.

2. Member States should encourage such organisations themselves to establish regulatory mechanisms within their remit, in particular with regard to the establishment of codes of conduct and the monitoring of compliance with these codes.

3. Member States should encourage those organisations in the media field with self-regulatory standards to apply these standards, as far as possible, to the new communications and information services.

4. Member States should encourage such organisations to participate in relevant legislative processes, for instance through consultations, hearings and expert opinions, and in the implementation of relevant norms, in particular by monitoring compliance with these norms.

5. Member States should encourage Europe-wide and international co-operation between such organisations.

Chapter II – Content descriptors

6. Member States should encourage the definition of a set of content descriptors, on the widest possible geographical scale and in co-operation with the organisations referred to in Chapter I, which should provide for neutral labelling of content which enables users to make their own value judgments over such content.

7. Such content descriptors should indicate, for example, violent and pornographic content as well as content promoting the use of tobacco or alcohol, gambling services, and content which allows for unsupervised and anonymous contacts between minors and adults.

8. Content providers should be encouraged to apply these content descriptors, in order to enable users to recognise and filter such content regardless of its origin.

Chapter III – Content selection tools

9. Member States should encourage the development of a wide range of search tools and filtering profiles, which provide users with the ability of selecting content on the basis of content descriptors.

10. Filtering should be applied by users on a voluntary basis.

11. Member States should encourage the use of conditional access tools by content and service providers with regard to content harmful to minors, such as age-verification systems, personal identification codes, passwords, encryption and decoding systems or access through cards with an electronic code.

Chapter IV – Content complaints systems

12. Member States should encourage the establishment of content complaints systems, such as hotlines, which are provided by Internet service providers, content providers, user associations or other institutions. Such content complaints systems should, where necessary for ensuring an adequate response against presumed illegal content, be complemented by hotlines provided by public authorities.

13. Member States should encourage the development of common minimum requirements and practices concerning these content complaints systems. Such requirements should comprise for instance:

a. the provision of a specific permanent Web address,

b. the availability of the content complaints system on a 24-hours basis,

c. the public provision of information about the legally responsible persons and entities within the bodies offering content complaints systems,

d. the public provision of information about the rules and practices of processing content complaints, including co-operation with law enforcement authorities with regard to presumed illegal content,

e. the provision of replies to users concerning the processing of their content complaints,

f. the provision of links to other content complaints systems abroad.

14. Member States should set up, at the domestic level, an adequate framework for co-operation between content complaints bodies and public authorities with regard to presumed illegal content. For this purpose, member States should define the legal responsibilities and privileges of bodies offering content complaints systems when accessing, copying, collecting and forwarding presumed illegal content to law enforcement authorities.

15. Member States should foster Europe-wide and international co-operation between content complaints bodies.

16. Member States should undertake all necessary legal and administrative measures for transfrontier co-operation between their relevant law enforcement authorities with regard to complaints and investigations concerning presumed illegal content from abroad.

Chapter V – Mediation and arbitration

17. Member States should encourage the creation, at the domestic level, of voluntary, fair, independent, accessible and effective bodies or procedures for out-of-court mediation as well as mechanisms for arbitration of disputes concerning content-related matters.

18. Member States should encourage Europe-wide and international co-operation between such mediation and arbitration bodies, open access of everyone to such mediation and arbitration procedures irrespective of frontiers, and the mutual recognition and enforcement of out-of-court settlements reached hereby, with due regard to the national ordre public and fundamental procedural safeguards.

Chapter VI – User information and awareness

19. Member States should encourage the development of quality labels for Internet content, for example for governmental content, educational content and content suitable for children, in order to enable users to recognise or search for such content.

20. Member States should encourage public awareness and information about self-regulatory mechanisms, content descriptors, filtering tools, access restriction tools, content complaints systems, and out-of-court mediation and arbitration.

* * *

Explanatory Memorandum

I. Introduction

1. The rapid development over the last decade of new communications and information technologies, in particular the Internet, at the global level has caused, on the one side, an enormous increase in access to information and in mass communication, which has led to a public perception of society being defined by the level of accessible information under the concept of the so-called “information society”. This has opened up new opportunities for the exercise of freedom of expression and information across national frontiers.
 
2. On the other hand, this development has led to a certain uncertainty concerning the legal status of these new communications and information services. The distinction between private correspondence and mass communication has become blurred. Professional standards and codes of conduct drawn up for the periodical print media and for broadcasting seem to be bound to their particular type of media, which is distinct from such “media” as the Internet. National legal provisions on youth protection, pornography, the portrayal of violence, advertising, decency or copyright and neighbouring rights seem to be challenged by the global reach of new communications and information services as well as by the question of where to allocate such services geographically in order to establish national jurisdiction.
 
3. Against this background, the Steering Committee on the Mass Media (CDMM) of the Council of Europe considered it appropriate for the Council of Europe to deal with the issue of illegal and harmful content on the new communications and information services from a self-regulation approach, with a view to protecting freedom of expression and information as well as other fundamental values. Substantive criminal law has not been a focus of this work.
 
4. Since self-regulation has become an important and recognised mechanism for the media in avoiding restrictive State legislation on the dissemination of information through the media, especially on matters of decency and moral values which differ widely among individuals and States, while ensuring respect of certain standards, some actors of the new communications and information services have taken initiatives for the creation of their own self-regulatory mechanisms.
 
5. The Ministers participating in the 5th European Ministerial Conference on Mass Media Policy on “the Information Society: a challenge for Europe” (Thessaloniki, 11-12 December 1997) referred in their Declaration to the development of new communications and information services by being “anxious in particular that this development contributes to the promotion of freedom of expression and information, artistic creation and exchanges between cultures, education and the participation of individuals in public life, while respecting and serving human rights, democratic values and social cohesion” and “welcoming the opportunities offered or anticipated in this regard by the new technologies and new communications and information services, while noting that there is a risk that these technologies and services may in certain circumstances be used to the detriment of and contrary to respect for human rights and human dignity, the protection of minors and basic democratic values”.
 
6. Against this background, the Action Plan agreed upon in Thessaloniki included the following activities:
 
a. “to encourage, in particular at the transnational level, self-regulation by providers and operators of the new communications and information services, especially content providers, in the form of codes of conduct or other measures, with a view to ensuring respect for human rights and human dignity, the protection of minors and democratic values, as well as the credibility of the media themselves”;
 
b. “to intensify work on the impact of new communications technologies and services on human rights and democratic values with a view to preparing within the framework of the Council of Europe any legal instruments or other measures which might be necessary to promote freedom of expression and information, especially across frontiers, and guarantee the protection of human rights and democratic values”;
 
c. “to study cases of misuse of the new technologies and new communications and information services for spreading any ideology, or carrying out any activity, which is contrary to human rights, human dignity and the fundamental rights of others, as well as to the protection of minors and to democratic values, and to formulate, where necessary, any proposals for legal or other action to combat such use”;
 
d. “to examine the opportunity and feasibility of establishing warning, co-operation and assistance procedures, including legal ones, in liaison with other authorities, with a view to undertaking concerted action against these forms of misuse at the widest possible level”; and
 
e. “to study the practical and legal difficulties in combating the dissemination of hate speech, violence and pornography via the new communications and information services, with a view to taking appropriate initiatives in a common pan-European framework”.
 
7. Furthermore, the Committee of Ministers, in its Declaration on a European policy for new information technologies (Budapest, 7 May 1999), urged the governments of member States, acting where appropriate with public and private partners, “to ensure respect for human rights and human dignity, notably freedom of expression, as well as the protection of minors, the protection of privacy and personal data, and the protection of the individual against all forms of racial discrimination in the use and development of new information technologies, through regulation and self-regulation, and through the development of technical standards and systems, codes of conduct and other measures”.
 
8. In the light of the Thessaloniki Action Plan, the Committee of Ministers gave the Group of Specialists on the impact of new communication technologies on human rights and democratic values (MM-S-NT) the mandate to analyse possible self-regulatory measures against illegal or harmful content.
 
9. When drawing up Recommendation (2001) 8, the MM-S-NT took note of the various initiatives taken at the national and international levels on this matter and organised several hearings with relevant bodies from the Internet sector, such as the European Internet Service Provider Association (EuroISPA), the Internet Watch Foundation of the United Kingdom, the Freiwillige Selbstkontrolle Multimedia-Diensteanbieter (voluntary self-control of multi-media service providers) of Germany, the Internet Content Rating Association (ICRA), the Entertainment Software Rating Board, the Internet Hotline Providers in Europe (INHOPE) Association and Childnet International.
 
10. At the 762nd meeting of the Ministers’ Deputies on 5 September 2001, the present Recommendation was adopted by the Committee of Ministers, who authorised the publication of this Explanatory Memorandum.
 
II. General commentary
 
11. The Recommendation uses the term “new communications and information services” without defining these services. This term or similar variants are widely used, commonly referring to digital communications and information services, such as the Internet with its World Wide Web and e-Mail. Although the word “Internet” is commonly used as a generic term for new communications and information services, the express mention of the Internet is avoided by the Recommendation, because of the rapid and partly unpredictable technological development in this field and the possible limitation which might result from an exclusive reference to the Internet. In this respect, the word “new” highlights this recent and on-going development, although some aspects of this development might not be qualified as new in the near future. In the light of the descriptive nature of the term, member States may be more specific in accordance with their national circumstances and policies.
 
12. This Recommendation is addressed to governments of member States. Any Recommendation of the Committee of Ministers is an instrument of political commitment, and not a legally enforceable instrument. Through its adoption by the Committee of Ministers, it binds all member States and does not require an individual adhesion by member States.
 
13. The means of implementing this Recommendation and its principles are not specified in the Recommendation. This enables States to select any appropriate means of implementation at their own discretion, taking into account their domestic law and practice, international commitments and national circumstances. In this regard, the preamble of the Recommendation refers to the International Convention on the Elimination of All Forms of Racial Discrimination of the United Nations of 1965.
 
14. Under Article 4 (a) of the International Convention on the Elimination of All Forms of Racial Discrimination, States Parties to this instrument “shall declare an offence punishable by law all dissemination of ideas based on racial superiority or hatred, incitement to racial discrimination, as well as all acts of violence or incitement to such acts against any race or group of persons of another colour or ethnic origin, and also the provision of any assistance to racist activities, including the financing thereof”. Since Article 4 does not specify any forms of expression, it applies to all types of media, including new communications and information services. The Recommendation shall not in any way limit or derogate from the obligations of States under this Convention or other international legal instruments.
 
15. The Recommendation does not seek to address directly the private sector, and in particular the new communications and information industry. It is up to member States to define any appropriate arrangements for ensuring that the private sector takes part in the implementation of this Recommendation.
 
16. Finally, member States are held “to disseminate widely this Recommendation and its appended principles, where appropriate accompanied by a translation”, as the dissemination of the Recommendation is a prerequisite for its proper implementation.
 
III.Commentary to the Recommendation
 
17. The specific recommendations or principles appear in the Appendix to the Recommendation. This Appendix is an integral part of the Recommendation itself. It is only for sake of clarity that the individual “principles and mechanisms concerning self-regulation and user protection against illegal or harmful content on new communications and information services” are grouped in the Appendix.
 
18. The Recommendation addresses both illegal and harmful content. Illegal content should be understood as content in violation of national laws. The preamble of the Recommendation refers to harmful content as content perceived as being potentially harmful, especially to minors and their physical, mental and moral development, without necessarily being illegal.
 
19. The Recommendation uses the term “content” as a generic term for all kinds of content, including text, images, sound as well as interactive communications, since new communications and information services typically offer a combination of all of them.
 
Chapter I – Self-regulatory organisations
 
20. Self-regulation by the relevant actors of the new communications and information services may be facilitated by the fact that these actors have, for instance, organised themselves into associations. Article 11 of the European Convention for the Protection of Human Rights and Fundamental Freedoms (hereinafter referred to as: European Convention on Human Rights or ECHR) guarantees the right to freedom of assembly and association. Member States are thus bound not to restrict this freedom under Article 11 of the ECHR. Paragraph 1 of the Recommendation goes further by recommending that member States should encourage the establishment of such organisations. The actual means of encouragement remain at the discretion of each member State.
 
21. These organisations should be encouraged to establish self-regulatory mechanisms, such as codes of conduct. Member States might focus their encouragement in particular on those organisations which achieve a wide representation of their sector, i.e. which group together a sufficient number of members who might have certain technical or other possibilities of determining the content being made available on new communications and information services. Paragraph 1 of the Recommendation mentions, for example, Internet service providers, content providers and users.

22. Paragraph 2 of the Recommendation aims at the monitoring of compliance with self-regulatory mechanisms, such as codes of conduct. Such monitoring should lead to an encouragement of compliance, possibly reinforced by sanctions such as suspension of membership or the publication of violations of codes of conduct. Sanctions imposed under self-regulatory mechanisms must not take the form of penal sanctions.

23. Examples of codes of conduct can be found with the European Internet Service Provider Association ( http://www.euroispa.org ) or the Internet Hotline Providers in Europe Association ( http://www.inhope.org ). Beyond Europe, codes of conduct have been established, for instance, by the Internet Industry Association of Australia ( http://www.iia.net.au ), the Canadian Association of Internet Providers ( http://www.caip.ca ) and the Electronic Network Consortium of Japan ( http://www.nmda.or.jp/enc/guideline.html ). These examples are mentioned for information only.
 
24. Self-regulatory mechanisms such as codes of conduct have existed for the periodical print media, radio and television for many decades, and related sectors such as the advertising sector, have developed their own standards. Paragraph 3 of the Recommendation encourages the application of those existing standards, as far as possible, to the new communications and information services. The International Chamber of Commerce / World Business Organisation has, for example, developed an advertising code for the Internet ( http://www.iccwbo.org/home/statements_rules/rules/1998/internet_guidelines.asp ).
 
25. Self-regulatory organisations bring together a resource of information, knowledge and experience in a rapidly developing and thus often “new” sector. They also constitute a forum for expressing the interests of the various actors and contribute to forming public opinion. Their participation in the preparation of relevant legislation can thus deepen the technical debate and facilitate wide acceptance of new standards. Member States are therefore encouraged, under paragraph 4 of the Recommendation, to involve these organisations. Such involvement might depend on the national circumstances and traditions, and member States have a wide discretion in this regard. One example could be the creation of a national co-regulatory body in France in early 2001 by the French government (see: http://www.internet.gouv.fr/francais/textesref/pagsi2/lsi/coregulation.htm ), just as the numerous consultations conducted by the governments of other member States.
 
26. Self-regulatory organisations could also, in accordance with national circumstances and traditions, be implicated in the monitoring of compliance with certain norms in a co-regulative framework. The Recommendation does not specify particular concepts or models. National experiences have been acquired in this regard by the Voluntary Self-regulation of Multimedia Service Providers (FSM) in Germany ( http://www.fsm.de ), which was established in the light of specific German legislation on the regulation and self-regulation of multimedia services, as well as by the Australian Broadcasting Authority, which has been given the authority under national law to register codes of conduct voluntarily submitted by the Internet industry, to run an Internet hotline for content complaints and to apply content classification standards used in the film and television sector to Internet content complained of through their hotline ( http://www.aba.gov.au ).
 
27. Paragraph 5 of the Recommendation recommends Europe-wide and international co-operation between such organisations. Given the global nature of new communications and information services, national self-regulation is likely to be faced with the limits of its jurisdiction. In addition, co-operation and exchange of information and experience is helpful for avoiding internationally conflicting results.
 

Chapter II – Content descriptors
 
28. With the abundance of content within the new communications and information services, content recognition will become essential to help users search for content efficiently, as well as enabling them to filter out undesirable content. The use of a set of content descriptors by content providers will facilitate this. Hence, paragraph 6 of the Recommendation recommends that content descriptors be defined, which should “provide for neutral labelling of content which enables users to make their own value judgments over such content”. This will enable users to select content in a neutral way which respects their own moral values, and will allow such content descriptors to be used internationally in different cultures and over time.
 
29. In the field of print media, film, broadcasting and video games, content classifications have been applied widely at the national level for many years. Domestic laws typically require the classification of violent and pornographic content for those media and prohibit racist content. It will be recalled that the Committee of Ministers has adopted several Recommendations in this field: Recommendation No. R (89) 7 concerning principles on the distribution of videograms having a violent, brutal or pornographic content, Recommendation No. R (92) 19 on video games with a racist content, Recommendation No. R (97) 19 on the portrayal of violence in the electronic media and Recommendation No. R (97) 20 on “hate speech”. Furthermore, Article 7 of the European Convention on Transfrontier Television aims at limiting programme services which contain pornography or violence, incite racial hatred or are likely to impair the physical, mental or moral development of children and adolescents, while Article 15 restricts advertising and tele-shopping for tobacco products, alcoholic beverages and medicines.
 
30. During the preparation of this Recommendation, it was acknowledged that, at the national level, media content should preferably be qualified or classified in an identical way, irrespective of the actual type of media. New communications and information services were not regarded as operating in a legal vacuum or as a medium outside national legal systems. However, national legal systems are very different with regard to what constitutes illegal content, for example in the field of pornographic content. This Recommendation does not aim at harmonising national laws in this respect.
 
31. National laws for print media, film or broadcasting regulate, for example, violent or pornographic content. Most member states have also introduced limitations on media content promoting the consumption of tobacco or alcohol as well as on gambling services. Therefore, paragraph 7 of the Recommendation makes reference to these types of content.
 
32. New communications and information services have been and are used by paedophiles for contacting children, mainly through so-called “chat rooms”, i.e. web sites where users can post, access and exchange text or voice messages and images. Paragraph 7 of this Recommendation, therefore, includes the recommendation to indicate those services which allow for unsupervised and anonymous contacts between minors and adults, for example. The examples mentioned in paragraph 7 are neither exhaustive nor mandatory, but were considered as being the most relevant and important. Racist content is to be considered as an offence punishable by law under Article 4 of the International Convention on the Elimination of All Forms of Racial Discrimination of 1965.
 
33. Where content descriptors are used, users of new communications and information services can recognise the type of content concerned, either visually or technically through specific software. This enables users to filter content which they do not want to access, for instance, when searching content on the Internet through search devices. It also allows parental control over content accessed by children.
 
34. Content providers are encouraged to apply content descriptors themselves. Such descriptors have been developed at the international level, for example, by the Internet Content Rating Association ( http://www.icra.org ), which encourages content providers to voluntarily label their content themselves by using a set of content descriptors which are compatible with the Platform for Internet Content Selection (PICS) developed and made available freely by the World Wide Web Consortium ( http://www.w3c.org/PICS ). Internet users can thus set their own PICS-based filters for themselves or their children.
 
35. Instead of self-labelling by content providers, content can also be classified for users by a trusted third party of the users’ choice. Such a classification service, which is compatible with PICS, is offered to users by the Entertainment Software Rating Board ( http://www.esrb.org ), for example. Third party rating may also include the identification of illegal content, such as racist content, and can thus offer users, in particular parents, schools or public libraries, the possibility to protect themselves against racist content by filtering. The identification of racist web sites is offered, for example, by the Simon Wiesenthal Center ( http://www.wiesenthal.com ).
 
Chapter III – Content selection tools
 
36. The enormous and continuous increase in content made available on new communications and information services requires users to be capable of searching and selecting the content which is of interest to them. For such a content search specific software programmes can be used, which typically apply certain selection criteria when presenting search results in a hierarchical manner. In the same way, content can be filtered. Therefore, paragraph 9 of the Recommendation proposes encouraging the development of a wide range of search tools and filtering profiles based on content descriptors. This will enable users to choose the search tool or filtering profile of their choice, when selecting content on the basis of content descriptors.
 
37. Article 10 of the European Convention of Human Rights guarantees the right to freedom of expression and information without interference by public authorities and regardless of frontiers, irrespective of the means of mass communication used. Paragraph 10 of this Recommendation, therefore, recommends that filtering of content on new communications and information services should be applied by users on a voluntary basis. Filtering systems can thus empower users to make qualified choices about the type of lawful content they wish to access, as for example parents or other persons or institutions having responsibility over children as to what content should be accessible to these children.
 
38. Content on new communications and information services can also be filtered by content providers themselves through the use of conditional access systems, such as personal identification codes or passwords, encryption systems which require a decoder as well as electronic codes or cards which have to be inserted, before a given content can be accessed. Paragraph 11 recommends that member States should encourage the use of conditional access systems by content providers offering content which can be considered as being harmful to minors by their parents or the public. 

Chapter IV – Content complaints systems
 
39. Illegal or harmful content made available on new communications and information services is generally accessible to a large number of users. Thus, public scrutiny over such content has led to many cases where illegal content was identified. In order to allow users to react to presumed illegal content, member States are recommended in paragraph 12 of the Recommendation to encourage the establishment of content complaints systems, such as hotlines.
 
40. Those content complaints systems could be provided, for example, by Internet service providers, content providers and user associations as well as other institutions, such as private associations in the field of child protection. In Europe, the Internet Hotline Providers in Europe Association ( http://www.inhope.org ) groups together a number of national hotlines. In countries where an adequate response to presumed illegal content is not offered by these private institutions, member States should provide content complaints systems operated by public authorities, such as special police authorities or public authorities with responsibilities for child protection matters.
 
41. Content complaints systems or hotlines offer users a tool for gaining confidence in new communications and information services, as well as in the complaints systems themselves and their processing of complaints. In order to avoid misinformation of the public and ensure a certain standard for content complaints systems, paragraph 13 of the Recommendation provides that member States should encourage the development of common minimum requirements and practices for these systems. They should comprise, for instance: (a) the provision of a permanent Web address which will be remembered by users through its permanence, (b) the availability of the content complaints system on a 24-hour basis allowing users to reach it at any moment, (c) the public provision of information about the legally responsible persons and entities within the bodies offering content complaints systems, which may help to achieve the necessary transparency for user confidence as well as public scrutiny over the content complaints system, (d) the public provision of information about the rules and practices of processing content complaints, including co-operation with law enforcement authorities, in order to make users aware of the possible results of their complaints, (e) the provision of replies to users concerning the processing of their complaints, in order to signal to users, possibly in an automated form, the receipt and processing of their complaints, and (f) the provision of links to other content complaints systems abroad, which will enable users to lodge complaints with complaints systems in the country where the content concerned originates.
 
42.. Paragraph 14 of the Recommendation provides that for the proper functioning of content complaints systems and their co-operation with public authorities with regard to presumed illegal content, the competent public authorities, such as police and law enforcement authorities, should set up an adequate framework for such co-operation. Such a framework could include the practical requirements for providing a contact point for content complaints systems or the training of staff in dealing with complaints concerning content on new communications and information services.
 
43. Paragraph 14 mentions specifically that member States should define the legal responsibilities and privileges of bodies offering content complaints systems when accessing, copying, collecting and forwarding presumed illegal content to law enforcement authorities. This is necessary, because domestic laws in member States might have made it an offence punishable under criminal law to access, copy, collect and forward illegal content, in particular child pornography. Content complaints bodies would otherwise commit a criminal offence, if they processed a complaint about child pornography, for instance.
 
44. Content complaints may very often concern content originating from countries different from where the content complaints body has its seat. Therefore, paragraph 15 of the Recommendation provides that Europe-wide and international co-operation between content complaints bodies should be fostered.
 
45. Finally, member States are recommended in paragraph 16 of the Recommendation to undertake all necessary legal and administrative measures for developing transfrontier co-operation between their relevant law enforcement authorities with regard to complaints and investigations concerning presumed illegal content from abroad. It is a logical consequence that law enforcement authorities should co-operate across national borders, especially when co-operation between content complaints bodies of different States has been recommended above. The framework for such co-operation may depend on the bilateral or multilateral agreements concluded by each member State, for example in the field of judicial assistance as well as recognition and enforcement of foreign judgments. In this context, member States might refer to the European Convention on Mutual Assistance in Criminal Matters of 1959 and its Additional Protocol of 1978.
 
Chapter V – Mediation and arbitration
 
46. Content-related complaints may easily lead to legal disputes. Such disputes could typically face the problem of jurisdictional uncertainty of domestic courts as well as problems caused by the fact that the parties to the dispute are in different and possibly very distant countries. Therefore, paragraph 17 of the Recommendation recommends the creation of voluntary, fair, independent, accessible and effective bodies or procedures for out-of-court mediation as well as mechanisms for arbitration of disputes concerning content-related matters.
 
47. Paragraph 18 of the Recommendation recommends Europe-wide and international co-operation between such mediation and arbitration bodies, open access by everyone to mediation and arbitration procedures irrespective of frontiers, and the mutual recognition and enforcement of out-of-court settlements reached hereby, with due regard to the national ordre public and fundamental procedural safeguards.
 
48. In this context, reference shall be made to the European Convention providing a Uniform Law on Arbitration of 1966, the European Convention on International Commercial Arbitration of 1961 and the Council of Europe Agreement of 1962 relating to the application of this Convention, as well as Recommendation No. R (99) 19 of the Committee of Ministers concerning mediation in penal matters.
 
Chapter VI – User information and awareness

49. Chapter III of this Recommendation deals with content selection tools, such as search tools and filtering profiles, which prevent access to harmful content. These content selection tools can also be used to select content with certain qualifications, such as governmental content, educational content and content suitable for children. Paragraph 19 of the Recommendation, therefore, recommends encouraging the development of quality labels for content on new communications and information services.
 
50. The users of new communications and information services can benefit from the development of self-regulatory mechanisms, technological tools for content selection, content complaints systems and out-of-court mediation and arbitration procedures, only if they have been made aware of them and possess sufficient knowledge about their use. Therefore, member States should encourage public awareness and information in this regard under paragraph 20 of the Recommendation. Such information should be accessible to all, for instance through educational institutions or public libraries. Information about filtering harmful content might especially be addressed to parents. In this context, member States may also take account of Recommendation (99) 14 on universal community service concerning new communications and information services, which recommends that information and training for all in these services should be developed.