Internet intermediaries play an increasingly important role in modern societies. Their actions influence the choices we make, the way we exercise our rights, and how we interact. The market dominance of some places them in control of principal modes of public communication. What are the roles they play? How do they impact human rights, democracy and the rule of law? What are their corresponding duties and responsibilities? The Council of Europe has developed human rights-based guidelines to help member states address this challenge.
The term ‘internet intermediaries’ commonly refers to a wide, diverse and rapidly evolving range of service providers that facilitate interactions on the internet between natural and legal persons. Some connect users to the internet, enable processing of data and host web-based services, including for user-generated comments. Others gather information, assist searches, facilitate the sale of goods and services, or enable other commercial transactions. Importantly, they may carry out several functions in parallel, including those that are not merely intermediary. Internet intermediaries also moderate and rank content, mainly through algorithmic processing, and they may perform other functions that resemble those of publishers. As a result, different regulatory frameworks can apply, respectively, to their intermediary roles and to their other functions.
Image © Shutterstock
Shared obligations of states and internet intermediaries
It is primarily the obligation of states to make sure that laws, regulations and policies applicable to internet intermediaries effectively safeguard the human rights and fundamental freedoms of users. At the same time and in line with the UN Guiding Principles on Business and Human Rights however, internet intermediaries have the responsibility to respect the internationally recognised human rights of their users and of third parties affected by their activities. States and intermediaries therefore have to work together. The higher the impact and the potential damage inflicted on rights by internet intermediaries, the greater the precautions they must employ when developing and applying their terms and conditions of service, community standards and codes of ethics.
Any request, demand or other action by public authorities addressed to internet intermediaries must, in line with the European Convention on Human Rights, be prescribed by law, be exercised within the limits of the law and constitute a necessary and proportionate measure in a democratic society. All regulatory frameworks, including co- or selfregulatory frameworks should have effective oversight mechanisms. The process of enacting legislation applicable to internet intermediaries should be transparent and inclusive. The relevant legislation should be accessible and foreseeable, should clearly define powers granted to public authorities, and should be interpreted, applied and enforced without discrimination.
Safeguards for freedom of expression
Any demand to internet intermediaries to restrict access to content shall meet the conditions foreseen in Article 10 of the European Convention on Human Rights. State authorities should carefully evaluate the possible, including unintended, impact on freedom of expression and should apply the least intrusive means. State authorities should also ensure that effective redress mechanisms are made available and adhere to applicable procedural safeguards. Any interference by intermediaries with the free flow of information and ideas should be based on clear and transparent policies and be limited to specific legitimate purposes, as determined by law, through instruction by a competent authority, or in accordance with the company’s own content restriction policies or codes of ethics. When restricting access to content in line with their own contentrestriction policies, intermediaries should do so in a transparent and non-discriminatory manner.
Use of personal data
Intermediaries should not disclose personal data to a third party unless required by law or requested do to so by a judicial or other competent authority whose decisions are subject to judicial review, that has determined that disclosure is consistent with law. Intermediaries should limit the processing of users` personal data to what is necessary to achieve a clearly defined purpose, which is explicitly and clearly communicated to all users in a proactive manner. The processing of personal data shall be based on the free, specific, informed and unambiguous consent of the user, with respect to the specific purpose, or on another legitimate basis defined in the Convention on the Protection of Individuals with regard to Automatic Processing of Personal Data (Convention 108).
States should guarantee access to effective judicial and non-judicial procedures against all alleged violations of the European Convention on Human Rights in the digital environment by internet intermediaries or third parties.
All terms of service agreements and policies shall be publicly available in clear, plain language and accessible formats. Users shall be notified in advance of all changes in relevant policies. The process of drafting and applying terms of service should be transparent, accountable and inclusive. Intermediaries should provide meaningful public information about the operation of automated data processing techniques, including the operation of algorithms that facilitate searches based on user profiling. Intermediaries should regularly publish transparency reports that provide clear and meaningful information on all restrictions to the free and open flow of information and ideas.
A committee of experts on internet intermediaries (MSI-NET) was set up in 2016 under the supervision of the Steering Committee on Media and Information Society of the Council of Europe (CDMSI) to work on these questions.
The MSI-NET prepared the:
- Recommendation CM/Rec(2018)2 of the Committee of Ministers to member States on the roles and responsibilities of internet intermediaries
- (also available in Macedonian language),
adopted by the Committee of Ministers in March 2018.
The committee also issued a:
- Study on the human rights dimensions of automated data processing techniques and possible regulatory implications (Algorithms and Human Rights).
Until the end of 2019, the inter-disciplinary Committee of experts on human rights dimensions of automated data processing and different forms of artificial intelligence (MSI-AUT) will prepare a draft recommendation on the human rights impacts of algorithmic decision-making processes in the public and private sector.
The finding of this report demonstrate just how difficult it can be for Internet users to understand and thereby consent to the terms of service of online platforms in order to make fully informed decisions on issues which affect their human rights such as content restriction policies and processing of personal data.
The Council of Europe commissioned to the Swiss Institute of Comparative Law a comparative study in respect of filtering, blocking and take-down of illegal content on the internet in the 47 member states of the Organisation. This study describes and assesses the legal framework but also the relevant case-law and practice in the field. It is divided in two main parts: country reports and comparative considerations.
The applicant had been the subject of a defamatory online comment, which had been published anonymously on a blog. The applicant made a civil claim against the small non-profit association which ran the blog, claiming that it should be held liable for the third-party comment. The applicant complained to the Court that by failing to hold the association liable, the authorities had failed to protect his reputation and had violated his right to respect for his private life. The Court held that the complaint was without merit. In cases such as this, a balance must be struck between an individual’s right to respect for his private life, and the right to freedom of expression enjoyed by an individual or group running an internet portal. In light of the circumstances of this case, the national authorities had struck a fair balance when refusing to hold the association liable for the anonymous comment.