Password reset (expires every two months)
Legal instruments to combat racism on the internet
V. SOFT LAW
5.1.2. Codes of Conduct - Mechanism of self-regulation
5.1.3. General terms and conditions of Providers
5.1.4. Governmental Registration Boards and Hotlines
5.1.5. Instruments to trace illegal contents: filtering, rating, labelling
5.2.1. Action Plan on the safer use of Internet
220.127.116.11. Creating a European network of hotlines
18.104.22.168. Encouraging self-regulation and codes of conduct
22.214.171.124. Developing filtering and rating systems
126.96.36.199. Encouraging awareness actions
5.3.2. The Netherlands
188.8.131.52. Code of conduct
184.108.40.206. Tasks and intentions of the FSM-Beschwerdestelle (Complaints Office)
220.127.116.11. Is netiquette legally binding
5.3.6. United Kingdom
18.104.22.168. Code of Practice for ISPs
22.214.171.124. Technical Aspects of identifying authors of racist material
Because the Internet is such a new and unique medium, people are having difficulty establishing rules for its use. Out of sheer necessity, the users of the Net have, over the period of time since the network was born, tended toward certain rules of network conduct. This code of network ethics has been given many names over the years - the one that has seemed to stick, however, is "netiquette", a conjunction formed from "network etiquette".
The interesting and unique thing about netiquette in contrast to a hard-and-fast system of rules is that it allows room for interpretation. From the point of view of an Internet User2, netiquette can be seen as a corollary of the Gentleman's Rule: "An Internet User, while using the Internet, shall conduct himself as a Gentleman and Responsible Citizen" There is nothing to stop someone from abusing the network. As with our daily actions with those around us, we must face the consequences of our behaviour. If years of network use have produced anything resembling a system of order, it is surely embodied in what is referred to as netiquette.
The netiquette rules as such are very vague and do not specifically mention racism.3 The rules which are closest to our issue might be that nobody shall use a computer to harm other people and one shall use the computer in ways that show consideration and respect. Racial discrimination on the Internet would therefore violate netiquette.
Netiquette covers not only rules for maintaining civility in discussions, but also special guidelines unique to the electronic nature of forum messages. For example, netiquette advises users to use simple formats because complex formatting may not appear correctly for all readers. In most cases, netiquette is enforced by fellow users who will vociferously object if one breaks a rule of netiquette.
Internet Providers are beginning to integrate netiquette into their contracts. For example, a large telecommunications company in Switzerland called Swisscom specifically referred to netiquette in its description of the "bluewindow" Internet Access Services:
"C.1.4 The customer accepts the rules listed in the netiquette (inter alia spamming, mail bombs, transmission of unwanted e-mail advertising) and shall comply with them." (Bluewindow)4
The same provisions may be found in the general terms and conditions of the Austrian Provider Eunet.5
When netiquette is incorporated by reference in the general terms and conditions it becomes part of the contract and its violation constitutes a breach of contract. The fact that Internet Providers simply refer to netiquette without any further explanation or link to a detailed description implies that Internet Providers presume that what ought to be understood by the word "netiquette" is common knowledge – a presumption which may or may not prudent. Before incorporating the precepts of netiquette into a contract, Internet Providers should provide an explanation or at least a link to a description of the netiquette. We found an English description of the guidelines of the netiquette on the website of ISPA Netherlands.6
For the industry to contribute effectively to restricting the flow of illegal and harmful content, it is also important to encourage enterprises to develop a self-regulatory framework through cooperation between them and the other parties concerned. This means that no access or hosting should be given by the Providers to illegal sites. The self-regulatory mechanism should provide a high level of protection and address questions of traceability. The Codes of Conduct are internal deals of the Providers who do not directly cooperate with the police. Some Internet Service Provider Associations have installed a hotline where illegal contents can be announced to the Providers and in this field they sometimes cooperate with the police. (see point 5.3. for examples)
In some countries the contracts between Providers and their clients - who buy space in or access to the Internet - are governed by general terms and conditions which incorporate Codes of Conduct. In the case of a breach of the Code of Conduct there is also a breach of the contract with the foreseen consequences (removal, closure, etc). Unified general terms and conditions for all Providers do not yet exist. There are different types of references concerning illegal contents on the net. Some of the General terms and conditions refer to illegal contents without any specification, others specify the prohibited acts such as racism, revisionism, or child pornography:
"User Guidelines for sunrise internet services (sunrise switzerland)7
Legal and illegal use:
… You are under obligation not to use the services provided for committing, or causing to be committed punishable offences and to take suitable measures to prevent illegal use by your employees or members of your household. This applies in particular to matters of illegal games of chance, money laundering, the publication and the making accessible the presentation of violence, so-called hard pornography, incitement to crime or acts of violence, disturbance of religious and cultural freedom or racial discrimination."
Some include a citation to the specific articles of the relevant laws.
The blue window Internet Access Services - Service Description for
C.1. Information content
C.1.2. In particular, the following illegal information content may not be transmitted or made accessible via the customer's access:
Depiction of violence as defined in Art.135 of the Swiss Penal Code
Pornographic texts, photographs and depictions as defined in Art. 197 Clauses 1 and 3 of the Swiss Penal Code
Racial discrimination as defined in Art. 261bis 261bis of the Swiss Penal Code
Incitement to violence as defined in Art. 259 261bis of the Swiss Penal Code
Instruction or incitement to criminal offences or other encouragement of the same
Illegal games of chance (in particular in the scope of the Lottery Act)
Information which infringes copyright, related protection rights or intellectual property rights of third parties.
We found no Code of Conduct with more specific references. This is in fact a weak point of this type of self-regulatory mechanism. The more precisely defined the Codes of Conduct, as well as the general terms and conditions, the more aware users are of the fact that they are violating the law and that their acts are punishable.
In the interest of completeness, we must discuss the limits of Codes of Conduct in general terms and conditions. A very recent case concerns the Provider Yahoo. On 23 February 2000, Yahoo America was accused by the American Anti Defamation League of not respecting its own Charter of Codes of Conduct concerning illegal racist content on the net. Unlike Yahoo France, Yahoo America did not remove the site where one can buy Nazi objects which are sold by auction.9 One reason for the American Provider Yahoo’s reticence might be the more liberal approach to the freedom of speech in the U.S. (see infra).
An effective way to restrict circulation of illegal material is to set up a network of centres (known as hotlines) which allow users to report content which they come across in the course of their use of the Internet and which they consider to be illegal.10 There are several types of institutions to whom the illegal content may be announced, such as a governmental registration board or a hotline (often installed by Providers or ISPAs) or NGO’s which also run hotlines. Responsibility for prosecuting and punishing those responsible for illegal content remains with the national law-enforcement authorities, while the hotlines aim at revealing the existence of illegal material with a view to restricting its circulation. Differences in national legal systems and cultures must also be respected. This means that in different countries different instruments are more easily accepted than others. We can say that sometimes the NGO is the first institution of contact concerning illegal contents. In other countries the Providers’ hotlines are frequently used to announce illegal contents. Based on the experience with hotlines for contents concerning child pornography, we would like to stress the importance of not having too many different contact points for announcing illegal contents in order to allow for a comprehensive overview of the subject and to avoid having various institutions or entities working in parallel.
5.1.5. Instruments to trace illegal contents: filtering, rating, labelling11
To promote safer use of the Internet, it is important to make the content easier to identify. This can be done through a rating system which describes the content in accordance with a generally recognised scheme (for instance, where items such as sex or violence are rated on a scale) and by filtering systems which empower the user to select the content he/she wishes to receive. Ratings may be attached by the content provider or provided by a third-party rating service. There are a number of possible filtering and rating systems. However, their level of sophistication is still low and none has yet reached the "critical mass" where users can be sure that content in which they are interested and content which they wish to avoid will be rated appropriately and that perfectly innocuous content will not be blocked.12
The labelling13 of a web site is a voluntary step by the publisher of the content or any other operator. It consists of labelling the content of the pages which the site contains and classifying them in various categories. This labelling/classification, which proceeds from the principle that information should be provided about the information, is designed to allow the end user of the computer to filter the contents to which he has access, whether he does this himself (i.e. by deciding not to consult the web pages whose labels do not appeal to him) or by means of purpose-designed software. Such is the nature of self-regulation that the labelling of web sites appears to have an essential role: piece: by this process the user tends to become responsible for the contents which he wishes to receive.
In the United States14, 42 two series of initiatives are worth mentioning because they are particularly well known.
The platform of the Recreational Software Advisory Council (RSAC), a non-profit-making association sponsored by the largest firms in the Internet market (IBM, Microsoft, Dell, Disney Online etc.) seeks, in particular, to divide websites into categories according to the types of public. At present the RSAC has classified approximately 50,000 sites, using as labelling criteria violence, sex, language and nudity.
SafeSurf is an organisation set up by Ray Soular and Wendy Simpson in 1995 to protect children on the Internet. A number of factors are taken into consideration for the purpose of labelling: profanity; heterosexual themes; homosexual themes; nudity; violence; sex, violence and profanity; intolerance, glorifying drug use; other adult themes; and gambling.
In Germany the eco (electronic commerce forum)15 acts as a spokesperson and representative for the Internet industry. In 1996 they created a working group called ICTF (Internet Content Task Force) which specialised in scanning and rating of Newsgroups with illegal and harmful content, including racist content. The Providers can denounce Newsgroups which seem to contain illegal material.16
The described instruments should not only protect Internet users from being confronted with racist content on the net, but should also restrict the active research of racist material by search engines. As an example, let us return to Yahoo: Upon typing the Keyword "nazi" search engines of Yahoo, France will produce only scholarly works on nazism, as opposed to Yahoo, America’s search engines, which continue to provide references to racist sites.
5.2.1. Action Plan on the safer use of Internet17
The Action Plan shall cover a period of four years from 1 January 1999 to 31 December 2002. The financial framework for the implementation of the Action Plan for the period from 1 January 1999 to 31December 2002 has been set at ECU 25 million.
The action lines, in conjunction with the Recommendation on protection of minors and human dignity, are a means of implementing a European approach to safer use of the Internet, based on industry self-regulation, filtering, and rating and awareness. Strong support has been expressed for this approach at the level of the European Parliament and by the Council and Member States, as well as in the wider European context of the Bonn Declaration agreed to by Ministers from 29 European States.
So far, hotlines exist only in a limited number of Member States. Their creation needs to be stimulated so that there are hotlines operating covering the Union both geographically and linguistically. Mechanisms for exchange of information between the national hotlines, and between the European network and hotlines in third countries need to be put in place.
In order for this network to develop its full potential, it is necessary to improve cooperation between industry and law-enforcement authorities, ensure Europe-wide coverage and cooperation, and increase effectiveness through exchange of information and experience.
This action will take the form of a call for proposals for participating organisations (20-25) to establish a European network of hotlines, and links between this network and hotlines in third countries, develop common approaches and stimulate transfer of know-how and best practice.
The participating organisations will be supported by a cross-section of industry actors (access and service providers, telecom operators, national hotline operators) and users. They will have to demonstrate a forward-looking and innovative approach, in particular in their relationship with national law-enforcement authorities.
In view of the transnational nature of communications networks, the effectiveness of self-regulation measures will be strengthened, at the European Union level, by coordination of national initiatives between the bodies responsible for their implementation.
Under this action line, it is foreseen to develop guidelines at the European level for codes of conduct, to build consensus for their application, and support their implementation. This action will be carried out through a call for tender to select organisations that can assist self-regulatory bodies to develop and implement codes of conduct. In connection with the establishment of Codes of Conduct, a system of visible "quality-Site Labels" for Internet Service Providers will be encouraged to assist users in identifying providers that adhere to Codes of Conduct. Measures will be taken to carefully monitor progress. This will be done in close coordination with the promotion of common guidelines for the implementation, at the national level, of a self-regulation framework as advocated by the Council Recommendation on Protection of Minors and Human Dignity.
Uptake of rating systems by European content providers and users remains low. The measures under this action line will focus on demonstrating the potential and the limitations of filtering and rating systems in a real world environment, with the objective of encouraging the establishment of European systems and familiarising users with their use. Filtering and rating systems must be internationally compatible and interoperable and developed with full cooperation of representatives of industry, consumers and users18.
In this context we want to mention the INCORE19 project (Internet Content Rating for Europe) funded as a Preparatory action to this EU-Action plan whose aim is to install a system which describes contents of websites. Founded by the European Commission (GD XIII) and hosted by Microsoft and UUnet its members are experts from the European Commission, representatives of private enterprises, private and public lobbying groups.
Awareness is also the necessary complement of the described Action lines, since the actions of industry to implement self-regulation and filtering and rating will bear fruit only if users and potential users are aware of them.
The European Parliament has called for the implementation of a European campaign and an information and awareness action programme, to be funded by the EU budget, to inform parents and all people dealing with children (teachers, social workers, etc.) on the best way (including technical aspects) to protect minors against exposure to content that could be harmful to their development, so as to ensure their well-being.
European action, on the basis of actions undertaken by the Member States, will contribute to reinforcement of synergy, in particular through exchange of information and experience. The Action Plan will initiate awareness actions that will build on the dissemination of information from access providers to customers, and also develop material for use in the education sector.
EuroISPA is the pan-European association of the Internet Services Providers associations of the countries of the European Union. The association was established when a number of such ISP associations signed the EuroISPA Memorandum of Understanding on 6 August 1997 in Brussels. On 10 September 1997 the signatories to the MOU met again and signed the agreement that formed EuroISPA EEIG, thereby creating the largest association of ISPs in the world.
From the aims and objectives:
"EuroISPA is being established to achieve several important purposes. First, to protect and promote the interests of Europe as a whole within the global Internet, securing for Europe a premier position in the key industry of the new Millennium. Secondly, to help deliver the benefits of this new technology of liberation and empowerment to individuals, while at the same time meeting the legitimate concerns of parents and others responsible for the weaker members of society. Thirdly, to encourage the development of a free and open telecommunications market, something of great benefit to society as a whole but essential to the healthy development of the Internet. And finally, to promote the interests of our members and provide common services to them where these cannot be had elsewhere."
At this time the EuroISPA members are Austria, Belgium, Denmark, France, Germany, Ireland, Italy, the Netherlands, Spain, Finland and the United Kingdom.
In Austria there is a Public Registration Board at The Ministry of Interior (Polizeiliche Meldestelle im Innenministerium) where anyone can report contents which he/she comes across in the course of their use of the Internet and which he/she considers to be illegal. This Registration Board is closely working together with the private hotline of the ISPA (Internet Service Providers Austria)21. This cooperation is functioning very well and is based on an informal agreement between the ISPA and the Registration Board who installed a so-called Hotline Beirat which consists of representatives of the public and private hotlines. The two institutions exchange Internet addresses with illegal and, especially, racial content in order to eliminate those websites.
Furthermore the ISPA has developed recommendations of Codes of Conduct22 for its members. Since the big majority of Austria’s Providers are members of the ISPA the acceptance of this self-regulation mechanism is very high. "To reach such a high level of application of self-regulation mechanisms it is necessary to follow an active information policy" says Karl Hitschmann, member of the direction of the ISPA.
The Austrian Internet-Provider EU-Net Austria23 for example went one step further by implementing a paragraph concerning the netiquette and illegal contents in its general terms and conditions. EU-Net’s clients are therefore bound to respect the legal norms punishing the dissemination of Nazi-propaganda otherwise the contract can be cancelled.
Another reason why all these mechanisms of self-regulation are working quite well is the fact that the Providers fear a certain responsibility for illegal contents. As long as there is no clear definition under the law of who is responsible for illegal contents on the net, the role of the Providers will remain an active one.
As non-governmental organisation we want to mention "helping hands"24 which has also installed an antiracism hotline and is actively cooperating with the Discrimination hotline Internet in the Netherlands. (see supra)
5.3.2. The Netherlands25
In the Netherlands a self-regulatory mechanism has been installed between the police and the Providers. Probably comparable with the above described Austrian institution the "meldpunt discrimatie Internet" (Discrimination Hotline Internet, DHI) for discrimination deals with racial contents on the Internet where everyone can announce sites with illegal contents. There we could find a link to the Austrian NGO "helping hands".
DHI is a project of the Magenta foundation. The hotline is advised by the Anti Discrimination Bureau Amsterdam (MDA) and the National Attorney Discrimination Expertise Centre , supported by the Dutch branch organisation for Internet Providers (NLIP), the Ministry of Justice and the Ministry of Internal Affairs. The DHI was founded after an increase of racist and discriminatory statements on the Internet.
By sending a warning/request to remove the material, DHI tries to decrease the amount of racist and discriminatory statements on the Dutch part of the Internet. When material is not removed, DHI files an official complaint with the Dutch Police.
The Association for Voluntary Self-Monitoring of Multimedia Service Providers26 was established with the following aims. The preamble of its Code of Conduct dated from the 9 July 1997 provides:
"The Association for the Voluntary Self-Monitoring of Multimedia Service Providers ("Freiwillige Selbstkontrolle Multimedia-Diensteanbieter e.V."; FSM in short) wishes to make its contribution toward strengthening the freedoms of Service Providers and protecting the valid interests of users and the general public, in particular against race discrimination and the glorification of violence, and to act on the basis of self-responsibility in order to strengthen protection for youth. Any form of censure will be rejected.
The Association for the Voluntary Self-Monitoring of Multimedia Service Providers wants to encourage Service providers to join in order to make them abide by the principles of the Code of Conduct and punish any violations of this code."
Up to now, approximately 300 German enterprises agreed to accept the Code of conduct of the FSM. Almost every day new supporters join the association. In the first year of its existence the FSM had to deal with 200 complaints. One can therefore say that the FSM is the most widely accepted Online-Self regulation institution in Germany.27
Principles of conduct - Impermissible content 28
"The members of the Association for the Voluntary Self-Monitoring of Multimedia Service Providers shall take all actions, within the scope of legally determined responsibility and to the extent actually and legally possible and reasonable, to ensure that content which is unlawful or impermissible, in particular pursuant to
a) § 130 of the StGB (Incitement to hatred and violence against segments of the population (or minority groups) or publishing insults against them in such a manner as to endanger the peace or to expose them to scorn or contempt);
b) § 130a of the StGB (Incitement to commit crimes);
c) § 131 of the StGB (Depiction of acts of violence, instigation to racial hatred);
… is neither provided nor switched for use."
Anyone is entitled to complain to the Complaints Office of the Association "Voluntary Self-Control for Multimedia Service Providers" with respect to contents which are available on the Internet or on any other networks or via online services. Complaints which are received by a member may be forwarded to the Association.
The commissioner at the Complaints Office (Commissioner) shall be responsible for the initial review of complaints received. In addition, complaints shall be treated by the Complaints Office and by its Chairperson in accordance with §§ 5 et seq of the Beschwerdeordnung where a decision-making procedure is laid down and on the basis of the Code of Conduct instituted by the Association.
The FSM-Beschwerdestelle is not competent concerning complaints which are contents of individual communication systems, for example insults or pornographic contents which are communicated by e-mail. It is also not competent concerning contents of Newsgroups. Under the conditions of § 6a of the Beschwerdeordnung the head of the FSM-Beschwerdestelle can inform the competent state institutions.
A collaboration with the Police authorities or the public Prosecutor’s Office is in principle excluded. In exceptional cases, if there is a strong suspicion of danger for life and health of the persons shown on the net, especially in cases of child pornography, the competent authorities are informed.
L'AFA (l'Association des Fournisseurs d'Accès et de Services Internet) réunit en son The AFA (Association of Access and Internet Service Providers) is made up of the following providers of Internet access and/or services: AOL Bertelsmann France, Cegetel, CompuServe France, FranceNet29, France Pratique, France Telecom Interactive, Grolier Interactive, Imaginet, Infonie, Internet Way, 9 Telecom, Business-Village, chello France, Club-Internet, France Explorer, Freesbee, Isdnet, Lokace, Lyonnais Cable, Magic Online, Uunet France, Wanadoo, World online France and Yahoo! France. The Practices and Uses of January 199830 contain a first code drawn up by its members. There is no express reference to racist sites, but only to the concept of netiquette (see point I.1 of the Practices and Uses).
The AFA has also opened a Point de contact (contact point) to help react to what are presumed to be illegal contents on the Internet. AFA Point de Contact provides information on the criminal provisions applicable to paedophilia and incitement to racial hatred and helps users to understand what they can do when they find illegal contents of that type via the Internet.
Since November 1999 a preliminary inquiry into the establishment of a joint Internet regulatory body has been in progress within the Government Information Service of the Prime Minister of France. Joint regulation represents a combination of market regulation, regulation by the community of users and regulation by law. The joint regulatory body must act with complete independence, but there is no question of establishing an independent administrative authority with competence for the Internet. The main areas for such a body are above all the ethics of the contents, consumer protection and the code of conduct of the actors. The body could be a forum for reflection and information, it could encourage self-regulation and participate in combating illicit contents31.
Reference has already been made to netiquette in the context of a dispute before the courts. In the Estelle Hallyday case the court of first instance considered that “the host is under an obligation to ensure that those he accommodates observe proper moral standards, that they comply with the ethical rules governing the web ...”32. Netiquette was therefore referred to, but it was used in a strange way: compliance is not a matter for the user, who is none the less the theoretical addressee of this primarily moral text, but for the professional responsible for the user. If netiquette continues to be extended in this way, it will be necessary to debate the precise content of netiquette, which at present is particularly vague. Although its use in a civil context is not necessarily a cause for alarm, it might seem more unusual if the provisions of netiquette should be used as an argument in criminal proceedings33.
Several types of contract may well take netiquette into account for the purpose of inserting it into the list of obligations of one or both parties. Netiquette then becomes legally binding, in the same way as an annex setting out general conditions or a reference in the contract to a special clause.
However, the contracts which refer to netiquette are frequently standard-form contracts. That is the case, in particular, of contracts for the provision of Internet access, which are certainly standard-form contracts (it is frequently possible to peruse the contract only after a subscription has been taken out) and which sometimes contain a clause on netiquette, either express or, more commonly, implied. Thus in the general conditions of “Wanadoo”, the access provider service of France Telecom, it is stated that the user must take note of the fact “that the community of Internet users has developed a code of conduct and that any person in breach of that code may be excluded from access to the Internet ...”34. Or again, the Internet access provider does not accept responsibility until the user has taken a positive step to “be familiar with the codes of conduct, uses and rules of behaviour which from time to time are disseminated on the Internet for that purpose”35.
Canadian case-law provides a very interesting example of a case where the court took netiquette directly into consideration; the case was Ontario Inc. v Nexx Online Inc. (Supreme Court of Ontario, Case 1267632/1999). It was summarised as follows by Lionel Thoumyre in the electronic review Juriscom.Net36.
This is the first Canadian decision in a case concerning unsolicited mail (“junk mail” or “spam”) and the implementation of the rules of netiquette.
An Internet service provider in Toronto, Nexx Online, decided to close the accommodation account of its customer, company 1267632 Ontario Inc., which operated the site Beaverhome.com. The following reasons were given: since 31 March 1999 Beaverhome.com had sent more then 200,000 unsolicited messages each day through the services of another service provider. This practice is deemed to be contrary to the rules of the well-known netiquette to which the accommodation contract expressly referred. The customer none the loss considered that Nexx Online was not justified in disconnecting its site and decided to sue for breach of contract.
On examining the terms of the accommodation contract, Judge Janet Wilson pointed out at the outset that there was no obvious clause prohibiting Nexx Online’s customer from distributing unsolicited commercial messages. However, she cited two contractual clauses in favour of the defendant:
1. the customer agrees to observe netiquette. This clause is drafted as follows: “Account Holder agrees to follow generally accepted ‘netiquette’ when sending e-mail messages or posting newsgroup messages ...”;
2. a second clause provides that the customer may have to agree to new contractual provisions being added by Nexx Online (with the option of a refund should he refuse).
The president of Nexx Online informed the customer in August 1998 that unsolicited commercial e-mail could not be distributed through his services.
The significance of the judgment lies essentially in Judge Wilson's argument, which has the effect of conferring legal force on the rules of netiquette by means of the contract37. The judge then concludes that sending unsolicited advertising e-mail is clearly in breach of the emerging principles of netiquette, unless the service provider has expressly allowed it.
Finally, Judge Janet Wilson has no hesitation in finding that the defendant acted in breach of the terms of the contract, in so far as the contract refers the customer to the requirement to comply with the principles of netiquette. Thus the practice of spamming, contrary to the code of ethics in force on the Network, justified disconnecting the Beaverhome.com site: "31 (…) I conclude that sending unsolicited bulk commercial e-mail is in breach of the emerging principles of Netiquette, unless it is specifically permitted in the governing contract. As the rules of Netiquette govern the parties' Contract, the plaintiff is in breach of its terms justifying disconnection of service. Secondly, in the alternative, Nexx is permitted to add terms to the Contract precluding a Nexx client sending unsolicited bulk e-mail directly, or through a third party. If the plaintiffs do not concur with the new term, they are entitled to a rebate of the pro-rated balance of the Contract price, and the defendant is entitled to disconnect service. The defendant has agreed to repay the prorated balance owing under the Contract from April 5, 1999 to August 5, 1999."
It should be observed that in Quebec netiquette may be binding on contracting parties, even in the absence of clauses expressly referring thereto, on the basis of Article 1434 of the Civil Code: “A validly made contract is binding on those who concluded it not only in respect of what they have expressed but also in respect of what follows from it according to its nature and in accordance with custom, equity or the law”.
The Code of Conduct of the ISPA in Belgium38 includes an obligation to observe the law in general and to stet up a contact point where illegal contents may be reported.
“The police shall set up a Contact Point39 to receive any complaints relating to any illegal or immoral activity (sexual activity, pornography, paedophilia – although this list is not exhaustive), racism and xenophobia, the negation of genocide, the provocation or encouragement of criminal acts, criminal association, games and lotteries, drugs or similar substances (for example sites offering for sale substances prohibited in Belgium) ... this list is not exhaustive.”
By virtue of the ISPA's very wide coverage of the Internet industry at retail level the most important item of soft law in the United Kingdom is its Code of Practice, last updated on 15.01.1999.40 This code is contractually binding upon all ISPA members. It mentions racism on Internet in point 2, headed "General Requirements". According to point 2.2, sub headed "Decency", member ISPs are obliged to use best endeavours to ensure that their services and promotional material do not contain material which incites racial hatred or otherwise promotes or facilitates practices which contravene British law. However, this obligation is expressly formulated so as to exclude "Third Party Content". We have received confirmation that the ISPA does not make ISPs in any way responsible for material created by others, which they are hosting on their servers. If anyone, including the ISPs themselves, are unhappy with the contents of any websites or Usenet postings hosted, they should pass the relevant information to the IWF (Internet Watch Foundation41) to be dealt with.42 The most important element of the Code of Practice, for the purposes of combating illegal material on Internet, is therefore point 5, headed "IWF", which obliges member ISPs to comply with "take-down notices" issued by the IWF.
The IWF intends to play a more active role specifically concerning racism on the Internet. Under plans announced by the IWF Chairman and its Assistant Chief Executive and supported by the British Minister for Small Business and E-Commerce, Internet sites in the United Kingdom which publish criminally racist material are to be targeted for the first time by the IWF43.
In May 1999, LINX (London Internet Exchange) published a document setting out the "Best Current Practice on Traceability".44 It reflects neither the legal requirements nor the existing practice of the majority of British ISPs. Instead, it sets out goals for ISPs to reach for, so as to improve their ability to trace the source of any material inappropriately placed on Internet (illegal material, spam, falsely labelled material), to identify hackers or fraudsters operating over the Internet. That is also important for racist sites in order to find out who put the content on the net.
An association of providers and other communications operators, the ANFoV (Associazione per la convergenza nei servizi di comunicazione) has adopted a code of self-discipline45, which has been in force since 1 January 1998. This code lays down procedures for reporting illicit contents, sets up a Self-disciplinary committee and provides for the application of penalties (in particular Articles 13 and 15 to 17). Thus far no racist sites have been reported. The code is not generally applicable, however, but must be accepted by the providers belonging to the association.
A draft self-regulation code intended to have a wider scope was prepared in 1997 by a working group consisting of the Associazione Italiana Internet Providers (AIIP), the Italian member of EuroISPA, and other organisations and associations of providers. The first draft is published on the AIIP’s home page46. It has been the subject of lengthy discussion and a number of amendments have been made: a more recent version, dated 5 March 1998, has been published by the electronic review Interlex47. However, the participants in the working group have been unable to agree on a definitive text, which explains why the ANFoV decided to adopt its own code (see above) and why the text is still at the draft stage.
According to this draft, access providers and contents providers undertake to remove from their servers any manifestly illicit or offensive content (cf. Article 11). The code provides for the establishment of a “self-regulatory board” empowered to take decisions on the implementation of the code and also to impose penalties (Articles 18 to 20). An appeal against the board’s decisions will lie to a committee responsible for implementing the code (Article 21).
According to the information received from the AIIP, the principles forming the basis of the code are observed by the members of the association. However, the control procedure has not been implemented.
In September 1995 a working group formed within the Federal Office of Justice and consisting of senior officials (from, inter alia, the criminal law division, the Federal Institute for Intellectual Property, the Federal Officer for Data Protection and the Federal Office for Information Technology), undertook to investigate the most appropriate ways of combating abuse on the Internet. The experts’ report, which was delivered six months later, was categorical on at least one point: the Swiss legislative machinery should not be reinforced but self-discipline should be encouraged among the operators, in particular access providers, by means of official recommendations48.
This working group deserves recognition for not having merely formulated a general strategy but for having made eleven specific recommendations to access providers. These are essentially based on two types of measure: the blocking of illicit data and contractual restrictions. Thus:
(a) an access provider which has clear evidence that illicit data are being conveyed on its network must take the necessary measures to block consultation of such data; this applies not only to violent, pornographic or racist contents but also to contents which infringe copyright or similar rights;
(b) the access provider must reserve the right under the terms of the contract to cancel the subscription contract of any customer who disseminates illicit contents or allows such contents to be consulted through his connection; similarly, the contract will state that customers are required to observe copyright and similar rights.
For example, passages containing these recommendations have been included in the general conditions of the provider Sunrise49. Sunrise is very active in the area of self-regulation and has set up an e-mail address where cases of breach of the regulations can be reported. In addition, Sunrise has specialists who look for sites which do not comply with their general conditions. Where possible, they block sites with illegal contents.
Finally, there is the initiative known as “Aktion Kinder des Holocaust” in Basel50, which requests Internet providers to block racist sites. Although Sunrise, Datacom and Swisscom-section IP Plus comply with such requests, Swissonline and CompuServe do not do so and do not block Internet sites without a court order.
On 22 December 1999 the Commission for Information Technology (IT-kommisionen) submitted a proposal to the Government for the establishment of an ombudsman for ethics on the Internet, on the model of the ombudsmen already in office in Sweden (cf., for example, the ombudsman for the press or the ombudsman against discrimination). The idea is to promote dialogue with the various actors on the Internet in order to combat illicit contents effectively. At the same time, the Commission states that it is not in favour of drawing up codes of conduct or making recommendations in that regard. The ombudsman, who would be chosen from among persons of integrity enjoying the respect of the actors on the Internet, should be supported in his work by an ad hoc committee which would determine questions of principle. The ombudsman would have no power to take decisions.
The Government have not yet reached a decision on that proposal51.
1. Jougleux Philippe, « La criminalité dans le cyberespace », Thèse de droit des Médias, 1999, p. 127 et s. Shea Virginia, Core Rules of Netiquette, Albion Books, San Francisco, 1994, http://www.albion.com/netiquette/book, a description of netiquette in English is lso available on the www site of the Netherlands ISPA http://www.nlip.nl/frames/frame2bi.htm : click on the item “netiquette”
2. http://jade.wabash.edu/wabnet/info/netiquet.htm Interpretation by WABnet, The Wabish College Digital Information System, Indiana.
3. http://www.fau.edu/netiquette/net/ten.html The Net: User Guidelines and Netiquette - by Arlene Rinaldi the ten commandments for computer ethics from the Computer Ethics Institute:
1.) Thou shalt not use a computer to harm other people.
2.) Thou shalt not interfere with other people's computer work.
3.) Thou shalt not snoop around in other people's files.
4.) Thou shalt not use a computer to steal.
5.) Thou shalt not use a computer to bear false witness.
6.) Thou shalt not use or copy software for which you have not paid.
7.) Thou shalt not use other people's computer resources without authorization.
8.) Thou shalt not appropriate other people's intellectual output.
9.) Thou shalt think about the social consequences of the program you write.
10.)Thou shalt use a computer in ways that show consideration and respect.
4. See The blue window Internet Access Services - Service Description for "HighWay" http://www2.bluewindow.ch/info/index_e.html see point C.1.4
5. http://www.kpnqwest.at/services/agb.shtml see point 8.4. “Der Vertragspartner anerkennt die Notwendigkeit der Einhal-tung der "Netiquette". Sollten aus dem Internet Beschwerden über den Vertragspartner an KPNQwest herangetragen werden, so ist KPNQwest im Wiederholungsfalle berechtigt, den Anschluß und das Vertragsverhältnis mit sofortiger Wirkung aufzulösen. Weiters wird die zur Bearbeitung der Beschwerden benötigte Zeit mit dem zum jeweiligen Zeitpunkt von KPNQwest üblicherweise verrechneten Stundensatz dem Vertragspartner verrechnet."
6. http://www.nlip.nl/frames/frame2bi.htm http://www.nlip.nl/index.html Homepage of NLIP, click „beleid+informatie“, after „netiquette“, after „RFC 1855“.
7. http://www.sunrise.ch/en/gen_ter.htm see point 1.2, http://www.ispa.at/ click ISPA Verhaltensrichtlinien, http://www.kpnqwest.at/services/agb.shtml see point 8.11 allgemeine Geschäfts- und Lieferbedingungen der Eunet EDV-Dienstleistungs-Gesellschaft m.b.H.
8. http://www2.bluewindow.ch/info/index_e.html see C 1.2, http://www.fsm.de/english/kodex/index.html see point 2.
9. http://www.zdnet.fr/actu/inte/a0013375.html „ Yahoo.com does not always comly with its anti-racist charter », ZDNet France, Internet Society, 11 March 2000.
10. http://www.fsm.de/bes/form/index.html "Beschwerdeformular" of the association "Freiwillige Selbstkontrolle Multimedia-Diensteanbieter“ in Germany; http://hotline.ispa.at "Formular“ of the Internet Service Providers Austria
11. For fuller information on the filtering systems and the adaption of the legislative framework of the information society, see the following sites : http://www.csa.fr/avecflash.htm and http://www.csa.fr/html/dos125.htm
12. An example of filtering by self-regulation concerning hypertext links: http://www.droit.umontreal.ca/~farassef/cipertexte
13. http://www.csa.fr/html/dos125.htm : see p. 18.
14. http://www.csa.fr/html/dos125.htm : voir p. 19
16. Information by Mr. Summa from eco (phone call 14 of march 2000)
17. Action Plan on Promoting Safer Use of the Internet, Decision No 276/1999/EC of the European Parliament and of the council of 25 January 1999 adopting a Multiannual Community Action Plan on promoting safer use of the Internet by combating illegal and harmful content on global networks. http://www.echo.lu/home.html
18. http://www.csa.fr/avecflash.htm INCORE workinggroup
19. Description in comparison to other countries (esp. USA, France) http://www.csa.fr/html/dos125.htm Adaptation of the legislative framework of the information society: the response of the CSA (Conseil Supérieur de l'audiovisuel) to the Government’s guidelines, La Lettre du CSA n° 125 - February 2000, France.
22. http://www.ispa.at/ click ISPA Verhaltensrichtlinien
23. http://www.kpnqwest.at/services/agb.shtml see point 8.4 and 8.11
24. http://www.helpinghands.at/Default.htm click English, click links: there you will find a link to the Magenta foundation in the Netherlands (see organisations against racism)
25. http://www.nlip.nl ISPA Netherlands
31. Preliminary inquiry into the establishment of a joint Internet regulatory body, under the presidency of Christian Paul, Deputy for la Nièvre, November 1999/March 2000, http://www.internet.gouv.fr/francais/index.html : click on “recherche” and type in the keyword “corégulation”.
32. Paris Regional Court, 9 June 1998 Estelle v Valentin and Daniel, at website legalis.net
33. Jougleux Philippe, La criminalité dans le cyberspace, Thèse de droit des Médias, 1999, p. 132.
34. General conditions available at the following address: http://www.wanadoo.fr/wanadoo_et_moi/offre/html/conditions_occ/html See Article 6.
35. General conditions of “Club-Internet”, the Internet service of Grolier Interactive, at: http://www.cybertheque.fr/conditions.html See point 3.1.5
36. Interview on self-regulation with Professor Pierre Trudel, in the “Professionels” space on Juriscom.net, case summarised by Lionel Thoumyre; http://www.juriscom.net/jurisca/spamca.htm
37. First, the court inferred the unwritten netiquette rules on spamming from a series of documents: an author by an American author (John Levine, “Why is spam bad?”: http://spam.abuse.net/spambad/html) and four United States judgments (Cyber Promotion Inc. v American Online Inc. E.D Pa. Nov. 4 1996: CompuServe Inc. v Cyber Promotions Inc. S.D. Oh. Feb 3 1997; Parker v C.N. Enterprises Tex. Travis County Dist. Ct. Nov. 10 1997; Cyber Promotions Inc. v Apex Global Information Services Inc. E.D. Pa. Sept. 30 1997).
38. http://www.ispa.be/fr/c040201.html see point 3.3 of the Code of Conduct.
39. http://www.ispa/be.fr/c040202.html Protocol on collaboration to combat illicit acts on the Internet: “... 2. An Internet user may report any content which is presumed to be illicit via e-mail (email@example.com) directly to the central judicial contact point, or contact his ISP”.
41. The Internet Watch Foundation (IWF) http://www.iwf.org.uk/about/about.html was launched in late September 1996 by PIPEX founder Peter Dawe to address the problem of illegal material on the Internet, with particular reference to child pornography. It is an independent organisation to implement the proposals jointly agreed by the government, the police, the two major UK service provider trade associations, ISPA and LINX, and Mr Dawe. Science and Technology Minister Ian Taylor welcomed the proposals as "a major industry-led initiative to reassure the public and business that the Internet can be a safe and secure place to work, learn and play."
42. Mr. Nicholas Lansman, ISPA representative.
43. "Watchdog moves to curb racist websites“, The Guardian, 30 January 2000.
44. It is available online at http://www.linx.net/noncore/bcp/illegal-material-bcp.html
48. Le Nouveau média interroge le droit, Report by an inter-departmental group on questions criminal law, data-protection law and copyright raised by the Internet (http://www.admin.ch/bj/infrecht/internet/inbearbf.htm).
49. http://ww.sunrise.ch/gen_ter.htm see 1.2, 6.3 and 6.9
51. The proposal can be accessed at http://www.itkommissionen.se/skrivels/sk991222.html