Sponsored Links
-->

Wednesday, May 30, 2018

src: i.ytimg.com

Internet censorship is the control or suppression of what can be accessed, published, or viewed on the Internet enacted by regulators, or on their own initiative. Individuals and organizations may engage in self-censorship for moral, religious, or business reasons, to conform to societal norms, due to intimidation, or out of fear of legal or other consequences.

The extent of Internet censorship varies on a country-to-country basis. While most democratic countries have moderate Internet censorship, other countries go as far as to limit the access of information such as news and suppress discussion among citizens. Internet censorship also occurs in response to or in anticipation of events such as elections, protests, and riots. An example is the increased censorship due to the events of the Arab Spring. Other areas of censorship include copyrights, defamation, harassment, and obscene material.

Support for and opposition to Internet censorship also varies. In a 2012 Internet Society survey 71% of respondents agreed that "censorship should exist in some form on the Internet". In the same survey 83% agreed that "access to the Internet should be considered a basic human right" and 86% agreed that "freedom of expression should be guaranteed on the Internet". According to GlobalWebIndex, over 400 million people use virtual private networks to circumvent censorship or for increased user privacy.


Video Internet censorship



Overview

Many of the challenges associated with Internet censorship are similar to those for offline censorship of more traditional media such as newspapers, magazines, books, music, radio, television, and film. One difference is that national borders are more permeable online: residents of a country that bans certain information can find it on websites hosted outside the country. Thus censors must work to prevent access to information even though they lack physical or legal control over the websites themselves. This in turn requires the use of technical censorship methods that are unique to the Internet, such as site blocking and content filtering.

Views about the feasibility and effectiveness of Internet censorship have evolved in parallel with the development of the Internet and censorship technologies:

  • A 1993 Time Magazine article quotes computer scientist John Gilmore, one of the founders of the Electronic Frontier Foundation, as saying "The Net interprets censorship as damage and routes around it."
  • In November 2007, "Father of the Internet" Vint Cerf stated that he sees government control of the Internet failing because the Web is almost entirely privately owned.
  • A report of research conducted in 2007 and published in 2009 by the Berkman Center for Internet & Society at Harvard University stated that: "We are confident that the [ censorship circumvention ] tool developers will for the most part keep ahead of the governments' blocking efforts", but also that "...we believe that less than two percent of all filtered Internet users use circumvention tools".
  • In contrast, a 2011 report by researchers at the Oxford Internet Institute published by UNESCO concludes "... the control of information on the Internet and Web is certainly feasible, and technological advances do not therefore guarantee greater freedom of speech."
  • Dr Shashi Tharoor in quarterly lecture series programme organized by a think tank based in India, Centre for Public Policy Research stated that "avenues of expressing our views and opinions having amplified multifold via digital media, freedom of expression comes immense responsibility"

Blocking and filtering can be based on relatively static blacklists or be determined more dynamically based on a real-time examination of the information being exchanged. Blacklists may be produced manually or automatically and are often not available to non-customers of the blocking software. Blocking or filtering can be done at a centralized national level, at a decentralized sub-national level, or at an institutional level, for example in libraries, universities or Internet cafes. Blocking and filtering may also vary within a country across different ISPs. Countries may filter sensitive content on an ongoing basis and/or introduce temporary filtering during key time periods such as elections. In some cases the censoring authorities may surreptitiously block content to mislead the public into believing that censorship has not been applied. This is achieved by returning a fake "Not Found" error message when an attempt is made to access a blocked website.

Unless the censor has total control over all Internet-connected computers, such as in North Korea (who employ an intranet that only privileged citizens can access), or Cuba, total censorship of information is very difficult or impossible to achieve due to the underlying distributed technology of the Internet. Pseudonymity and data havens (such as Freenet) protect free speech using technologies that guarantee material cannot be removed and prevents the identification of authors. Technologically savvy users can often find ways to access blocked content. Nevertheless, blocking remains an effective means of limiting access to sensitive information for most users when censors, such as those in China, are able to devote significant resources to building and maintaining a comprehensive censorship system.

The term "splinternet" is sometimes used to describe the effects of national firewalls. The verb "rivercrab" colloquially refers to censorship of the Internet, particularly in Asia.


Maps Internet censorship



Content suppression methods

Technical censorship

Approaches

Internet content is subject to technical censorship methods, including:

  • Internet Protocol (IP) address blocking: Access to a certain IP address is denied. If the target Web site is hosted in a shared hosting server, all websites on the same server will be blocked. This affects IP-based protocols such as HTTP, FTP and POP. A typical circumvention method is to find proxies that have access to the target websites, but proxies may be jammed or blocked, and some Web sites, such as Wikipedia (when editing), also block proxies. Some large websites such as Google have allocated additional IP addresses to circumvent the block, but later the block was extended to cover the new addresses. Due to challenges with geolocation, geo-blocking is normally implemented via IP address blocking.
  • Domain name system (DNS) filtering and redirection: Blocked domain names are not resolved, or an incorrect IP address is returned via DNS hijacking or other means. This affects all IP-based protocols such as HTTP, FTP and POP. A typical circumvention method is to find an alternative DNS resolver that resolves domain names correctly, but domain name servers are subject to blockage as well, especially IP address blocking. Another workaround is to bypass DNS if the IP address is obtainable from other sources and is not itself blocked. Examples are modifying the Hosts file or typing the IP address instead of the domain name as part of a URL given to a Web browser.
  • Uniform Resource Locator (URL) filtering: URL strings are scanned for target keywords regardless of the domain name specified in the URL. This affects the HTTP protocol. Typical circumvention methods are to use escaped characters in the URL, or to use encrypted protocols such as VPN and TLS/SSL.
  • Packet filtering: Terminate TCP packet transmissions when a certain number of controversial keywords are detected. This affects all TCP-based protocols such as HTTP, FTP and POP, but Search engine results pages are more likely to be censored. Typical circumvention methods are to use encrypted connections - such as VPN and TLS/SSL - to escape the HTML content, or by reducing the TCP/IP stack's MTU/MSS to reduce the amount of text contained in a given packet.
  • Connection reset: If a previous TCP connection is blocked by the filter, future connection attempts from both sides can also be blocked for some variable amount of time. Depending on the location of the block, other users or websites may also be blocked, if the communication is routed through the blocking location. A circumvention method is to ignore the reset packet sent by the firewall.
  • Network disconnection: A technically simpler method of Internet censorship is to completely cut off all routers, either by software or by hardware (turning off machines, pulling out cables). This appears to have been the case on 27/28 January 2011 during the 2011 Egyptian protests, in what has been widely described as an "unprecedented" internet block. About 3500 Border Gateway Protocol (BGP) routes to Egyptian networks were shut down from about 22:10 to 22:35 UTC 27 January. This full block was implemented without cutting off major intercontinental fibre-optic links, with Renesys stating on 27 January, "Critical European-Asian fiber-optic routes through Egypt appear to be unaffected for now." Full blocks also occurred in Myanmar/Burma in 2007, Libya in 2011, and Syria during the Syrian civil war. A circumvention method could be to use a satellite ISP to access Internet.
  • Portal censorship and search result removal: Major portals, including search engines, may exclude web sites that they would ordinarily include. This renders a site invisible to people who do not know where to find it. When a major portal does this, it has a similar effect as censorship. Sometimes this exclusion is done to satisfy a legal or other requirement, other times it is purely at the discretion of the portal. For example, Google.de and Google.fr remove Neo-Nazi and other listings in compliance with German and French law.
  • Computer network attacks: Denial-of-service attacks and attacks that deface opposition websites can produce the same result as other blocking techniques, preventing or limiting access to certain websites or other online services, although only for a limited period of time. This technique might be used during the lead up to an election or some other sensitive period. It is more frequently used by non-state actors seeking to disrupt services.

Over- and under-blocking

Technical censorship techniques are subject to both over- and under-blocking since it is often impossible to always block exactly the targeted content without blocking other permissible material or allowing some access to targeted material and so providing more or less protection than desired. An example is that automatic censorship against sexual words in matter for children, set to block the word "cunt", has been known to block the Lincolnshire placename Scunthorpe. Another example is blocking an IP-address of a server that hosts multiple websites, which prevents access to all of the websites rather than just those that contain content deemed offensive.

According to a report produced in 1997 by the gay rights group GLAAD, many 1990s-era Internet censorship software products prevent access to non-pornographic LGBT-related material.

Use of commercial filtering software

Writing in 2009 Ronald Deibert, professor of political science at the University of Toronto and co-founder and one of the principal investigators of the OpenNet Initiative, and, writing in 2011, Evgeny Morzov, a visiting scholar at Stanford University and an Op-Ed contributor to the New York Times, explain that companies in the United States, Finland, France, Germany, Britain, Canada, and South Africa are in part responsible for the increasing sophistication of online content filtering worldwide. While the off-the-shelf filtering software sold by Internet security companies are primarily marketed to businesses and individuals seeking to protect themselves and their employees and families, they are also used by governments to block what they consider sensitive content.

Among the most popular filtering software programs is SmartFilter by Secure Computing in California, which was bought by McAfee in 2008. SmartFilter has been used by Tunisia, Saudi Arabia, Sudan, the UAE, Kuwait, Bahrain, Iran, and Oman, as well as the United States and the UK. Myanmar and Yemen have used filtering software from Websense. The Canadian-made commercial filter Netsweeper is used in Qatar, the UAE, and Yemen. The Canadian organization CitizenLab has reported tha Sandvine and Procera products are used in Turkey and Egypt.

On 12 March 2013 in a Special report on Internet Surveillance, Reporters Without Borders named five "Corporate Enemies of the Internet": Amesys (France), Blue Coat Systems (U.S.), Gamma (UK and Germany), Hacking Team (Italy), and Trovicor (Germany). The companies sell products that are liable to be used by governments to violate human rights and freedom of information. RWB said that the list is not exhaustive and will be expanded in the coming months.

In a U.S. lawsuit filed in May 2011, Cisco Systems is accused of helping the Chinese Government build a firewall, known widely as the Golden Shield, to censor the Internet and keep tabs on dissidents. Cisco said it had made nothing special for China. Cisco is also accused of aiding the Chinese government in monitoring and apprehending members of the banned Falun Gong group.

Many filtering programs allow blocking to be configured based on dozens of categories and sub-categories such as these from Websense: "abortion" (pro-life, pro-choice), "adult material" (adult content, lingerie and swimsuit, nudity, sex, sex education), "advocacy groups" (sites that promote change or reform in public policy, public opinion, social practice, economic activities, and relationships), "drugs" (abused drugs, marijuana, prescribed medications, supplements and unregulated compounds), "religion" (non-traditional religions occult and folklore, traditional religions), .... The blocking categories used by the filtering programs may contain errors leading to the unintended blocking of websites. The blocking of DailyMotion in early 2007 by Tunisian authorities was, according to the OpenNet Initiative, due to Secure Computing wrongly categorizing DailyMotion as pornography for its SmartFilter filtering software. It was initially thought that Tunisia had blocked DailyMotion due to satirical videos about human rights violations in Tunisia, but after Secure Computing corrected the mistake access to DailyMotion was gradually restored in Tunisia.

Organizations such as the Global Network Initiative, the Electronic Frontier Foundation, Amnesty International, and the American Civil Liberties Union have successfully lobbied some vendors such as Websense to make changes to their software, to refrain from doing business with repressive governments, and to educate schools who have inadvertently reconfigured their filtering software too strictly. Nevertheless, regulations and accountability related to the use of commercial filters and services are often non-existent, and there is relatively little oversight from civil society or other independent groups. Vendors often consider information about what sites and content is blocked valuable intellectual property that is not made available outside the company, sometimes not even to the organizations purchasing the filters. Thus by relying upon out-of-the-box filtering systems, the detailed task of deciding what is or is not acceptable speech may be outsourced to the commercial vendors.

Non-technical censorship

Internet content is also subject to censorship methods similar to those used with more traditional media. For example:

  • Laws and regulations may prohibit various types of content and/or require that content be removed or blocked either proactively or in response to requests.
  • Publishers, authors, and ISPs may receive formal and informal requests to remove, alter, slant, or block access to specific sites or content.
  • Publishers and authors may accept bribes to include, withdraw, or slant the information they present.
  • Publishers, authors, and ISPs may be subject to arrest, criminal prosecution, fines, and imprisonment.
  • Publishers, authors, and ISPs may be subject to civil lawsuits.
  • Equipment may be confiscated and/or destroyed.
  • Publishers and ISPs may be closed or required licenses may be withheld or revoked.
  • Publishers, authors, and ISPs may be subject to boycotts.
  • Publishers, authors, and their families may be subject to threats, attacks, beatings, and even murder.
  • Publishers, authors, and their families may be threatened with or actually lose their jobs.
  • Individuals may be paid to write articles and comments in support of particular positions or attacking opposition positions, usually without acknowledging the payments to readers and viewers.
  • Censors may create their own online publications and Web sites to guide online opinion.
  • Access to the Internet may be limited due to restrictive licensing policies or high costs.
  • Access to the Internet may be limited due to a lack of the necessary infrastructure, deliberate or not.

Major web portal official statements on site and content removal

Most major web service operators reserve to themselves broad rights to remove or pre-screen content, sometimes without giving a specific list or only a vague general list of the reasons allowing the removal. The phrases "at our sole discretion", "without prior notice", and "for other reasons" are common in Terms of Service agreements.

  • Facebook: Among other things the Facebook Statement of Rights and Responsibilities says: "You will not post content that: is hateful, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence", "You will not use Facebook to do anything unlawful, misleading, malicious, or discriminatory", "We can remove any content or information you post on Facebook if we believe that it violates this Statement", and "If you are located in a country embargoed by the United States, or are on the U.S. Treasury Department's list of Specially Designated Nationals you will not engage in commercial activities on Facebook (such as advertising or payments) or operate a Platform application or website".
  • Google: Google's general Terms of Service, which were updated on 1 March 2012, state: "We may suspend or stop providing our Services to you if you do not comply with our terms or policies or if we are investigating suspected misconduct", "We may review content to determine whether it is illegal or violates our policies, and we may remove or refuse to display content that we reasonably believe violates our policies or the law", and "We respond to notices of alleged copyright infringement and terminate accounts of repeat infringers according to the process set out in the U.S. Digital Millennium Copyright Act".
    • Google Search: Google's Webmaster Tools help includes the following statement: "Google may temporarily or permanently remove sites from its index and search results if it believes it is obligated to do so by law, if the sites do not meet Google's quality guidelines, or for other reasons, such as if the sites detract from users' ability to locate relevant information."
  • Twitter: The Twitter Terms of Service state: "We reserve the right at all times (but will not have an obligation) to remove or refuse to distribute any Content on the Services and to terminate users or reclaim usernames" and "We reserve the right to remove Content alleged to be [copyright] infringing without prior notice and at our sole discretion".
  • YouTube: The YouTube Terms of Service include the statements: "YouTube reserves the right to decide whether Content violates these Terms of Service for reasons other than copyright infringement, such as, but not limited to, pornography, obscenity, or excessive length. YouTube may at any time, without prior notice and in its sole discretion, remove such Content and/or terminate a user's account for submitting such material in violation of these Terms of Service", "YouTube will remove all Content if properly notified that such Content infringes on another's intellectual property rights", and "YouTube reserves the right to remove Content without prior notice".

  • Wikipedia: Content within a Wikipedia article may be modified or deleted by any editor as part of the normal process of editing and updating articles. All editing decisions are open to discussion and review. The Wikipedia Deletion policy outlines the circumstances in which entire articles can be deleted. Any editor who believes a page doesn't belong in an encyclopedia can propose its deletion. Such a page can be deleted by any administrator if, after seven days, no one objects to the proposed deletion. Speedy deletion allows for the deletion of articles without discussion and is used to remove pages that are so obviously inappropriate for Wikipedia that they have no chance of surviving a deletion discussion. All deletion decisions may be reviewed, either informally or formally.
  • Yahoo!: Yahoo!'s Terms of Service (TOS) state: "You acknowledge that Yahoo! may or may not pre-screen Content, but that Yahoo! and its designees shall have the right (but not the obligation) in their sole discretion to pre-screen, refuse, or remove any Content that is available via the Yahoo! Services. Without limiting the foregoing, Yahoo! and its designees shall have the right to remove any Content that violates the TOS or is otherwise objectionable."

src: atltomgirl.files.wordpress.com


Circumvention

Internet censorship circumvention is the processes used by technologically savvy Internet users to bypass the technical aspects of Internet filtering and gain access to otherwise censored material. Circumvention is an inherent problem for those wishing to censor the Internet because filtering and blocking do not remove content from the Internet, but instead block access to it. Therefore, as long as there is at least one publicly accessible uncensored system, it will often be possible to gain access to otherwise censored material. However circumvention may not be possible by non tech-savvy users, so blocking and filtering remain effective means of censoring the Internet access of large numbers of users.

Different techniques and resources are used to bypass Internet censorship, including proxy websites, virtual private networks, sneakernets, and circumvention software tools. Solutions have differing ease of use, speed, security, and risks. Most, however, rely on gaining access to an Internet connection that is not subject to filtering, often in a different jurisdiction not subject to the same censorship laws. According to GlobalWebIndex, over 400 million people use virtual private networks to circumvent censorship or for increased level of privacy. The majority of circumvention techniques are not suitable for day to day use.

There are risks to using circumvention software or other methods to bypass Internet censorship. In some countries individuals that gain access to otherwise restricted content may be violating the law and if caught can be expelled, fired, jailed, or subject to other punishments and loss of access.

In June 2011 the New York Times reported that the U.S. is engaged in a "global effort to deploy 'shadow' Internet and mobile phone systems that dissidents can use to undermine repressive governments that seek to silence them by censoring or shutting down telecommunications networks."

Another way to circumvent Internet censorship is to physically go to an area where Internet is not censored. In 2017 a so-called "Internet refugee camp" was established by IT workers in the village of Bonako, just outside an area of Cameroon where Internet is regularly blocked.


src: 2.bp.blogspot.com


Common targets

There are several motives or rationales for Internet filtering: politics and power, social norms and morals, and security concerns. Protecting existing economic interests is an additional emergent motive for Internet filtering. In addition, networking tools and applications that allow the sharing of information related to these motives are themselves subjected to filtering and blocking. And while there is considerable variation from country to country, the blocking of web sites in a local language is roughly twice that of web sites available only in English or other international languages.

Politics and power

Censorship directed at political opposition to the ruling government is common in authoritarian and repressive regimes. Some countries block web sites related to religion and minority groups, often when these movements represent a threat to the ruling regimes.

Examples include:

  • Political blogs and web sites
  • Lèse majesté sites, sites with content that offends the dignity of or challenges the authority of a reigning sovereign or of a state
  • Falun Gong and Tibetan exile group sites in China or Buddhist, Cao Dai faith, and indigenous hill tribes sites in Vietnam
  • Sites aimed at religious conversion from Islam to Christianity
  • Sites criticizing the government or an authority in the country
  • Sites that comment on political parties that oppose the current government of a country
  • Sites that accuse authorities of corruption
  • Sites that comment on minorities or LGBT issues

Social norms

Social filtering is censorship of topics that are held to be antithetical to accepted societal norms. In particular censorship of child pornography and to protect children enjoys very widespread public support and such content is subject to censorship and other restrictions in most countries.

Examples include:

  • Sites that include hate speech inciting racism, sexism, homophobia, or other forms of bigotry
  • Sites seen as promoting illegal drug use (Erowid)
  • Sex and erotic, fetishism, prostitution, and pornographic sites
  • Child pornography and pedophile related sites (see also CIRCAMP)
  • Gambling sites
  • Sites encouraging or inciting violence
  • Sites promoting criminal activity
  • Communist symbols and imagery in Poland, Lithuania, Ukraine, Latvia, Moldova, and Hungary
  • Nazi and similar websites, particularly in France and Germany
  • Sites that contain blasphemous content, particularly when directed at a majority or state supported religion
  • Sites that contain defamatory, slanderous, or libelous content
  • Sites that include political satire
  • Sites that contain information on social issues or "online protests, petitions and campaigns"

Security concerns

Many organizations implement filtering as part of a defense in depth strategy to protect their environments from malware, and to protect their reputations in the event of their networks being used, for example, to carry out sexual harassment.

Internet filtering related to threats to national security that targets the Web sites of insurgents, extremists, and terrorists often enjoys wide public support.

Examples include:

  • Blocking of pro-North Korean sites by South Korea
  • Blocking sites of groups that foment domestic conflict in India
  • Blocking of sites of the Muslim Brotherhood in some countries in the Middle East
  • Blocking Wikileaks
  • Blocking sites such as 4chan thought to be related to the group Anonymous

Protection of existing economic interests and copyright

The protection of existing economic interests is sometimes the motivation for blocking new Internet services such as low-cost telephone services that use Voice over Internet Protocol (VoIP). These services can reduce the customer base of telecommunications companies, many of which enjoy entrenched monopoly positions and some of which are government sponsored or controlled.

Anti-copyright activists Christian Engström, Rick Falkvinge and Oscar Swartz have alleged that censorship of child pornography is being used as a pretext by copyright lobby organizations to get politicians to implement similar site blocking legislation against copyright-related piracy.

Examples include:

  • File sharing and peer-to-peer (P2P) related websites such as The Pirate Bay
  • Skype
  • Sites that sell or distribute music, but are not 'approved' by rights holders, such as allofmp3

Network tools

Blocking the intermediate tools and applications of the Internet that can be used to assist users in accessing and sharing sensitive material is common in many countries.

Examples include:

  • Media sharing websites (e.g. Flickr and YouTube)
  • Social networks (e.g. Facebook and Instagram)
  • Translation sites and tools
  • E-mail providers
  • Web hosting sites
  • Blog hosting sites such as Blogspot
  • Microblogging sites such as Twitter and Weibo
  • Wikipedia
  • Censorship circumvention sites
    • Anonymizers
    • Proxy avoidance sites
  • Search engines such as Bing and Google - particularly in Mainland China and Cuba

Information about individuals

The right to be forgotten is a concept that has been discussed and put into practice in the European Union. In May 2014, the European Court of Justice ruled against Google in Costeja, a case brought by a Spanish man who requested the removal of a link to a digitized 1998 article in La Vanguardia newspaper about an auction for his foreclosed home, for a debt that he had subsequently paid. He initially attempted to have the article removed by complaining to Spain's data protection agency--Agencia Española de Protección de Datos--which rejected the claim on the grounds that it was lawful and accurate, but accepted a complaint against Google and asked Google to remove the results. Google sued in Spain and the lawsuit was transferred to the European Court of Justice. The court ruled in Costeja that search engines are responsible for the content they point to and thus, Google was required to comply with EU data privacy laws. It began compliance on 30 May 2014 during which it received 12,000 requests to have personal details removed from its search engine.

Index on Censorship claimed that "Costeja ruling ... allows individuals to complain to search engines about information they do not like with no legal oversight. This is akin to marching into a library and forcing it to pulp books. Although the ruling is intended for private individuals it opens the door to anyone who wants to whitewash their personal history....The Court's decision is a retrograde move that misunderstands the role and responsibility of search engines and the wider internet. It should send chills down the spine of everyone in the European Union who believes in the crucial importance of free expression and freedom of information."


src: safervpn-blog.s3.amazonaws.com


Around the world

As more people in more places begin using the Internet for important activities, there is an increase in online censorship, using increasingly sophisticated techniques. The motives, scope, and effectiveness of Internet censorship vary widely from country to country. The countries engaged in state-mandated filtering are clustered in three main regions of the world: east Asia, central Asia, and the Middle East/North Africa.

Countries in other regions also practice certain forms of filtering. In the United States state-mandated Internet filtering occurs on some computers in libraries and K-12 schools. Content related to Nazism or Holocaust denial is blocked in France and Germany. Child pornography and hate speech are blocked in many countries throughout the world. In fact, many countries throughout the world, including some democracies with long traditions of strong support for freedom of expression and freedom of the press, are engaged in some amount of online censorship, often with substantial public support.

Internet censorship in China is among the most stringent in the world. The government blocks Web sites that discuss the Dalai Lama, the 1989 crackdown on Tiananmen Square protesters, the banned spiritual practice Falun Gong, as well as many general Internet sites. The government requires Internet search firms and state media to censor issues deemed officially "sensitive," and blocks access to foreign websites including Facebook, Twitter, and YouTube. According to a recent study, censorship in China is used to muzzle those outside government who attempt to spur the creation of crowds for any reason--in opposition to, in support of, or unrelated to the government. The government allows the Chinese people to say whatever they like about the state, its leaders, or their policies, because talk about any subject unconnected to collective action is not censored. The value that Chinese leaders find in allowing and then measuring criticism by hundreds of millions of Chinese people creates actionable information for them and, as a result, also for academic scholars and public policy analysts.

There are international bodies that oppose internet censorship, for example "Internet censorship is open to challenge at the World Trade Organization (WTO) as it can restrict trade in online services, a forthcoming study argues".

Reports, ratings, and trends

Detailed country by country information on Internet censorship is provided by the OpenNet Initiative, Reporters Without Borders, Freedom House, and in the U.S. State Department Bureau of Democracy, Human Rights, and Labor's Human Rights Reports. The ratings produced by several of these organizations are summarized in the Internet censorship by country and the Censorship by country articles.

OpenNet Initiative reports

Through 2010 the OpenNet Initiative had documented Internet filtering by governments in over forty countries worldwide. The level of filtering in 26 countries in 2007 and in 25 countries in 2009 was classified in the political, social, and security areas. Of the 41 separate countries classified, seven were found to show no evidence of filtering in all three areas (Egypt, France, Germany, India, Ukraine, United Kingdom, and United States), while one was found to engage in pervasive filtering in all three areas (China), 13 were found to engage in pervasive filtering in one or more areas, and 34 were found to engage in some level of filtering in one or more areas. Of the 10 countries classified in both 2007 and 2009, one reduced its level of filtering (Pakistan), five increased their level of filtering (Azerbaijan, Belarus, Kazakhstan, South Korea, and Uzbekistan), and four maintained the same level of filtering (China, Iran, Myanmar, and Tajikistan).

Freedom on the Net reports

In the 2011 edition of Freedom House's report Freedom on the Net, of the 37 countries surveyed, 8 were rated as "free" (22%), 18 as "partly free" (49%), and 11 as "not free" (30%). In their 2009 report, of the 15 countries surveyed, 4 were rated as "free" (27%), 7 as "partly free" (47%), and 4 as "not free" (27%). And of the 15 countries surveyed in both 2009 and 2011, 5 were seen to be moving in the direction of more network freedom (33%), 9 moved toward less freedom (60%), and one was unchanged (7%).

The 2014 report assessed 65 countries and reported that 36 countries experienced a negative trajectory in Internet freedom since the previous year, with the most significant declines in Russia, Turkey and Ukraine. According to the report, few countries demonstrated any gains in Internet freedom, and the improvements that were recorded reflected less vigorous application of existing controls rather than new steps taken by governments to actively increase Internet freedom. The year's largest improvement was recorded in India, where restrictions to content and access were relaxed from what had been imposed in 2013 to stifle rioting in the northeastern states. Notable improvement was also recorded in Brazil, where lawmakers approved the bill Marco Civil da Internet, which contains significant provisions governing net neutrality and safeguarding privacy protection.

Reporters Without Borders (RWB)

RWB "Internet enemies" and "countries under surveillance" lists

In 2006, Reporters without Borders (Reporters sans frontières, RSF), a Paris-based international non-governmental organization that advocates freedom of the press, started publishing a list of "Enemies of the Internet". The organization classifies a country as an enemy of the internet because "all of these countries mark themselves out not just for their capacity to censor news and information online but also for their almost systematic repression of Internet users." In 2007 a second list of countries "Under Surveillance" (originally "Under Watch") was added.

When the "Enemies of the Internet" list was introduced in 2006, it listed 13 countries. From 2006 to 2012 the number of countries listed fell to 10 and then rose to 12. The list was not updated in 2013. In 2014 the list grew to 19 with an increased emphasis on surveillance in addition to censorship. The list has not been updated since 2014.

When the "Countries under surveillance" list was introduced in 2008, it listed 10 countries. Between 2008 and 2012 the number of countries listed grew to 16 and then fell to 11. The list was last updated in 2012.

RWB Special report on Internet Surveillance

On 12 March 2013, Reporters Without Borders published a Special report on Internet Surveillance. The report includes two new lists:

  • a list of "State Enemies of the Internet", countries whose governments are involved in active, intrusive surveillance of news providers, resulting in grave violations of freedom of information and human rights; and
  • a list of "Corporate Enemies of the Internet", companies that sell products that are liable to be used by governments to violate human rights and freedom of information.

The five "State Enemies of the Internet" named in March 2013 are: Bahrain, China, Iran, Syria, and Vietnam.

The five "Corporate Enemies of the Internet" named in March 2013 are: Amesys (France), Blue Coat Systems (U.S.), Gamma International (UK and Germany), Hacking Team (Italy), and Trovicor (Germany).

BBC World Service global public opinion poll

A poll of 27,973 adults in 26 countries, including 14,306 Internet users, was conducted for the BBC World Service by the international polling firm GlobeScan using telephone and in-person interviews between 30 November 2009 and 7 February 2010. GlobeScan Chairman Doug Miller felt, overall, that the poll showed that:

Despite worries about privacy and fraud, people around the world see access to the internet as their fundamental right. They think the web is a force for good, and most don't want governments to regulate it.

Findings from the poll include:

  • Nearly four in five (78%) Internet users felt that the Internet had brought them greater freedom.
  • Most Internet users (53%) felt that "the internet should never be regulated by any level of government anywhere".
  • Opinion was evenly split between Internet users who felt that "the internet is a safe place to express my opinions" (48%) and those who disagreed (49%). Somewhat surprisingly users in Germany and France agreed the least, followed by users in a highly filtered country such as China, while users in Egypt, India and Kenya agreed more strongly.
  • The aspects of the Internet that cause the most concern include: fraud (32%), violent and explicit content (27%), threats to privacy (20%), state censorship of content (6%), and the extent of corporate presence (3%).
  • Almost four in five Internet users and non-users around the world felt that access to the Internet was a fundamental right (50% strongly agreed, 29% somewhat agreed, 9% somewhat disagreed, 6% strongly disagreed, and 6% gave no opinion). And while there is strong support for this right in all of the countries surveyed, it is surprising that the United States and Canada were among the top five countries where people most strongly disagreed that access to the Internet was a fundamental right of all people (13% in Japan, 11% in the U.S., 11% in Kenya, 11% in Pakistan, and 10% in Canada strongly disagree).

Internet Society's Global Internet User Survey

In July and August 2012 the Internet Society conducted online interviews of more than 10,000 Internet users in 20 countries. Some of the results relevant to Internet censorship are summarized below.

Transparency of filtering or blocking activities

Among the countries that filter or block online content, few openly admit to or fully disclose their filtering and blocking activities. States are frequently opaque and/or deceptive about the blocking of access to political information. For example:

  • Saudi Arabia and the United Arab Emirates (UAE) are among the few states that publish detailed information about their filtering practices and display a notification to the user when attempting to access a blocked website. The websites that are blocked are mostly Pornographic and against the respective states and/or the Islamic Religion.
  • In contrast, countries such as China and Tunisia send users a false error indication. China blocks requests by users for a banned website at the router level and a connection error is returned, effectively preventing the user's IP address from making further HTTP requests for a varying time, which appears to the user as "time-out" error with no explanation. Tunisia has altered the block page functionality of SmartFilter, the commercial filtering software it uses, so that users attempting to access blocked websites receive a fake "File not found" error page.
  • In Uzbekistan users are frequently sent block pages stating that the website is blocked because of pornography, even when the page contains no pornography. Uzbeki ISPs may also redirect users' request for blocked websites to unrelated websites, or sites similar to the banned websites, but with different information.

Arab Spring

See also: Internet Censorship in the Arab Spring, 2011 Egyptian Internet shutdown, and Free speech in the media during the Libyan civil war

During the Arab Spring of 2011, media jihad (media struggle) was extensive. Internet and mobile technologies, particularly social networks such as Facebook and Twitter, played and are playing important new and unique roles in organizing and spreading the protests and making them visible to the rest of the world. An activist in Egypt tweeted, "we use Facebook to schedule the protests, Twitter to coordinate, and YouTube to tell the world".

This successful use of digital media in turn led to increased censorship including the complete loss of Internet access for periods of time in Egypt and Libya in 2011. In Syria, the Syrian Electronic Army (SEA), an organization that operates with at least tacit support of the government, claims responsibility for defacing or otherwise compromising scores of websites that it contends spread news hostile to the Syrian government. SEA disseminates denial of service (DoS) software designed to target media websites including those of Al Jazeera, BBC News, Syrian satellite broadcaster Orient TV, and Dubai-based Al Arabiya TV.

In response to the greater freedom of expression brought about by the Arab Spring revolutions in countries that were previously subject to very strict censorship, in March 2011, Reporters Without Borders moved Tunisia and Egypt from its "Internet enemies" list to its list of countries "under surveillance" and in 2012 dropped Libya from the list entirely. At the same time, there were warnings that Internet censorship might increase in other countries following the events of the Arab Spring. However, in 2013, Libyan communication company LTT blocked the pornographic websites. It even blocked the family-filtered videos of ordinary websites like Dailymotion.


src: i1.wp.com


See also


src: i0.wp.com


References

This article incorporates licensed material from the OpenNet Initiative web site.


src: koelner-dom.us


External links

  • Censorship Wikia, an anti-censorship site that catalogs past and present censored works, using verifiable sources, and a forum to discuss organizing against and circumventing censorship.
  • "Index on Censorship", web site for the London-based organization and magazine that promotes freedom of expression.
  • Internet censorship wiki, provides information about different methods of access filtering and ways to bypass them.
  • "Online Survival Kit", We Fight Censorship project of Reporters Without Borders.
  • "Media Freedom Internet Cookbook" by the OSCE Representative on Freedom of the Media, Vienna, 2004.
  • Discussion of global net filtering, Berkman Center for Internet & Society, Harvard, March 2008.
  • How to Bypass Internet Censorship, also known by the titles: Bypassing Internet Censorship or Circumvention Tools, a FLOSS Manual, 10 March 2011, 240 pp.
  • "How to bypass internet censorship: The current state of internet censorship", The Times of India, 14 November 2013.
  • "Free Speech in the Age of YouTube" in the New York Times, 22 September 2012.

Source of article : Wikipedia