PICT3011 – Cyber Security in Practice Research Essay

By | January 27, 2025

Is censorship a tool used by governments to protect their citizens or a means to restrict their freedom? Evaluate the role of censorship and justify your claims.

The challenging and contentious topic of censorship raises the question of how to reconcile restricting individuals’ freedoms while still ensuring their safety. Censorship’s function can change based on the government and its objectives. While some contend that governments employ censorship as a tool to safeguard their citizens, others see it as a way to curtail their freedom. It is critical to consider a variety of viewpoints and supporting data when assessing the function of censorship.

For this paper we will first have to define what we mean by censorship and look briefly into its history and use. We will offer some case studies that cover some of the roles in question, protection vs restriction as well as censorship evasion. We conclude that censorship has been used by governments since governments existed more recently though, the advent of the internet has spawned various programs of digital censorship. Censorship is used by repressive regimes as well as well as the democratic and in fact it is the censorship programs of some democracies that seem the most insidious. While oppressive regimes will censure access to news or dissident information, they often see this as a method to maintain stability within the country. Democratic governments regularly frame their censorship programs as crime fighting efforts, most often against Child Abuse Material (CAM) or terrorism. This framing instantly puts anyone objecting to the programs on the side of “pedophiles and terrorists.” Unfortunately, once implemented these programs are subject to scope and function creep which potential results of mass surveillance and criminalisation of innocent people. Tools such as TOR can be used to evade censorship and in some cases the discovery of these tools will lead to censorship programs backfiring on the government implementing them.

The English term ‘Censorship’ goes back to the office of the Censor established in Rome in 443 BC (Anastaplo, 2023).  However, censorship was not invented by the Romans or Greeks. Social and political constraints on speech, writing, and theater were prevalent in many ancient civilizations. From the limitations woven into Chinese ideography to the taboos and traditions maintained regarding symbolic meaning in many other societies, from ancient Sumeria and Egypt (Moore, 2016). For this paper we will use the definition provided by the Oxford Dictionary of Media and Communication —“any regime or context in which the content of what is publicly expressed, exhibited, published, broadcast, or otherwise distributed is regulated or in which the circulation of information is controlled,” or “a regulatory system for vetting, editing, and prohibiting particular forms of public expression,” or, even more generally, “the practice and process of suppression or any particular instance of this.” (Oxford Reference, 2023).

The modern version of this definition emphasizes the institutional use of control, makes a distinction between legally administered regimes and contexts and private instances emphasizing the importance of the complex subject of public expression (Moore, 2016). This emphasis sets it apart from previous definitions, such as those found in other Oxford dictionaries, which demonstrate how the word’s Latin roots shaped the concept of the “censor” as a lone operator. “An official whose duty it is to inspect books, journals, plays, etc., before publication, to ensure that they contain nothing immoral, heretical, or offensive or injurious to the State” was the OED’s definition of “censor” in 1974. This usage dates back to 1644. In addition, a 1914 reference to “one who censors private correspondence (as in time of war)” is included (Oxford English Dictionary, 2023).

Censorship is not limited to state actors. It is not unreasonable to argue that publishers may also have an influence on the kinds of cultural expression that are created and made available, just as businesses and organizations have the power to control the news that is reported to the public. This is how media oligopolies enact private censorship. A notable example of a modern public-private partnership is Google.cn, the China-specific version of Google’s widely used search engine. Under official guidance but “charged to draw the line for itself,” a multinational corporation voluntarily assists in the implementation of the Chinese government’s internet censorship system (Moore, 2016).

In the internet age digital censorship – the control, regulation, and restriction of information and communication on digital platforms, including the internet, social media, and other technologies has become a major topic of concern. Digital censorship involves various measures implemented by governments, organizations, or individuals to limit or manipulate the flow of information and control access to certain content. It can also involve online surveillance as the two serve intertwined roles in the cyber landscape (Earl et al., 2022). With the help of dual-use technologies like deep packet inspection, operators can monitor content and internet traffic like email and browser usage. Similar methods can also be used to monitor social media. Traditional surveillance typically targets specific individuals or smaller groups, but digital tools enable surveillance to occur at the internet backbone or internet service provider (ISP) level, so it can happen on a large scale. When it is done at this level, regimes gain extraordinary access to social movements, information on public discontent, and even the actions of state employees whose corruption or incompetence may be fueling unrest (Xu, 2021).

One viewpoint on censorship holds that governments employ it as a tool to safeguard their populace. This point of view contends that censorship is required to uphold social order, safeguard national security, and stop the spread of objectionable or dangerous content. Censorship is seen as a way to protect society’s welfare because some ideas or information can be harmful. For instance, (Ford and Wajcman, 22017) talk about how Wikipedia’s infrastructure can support prejudice and false information. They contend that censorship is a useful tool for addressing these problems and making sure the public has access to inclusive and accurate information. Ironically, the Chinese government frames their use of censorship as a means of maintaining stability by stopping collective action from being organized that might endanger the government (King et al., 2013).  China invests a lot of money in blocking websites from other countries in an effort to stop the spread of uncensored material that might be interpreted as regime-threatening. This implies that the purpose of censorship is to manage the dissemination of information and shield citizens from potentially dangerous or destabilising content (Chen and Yang, 2019).

While many tend to concentrate on the use of surveillance in autocratic settings, Western democracies, such as the United States and Britain also deploy significant surveillance capabilities. Like autocrats, they employ surveillance for preventive measures including quelling or restricting protest. Snowden’s information leaks indicate that this also includes monitoring of domestic and international telecom networks (Gellman and Soltani, 2013). When it comes to online censorship in democracies, the United Kingdom (UK) has led the way since the advent of the modern internet. Since the 1990s, the UK has created various regulations that have been influential globally but raise real questions about legitimacy, accountability, and transparency. These patterns include targeting intermediaries, using the national platform to encourage “voluntary” self-regulation, and promoting automated censorship tools such as web blocking (McIntyre, 2018). These measures, when made public, are usually framed as efforts to protect the populace from such things as Child Abuse Material (CAM), terrorism recruitment / organization and extreme pornography.

The blocking of dubious websites comes with a lot of issues. Almost invariably, choices regarding which URLs to include in the list are made without consulting the website owner (IWF, 2015b). ISPs are free to choose to disregard the recommendation and many have resorted to using fake error messages in place of block pages, which are meant to notify users that a page has been blocked (McIntyre, 2018). This is because the URL List is not available to the public and notice to the site is not necessary at the time of blocking. Similarly, some ISPs have turned to the less complex and less expensive methods of IP address blocking, or DNS poisoning even though the advice is for them to block just the particular URL on the list.

The introduction of URL blacklists opens the door to automatic surveillance. Under such a system attempts to access blocked URLs are automatically reported to police (Heal, 2017). In this instance, a system designed to shield users from unintentional access to CAM has been repurposed to potentially criminalize those users. This use of a URL blacklist highlights concerns regarding function creep and the convergence of censorship and surveillance.

Another system described as a ‘game changer’ in the fight against CAM is the Image Hash List. Microsoft PhotoDNA signatures and MD5 hashes are used to create ‘digital fingerprints’ of CAM which can be used by hosting provides to identify, delete or block uploads of CAM images (McIntyre, 2018). Hash matching presents fresh possibilities for wide ranging censorship. Hash matching systems have the potential to fundamentally alter the dynamics of regulation by allowing for the automated identification and removal of already available CAM and limiting the spread of new content by preventing images from being uploaded. With the help of this tool, an image only needs to be classified once, relieving analysts of a repetitive task and freeing up more time for identification of new images and proactive searching. (IWF, 2015a).

Hash lists are already used by major providers, Google scans all messages sent via Gmail (McCormick, 2014), Microsoft uses PhotoDNA to scan every file uploaded to OneDrive (Microsoft, 2023) and several people have been arrested after using these services (Gibbs, 2014). In addition to being disproportionate in and of itself, this kind of indiscriminate surveillance is also easily co-opted for other kinds of content resulting in function creep. Surveillance of private communication and file scanning could become more commonplace in the context of CAM and be used as a springboard for surveillance for other reasons (McIntyre, 2018).

Despite the fact that trying to circumvent censorship can be challenging, individuals and groups can do so by employing a range of techniques and tools. Potential strategies include using I2P, Freenet, and Tor. These services can be used for sending and receiving sensitive or secret information for both good and bad since they offer decentralized, encrypted, and anonymous internet communications.

For brevity, this paper will focus on the TOR network. The most well-known Dark Net product is undoubtedly Tor (The Onion Router). The free software includes a web browser that resembles Mozilla Firefox, that can be used to access the dark net, offering even inexperienced users a high degree of anonymity. TOR addresses are made up of random keys that end in “.onion”, the browser enables users to access these sites. For example, the address “http://zqktlwiuavvvqqt4ybvgvi7tyo4hjl5xgfuvpdf6otjiycgwqbym2qad.onion/” leads to “The Hidden Wiki,” an index of .onion sites. This page will not open a standard browser like MS Exchange or Chrome; but it will be rendered just as any other webpage in a browser that has Tor enabled. Using a browser that supports Tor has the added benefit of anonymizing your internet activity as you browse. This is achieved by transferring your communication through multiple “nodes” that encrypt your data and conceal your IP address.

Tor was first developed in 2002 by the US Naval Research Laboratory.  A nonprofit organization that provides the user-friendly browser packages, The Tor Project Inc. has now assumed control of development (Mansfield-Devine, 2014). There are ongoing rumors and some supporting data suggesting that the Tor network is still connected to US intelligence services (Levine, 2018). That said, leaked NSA (National Security Agency) documents suggest breaking into the Tor network is incredibly challenging (Landau, 2014). The Russian Ministry of Internal Affairs has also released a government tender worth millions of rubles for anyone who can develop a dependable method of breaking Tor’s anonymity (Mansfield-Devine, 2014).

Tor was first created as a secure channel of communication for American intelligence services to use with their operatives abroad. In order to promote unrestricted internet access in circumstances where online censorship was severe or the fear of persecution for those attempting to access information deemed illegal was prohibitive, it was then made available as a free service in 2015 (Moore and Rid, 2016). The National Science Foundation and other US government organizations still provide the lion’s share of funding to the Tor Project. According to (McKim, 2012), these organizations justify this support because Tor ”provides potential life-saving online security and privacy in places – such as Iran and Syria – where political dissidents are often dealt with harshly”.

The use of Tor is often associated with criminality as its hidden services include online drug markets and hacker forums (that is just for starters) however this perception is misleading as (Biryukov et al., 2014) state many hidden services support freedom of speech, human rights, and access to information that is restricted in nations with authoritarian regimes. They discovered that the total number of covert services pertaining to illicit activities was almost equal to that of covert services pertaining to other topics, such as message boards where drug users can obtain guidance on responsible usage and victims of domestic abuse can gather to receive support (Bancroft and Scott Reid, 2017). These figures indicate the percentage of services devoted to the most upsetting topics, like pedophilia and murder, would be extremely small (Greenburg, 2015).

On September 29, 2014, after three days of pro-democracy protests in Hong Kong, the Chinese government blocked Instagram’s access from within Mainland China. The demonstrations were against the Chinese government’s proposed electoral reforms. Images of the protests were making the rounds on Instagram, leading many to theorize that the platform’s block was designed to prevent any potential disruption from spreading from Hong Kong to Mainland China. Instagram usage significantly decreased as a result of the block. The Great Firewall prevented about half of the Instagram users in China from accessing the platform. The other half, however, discovered ways to get around the restrictions, mainly by using Virtual Private Networks (VPNs) (Hobbs and Roberts, 2018).  As new users began using VPNs, they were to access additional political content on Twitter and Wikipedia. For instance, following the Instagram block, views of Chinese-language Wikipedia pages about contentious subjects like Tiananmen Square and Chinese Communist Party leaders increased significantly. A day after starting to use Twitter following Instagram’s block, newcomers in China, started talking about the protests in Hong Kong (Hobbs and Roberts, 2018). This example demonstrates the unintended consequence of providing many “normal” citizens with abrupt incentives to get around censorship and thus gain access to websites and information that many of them had never seen before or had not previously been interested in (Hobbs and Roberts, 2018).

In this paper we have provided a definition of censorship, briefly examining its history and use. Using case studies from China the US and UK we researched the roles in question, protection vs restriction and censorship evasion. We showed that censorship has been used by governments throughout history and that internet has resulted in programs of digital censorship.  we highlighted the use of censorship by repressive regimes as well as the democratic and the fact it is the censorship programs of some democracies that seem the most insidious. Finaly, we demonstrated that tools such as TOR can be used to evade censorship and in some cases the discovery of these tools will lead to censorship programs backfiring on the government implementing them.

REFRENCES

Anastaplo G (2023) Censorship. Available at: https://www.britannica.com/topic/censorship (accessed 20 October 2023).

Bancroft A and Scott Reid P (2017) Challenging the Techno-Politics of Anonymity: The Case of Cryptomarket Users. Information, Communication & Society 20(4). article. Abingdon: Routledge: 497–512.

Biryukov A, Pustogarov I, Thill F, et al. (2014) Content and Popularity Analysis of Tor Hidden Services. arXiv.org. article. Ithaca: Cornell University Library, arXiv.org. Epub ahead of print 2014. DOI: 10.48550/arxiv.1308.6768.

Chen Y and Yang DY (2019) The Impact of Media Censorship: 1984 or Brave New World? American Economic Review. Epub ahead of print 2019. DOI: 10.1257/aer.20171765.

Earl J, Maher T V and Pan J (2022) The Digital Repression of Social Movements, Protest, and Activism: A Synthetic Review. Science Advances 8(10). article. United States: eabl8198–eabl8198.

Ford H and Wajcman J (2017) ‘Anyone Can Edit’, Not Everyone Does: Wikipedia’s Infrastructure and the Gender Gap. Social Studies of Science. Epub ahead of print 2017. DOI: 10.1177/0306312717692172.

Gellman B and Soltani A (2013) NSA Infiltrates Links to Yahoo, Google Data Centers Worldwide, Snowden Documents Say. Washington Post. Washington D.C.

Gibbs S (2014) Microsoft Tip Led Police to Arrest Man over Child Abuse Images. Available at: https://www.theguardian.com/technology/2014/aug/07/microsoft-tip-police-child-abuse-images-paedophile (accessed 20 October 2023).

Greenburg A (2015) No, Department of Justice, 80 Percent of Tor Traffic Is Not Child Porn. Available at: https://www.wired.com/2015/01/department-justice-80-percent-tor-traffic-child-porn/ (accessed 25 October 2023).

Heal C (2017) ICAlert Launches to Safeguard Schools Against Online Child Abuse. Available at: https://swgfl.org.uk/magazine/icalert-launches-to-safeguard-schools-against-onli/ (accessed 20 October 2023).

Hobbs WR and Roberts ME (2018) How Sudden Censorship Can Increase Access to Information. American Political Science Review 112(3). Cambridge University Press: 621–636.

IWF (2015a) Hash List “Could be Game-Changer” in the Global Fight Against Child Sexual Abuse Images Online. Available at: https://www.iwf.org.uk/news-media/news/hash-list-could-be-game-changer-in-the-global-fight-against-child-sexual-abuse-images-online/ (accessed 27 October 2023).

IWF (2015b) URL List Policies Procedures and Processes. Available at: https://www.iwf.org.uk/media/3dvhyepa/url-list-policies-procedures-and-processes_.pdf (accessed 20 October 2023).

King G, Pan J and Roberts ME (2013) How Censorship in China Allows Government Criticism but Silences Collective Expression. American Political Science Review. Epub ahead of print 2013. DOI: 10.1017/s0003055413000014.

Landau S (2014) Highlights from Making Sense of Snowden, Part II: What’s Significant in the NSA Revelations. IEEE security & privacy. article. New York: IEEE.

Levine Y (2018) Surveillance Valley: The Secret Military History of the Internet . Cambridge, MA: Perseus Books.

Mansfield-Devine S (2014) Tor Under Attack. Computer Fraud & Security 2014(8). article. Elsevier B.V: 15–18.

McCormick R (2014) Google Scans Everyone’s Email for Child Porn, and it Just Got a Man Arrested. Available at: https://www.theverge.com/2014/8/5/5970141/how-google-scans-your-gmail-for-child-porn (accessed 10 October 2023).

McIntyre TJ (2018) Internet Censorship in the United Kingdom: National Schemes and European Norms. In: Lilian Edwards (ed.) Law, Policy and the Internet . Oxford, UK: Hart Publishing.

McKim JB (2012) Walpole Company’s Anonymity Software Aids Illicit Deals. The Boston Globe. Available at: https://www.bostonglobe.com/business/2012/03/08/walpole-company-anonymity-software-aids-elicit-deals/n1icZ1d30WjvUmmQqS7vjM/story.html (accessed 28 October 2023).

Microsoft (2023) About Our Practices and Your Data. Available at: https://blogs.microsoft.com/datalaw/our-practices/ (accessed 20 October 2023).

Moore D and Rid T (2016) Cryptopolitik and the Darknet. Survival (London) 58(1). article. Routledge: 7–38.

Moore N (2016) Censorship. In: Oxford Research Encyclopedia of Literature. Oxford University Press.

Oxford English Dictionary (2023) censor, n., sense 2.e. Available at: https://www.oed.com/dictionary/censure_n?tl=true

Oxford Reference (2023) censorship. Available at: https://www.oxfordreference.com/view/10.1093/oi/authority.20110803095558166.

Xu X (2021) To Repress or to Co-opt? Authoritarian Control in the Age of Digital Surveillance. American journal of political science 65(2). article. HOBOKEN: Wiley Subscription Services, Inc: 309–325.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.