|
Article 6-12: Personal Rights. Co-authored by Klaus Stoll and Prof Sam Lanfranco. [1]
Internet Governance like all governance needs to be founded in guiding principles from which all policy making is derived. There are no better fundamental principles to guide our policy making than the Universal Declaration of Human Rights, (UDHR). This article is Part 3 of a series exploring the UDHR as a guide and template for the digital governance and digital citizenship. [2] We discuss UDHR Articles 6 through 12 and address topics such as fundamental digital values, cyberlaw, policymaking and the role of tribunals in digital governance. [3]
Articles 6 and 7 are closely connected, and we discuss them together.
Article 6: Everyone has the right to recognition everywhere as a person before the law.
Article 7: All are equal before the law and are entitled without any discrimination to equal protection of the law. All are entitled to equal protection against any discrimination in violation of this Declaration and against any incitement to such discrimination.
Article 6 lays the foundation for personal rights as “recognition as a person before the law.” It “recognizes the existence of the individual as a human being with distinct needs, interests, and opinions.” [4] This is “a prerequisite to all other rights of the individual.” [5]
Everyone, Everywhere, Equally
“Everyone, everywhere, equally” is referred to in the following as the 3e’s. Articles 5 and 6 stress the universality of personal rights and make no distinctions based on race, religion, culture, or gender orientation. Personhood is a broad concept under the UDHR, and the pressing task now is to formally recognize digital personhood [6], personal data and “constructed personas” formally under its protection.
Universality and inclusivity without geographical limitations are fundamental characteristics of cyberspace and the loadbearing pillars of digital citizenship. [7] The 3e’s are “separate but inseparable” as foundational principles of the UDHR and should formally underpin digital citizenship. No citizen should be denied access and protection of the law or be forced to give up these fundamental rights regarding their digital data and digital personas in the Internet ecosystem. [8]
“The concept of “sharing” is the DNA of the whole Internet.
Cyberspace has over time developed from being the way to access computers, to share files and become the space to assemble, to express, share and promote ideas and to defend and pursue them as an individual or group without the limitations of space and to a certain extent the limitations of language and culture.
It leads to a “win-win-situation” and does not know losers. If the concept of sharing is ignored or substituted by a 20th century “zero-sum game” with winners and losers, the risk is high, that in an interconnected world at the end of the day everybody is a looser. This is a fundamental lesson from the 50 years of Internet history, which should not be forgotten.” [9]
DNS Values
The 3e’s also apply, in a technical sense, in the operation of the Internet’s Domain Name System, (DNS). [10] When the DNS resolves digital address queries, it makes no distinctions and serves everyone, everywhere, equally. The DNS reflects and upholds the most important principle when it comes to the application of law: Do not discriminate. [11] The DNS is more than a technical innovation, it’s operation inherently embodies respect for rights in cyberspace and exhibits integrity in human communication within a trusted system. Attempts to weaken the universality of DNS through alternative roots, national segments and closed spaces diminish our rights as persons and reduces the role of the Internet ecosystem as a venue for building our joint digital and human humanity.
Recognition
Recognition before the law means not only recognition of a human being as a person and a citizen, but also the recognition of the specific circumstances in which one resides, here as digital personas and digital citizens. Laws, regulations and behavior concerning digital existence must respect the (global) borderless residence of our digital beings and treat our digital data and personas as part of our being.
Cyber Law [12]
Cyber Law is any law that is applicable in cyberspace. Treatment under the law draws on both codified law and case law. Jurisprudence under digital governance is in its infancy and will further develop over time. So will case law and legal processes regarding issues such as intellectual property, trademark infringement, domain name disputes, cybersquatting, and e-commerce practices. The development laws on cybercrimes such as, hacking, identity theft and cyberbullying, malware, spyware, phishing and pharming, and the development of policies for acceptable practices and behavior in the digital political, economic and cultural spheres are in their infancy. [13] It is essential that work in such areas have a firm understanding and grounding in the principles embedded in the UDHR as they apply to persons, personal data, and personas in the cyberspaces of the Internet ecosystem. Laws and regulations must be careful to address the problems as narrowly as possible and avoid unintended consequences on the free flow of information on the Internet.
At present, a general consensus exists that present international laws and treaties do apply in cyberspace. The discussion focuses more on “how they apply” rather than “if they apply.” [14] Some governments consider existing treaties and national laws as adequate. Others see the need to create new laws specific to cyberspace. [15] Given the global nature of the Internet ecosystem, National and regional cyberspace law will often have a relevance that goes beyond the limits of territory based sovereignty. [16] While states may agree that cyber law is always subject to and guided by national laws and international agreements, we are at the early stages in the design of national cyber laws, and for some degree of global harmony across digital policies. The global nature of the Internet ecosystem will likely involve intensive discussion around international agreements regarding digital rights and the meaning of global digital citizenship. There is also a risk in the global online environment that the first country to regulate can, by virtue of being the “first comer,” essentially impose legal rules and potential liabilities on the rest of the world. [17]
Digital Governance and Policy Making
With the explosive growth of the Internet, we observe growing efforts by some nation-states to question the applicability of existing international law in cyberspace. In 2004, the United Nations mandated the “UN Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security”, (GGE). [18] The mandate of the GGE was “to consider existing and potential threats in the sphere of information security and possible cooperative measures to address them,” Six GGEs failed to produce a consensus report in 2017 and managed only to outline, but not establish, a global digital agenda and the general principle that international law applies in cyberspace.
Fundamental differences in how States see the role of law in cyberspace became obvious in 2018 when the UN adopted two new resolutions that mandated another GGE as a continuation the other previous GGE and whose mandate included to take into account the assessments and recommendations of the previous group .” [19] and in parallel an “Open-Ended Working Group on Developments in the Field of ICTs in the Context of International Security” (OEWG). There were overlapping and sometimes contradictory agendas, working methods and remits. [20]
The two UN groups are a manifestation of the big conceptual difference between member states when it comes to cyber law and digital governance. One group of states prioritizes their own sovereignty and protection against perceived cyberthreats and undesired outside influences into their internal affairs. The other group of states prioritizes personal information security over cybersecurity and puts the integrity of the Internet ecosystem and how digital information is processed on the top of its agenda. These conflicting tendencies make it very difficult and unlikely to form a consensus on international or global policies and practices.
Another process was recently created within the UN framework during the 74th UN General Assembly. A resolution, initiated by Russia was adopted to establish an open-ended ad hoc intergovernmental committee of experts to “to elaborate a comprehensive international convention on countering the use of information and communications technologies for criminal purposes.” [21] Although their purported intent was to address cybersecurity, this has to be seen as another attempt to highjack larger internet governance processes under the guise of security.
Wolfgang Kleinwaechter summarized the efforts around digital policy making so far: “In the 2000s, there was a more or less ideological battle between “isms”—multistakeholderism vs. multilateralism—which produced more controversy than progress. In the 2010s, it was widely recognized that both concepts could co-exist… But as the UN Panel has outlined, for the 2020s, this will not be enough. The next generation of Internet Governance will need much more inclusive processes where multilateralism and multistakeholderism have to be treated as two sides of one coin.” [22]
Governing the Killer Robots
Cyberspace is viewed as another theatre of war. Governments have quickly identified the potential of digital technologies for militaristic use in defense of their countries. A group of governmental experts is negotiating since 2014, “Lethal Autonomous Weapon Systems”, (LAWS), under the Convention on Certain Conventional Weapons, (CCW). [23] Their recommendations have potentially a great influence on policymaking for cyberspace. The assessment of the legality of AI-based killer robots differs widely between states, some want to ban them like chemical weapons, other support the use of these murderous digital innovations, but none are underestimating the impact of digital technologies on warfare. Given its significance and the explosive expansion of military drone use, it is surprising how little the discussion about killer robots is taking place in the general global debate.
Digital policy making Mechanisms and Tribunals
There exists no specific judiciary body for cyberspace. States are trying to fill the void and extend their sovereignty into cyberspace by making the activities of their citizens in cyberspace subject to nation-based laws, some with elements of extraterritoriality. These efforts can only result in inadequate applications of law and expressions of justice. They do not consider the special characteristics of cyberspace and transpose territorially based concepts of law and justice into the digital realm of borderless, universal and inclusive activities and behavior. There is an urgent need to define cyber law, establish mechanisms of enforcement, and create dispute resolution tribunals, all developed through legitimate policy-making processes. To establish legitimate cyberlaws and create competent tribunals, digital citizens must be empowered and engaged in policy-making processes.
Digital Policy Making Mechanisms
Over the years there have been a growing, and increasingly confusing, number of initiatives to establish policy making mechanisms for cyberspace. [24] Some are UN based. [25] Some are based on national efforts. [26] Some have been initiated by other stakeholders. [27] The number of initiatives is confusing and seems to be only limited by the number of special interests represented in digital governance. Their common characteristic is that they are created as instruments to ensure that one group’s specific interests prevail over those or another: profit over privacy; national interests over global brotherhood; short term political gains over long term common good, the list is as long as there are special interests seeking protection.
Other characteristic these initiatives share is their claim to be inclusive and open to all stakeholders while claiming that the common good is upmost in their minds. This is strongly reminiscent of the proclamation of the pigs who control the government in George Orwell’s novel Animal Farm: “All animals are equal, but some animals are more equal than others!.” [28] In doing so this exposed the hypocrisy of self-appointed and self-empowered policy-making bodies, bodies that proclaim the absolute equality of all digital citizens but in practice preserve power and maintain the privileges of a small elite.
Enabling Recognition
Some of the reasons why effective digital governance has so far not be established are contained in the wording of Article 6. Recognition before the law now requires the equal recognition of everyone’s rights, everywhere, in the context of one’s life across inseparable literal and digital realms. [29]
What is need are digital governance models that recognize and enable the equal participation of everyone from everywhere in open policy-making processes. The multistakeholder model comes nearest to that ideal, but it can be corrupted through artificial access barriers such as funding. How can civil society stakeholders be effective and independent when they are often the only ones that act as volunteers and depend on the funding by other stakeholders to enable their participation? [30] How can trade associations be the representatives of all their members when their policymaking is dictated by the interests of their top corporate members? To explore and eliminate the influences of institutional corruption “...we needed to think about the ways in which systems of incentives, or economies of influence, might advance or deter a collective objective.” [31] The function of digital governance is not just policy-making, but also to create the conditions for it through unprecedented efforts of awareness, engagement and capacity building with the goal of establishing the 3e’s (everyone, everywhere, equally). Like any good judge, the engagement processes must be neutral and even-handed towards all stakeholders.
Without Discrimination
Article 7 puts a lot of emphasis on protecting citizens against discrimination that stems from violations of their basic human rights. It goes even further by condemning not only active discrimination but “incitement to such discrimination.” Many exploitative digital business models discriminate and, as discussed in Part 2 of this series, reducing one’s digital engagement to little more than digital slavery. They enable and promote the exploitation of personal data, including surveillance, data mining and constructed digital personas, and contribute to the end result of discrimination.
As long is cyberspace is without just and effective governance mechanisms, it falls to the state is to protect the rights of its citizens, including their rights as digital citizens. [32] That does not give the states the right to assume that they can discriminate against the rights of digital citizens by passing laws that are at odds with the notions of rights as found in the UDHR, nor to pursue special interest policies at the cost of the public good. The global nature of cyberspace poses certain limits on the effective actions of states on the rights and obligations of digital citizens who are digitally or literally residents within the boundaries of the state and digitally residents globally at the same time.
Competent Tribunals
Article 8: Everyone has the right to an effective remedy by the competent national tribunals for acts violating the fundamental rights granted him by the constitution or by law.
Article 8 presumes the existence of competent national tribunals and citizens’ rights based in law. So far, no such competent tribunals exist for cyberspace, either nationally or for the global Internet ecosystem. Cyberspace lacks appropriate tribunals where a digital citizen can pursue an effective remedy to digital rights issues. There are attempts to establish cyber tribunals, including tribunals operating at the global level, but they tend to focus on issues of special, often commercial, interests such as intellectual property and domain name disputes. [33] They are lacking in due process mechanisms for end users and struggle to reach the levels of legitimacy necessary for defending general principles necessary to serve a multistakeholder constituency base. [34]
Article 8 also states that tribunals must be competent. According to Article 14 of the “International Covenant on Civil and Political Rights,” (CCPR), for a tribunal to be competent requires independence and impartiality. Independence means a clear separation of the powers of state, expertise of actual judicial officers and the independence of the tribunal members from third-party support such as funding. [35] The emergence of pseudo-tribunals, expedited “takedown” courts and other fast track processes can often trample on user rights and diminish the role of traditional judicial tribunals.
None of the past, current or proposed digital governance mechanisms and related tribunals qualifies as “competent”. [36] They are not clearly separated from the institutions they “judge” over, and/or are in their maintenance dependent on support from the same institutions. [37]
In the context of the Internet ecosystem and cyberspace, multi-stakeholderism is an important principle that should be deployed as a basic principle of digital governance. As it is currently implemented, it is neither independent nor impartial.
Under the guidance of the UNDHR, nation-states are currently the first line of defense against what should be codified as violations of the fundamental rights of digital citizens. In the absence of broad policy engagement, these remedies will always be partial and imperfect. Digital governance’s urgent task is the development of competent and impartial policy processes and competent cyber laws and tribunals.
Arbitrary Means What in Cyberspace
Article 9: No one shall be subjected to arbitrary arrest, detention, or exile.
Article 9 indirectly confirms the right of a competent tribunal, (including future competent tribunals under cyber law), to order arrest, detention or exile, if it is not arbitrary. A charge by a competent tribunal may be arbitrary, or not. An arbitrary charge or punishment may appear random, but there is always a “private agenda” underlying cause or reason. For example, a locus of power (state, president, party, dictator), may use instruments at its disposal, (including the police, judiciary, or now, communications authority), to intimidate its citizens and embed fear through digital exile and random arrests and punishment.
As aggrieved persons such as ethnic and gender community groups resort to digital means to express their concerns, expose problems, and mobilize for action, a more targeted but still arbitrary two-level prosecution is increasingly likely to follow. Individuals and whole communities can experience “digital exile” in cyberspace by the growing practice of authorities suspending access to both the Internet access and cell phone service. Detention and arbitrary arrest can follow based on activities labeled unwelcomed in cyberspace by non-accountable governments.
In cyberspace, in the absence of policies and regulations, corporate power (social media, search engines, apps) has a wide scope to introduce terms of use rules that must be followed by its subscribers and users. At issue is what justifies, or challenges, the legitimacy of the restrictions and data use policies embedded in the user rules. Rules can be justified when they fairly benefit all digital users and are built on multistakeholder engagement and agreement. If the motive is to enhance corporate power, and users don’t agree with the rules, there is little they can do. Disagreeing can result in suspension, exile and “digital death” on the platform. Digital providers, or any other digital stakeholder, should not be able to arbitrarily introduce rules, or demand the use of unjustified standards and norms, that harm the rights of users by exclusion or by threatening sanctions.
Interference into the rights of digital citizenship is justified, and not arbitrary when digital technologies are used to harm others. Digital technologies are used for hacking, identity theft, cyberbullying, phishing and pharming (using age-old criminal techniques like blackmail and extortion). [38] Digital technologies whose use is justified under normal circumstances, such as encryption and Virtual Private Networks (VPN’s), can also be used to avoid detection and punishment in the exercise of cybercrime.
Arbitrary acts and overly broad laws can impose the blunt force remedy of removing individual and entire household access to Internet services, which results in a form of digital exile. Terminating users’ internet access, along with content filtering and “stay down” regimes without due process protections for citizens, results in unjust digital exile.
Avoiding arbitrary acts in cyberlaw will require intense dialogue and consultation in competent digital policy making processes.
Tribunal Characteristics
Article 10: Everyone is entitled in full equality to a fair and public hearing by an independent and impartial tribunal, in the determination of his rights and obligations and of any criminal charge against him.
Article 10 describes the characteristics of the tribunal a citizen is entitled to: equality, fairness, independent, impartial. The role of the tribunal is restricted to determine rights and obligations if any criminal act has been committed.
There is an inherent conflict between national interests based on sovereignty over physical territories and the digital citizen’s presence in the virtual and borderless cyberspace. Attempts by nation-states to assert their territory-based sovereignty into cyberspace and over one’s digital citizenship violate Article 10, especially if such attempts take place outside the normal judicial systems. For example, administrative agencies who grant their employees judicial-like powers to exercise broad take-down and injunctive powers over Internet users, violate user rights to full fair and equality treatment by an independent and impartial tribunal. [39]
Presumed Innocence but Assumed Guilt
Article 11: (1) Everyone charged with a penal offence has the right to be presumed innocent until proved guilty according to law in a public trial at which he has had all the guarantees necessary for his defense.
Article 11: (2) No one shall be held guilty of any penal offence on account of any act or omission which did not constitute a penal offence, under national or international law, at the time when it was committed. Nor shall a heavier penalty be imposed than the one that was applicable at the time the penal offence was committed.
As discussed above, some states are trying to mitigate the current lack of cyberlaw with their own legislation. [40] In doing so, it is important that they afford citizens the right to due process and the right to be presumed innocent until the high bar of being proven guilty is established. States have the power to regulate access to cyberspace, digital content, applications, and activities through legislation and/or prescribed technical barriers. Suspensions of Internet and cell phone access have become increasingly common, as well as intensive surveillance by the state, and commercial interests.
There is a pressing need for multistakeholder dialogue in the formulation of policies, and for oversight in policy implementation. States must take care of how they approach cybercrime and security, and the presumption of misuse just because the possibility of misuse exists. All legislation needs to be evidence-based, rights sensitive and motivated by public interest and the common good. It cannot be based on bolstering strategic economic and political power self-interests at the expense of digital rights.
It cannot be stressed enough that there is a need for an engaged discussion between all digital stakeholders ABOUT what principles need to be honored (with regard to the UDHR) and what proper policies need to be enacted for dealing with the issues surrounding digital technologies. Data collection, storage, and surveillance, and AI digital persona assembly open countless issues about digital rights and digital obligations.
When the drafters of the UDHR penned Article 11, the memories were still fresh of a Nazi Germany. In Nazi Germany, millions were charged and assumed guilty for racial, political or economic reasons, based on a shaky foundation of unjust laws. Processes and judgments took place behind closed doors and without due process [41] The drafters had lengthy discussions on the second paragraph of Article 11. [42] A ban on retroactive law was part of many constitutions at the time and the pressing question was if that meant that the recent Nuremberg trials of Nazi criminals where illegal. [43] At the time of drafting it was not generally accepted that those leaders who had so deeply and widespread abused human rights in the name of their ideologies could be held responsible and face an international tribunal.
Today, in a much less charged environment, we are facing a reverse ordering of the process. Efforts by states involve drafting national legislation about the rights and obligations of persons or entities engaged in global digital behavior. The questions of adequate policy development engagement, appropriate legislation, and global reach are acute today, regarding policies around behavior in the local and global digital spaces of the Internet ecosystem. [44]
While the vision of the Internet was one of “free and open to all,” self-interested stakeholders have used the language of “digital disruption” and “unregulated innovation” to engage in digital business practices at the expense of digital rights. [45] In so doing they enabled and established digital exploitation and slavery. Going forward, we are in the quest for proper legislation and regulations that preserve a balance between protecting digital rights and incentives for promoting innovation.
One important question is whether past offenses against human rights in cyberspace should go unpunished, or uncompensated for, because of a lack of legislation in place at the time. Do those who personas have been harmed receive no compensation while the perpetrators continue to build empires base on their ill-gotten gains? Are we to forgive and forget or should we aim for international digital crimes tribunals and restitution? These issues are not explored here.
Crimes of Omission
Many of the violations of UNHRD rights in cyberspace are based not on acts of commission, but also based on acts of omission. In order to enable and sustain predatory digital business practices, the necessary technological tools that protect the privacy and security of digital citizens have not been implemented. Much of data security has to do with the holder of the appropriate data ensuring that it is not accessed by competitors or cybercriminals. Even the form and content behind the “consent button” or opt-out options on most applications are dense and, in most cases, exploitative in the extreme. Users are ill-informed about the real-life consequences of data use and persona assembly when they pushed the ever-present “I Agree” button on a digital application or web site. They have no idea of the rights they are giving ways and the forces to which they are being exposed.
Where is the line?
Article 12: No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks. [46]
Never has there been a technology, short of slavery or incarceration, more suitable for interference with a person’s privacy. Other than for slavery, there has never been a process or technology that made the systematic invasion of a person’s privacy and exploitation of the person the foundation of its business model. At the same time, never has there been a technology with more promise to benefit humankind and help to solve some of its central problems, here through the collection and analysis of anonymized data. Also, never has there been a technology with the potential to “violate rights, undermine privacy, polarize societies and incite violence. [47] With the rise of digital technologies, the private sector, for its own purposes, joins the state and other entities in the surveillance of citizens. The collection and processing of personal data is the basis of the digital business practices the feed the revenue and growth of the globe’s largest digital and, increasingly, non-digital entities.
The drafters of the UDHR could not have foreseen the importance and relevance the extension of privacy beyond one’s limited personal data, the access by others to family, work and community data, or the emergence of such data-intensive technologies such as social media, email, web-browsing and resulting data mining and digital persona assembly.
By the word arbitrary, Article 12, (just as Article 9), does not see the right to privacy as absolute. Interference with the right to privacy must be necessary, legitimate and proportional. Nor did it foresee personal data and constructed digital personas as property that could be traded, bought or sold without its owner’s permission. This is akin to being lured or seduced into selling oneself into digital slavery. The questions to ask at both the legal and ethical levels are:
The special role of privacy in Cyberspace
Privacy assumes new importance in cyberspace. In a pre-digital world, privacy was easier to maintain, and the right to privacy was respected by laws and regulations. Laws were in place in cases where it was necessary to infringe on it. Much of one’s time spent, and business, community and social activities on multiple fronts went unnoticed and unnoted, offering little scope for information retrieval. Even forensics tended to work with minimal data sets. The emergence of cyberspace has opened previously unknown possibilities of mass data accumulation, and surveillance by various parties. Every keystroke, finger swipe on a cell phone, location data, and every footstep is collected, stored, identified and tagged. This opens a wide field to behavioral manipulation, surveillance capitalism and endless exploitation. Commercial misuses of personal data and the construction of use-specific personas are taking place without meaningful user permission and consent and without oversight.
How such mass data should be treated is very much a subject for discussion. What should be protected as private depends on proposed use models? What for one party represents mass surveillance, social engineering, and manipulations, other parties see as the imperative and opportunity to prevent crime, heal the sick, or support innovation. Even more technical applications such as Artificial intelligence can make decisions about which among the neediest in society receive social welfare benefits. Yet, AI is also seen as in the service of solving humankind’s most pressing problems. Digital technology puts who does what with what data at the center of a society’s concerns about personal and social integrity and progress. The resulting agenda of concerns should be set at the center of society’s policy discussions.
Fighting the Bias: Technological Solutions for Technological Problems?
Encryption ensures the privacy of personally generated data, but it does not solve the privacy dilemma for two reasons. Much of one’s data is generated beyond one’s direct engagement, and authorities will always insist on the need for back door access to data for reasons of security. One good example here is the refusal of Apple to help the US DOJ to unlock two iPhones used by the Saudi shooter who killed three people at a Navy base in Pensacola, Florida. Recent developments show a principled stance of a company does not mean that it will last under pressure. [48] Like encryption, anonymized personal data requires that the digital citizen trusts and authorizes some competent digital governance institution, based in the state, private sector or otherwise, to oversee and control the processes. This is further complicated by the fact such “anonymized” data can easily be reconstructed using just a few known data points. AI-enhanced techniques, facial recognition software, and other tools can even more easily reconstruct identifiable identities for various uses. Both the literal and digital citizens have no control over the algorithms used or the intended uses of such digital personas. While this is a technology-enabled problem, there is no technology-enabled solution to either the data privacy or the digital personas issues. The solutions will always require a blend of policy-based governance and the development of trust around acceptable social norms of behavior by all involved in the cyberspaces of the Internet ecosystem.
We have identified universality and inclusivity as fundamental characteristics of cyberspace and the need for digital governance models that recognize and enable the participation of everyone, from everywhere, in open policy-making processes. We looked at the need for and the attributes of independent and competent tribunals, but every political or technological solution will always require human trust. Trust involves mutual understanding and respect, attitudes that flow from open engagement in dialogue among diverse stakeholder digital citizens.
Is it surprising to see how fundamental principles of the UDHR, such as universality and all-inclusiveness, find their reflection in fundamental principles for governing digital technologies? In many respects, we are confronting the same issues as confronted in the drafting of the UDHR, albeit in the cyberspaces of the Internet ecosystem. The goal is to be able to translate them into the principles, policies and practices that govern behavior in the digital era.
Cyberspace will always have contested areas, and the decisions around policies and practices will be an ongoing, much as ongoing case law enriches the meaning and understanding of legislated laws and policies. Fundamental principles are always navigational aids and aspirational. For the next steps on the way to enshrining fundamental principles for the rights of our digital citizenship, we have to establish legitimate mechanisms for digital governance and inclusive mechanisms created by and for “We the people…” in both a literal and a digital sense.
In part 4 of this series, we will discuss Articles 13 – 17 as they apply to political rights and will explore topics such as the notions of the responsible and accountable cyberstate and empowered digital citizenship.
[1] The authors contributed this article solely in their personal capacity, to promote discussion around the UDHR, digital rights and digital citizenship. The authors would also like to thank Sarah Deutsch for her valuable contributions to the article. The authors can be reached at [email protected] and [email protected]>.
[2] Parts 1 and 2 are available at: http://www.circleid.com/posts/20191210_internet_governance… and http://www.circleid.com/posts/20200106_internet_governance…
[3] This series of article are presented a bit like preparing the foundation for a house, here the house is the “house of regulations and rights” in the digital age. An understanding of the desired digital rights, and the pitfalls from policy and regulation, is required to build a sturdy and relevant platform of digital rights.
These articles are also a contribution to the upcoming 75th UN UDHR anniversary and as a start of an Internet ecosystem wide discussion around digital rights and policy development. Comments are welcomed. (Send comments with “UDHR” in the subject line to [email protected] ). Comments will be used to update this digital rights discussion in subsequent articles. The goal is to kickstart progress toward a much-needed International Covenant on digital Civil and Political, Economic, Social and Cultural Rights.
[4] Margaret Edith Brett, The Right to Recognition as a Person before the Law and the Capacity to Act under International Human Rights, page 9, LLM in International Human Rights Law, Irish Centre for Human Rights National University of Ireland, Galway August 2012, https://www.chiark.greenend.org.uk/~chrisj/Right…
[5] Geraldine Van Bueren, The International Law on the Rights of the Child (Martinus Nijhoff 1995), 40; Manfred Nowak, U.N. Covenant on Civil and Political Rights: CCPR Commentary (2nd revised edn, N.P. Engel 2005), 369.
[6] We have discussed our digital persona in Part 1: “The advances brought by digital technologies have created a new multi-faceted dimension to our digital personas. Our physical persona is something that is gifted to us by our birth. Our digital personas are created by digital technologies. In parallel to our physical persona, with very few exceptions, all people are simultaneously acquiring multiple digital personas. They consist of digital data constructs (personas) that are linked to our unique literal being as a human. Often attached to these digital personas are human or machine-imposed value judgments affecting one’s real world reputation, personal information, credit or risk worthiness, and other indicia affecting basic human welfare.” For further information see section “The Digital Persona” in part 1.
[7] This area is more critical in the face of greater use of digital/Internet shutdowns to deal with domestic issues.
[8] We have discussed this before in Part 1 under the heading “Liberty”.
[9] Wolfgang Kleinwächter, Internet Governance Outlook 2020: The Next Generation of Players and Problems Is Coming, CircleID, http://www.circleid.com/posts/20200107_internet_governance_outlook…
[10] For further information about the DNS see; https://en.wikipedia.org/wiki/Domain_Name_System, and https://www.icann.org.
[11] This is one of the main distinctions between the DNS search and the use of search engines where algorithms always contain elements of human bias, discretion and ambiguity.
[12] The topic of cyber law, and its future development, is young, broad and Important. It merits careful attention in order that it not codify regulations that impact adversely on the digital rights of persons.
[13] For more information on the international context of cybercrimes treaties and initiatives see footnote 29 below.
[14] For example, one of the major issues between States is the application of International law to Cyber Warfare. See: Tallinn Manual on the International Law Applicable to Cyber Warfare. en.wikipedia.org/wiki/Tallinn_Manual
[15] The British Computer Misuse Act 1990, is an early example, for further information see; https://en.wikipedia.org/wiki/Computer_Misuse_Act_1990 A recent example is the EU General Data Protection Regulation, GDPR, for further information, see: https://ec.europa.eu/info/law/law-topic/data-protection_en
[16] For example see Forbes Magazine article, “15 Unexpected Consequences Of GDPR”, Aug 15, 2018, https://www.forbes.com/sites/forbestechcouncil/2018/08/15/15-unexpected-consequences-of-gdpr/#7b3a9e0894ad
[17] The European Unions GDPR could be seen as a prime example.
[18] For further Information see: https://www.un.org/disarmament/ict-security/
[19] See https://digitallibrary.un.org/record/799853 “In its resolution, the General Assembly requested that a group of governmental experts be established in 2014, on the basis of equitable geographical distribution, to continue to study, with a view to promoting common understandings, existing and potential threats in the sphere of information security and possible cooperative measures to address them, including norms, rules or principles of responsible behaviour of States and confidence-building measures, the issues of the use of information and communications technologies in conflicts and how international law applies to the use of information and communications technologies by States, as well as the concepts aimed at strengthening the security of global information and telecommunications systems. The Group was also asked to take into account the assessments and recommendations of a previous Group (see A/68/98). The Secretary General was requested to submit a report on the results of the study to the Assembly at its seventieth session.”
[20] For further Information see footnote 13 above. The GGE experts met in a closed-door format with no observers permitted. The work of the GGE is further limited by the mandate of the General Assembly “mandates the work of the GGEs squarely in the realm of international security and disarmament, and thus not as a technical exercise”. The GGE also “decided that the issues not under the purview of the First Committee-such as espionage, Internet governance, development and digital privacy- are not the focus of the Groups work”.
The Open-Ended Working Group started in June 2019 and is open to all UN member states. The OEWG holds consultative meeting with Other stakeholders from the private sector, civil society and academia which also can apply to attend the meetings. The OEWG addresses six substantive issues: 1. Existing and Potential threats; 2. International law; 3. Rules, norms and principles; 4. Regular institutional dialogue; 5. Confidence building measures; and 6. Capacity building. The aim of the OEWG is to develop reports on a consensus basis.
[21] https://digitallibrary.un.org/record/3831879
[22] Wolfgang Kleinwächter, Internet Governance Outlook 2020: The Next Generation of Players and Problems Is Coming, CircleID, http://www.circleid.com/posts/20200107_internet_governance_outlook_2020…
[23] For further information see: https://www.unog.ch/80256EE600585943/...
[24] For example: The Global Commission on the Stability of Cyberspace (GCSC), https://cyberstability.org/; the Geneva Internet Platform, https://www.giplatform.org/
[25] For example the World Summit on the Information Society, WSIS, https://www.itu.int/net/wsis/: or, the Internet Governance Forum, IGF, https://www.intgovforum.org/multilingual/
[26] For example the Paris Peace Forum, https://parispeaceforum.org/; or the Geneva Dialogue, https://genevadialogue.ch/ , one of the more enlightened and promising initiative is the Internet & Jurisdiction Policy Network, https://www.internetjurisdiction.net/
[27] For example the Internet Corporation for Assigned Names and Numbers ICANN, https://www.icann.org/ (domain name industry); the World Economic Forum, WEF, (https://www.weforum.org/ ,neo-liberal capitalism); Tim Barner-Lees “Contract for the Web”, https://contractfortheweb.org/; the Cybersecurity Tech Accord, https://cybertechaccord.org/ ; Cyber Peace Institute, (CPI), https://cyberpeaceinstitute.org/ , Global Forum on Cyberexpertise, (GFCE), https://www.thegfce.com/.
[28] See: https://en.wikipedia.org/wiki/Animal_Farm
[29] A good example for the distinctiveness but inseparability of the physical and digital is climate change. The infrastructures of the digital realm require vast amounts of energy, resulting in greenhouse gases. Digital technologies such as AI, blockchain and the cloud require large amounts of energy and contribute to pollution. Potentially, these same technologies can become major factors to overcome the problems they cause. The achievement of the UN’s Sustainable Development Goals depends to a large extend on the strategic deployment of digital technologies, the same digital technologies that improperly used will aggravate the underlying problems.
[30] We have discussed this problematic before in the Quo Vadis ICANN article, see: http://www.circleid.com/posts/20181211_quo_vadis_icann/
[31] Lawrence Lessig: Forword “Institutional Corruption” Defined see: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2295067 .Indeed institutional corruption is a field that requires urgent exploration in the context of cyberspace and internet governance and the works of Lessig provide a very good starting point. On Lawrence Lessig see: https://en.wikipedia.org/wiki/Lawrence_Lessig
[32] States have the responsibility to ensure that the rights of their citizens. Article 12 of The Convention on the Rights of Persons with Disabilities can serve as a model. See: https://www.un.org/development/desa/disabilities/convention-on-the…
[33] Rights Protection Mechanisms (RPMs) & Dispute Resolutions Procedures (DRPs). For further information, see: https://www.icann.org/resources/pages/rpm-drp-2017-10-04-en
[34] We have identified already a number of these general principles, for example the 3 e’s and the dual separate but inseparable character of Cyberspace.
[35] See: https://www.ohchr.org/EN/ProfessionalInterest/Pages/CCPR.aspx
[36] For some of the latest proposal on digital governance mechanisms see section 3.3 of “The age of digital interdependence”, Report of the UN Secretary-General’s High-level Panel on Digital Cooperation, https://www.un.org/en/pdfs/DigitalCooperation-report-for web.pdf
[37] The current “ombudsman” model deployed by ICANN, although useful and skillfully diploid, is floored the very same reasons. See: https://www.icann.org/ombudsman/
[38] In December of 2019 the UN General Assembly approved a resolution to create a new international convention on cybercrime. The resolution was approved marginally with many fearing the resolution will allow for crackdowns on freedom of expression. See: https://thehill.com/policy/international/476109-un-gives-green-light-to-draft-treaty-to-combat-cybercrime and https://news.yahoo.com/un-backs-russia-internet-convention-alarming-rights-advocates-011310327.html . This is only one of a number of relevant initiatives ad treaties on cybercrime: The Council of Europe (Strasbourg) Budapest Convention, https://www.coe.int/en/web/cybercrime/the-budapest-convention .The work on Cybercrime by UN Office of Drugs and Crime (UNODC, Vienna); including the (existing) Open Ended Working Group on Cybercrime: https://www.unodc.org/unodc/en/cybercrime/index.html and their comprehensive study on Cybercrime: https://www.unodc.org/documents/organized-crime/cybercrime/CYBERCRIME_STUDY_210213.pdf ; finally The Shanghai Cooperation (SCO) which has formalized cooperation among the eight member countries (including India, China, Pakistan and Russia) on issues of cybercrime: https://ccdcoe.org/organisations/sco/ (Thanks to Nigel Hickson for providing this list and links in an email of the 12/31/2019)
[39] This example applies to both the CASE Act where the US Copyright Office is intending to create a special copyright take down tribunal, but it also may apply in Europe because the EU Copyright Directive mandates that Member States create “out of court redress mechanisms” to settle copyright disputes. In the US, some of these forced arbitration provisions have been found to be unconscionable and violate citizens’ due process rights.
[40] What Does California’s New Data Privacy Law Mean? Nobody Agrees, Natasha Singer. See: https://www.nytimes.com/2019/12/29/technology/california-privacy-law.html
[41] For more information, see: https://www.un.org/en/sections/universal-declaration/drafters-universal-declaration-human-rights/index.html
[42] “The UDHR was being drafted just after the Nuremberg war crimes trial had ended, with a similar trial still under way in Tokyo. Article 11’s respect for the presumption of innocence was agreed on quickly. The drafters struggled over the wording of the second paragraph. They were concerned that a ban on retroactivity could be used as an argument that the Nuremberg trials had been illegal. They had tried for “crimes against peace” and “crimes against humanity” which previously did not exist in national laws. From: Universal Declaration of Human Rights at 70: 30 Articles on 30 Articles - Article 11, https://www.ohchr.org/EN/NewsEvents/Pages/DisplayNews.aspx
[43] For further information see: https://en.wikipedia.org/wiki/Nuremberg_trials
[44] The reference to the UDHR and the Nuremberg Trials does not compare the incomparable of the Holocaust with the issues around digital rights, due process and digital exploitation. We nevertheless can and should take history as lessons learned and inspiration. We can use it as a navigational and interpretive aid in a new circumstance, the global Internet ecosystem, where the local, the global and the personal present complex policy challenges. We are in a time where digital exploitation and exclusion are pressing issues. We can look to the UDHR for what it says to us today, how we can react and prevent cyberspace from being used for repression, exploitation and exclusion
See also footnote 19 in Part2 on Digital Slavery
[45] One of the most prominent and insistent proponent of the permission less innovation ideology, Vint Cerf, had to admit its limitation: ” All the openness led to what many of us call permission less innovation, all of which was very satisfying for me, watching this grow in a very organic way. There is only one small little detail that had not penetrated my thinking in the early stages and that’s: What happens when the general public gets access?” Vint Cerf, US IGF Washington 2017, https://www.youtube.com/watch?v=J4HxqfJK13I
[46] Article 12 of the UDHR finds it equivalent in Article 17 of the “International Covenant on Civil and Political Rights.
The subject of digital privacy was subject of several discussions over the years. The United Nations Human Rights Committee on the right of privacy, family, home, correspondence, and protection of honor and reputation, under the International Covenant of Civil and Political Rights, (ICCPR), expressed in 1988, demands in General Comment No. 16 to Article 17 that state surveillance be subject to laws legality through clear and precise law that safeguards a citizen’s right to privacy. The United Nations General Assembly Resolution 68/167, on the right of privacy in the digital age”, passed on December 18, 2013.
Additional action where taken within the UN framework: Resolution 68/167 included a request of the general Assembly that the High Commissioner for Human Rights prepare a report on the right to privacy. The Human Rights Council based on its decision 25/117 held a panel discussion on the rights of privacy in the digital age, in 2014. The council also appointed in 2015 based on its resolution 28/16 a Special Rapporteur on the rights of privacy for a 3-year period. We should also mention the recent Report of the UN Secretary-General’s High-level Panel on Digital Cooperation, “The Age of Digital Interdependence in the age of digital interdependence”. https://www.un.org/en/digital-cooperation-panel/ . All this effort shows that whilst the UN is taking the subject of digital privacy serious, it has no powers to ensure and implement the right to privacy beyond making recommendations and hope for the compliance of states and the private sector.
[47] page 12, Report of the UN Secretary-General’s High-level Panel on Digital Cooperation.
[48] Joseph Menn, Reuters, 21rst of January 2020, Exclusive: Apple dropped plan for encrypting backups after FBI complained – sources https://www.reuters.com/article/us-apple-fbi-icloud-exclusive/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT
Sponsored byWhoisXML API
Sponsored byRadix
Sponsored byDNIB.com
Sponsored byIPv4.Global
Sponsored byVerisign
Sponsored byVerisign
Sponsored byCSC