|
“We are still in the early days of an information civilization. The third decade is our opportunity to match the ingenuity and determination of our 20th-century forebears by building the foundations for a democratic digital century.”1 (Shoshana Zuboff)
One of the consequences of the Jan 6th events is a renewed attention towards Surveillance Capitalism as a key doctrine undermining democracy.2 This part 2 of the 2 part series of discusses the rise and fall of Surveillance Capitalism under the premise that the better we understand the danger at the door, the better we are able to confront it.
“Capitalism is supposed to be a system of checks and balances. It’s a marketplace where everyone haggles until they are basically satisfied, and it works because you can always threaten to walk away if you don’t get a fair deal. But when there’s only one Marketplace, and it’s impossible to walk away, everything is out of balance. Amazon owns the marketplace. They can do whatever they want, That’s not capitalism. That’s piracy. (David Kahan, Chief Executive Birkenstock Americas)3
Surveillance has a built-in drive for more and more surveillance. The goal is certainty, but to reach more certainty, you need more surveillance data. It can never reach the ultimate goal. It can only increase the observations and sharpen the analytic tools, but it can never reach 100% certainty, but it can suggest other surveillance uses. It will forever be on the hunt for the holy grail of certainty, with the user investing time and sacrificing more and more data. Shareholders want ever better margins from the monetization of data. Advertising partners want even more precise predictions. Governments spur themselves on with the thought of missing another 9/11. The stakes have become higher. A hunt for higher profits has become, in the context of security agencies, a question of corporate life and death. Ever more certainty has become an imperative! To achieve this, it is no longer enough to observe, it has to go further and read everybody’s minds and ultimately control and direct what we do.
Surveillance capitalist corporations are in a perpetual “startup’ mode of disrupting existing businesses and replacing them with their own. The moment they lose this drive, they become themselves the prey of other companies. That is why surveillance capitalist companies are always monopolistic and by design fighting any and all anti-trust regulations and regulatory moves. It only knows one direction, to become ever more disruptive, bigger, dehumanized, antisocial to the degree of criminality. It amasses power because it knows without it will die. Will they die? Are there are limits to their exploitation? Left on their own will, they leave nothing but devastation and despair behind them.
The genius of surveillance capitalism is revealed when it comes to introducing the new tools needed to increase its powers of observation and manipulation. Following the tried and tested digital business model of the search engine and social media platform, it offers a new product, the “next big thing.” On the surface, it offers innovations that make the user’s life easier, but the real purpose of the (IoT) devices or apps is to harvest even more and better personal data. Why should a digital corporation or government install the instruments of observation and risk a “big brother” image when it can manipulate its citizens and customers to provide the data, and even pay to do it. ( In surveillance capitalism, you can build the wall and get Mexico to pay for it!)
The most valuable tool and resource surveillance capitalism has exploited is the smartphone. Smartphones equipped with gyroscope, accelerometer, magnetic field detectors, and a barometric pressure sensor allow apps to monitor a person’s activities in great detail. It does not just roughly know where you are and what you do; it knows that you are on the fourth-floor dance studio, having salsa lessons with the nice co-worker you met at the watercooler 4 ½ month ago. The same co-worker that searched the net for engagement rings and was recently seen lurking outside the displays of jewelry shops. There are endless business opportunities opening there. All hell will break loose if she says “yes” and the wedding industry takes over the situation. Data-driven marketing pitches will pressure the couple away from that intimate 40 person family event that they wanted, and into a 3-day stadium event that will bankrupt them for life, possibly at the cost of their children’s higher education, as college funding underwrites a flashy wedding.
Sometimes it is argued that the iPhone is less intrusive than Android phones. Apple makes a point to portrait itself as the good guy with slogans like “what happens on your iPhone stays on your iPhone” or even point out their role as guardians of privacy with slogans like “we’re in the business of staying out yours.” In fact, iPhones have the same array of sensors and send as much personal data to third parties as any Android phone. There is also a persistent worry that Apple’s use of Chinese subcontractors makes it vulnerable to backdoors and hacking.
The methods to create consumer demand and consent are simple. Either you appeal to laziness or comfort. When that is not possible, a threat is created and promoted together with the offering of the tool to alleviate it. Consumers become the collaborators in their own exploitation. By “google” people, they internalize the values of surveillance capitalism, and their ethics have become one with it.
To see what they do, doorbells with networked cameras are brought to market. They can stop a thieve to steal a parcel, but they also register who is going in and out of the house and what is happening in front of it. Combining the images of all doorbells in the street enables the observation of the whole street and, ultimately, the whole town. Combining the images with the sounds from “smart interfaces” that have been marketed as “household helpers” provides more refined data and context.4 As these devises leave few physical tasks for the humans, they need to keep fit, which requires knowledge about what is going on in their bodies. The knowledge about your health that is important for you and your doctor is equally important for a health insurance company to decide if you are worth the risk.
To aid driving, cars are equipped with navigation systems. To make the car “smart” its connected directly with its manufacturer and other entities such as banks and insurance companies. If the installments are late, it gets switched off remotely. If the service intervals are not observed, the top speed reduces to 15 miles an hour and forces the car to come to the garage of the manufacturer’s choice for “service.” Your car insurance premium is flexible now and is deducted monthly depending on which driving behavior is reported. There is much too much uncertainty in just allowing people to drive. A self-driving car, or even better, a self-driving car that checks if your journey is justified and gives you permission for it is a much more sensible proposition. What is more, it is all done in the interest of, and out of “care” for, the consumer.
Uncertainty, rooting in the free will of the individual, is bad for business and ultimately inefficient, and chaotic. One can argue that it contributes to unnecessary emissions and impacts climate change. Here we have reasons why humans must be taught to control their behavior, otherwise, they will continue the madness of turning up all at once, with their motors running, to buy their coffees and bagels at the drive-through. Teaching them to come in regular intervals will save the world by saving humankind from its irrational self.
Surveillance capitalism feeds a new “cult of reason”5 like during the French revolution, and this time it has the means to direct people to what is best. How to do that was demonstrated successfully by Pokémon GO.6 The game whose basic object it is to collect creatures. Instead of hunting them down, it uses mapping tools to take the hunt from the screen into the real world. By making a creature to appear outside a particular coffee shop at 8.30 pm and letting the player know in advance, you ensure that he or she will be there. Make the next person come at 8.32 and so on. Use and refine these basic principles to all areas of human behavior, and voila, a state of perfect reason has been created, but at the expense of what human rights.
Surveillance capitalist corporations claim that they have a specific corporate culture, as an Amazon executive expressed it “our way,” not grasping that the significance of his words mean that the values and needs of the company are more important than the rights of individuals, individual or the common good. This begs the question about our fears about computers taking, a takeover that has long been fulfilled. Humanity becomes no more or less than a function that has mobilized the support of capitalist surveillance companies. Even Jeff Bezos has to realize that he too is among the poorest with us, when he sold himself and us to ultimate surveillance capitalist exploitation through a machine called Amazon.
There is a reason why they teach robotics using Lego bricks:
“The most consequential global policy concerns of the present era are arising in debates over the architecture and governance of cyber-physical systems. Technology policy has to be conceptualized to account for the expansion of digital technologies from Communication and information exchange to material sensing and control. How technical, legal, and institutional structures evolve will have sweeping implications for civil liberties and Innovation for a generation.”7
The goal of surveillance capitalism is to change the Internet from a communication network of, for, and by the people, to a system of control. Think about human communication as children’s building blocks. Before surveillance capitalism, think of the Internet. There are blocks of different shapes, sizes, and colors. You can stack them on top of each other or side by side to construct whatever you want. Construction is limited by the fact that the blocks might touch one another, but they are not connected by anything more than gravity, making any construction both possible and unstable. (But that is also part of the fun, which child could resist the sight and sound of tumbling building blocks.)
Then came innovation by the creators of Lego. By sticking little round knobs on the bricks that clicked into holes in the bottom of other bricks, they created a stable connection that made the creative possibilities endless. The Internet is the little round knobs on the bricks of human communication. It enabled people to connect with each other globally. The digital knobs, the technical infrastructure of the Internet, like the Lego knobs, are both physical and virtual; they are cyber-physical. We can see the knobs clearly, but they are hidden and invisible within the platforms and applications where they make their magic, form connections with other bricks, and produce a gold mine of archived and tagged data.
The possibility to connect is endless, and everything seems to be possible as long as the knobs and holes are compatible with one another. To gain an economic advantage, surveillance capitalism wants to control all the bricks in play, and to do, so it tampers with the bricks and little knobs that connect the network of networks. They started to collect data about every brick, every effort to move a brick, and all the traffic in and around the brick, and any traffic facilitated by the brick. To do so, they created bricks with a multitude of sensors. Connected to the Internet, they form the Internet of Things (IoT). They look like ordinary bricks and function as part of the whole construction. Their real purpose is to send data to a special cluster of bricks that collects and analyzes the data and to send commands to actuator bricks that are able to manipulate all the bricks around it through numerous ways like turning switches, moving bricks into the desired position, and determine where and when bricks would connect with others. These magic sensor/actuator bricks bridge the gap between virtual and real. Something real, like the position of a person, is translated into digital virtual information. Based on the analysis and a pre-determined goal, commands are sent to actuators that cause where the person might go next. There is a subtle (and hidden) interplay between services provided and data collected.
The character and function of the Internet is changed from a communication network to a network of sensing and control. The Internet has undergone a transition from being primarily content-centric to increasingly surveillance and activation-centric. The Internet was always used by companies and governments to remote control utilities such as the electric grid, the step that Surveillance capitalism did is that it began to integrate humans and human behavior into the Internet of Things (IoT). A large part of the Internet is used to control legions of sensors and actuators. Outfitted with sensors and actuators like smart phones, social media sides and health watches, humans are surveyed and activated. The point of interaction between humans and the Internet has moved from the screen to the sensors, resulting in the invisibility of the network, a lack of agency about control, and a loss of ability to determine what happens with personal data.
The win here for surveillance capitalism is that by monitoring and controlling the sensors and actuators, it determines the shape of the whole and does not need to control everybody and everything. It only must establish itself as the architect that guides the overall construction and use (orchestration) to achieve the desired results. You cannot build a round house if you have only square bricks. Cyber-physical architecture determines what the Internet is and, more importantly, what it will become in the future. In a form-follows-function design model, the functions are data mining and behavioral modification, designed in the clothing of friendly apps.
To control all aspects of construction and to eliminate the last havens of freedom and creativity, surveillance capitalism needed to replace the round knobs with knobs of another design under its control.
To do so, it had to first cause a disruption by disassembling existing socio-economic constructs. The next step of the “innovation” was to reassemble the bricks in such a way that the connection between the bricks is controlled by the virtual bricks. Becoming the intermediary between two systems, one real and one virtual, with the virtual in surveillance mode, lets the virtual control the real.
To convince players to abandon the round design they made the new bricks free and gave them attractive functions designed to appeal to and please the users. First, these bricks seemed to be compatible but having convinced a large amount of players to use their bricks, surveillance capitalism began to change the design of the knobs. Then it created knobs specific to economic sectors like buying a book, ordering a taxi and renting a room. These knobs were different in shape and size and could only be connected with bricks of the same specification, a design primarily to restrict access to valuable data. Free creative play and economic competition are replaced by an increasing need to create and follow a pre-ordained design. Soon their knobs began to replace the original knobs everybody used. The new knobs where hailed as vital innovations and treated as commercial property that surveillance capitalists used to create powerful monopolies that forced everybody else to adapt and buy bricks with, or compatible with, the new knobs.
All firms are now technology companies, but not all are surveillance capitalists. Surveillance capitalism has achieved the seemingly impossible takeover of most global commercial activities. It did this by making itself the connector between providers and consumers, offering free access in exchange for data mining.
In the end, there is only one pre-ordained design. We have stopped being players with agency and have become bricks in an overall construction by an architect who has no concerns about us, other than as the producers of data and as the subjects of behavior modification as consumers (and citizens?).
Governments try to regulate the visible but are unable to address the real problems that a pre-determining cyberinfrastructure represents. Existing governance structures fail as they are basically addressing the wrong net. As observed:
“Interventions based on law and international agreements are not alone sufficient. Public policy is inscribed and concealed inside architecture.” “The technological diffusion of the Internet into the material world requires new approaches to technical architecture and governance that not only consider the content-centric protection of the digital economy and the free flow of information but also view infrastructure stability and cybersecurity as a critical human rights issue.”8
“What has made us great for so long is suddenly being seen as something we ought to be ashamed of!” (Amazon executive)
“people are worried-we’re suddenly on the firing line.” (recently retired Amazon executive)9
The situation looks hopeless. Escape from Alcatraz High-Security Prison seems to be more likely than escaping from surveillance capitalism. 99.9% of the digital information is rendered in a digital format that the surveillance capitalists have created for us. How can we escape or at least resist? We went the wrong way with the Internet, and we got led down the wrong path by surveillance capitalism. Now that we are becoming increasingly aware of its dangers, how can we start our way back to the crossroads and try again?
One reaction is to opt-out of digital technologies, which in developed societies seems impossible. The next option is to go digital hiding by deploying practices like Virtual Private Networks (VPN) that disguise our identity and deceive the deceivers. This might afford a person some limited level of protection, and even with that VPN in play, the device is transmitting coordinates and other identifiable data. Individual defenses take a lot of effort and resources and do not bode well as the best path for our human dignity and integrity. The question is whether these forms of passive resistance can effect change, or do they just change the information to surveillance capitalism that has won over our self-determination.
We should not despair. There are good reasons that will ultimately cause the downfall of surveillance capitalism. It is unsustainability, and its business practices run against human nature and our notions of human rights and human dignity.
The European Commission stated in its recent White Paper on Artificial Intelligence: “As digital technology becomes an ever more central part of every aspect of people’s lives, people should be able to trust it. Trustworthiness is also a prerequisite for its uptake.”10
“For most of us, computers are effectively magic. When they work, we don’t know how. When they break, we don’t know why. For all but the most rarefied experts, sitting at a keyboard is an act of trust” (Raffi Khatchadourian in The New Yorker)11
(Silicon Valley is in) “the trust business—if you loose the trust of the people who use the product, you are done, You never get it back” (McNammee in the New Yorker)12
The tide is turning against surveillance capitalism as users are increasingly losing trust with regard to some uses of the Internet. There is a wealth of studies about how mistrust against digital industries in general and social media and online news platforms is growing.13 The current developments and discussions around privacy and the proposed Covid-19 tracking apps shows how deep this mistrust goes. Singapore was one of the first countries to introduce a Covid19 tracking app. It turned out to be a failure not only for technical reasons but mainly because people did not trust and download it.14 60-80% adoption would be needed to make it effective, but only around 30% used the app.15 One of the reasons was that the digital use plan, like in any surveillance capitalism app, tried to take advantage of the situation, and collected data that did not fit its stated purpose. Populations responded or failed to respond, where it seems that many would rather risk illness than submit themselves to digital monitoring by states in combination with data mining by surveillance capitalism. This is a case where the lack of digital integrity in the past results in death today.
No citizen will deny a legitimate and authorized health or law enforcement agency access to relevant data if the appropriate checks and balances are in place. Surveillance capitalism’s insistence on scraping data, tagged to individuals from all sources, is at the core of concerns and opposition here.16
If a government treats its citizens by default as potential, but “not yet”, offenders and manifests its non-existing trust relationship with widespread digital surveillance, possibly backed up with social-economical retributions, citizens will wisely strive to hide their information.17 They will not trust a government that does not trust them. When the government bases its approach to surveillance on digital integrity, and practices it, a policy dialogue to determine the rights and responsibilities of both governments and citizens is possible. The “Brands in Motion 2018” study, based on 25,000 consumers globally, found that 93% of the consumers in Germany demanded more ethical responsibility in the use of digital technologies.18
Governments and the digital industry need to restore the public trust in themselves through transparency, accountability, and truth and unity between their words and deeds, complimented by checks and balances provided through Internet Governance. Trustworthiness requires a new way of thinking, resulting in structural changes embedded in digital processes and Internet governance.
“Rivalries in ‘Silicon Valley One’ revolved around technological Prowess, consumer allegiance, and profitability. Now competition is for moral superiority…” Brian Barth, The New Yorker19
Surveillance capitalism has a build-in inability to do the right thing because doing so would mean self-destruction. The “right thing” is contrary to its digital business plan. Piecemeal private data protection regulations are nothing more than a short-term fix, inadequate and costly. In the long run, Surveillance capitalism will damage and impoverish the lives of all that are touched by it.
To see how a trust-based business model might work, we must return to digital integrity. Digital integrity is an Internet user’s most valuable protection from digital exploitation. Current digital business models are based on the exploitation of personal data. Internet users live in a constant tension between using the full potential of the Internet and being exploited to the point of digital slavery. As more Internet users suffer and feel the tension, the more they will value digital integrity, and the more they will demand and possibly be willing to pay for assured digital integrity.
We have seen the same demand for integrity at work in creating marketplaces where consumers go “green” and started to pay for their physical health, and the health of the planet, by buying biological products at a higher price than non-biological ones. Private sector companies that offer products that demonstrably do not violate their customers’ digital integrity are able to charge a price and make a profit without having to resort to practices that are harmful to their customers. Like the “green industry, a “digital integrity industry” is forming, from registries to domain name sellers, platform providers and online stores. We see just the beginning of a movement, but it is gaining momentum.20 Digital citizens start to express their will through behavioral changes and so create new digital realities. With each new “digital integrity business,” the will of the people manifests itself in pressures to reform the digital marketplace.
The more we know here, the more we will be able to resist. Education in all matters concerning digital integrity and the workings of the digital ecosystem is one of the main pre-requisites for effective digital citizenship. Education informs digital citizens to become empowered digital citizens. The right to education should go beyond, for example, basic literacy or the higher goals STEM or STEAM-focused curriculum. It must include awareness around one’s role in the governance ecosystem and building and maintaining a suitable social fabric and the social contract for social processes and behavior that promote access to individual human dignity.
Are anti-trust laws and privacy regulations by governments the solution?
“It’s as if Bezos charted the company’s growth by first drawing a map of antitrust law, and then devising routes to smoothly bypass them” (Lina Khan)21
The ultimate goal and premise would be to restrict commercial activity based on private data seriously. Only data that was strictly necessary for providing a particular service, from medical files to gym membership, could be collected, never shared, and deleted when no longer needed.
Surveillance capitalism has so far managed to deceive the general public and lobby politicians into legislation that minimally regulates how the collected data is used, and avoid the worst-case scenarios around the integrity, and human rights concerns, related to company digital business practices to collect data. This is like regulating how a slave is to be treated but not questioning the general premise of slavery.
Attempts are made to get around even the existing very limited regulations as the recent attempts of Facebook to declare a “legitimate interest” to all personal data, even that which is protected by existing legislation such the European Union’s GDPR.22
The main argument against not allowing general access to private data is that to do so limits innovation and, for example, the effectiveness of AI and applications like Covid-19 prevention apps. First of all, the difference between personal data and general data (e.g., facial recognition) is blurred. Surveillance capitalism is not after just the insights that can be gotten from aggregate, anonymous data, but its digital business model is also interested in insights about you. Secondly, it focuses AI on Artificial Narrow Intelligence or “ANI,” which is basically smart algorithms that make quick decisions, for example, based on real-time data they receive. They are superior to human intervention only in that they have the ability to process data quicker. The algorithms still remain “human intelligence” based, in contrast to Artificial General Intelligence (AGI), where machines refine the algorithms based on specified (human?) objectives.
How Surveillance Capitalists explore the possibilities of AI was demonstrated by Alpha Go developed by Googles Deep Mind Lap, which pitted a computer against a world-class Go player resulting in a 4-1 win for the computer23. Not only did the games show that computers can win but also that AGI has the potential to teach people how to think in better ways. The currently applied emphasis is on ANI, with slower growth for AGI innovation. AI-based technology and digital business plans will developed as ways to refine the control and manipulation of people, for whatever ends the users of AI seek. The real danger is that surveillance capitalists will develop a monopoly on AI, or at least a protection against transparency and accountability, that threatens the use of knowledge and technology in the common good service.
Surveillance Capitalism underestimates human nature as it sees us as soulless machines that can be controlled through the appropriate command codes.
”...the ultimate goal of surveillance capitalism is to eliminate the uncertainty of decision-making. “That has a superficial appeal, until you realize that agency and identity depend on uncertainty; because it is the choices we make in uncertainty that define who we are.”24 (McNamee, The New Yorker)
The predictions about human behavior ignore one fact that can be predicted about human nature with certainty: Whatever the incentives digital technologies provide if they are violating fundamental human rights such as freedom, dignity, and integrity, people will sooner or later react, oppose, and burst the chains of their subservience. There are numerous examples in human history that support this premise. For surveillance capitalism to function, it needs ever-increasing certainty based on ever-increasing data. This is causing an ever-increasing separation of persons from knowledge, rights from responsibilities, common good from profit. In the surveillance capitalism business model, balance, dialogue and a middle ground do not exist. All that exists is the extreme need for ever more data for it to continue. Much of today’s digital technology is as though our data is in the hands of data junkies in search of the next data fix.
However sophisticated surveillance capitalism becomes, it will always have to predict the thoughts and actions of people in the end. That is the monetized product sold to its corporate and government customers. The next “big thing” is to take humans totally out of the decision-making algorithm and let computers make the decisions. Delegating the analysis and the decision-making to autonomous or semi-autonomous AI algorithms is highly risky, but for surveillance capitalists, that is the way they will try to go. It is the logical outcome of their digital business strategy and the only way to reach behavioral predictability at the scale of activities and behaviors faced (enjoyed) by surveillance capitalists. They are not concerned if they lead humanity to its: “I am sorry Dave, I’m afraid I can’t do that” moment, like in Stanley Kubrick’s film “A Space Odyssey.”
We need to find ways to make better investments in societies for their benefit and stop investments made for the sake of profits in ways that challenge the integrity and human rights and make the rich richer and the poor poorer.
“I also call upon Member States to place human rights at the centre of regulatory frameworks and legislation on the development and use of digital technologies. In a similar vein, I call upon technology leaders urgently and publicly to acknowledge the importance of protecting the right to privacy and other human rights in the digital space and take clear, company-specific actions to do so”. (UN Secretary-General Antonio Guterres)25
As the gap between rich and poor widens, the social contract between different parts of a society becomes unsustainable, surveillance capitalism will be confronted as a cancerous business process that is both violating human rights and feeding that inequitable growth.
We need to return to an Internet that enables us to do more for the sake of humanity and not for the sake of monopolistic surveillance capitalism and their oligarchy of investors. As the focus of the Internet increasingly became the monetization of all data and processes, the Internet started to lose its soul. Its chief evangelist spread the gospel of Google for private gain, and not in the service or for the salvation of humankind.
In the language of the Universal Declaration of Human Rights (UDHR), we need to retool digital technologies to provide direct social security for the wellbeing of all. To achieve the UN’s Sustainable Development Goals (SDGs) will depend on the massive and targeted use of appropriate digital technologies. Currently, digital technologies are often detrimental to the achievement of the SDGs, many with aspects in violation of the UDHR. We need new kinds of digital innovation. We need innovations that are born out of the need to address real human goals and needs and not innovations that are servants of surveillance capitalism. Instead of putting their energies into lining the pockets of shareholders, we need smart incubators that work on solving problems like climate change. Many of the surveillance capitalist business models and apps are contributing to problems like climate change, social inequity, and socio-economic marginalization.
The next “Big Thing!” to be sought and hoped for is an Internet ecosystem populated by applications and (IoT) devices that do not exploit their users. Even a smartphone where the user fully understands the terms of services offered and transparency around the nature and uses of data collected would be truly revolutionary without exploiting private user data. The associated commercial potential would not have to rival surveillance capitalism since there is every expectation that surveillance capitalism’s digital business practices will be reined in.
The question is whether governments and companies have the will, or feel the pressure from citizens and consumers, to bring integrity to their digital data practices and respect human rights in both the literal and virtual aspects of life. One fear is that manufacturers and governments so value access to data to predict and manipulate the deeds and thoughts of customers and citizens that the integrity and privacy of persons are so compromised that they are never released from their digital servitude.
We face an uncomfortable truth. We will likely never be free of digital servitude unless we take on the task of change and do the job ourselves. The safe and predictable world surveillance capitalism is luring us into is nothing more than an illusion and a distraction. It will not banish unpredictability and may well heighten insecurity. The current Covid-19 pandemic has revealed how fragile we are and that we need to work together and respect each other to survive. We will stop giving in to our laziness and comfort that took us away from engagement in the affairs and wellbeing of the world and community that surround us. We must build a balance between the uses and the effects of the digital applications and (IoT) devices that are trapping us in digital servitude and contributing to wellbeing and life for all, including flora and fauna) on our fragile planet earth.
Let us build new digital structures and processes that are constructive and not destructive. Let us set up structures of Internet Governance the promote engagement and democratic accountability. Let us return decency and integrity to what we do and how we do it. We are not where we want to be, but through conscious and deliberate engagement. With good Internet governance and by retooling social media and other digital apps to serve engaged dialogue, we can get on the way to a better tomorrow.
The author would like to give a big THANKS to Prof. Sam Lanfranco. Without his support and input these articles would not be possible.
Sponsored byVerisign
Sponsored byVerisign
Sponsored byIPv4.Global
Sponsored byCSC
Sponsored byWhoisXML API
Sponsored byRadix
Sponsored byDNIB.com