|
On 24-27 April, a 33-year-old international organisation of ICT organisations will convene a meeting at London under ETSI auspices after a four-year hiatus. Known today as the GSC or Global Standards Collaboration organisation, it began its existence in the Spring of 1990 at Fredericksburg, Virginia, as an umbrella mechanism for all of the world’s ICT standards bodies to collaborate on “high interest subjects.” Given that all these organisations, to a significant extent, compete against each other, the venue has always been a challenge to avoid acting as a cartel. Given my own past GSC roles, the event inspired a reflection on the history with colleagues, the major changes occurring over more than three decades, the common enduring issues, and the new opportunities as well as challenges faced today. Ultimately, the name of the organisation begs the question—is GSC collaboration really possible?
The initial Fredericksburg meeting was convened at the invitation of the management of what was the T1-Committee of the Exchange Carrier Standards Association (ECSA) combined with U.S. government agencies who were collectively interested in establishing what was described as an “international standards summit.” In 1990, the institutions, technologies, regulation, and provisioning of telecommunication and internet services were changing rapidly both in the U.S. and globally. The old provisioning regimes were disappearing and global opportunities emerging—sparked in some measure by international treaty changes at the ITU, GATT/WTO, and European Union. These changes folded into the standards making venues and processes. The convenors were largely motivated by mitigating impediments to those changes. The mantra at the time was taken from the Bobby Dylan song, “the times are a-changin.”
The ITU Secretary-General Pekka Tarjanne was asked to chair the policy session. As Chief of Telecommunication Regulations and his principal strategist, I was asked to represent him. Tarjanne had just assumed the Secretary-General role, and was intensely interested in evolving the ITU focus and processes. (My involvement in the subject matter dated to ten years earlier as a senior technology advisor at the FCC dealing with international policy—that included following global standards activity and required use of open-source intelligence assets to get access to them.) As chair, I reached out to colleague Vint Cerf who was the Internet Architecture Board chair and former DARPA program manager, to join the panel, expand the discussions and perspectives, and broaden the summit membership. His principal message was the mantra of the IAB—produce standards for running code.
The second meeting of the group was hosted in 1991 at the new ETSI headquarters in Sophia Antipolis by the new Director General, Karl-Heinz Rosenbrock. ETSI was also the extremely active home for the development of GSM mobile standards. The summit organisation became the InterRegional Telecommunications Standards Conference, and Rosenbrock—like Tarjanne—was intensely interested in evolving standards-making activities. As the former lead of ISDN development at Deutsche Telecom’s labs, he was pleased to announce that ETSI standards would be freely available online to greatly facilitate use.
Tarjanne that year facilitated the GSC standards availability objectives by signing an agreement with a small group of computer science graduate students headed by one of the legendary network discovery technology experts and a public interest advocate—who, in three weeks, for free, converted all of the ITU CCITT standards to downloadable files on an FTP server and made them available to the world and discoverable. The access metrics were overwhelming—exceeding paywall sales by many orders of magnitude—and leading the team who accomplished the work to cheekily comment that there was “a certain pent-up demand.” Rosenbrock at ETSI then took the lead by being the first standards body to make its standards, including those for the GSM mobile network, available for free on a web server.
As the summit evolved, it formed subgroups to deal with common issues, called itself the Global Standards Collaboration organisation and migrated the secretariat between ETSI and ITU. High on the list of common topics of interest were standards availability, radio-related activities, IPR and antitrust concerns, and, a decade later, lawful interception. For a couple of years, initially, the GSC activity engaged the internet community.
The participation was not significantly expanded. As the years passed, the meetings degraded into a set of attendee bodies providing high-level descriptions promoting themselves. Most meetings ended with adopting a relatively meaningless “communique.” In 2011, the group published a list of 34 “resolutions” on a variety of generic standards subjects. The next year in 2012, the group somewhat oddly produced a list of 19 “high interest subjects” with designated “prime PSOs (Participating Standards Organisations)” for each subject. In 2014, it created a kind of charter termed “GSC Principles” that effectively closed off the organisation with a set of requirements that included any new members be “agreed by consensus among the existing members” and solicited none.
The last GSC in 2019 in Montreux consisted of only 12 attendee body presentations—ARIB, ATIS, CCSA, ETSI, IEC, IEEE, ISO, ITU, TIA, TSDSI, TTA, TTC—plus presentations on four themes—smart sustainable cities, managing and using data, artificial intelligence, and AI applications.
Over the past 33 years, almost everything relating to ICT standards has changed profoundly. Chief among these changes is an explosion in the standards-making ecosystem itself. A recent effort in ETSI to enumerate standards venues related to the EU’s proposed Cyber Resilience Act identified more than 750 of them—consisting of many different ICT communities clustered around different kinds of institutional arrangements, technology platforms, service provisioning, operating timeframes, and working styles. Among some of the original GSC members, the roles, constituents, and relevance of their work diminished significantly. 3GPP, with ETSI as the Secretariat and GSMA as an ancillary industry collaborative organisation emerged as a massive standards-making body dwarfing all others.
ICT technology and provisioning has shifted from dedicated, stand-alone networks, services, and devices to a global virtualised mesh focused on tailored content delivery, discovery mechanisms, constantly evolving applications, and mobility. Platform coding communities of 100 million developers worldwide have emerged as enormous de facto ICT standards bodies—some with Artificial Intelligence bots assisting the work. This ICT tectonic shift has especially important implications for standards bodies, as a product or service specification is manifested as code running on blades at cloud data centres and network devices that is adapted for users in a microsecond—often by an AI agent. The code for global systems can be changed and tailored for customers and operations support instantly.
The “avoiding duplication” basis for collaboration, once advanced by GSC, harkens back to a bygone era when everything was hard-wired and has largely disappeared. The assertion often had anti-competitive overtones as some players sought to advance their products in standards bodies they controlled.
Over the three decades, the ability to rapidly create, evolve and disseminate the standards online together with encodings to effect improvements and eliminate vulnerabilities to enhance resilience has become especially important. 3GPP evolves its entire set of standards as releases that change in 2 to 3-year life cycles of requirements studies and implementing specifications.
The regulatory environment whiplashed—shifting from a world of monopoly PTTs, to unfettered globalization, to arrays of extensive regional and national legal requirements and constraints. Today, the European Union has more than 150 legal instruments applicable to every aspect of ICT provisioning and use. Regulatory-driven standards have undergone a similar cycle and have now become a favoured approach together with an ensemble of mechanisms to achieve compliance. The U.S. has even implemented wholesale market constraints based on the origin of the product or service with intensive focus on supply chains—eliminating any notion of an open global market.
ICT security—which was a minor consideration 33 years ago in dedicated infrastructures—is now paramount as cybersecurity threats, and all manner of cybercrime grew exponentially and proliferated. Cyberwarfare is a reality. The chimera of trusted networks, services, and devices has now emerged as a Zero Trust model based on risk management tolerance.
Last but not least, the role of government standards-making bodies has undergone a similar whiplash—especially relating to cyber security. In 1990, the U.S. National Security Agency had, over some years, worked with the industry to produce the Secure Data Network System (SDNS) standards that were manifested through NIST, CCITT, ISO, MITRE, and IETF. At the same time, most other countries manifested their standards through the governmental bodies that provided public telecommunication services and coordinated through CCITT and CCIR. As the years passed, government largely relied on private-sector standards bodies as part of a policy known as “public-private standards policy.” Today, however, ENISA has emerged as a major international ICT standards body implementing far-reaching new EU law. NIST has also assumed a quasi-regulatory standards stature leveraged internationally.
Notwithstanding the profound changes over the past three decades, most substantive GSC issues are persistent.
Who gets recognized to participate. The original GSC concept of an open, expansive organisation of standards organisations of all flavours has unfortunately not been pursued. What ensued is a handful of legacy bodies plainly focused on promoting themselves in a dramatically changing ICT world rather than dealing with substantive standards making issues. The most successful organisation is ETSI because of its adaptability, large industry-driven constituency, freely-available and well-versioned standards with permanent URIs and ETSI Forge code expressions, continuing awareness and outreach to other bodies, serving as secretariat for 3GPP, and status as both a global and a European normative standards body. However, ETSI also has been caught in the sweeping change and rapidly expanding universe of standards bodies that includes huge global developer ICT organisations.
The notion advanced decades ago of “accredited” standards bodies or recognized SDOs always seemed aimed at disparaging other bodies. It remains counterproductive, if not disdainful and anti-competitive. Acting as an ICT standards cartel is not a good position to be in.
The GSC has no real value as it presently exists. The few individual members each have their own existence to pursue individual opportunities in the ever-shifting competitive standards marketplace and collective support of 3GPP. The GSC’s continued existence and fundamental value proposition are dependent on expanding membership to include as many ICT standards bodies as possible—of all kinds, including those serving specific product service developer communities, and governmental standards bodies—and deal with the issues raised below.
ICT turf. In today’s ICT world, most standards bodies serve constituencies with a shared set of motivations which is key, as well as perspectives and working styles. The resulting diversity inevitably produces standards that effectively compete in the marketplace—a result that has long proven its value. GSC should not be designating any one organisation as “prime” or their standards as the basis for marketplace offerings. Diminishing “duplication” of work on standards is a nice theoretical construct, but it is ultimately wrong-headed and fraught with anti-competitive implications and hazards. Furthermore, in the new virtualised ICT world, where code at a cloud data centre manifests tailored services, the “duplication avoidance” argument goes away. With open competitive cloud platforms and provisioning, users can have the choice of running any trusted standards-based code.
Standards Availability. When the first GSC meeting occurred 33 years ago, one of the principal issues was the placement of paper-based standards behind paywalls and transition to network-based access. It was evident then, and proven through the success of many standards bodies, that freely available online standards are extremely beneficial to the effectiveness of any organisation in the ICT field. Fortunately, over the past three decades, the practice of maintaining paywall standards has significantly disappeared from most, but not all, bodies.
Those who continue the use of paywalls must contrive egregious devices to maintain the practice that faces continuous legal challenges. Additionally, in those instances where one standards body may want to reference relevant standards produced by another body, it may choose not to do so if the standard exists behind a paywall because it cannot be sufficiently reviewed and/or does a disservice to members by forcing them to pay. Referencing a paywall standard by a standards body or government agency effectively makes them a sales promotion agent.
Even worse from a public policy and juridical perspective is the practice of leveraging regulatory mandates in order to sell referenced standards behind paywalls—which in many jurisdictions has been held unlawful. The GSC can help urge regulatory authorities and standards bodies alike to express requirements and standards in OSCAL to facilitate an open marketplace in standards and collaboration among ICT bodies—which NIST, CIS, and ETSI have begun to use, and the EU has funded for meeting new regulatory mandates.
The topic of standards availability also includes collaboration on common attributes such as discoverability, permanent URIs, versioning, formats, and languages for not only a published standard, recommendation, guideline, or profile but the ancillary expressions and code in ASN.1, XML, JSON, or other encodings. Those ICT standards bodies that still use paywalls do not have any permanent URIs or discoverability mechanisms and rely instead on diverse “sales points” that attempt to sell the standard whatever the market will bear, sometimes demanding over 13 dollars per page for downloadable European normative ICT standards.
Last, and perhaps most importantly, as noted earlier, ICT technology and provisioning has shifted from dedicated, stand-alone networks, services, and devices to a global mesh of virtualised capabilities focused on tailored content delivery, constantly evolving applications, and mobility. Product or service specifications are manifested as code at a cloud data centre, and network devices are instantly tailored and evolved for users. The ability to rapidly evolve and disseminate the standards online together with encodings to effect improvements and eliminate vulnerabilities to enhance resilience have become increasingly important today—with an effective means for providing notice of what are effectively “standards patches” still missing. The continued use of ICT standards hidden behind sales point paywalls using glacial development processes is a cyber security vulnerability.
Means of collaboration. One of the most profound changes occurring as the world of ICT virtualisation has scaled, is the expression of both standards and even regulatory requirements as code. The evolution began slowly decades ago with the expression of standards as ASN.1 encodings with unique identifiers and trust mechanisms. Although the concept emerged in the CCITT in the 1980s, ETSI pioneered the implementations for multiple services and applications—making the code downloadable at permanent URIs. The IETF and OASIS followed—expanding to XML and JSON and providing for interoperability among those encodings and newly emerging ones. The basis for collaboration among some ICT standards organisations were the encoded expressions of the standards. A few organisations, such as the Center for Internet Security, took this a step further and instantiated standards like the Critical Security Controls as hardened cloud operating system images and provided expression mappings among different cybersecurity frameworks.
Two years ago, several standards bodies and security agencies collaborated with NIST to embark on an effort to apply collaboration using such expressions to regulatory requirements and named it OSCAL—Open Security Controls Assessment Language. The platform enables an open, interoperable, and highly flexible ecosystem among government or industry bodies establishing requirements and those producing the implementing ICT technical standards. In today’s world of virtualised networks, devices, and services spanning multiple jurisdictions and contexts, where intelligent running code is making decisions, OSCAL has become the ICT standards collaboration lingua franca. ETSI’s own Forge server now supports a diverse set of expressions for its standards, including OSCAL.
An important new global need has emerged for a consistent, universal, rapid means for discovery of and access to the code expression sites of ICT standards and requirements bodies for collaboration among standards organisations, developers, and users. A more universal GSC organisation could facilitate its creation.
Running Code and testing. At the first meeting, the admonition to those assembled to focus on “running code” has remained an ideal quest over the years. It has become increasingly important, and organisations such as 3GPP, ETSI or the IETF, who are running-code oriented, also host plug-tests and hackathons to constantly improve them. Indeed, the entire massive global mobile communications network and services infrastructure community advances the specifications in numbered versions similar to computer Operating Systems. Yet some standards bodies—especially for cybersecurity—produce standards that are little more than generic admonitions for ineffective certification purposes and rely on regulatory mandates for a quantum of relevance.
Cyber Security information sharing. An increasing number of ICT standards organisations of all kinds are adopting cyber security mechanisms related to their own standards, such as the reporting of discovered vulnerability information. It is critically important that such vulnerabilities be disclosed to appropriate global, regional, and national vulnerability authorities, including the responsible standards body itself, and shared. The most widespread reporting format is CVE (Common Vulnerabilities and Exposures)—ideally with CVSS (Common Vulnerability Scoring System) scoring. ETSI, 3GPP, GSMA, and IETF have been leaders in establishing Coordinated Vulnerability Disclosure mechanisms for their own standards. However, no collaboration has occurred among all the diverse organisations, nor does a discovery and trusted access mechanism exists for sharing the information.
Proactive continuing standards activity discovery. Most viable standards bodies today are industry contribution driven and constantly evolving. GSC annual briefings of work or formal liaison communications between standards bodies is a generally wasteful, if not counterproductive, practice. Almost all effective ICT standards bodies have their work accessible online to see.
What is needed are consistent mechanisms for constant open, proactive discovery and analysis at the working level to see what other groups are doing at any moment and the extent to which the work can be leveraged or gaps discovered.
Antitrust. The potential use of standards development processes by participant members to create standards that stifle competition is a common concern in all ICT standards bodies. Both competition and judicial authorities have become increasingly focused on this anti-competitive behavior, and in some cases, exacted penalties. In the past few years, the legal counsel of standards bodies have sought to collaborate on developing good antitrust policies and practices. ETSI and the ITU have been leaders in this area. GSC could play a useful role in maintaining collaboration and adopting antitrust best practices for ICT standards bodies.
Intellectual Property Rights. ICT-related patents constitute an enormous hidden standards universe that exists against the backdrop of standards body activities. The availability in the past few years of free, global, online patent search capabilities for standards-making features has revealed the enormous size and provides the ability to discover undisclosed patent holdings. Many companies face a constant challenge of whether to give away their standards IPR to a standards body or rely on patent protection instead.
Patent-related litigation is an enormous hidden business expense that has an adverse effect on standards activities. For example, IPR contributed to a standards body by a member can be converted into a patent by another party. In other instances, a member may contribute IPR for a standard that was the subject of a patent by the member without notice and subsequently attempted to extract unfair licensing fees. For many years, GSC maintained a standing committee of experts to collaborate on dealing with these issues and develop best practices. It deserves to be continued.
Membership promotion and engagement. Participation in any standards body is resource intensive, and in a highly competitive ICT standards world, potential members are constantly examining where they place those resources—including the rare experts who can effectively participate. Companies, government agencies, and other institutions also weigh the alternatives—obtain IPR protection, publish your own product specification online, and get it into the marketplace as fast as possible. Become your own standards body. In a virtualised ICT world, this alternative is feasible and attractive if you are large enough or can get subsidized. A second alternative is to form a specialised product group consortium and keep the costs low.
As a result of this significantly changed ecosystem, most standards bodies today are constantly examining ways to enhance their value to existing and potential members. Those which have high entry costs or other barriers to effective participation, such as multiple meeting attendance or offer minimal value proposition, are facing attrition. Those bodies still maintaining paywall standards are in a particularly difficult situation because the participatory costs coupled with enormous prices charged for the standards significantly diminishes use and impedes their review and evolution—as well as participation. GSC can lead the way for transparency in standards use via download metrics.
Membership engagement also increasingly includes efforts to enhance diversity and involvement of small and medium-sized companies as well as academic institutions and even individual experts willing to contribute their knowledge. New regulatory regimes, such as those emerging in the EU, place special emphasis on enhancing inclusion and diversity.
Standards organisation governance. Standards body governance is a perennial issue for most larger, generic ICT standards organisations. The topic is entwined with who has rights to approve standards products; the processes used, fees paid, facility and secretariat finances, and leadership positions.
The organisations with significant challenges are those of an intergovernmental character where approval lies in the hands of national government officials, those with membership based entirely on continuing physical attendance, and those who are part of larger organisations with high overhead and dependent on publishing revenue. These bodies often spend endless unproductive hours and even dedicated conferences trying to game the structure and remit of the organisation to maximize the perceived benefit of influential participating entities. The ITU-T still does it every four years—as they did a hundred years ago. All ICT standards organisations today face governance challenges and can significantly benefit from some measure of collaboration on beneficial common approaches that extend their value proposition.
Standards virtual meeting platforms. The past few years during the COVID period have more dramatically advanced the use of virtual meeting platforms than any time in history. Most bodies now use the platforms extensively in differing combinations with physical meetings. The practice is especially attractive for participant members for dealing with resource constraints involving travel and expert personnel. Additionally, there is a generational factor with younger employees who seek more control over their lifestyle and possessing valuable technical expertise. In some ICT standards-making environments today, the virtual platforms, document repositories, and code are merged together. The shift here has occurred so rapidly there has been little, if any, dialogue among standards bodies concerning this changed participatory paradigm as well as collaboration. Clearly, good virtual meeting capabilities are rapidly becoming a necessity for ICT-valued standards bodies.
Identifier and parameter registration authorities. Most standards bodies produce specifications that, to varying degrees, require obtaining or registering unique identifiers and parameters to implement the specification in a product or service. The availability of these identifiers and parameters—their registration and network-based availability of the registration information, is essential not only for implementation, but in many cases, for resilience and trust as well. In some cases, the identifiers are subject to treaty requirements and made available through third parties in national governments or industry. Identifiers like domain names constitute billion-dollar industries with their own governance mechanisms. The availability and management of the identifiers can also give rise to significant antitrust, IPR, cybersecurity, and regulatory issues. Standards bodies like the CA/B Forum exist at the critical crossroads of identifiers, trust tables and running code on nearly every network device and many applications.
In most cases, the management of identifier registrations and availability of the information is done by the standards body. In some cases, it is outsourced to a third party. The activity is one of the most important components of successful ICT standards-making, yet there are no common standards or collaboration among the bodies on discovery, trust methods, or availability.
Standards secretariat operations. Every standards body has some kind of secretariat for operational support and legal necessity. They may be permanent and large or shared or temporary. The secretariats play significantly varying roles, but they all have a common set of challenges relating to the nature of the support, personnel, archives, legal requirements, and contact information. Here also, the many hundreds of ICT standards body secretariats—most of which operate globally—continue to exist independently of each other with all manner of networking among them ranging from relatively omniscient by design, like ETSI, to utterly insular. Somewhat incredulously, no mechanism exists to register, discover, and contact ICT standards secretariats.
Regulatory and globalisation environment changes. The regulatory and globalization norms for providing ICT products and services have changed and will continue to change dramatically. In 1990, most ICT networks and services were relatively fixed and primitive and either provided by government agencies and coordinated through ITU bodies or heavily regulated. Efforts through the GATT/WTO to open up global ICT market opportunities were occurring—based on the standards recognition of two legacy bodies who heavily lobbied to ensure their existence as a kind of standards cartel. That world is long gone.
Today, the quest in most nations and regions for a measure of resiliency and national security in an interconnected, virtualized ICT world has resulted in a Zero Trust environment which has significantly adversely impacted the old global open market norms. Supply chain management is paramount.
Coupled with this fundamental regulatory-market paradigm shift are the adoption of regional and national normative mandates that range from: wholesale denial of market entry based on national origin to the EU’s expansive array of 150 different ICT Directives, Regulations, and Decisions coupled with extraterritorial applicability. One proposed new regulation for cyber resilience descends to the essentially impossible level of identifying standards for “digital elements” in every ICT product and service—physical and virtual—and applying compliance requirements.
Standards bodies today that ignore these fundamental shifts and fail to be cognizant of the regulatory requirements do so at the risk of irrelevance. Effective means for identifying these requirements, mapping them to standards activities, and leveraging tagging mechanisms like the EU’s European Legislation Identifier (ELI) coding, are essential tools in the global standards ecosystem today.
Government ICT standards bodies. One of the more significant recent developments that operates as an adjunct to the regulatory and globalisation environment changes is the re-emergence of major government ICT standards bodies. The two most prominent today are ENISA and NIST, and significantly disrupt the existing competitive private-sector ICT standards marketplace. Both have begun promulgating their own standards with regulatory force and effect—even though the expertise exists primarily in industry. And, when private-sector ICT standards body specifications are included by reference (as has occurred for cloud cyber security standards or frameworks), the references tend to be primarily to a single private ICT standards organisation with paywalls.
If the past many decades of ICT standards activity have taught us anything, it is that reliance on single standards bodies does not produce the best standards. NIST and the EU Horizon 2020, to their credit, have recently embarked on a meta-layer solution for enabling an open, competitive, adaptable market for ICT standards known as OSCAL (Open Security Controls Assessment Language).
Collaboration on the challenges emerging from all of the significant enduring issues enumerated above provides an ample basis for the existence of a Global Standards Collaboration organisation and discussion at the London meeting in April. No global venue for collectively considering these issues and challenges exists, much less finding solutions across the entire ecosystem. Given the competitive ICT standards environment today, no single existing organisation or small subgroup could serve as that venue.
Cerf’s admonition to the first meeting 33 years ago concerning “running code” has, in many ways, become the most fundamental predicate for successful standards-making today. Indeed, running code is the entire basis of the huge platform developer standards organisations—where the collaboration is code based. The de facto ICT standards world is routing around the legacy bodies.
However, given the 33-year history of the GSC and current existence, as a closed organization among 14 legacy standards bodies largely promoting themselves, it will be difficult to change its direction to be fully open, as originally intended, and deal with significant substantive issues facing all the many ICT standards bodies today. It is the GSC meta challenge.
Even if GSC23 does nothing, the occasion—including this paper which now has several contributors and potential widespread distribution—provides the opportunity for a frank dialogue on the history of the entire ecosystem, together with the issues, needs, challenges, and possible ways forward (or not). Over the past half century, in which I have been active in many of these organisations in different contexts, as well as writing the histories going back to the Chappe “optical internet” in 1791, the core issues have remained unchanged—even as the participant views have swung between extremes. It may be that global ICT standards body “Brownian Motion” is the inherent condition of the universe.
Sponsored byRadix
Sponsored byIPv4.Global
Sponsored byCSC
Sponsored byWhoisXML API
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byVerisign