|
Last week I pointed out a potential problem with the user experience, if, as envisioned, a large number of new generic Top-Level Domains (gTLDs) are added to the root at the same time. The problem I was referring to has nothing to do with the new gTLDs themselves. Rather, it’s about the lack of any updated procedures and communication campaigns to application and software vendors. The objective would be to alert them in time and equip them to swiftly update their programs (email, form fill and the like) to recognize and handle new gTLDs, such that the user experience is consistent with existing TLDs.
To date, I received 10 comments that confirmed that the problem does in fact exist. Some comments focused on whether a company, or for that matter, an application or even a service provider, has the right to block certain TLD extensions. That is in fact a different question. Companies, service providers and potentially even some governments, if they have their way, do and will continue to block certain TLD extensions and even web addresses. That’s a conscious decision and targeted at specific extensions. The problem I’m concerned about is software and application providers who, as a result of lack of information and updated procedures, would unwittingly end up blocking a slew of new gTLDs.
The other issue I mentioned was about the incentive “for application developers to update systems and applications in a timely manner.” By “timely manner,” I mean within a timeframe to coincide with the launch of the first batch of new gTLDs. Within days or even months of the launch of a new TLD, there will likely be very little content behind the domain names registered. Many registrants will not have configured their nameserver or updated their MX records. All this means there is very little resolution compared to an established TLD. So trying to make a case to the application providers based on user needs will ring hollow, at best. Plus, they will have to incur the cost of any development and Q&A, not the new registries.
So my challenge remains: “What is the most efficient way to communicate the changes and their timing to the application developers soon enough so that the first registrants of the new TLDs and their associated registrants and end-users will have the same experience as they have now using a well established TLDs?” Seems to me and some of the commentators, that one option is for ICANN to coordinate a program within the “Four Month Communication Window” to communicate the upcoming changes and implications to the software and application developer community. But there is also a need for an updated set of clear and efficient procedures. Is this something IETF can and should cover? If not, then who?
If new gTLDs are to be launched by next year, there isn’t much time.
And lastly, what are the right incentives for the developer community to follow-through within the timeline that best corresponds to the launch window? Would the incentives be designed to offset cost? Target the largest applications? Or reward the “early adopters”? And who should provide the incentives?
Having clarified the problem as an unwitting omission of most new gTLDs, as opposed to a conscious exclusion of specific TLDs, I hope it’s clear that the problem is about preserving the credibility and confidence of registrants and end-users in all of us as an industry. It will neither be seemly nor sufficient to point fingers at software and application providers when it becomes apparent to the end-user that many of the new gTLDs simply will not work the same way the existing ones do.
Given studies that demonstrate that the first few months of a TLD launch are critical, this is just one of the many hurdles (and one of the relatively minor ones) that new TLDs will need to address to raise their odds of success. While it’s encouraging that my previous posts caught the attention of many industry insiders, the next year may find many new teams in the TLD marketplace. Do they have the bandwidth to foresee such issues? For those hoping this new TLD frontier is the land of plenty, we should not only hope, but also prepare.
Sponsored byCSC
Sponsored byRadix
Sponsored byDNIB.com
Sponsored byIPv4.Global
Sponsored byVerisign
Sponsored byWhoisXML API
Sponsored byVerisign
Another option might involve having the registries work diredtly with ICANN in the communication process. Also, I really don’t see these issues being remedied in just a couple of months via a communication campaign. Therefore, the registries should consider soliciting feedback from those that purchase their domains. Any issues brought to their attention could be routed to the respective entities for resolution. Any issues not remedied within a certain period of time could then be made public for all to see.
This problem is larger than what just registries or ICANN can do. Application vendors & software providers, and web-based service providers don’t really pay attention to the TLD space, and don’t care about making changes unless it becomes critical.
I spent the first 5 years of running .INFO (the first non-3 letter TLD in the world) working with many vendors to get their systems to accept .INFO email addresses as valid. 10 years in, that problem has not yet gone away.
Alexa, great article, as always. I think that the topic is a good one, and I’ll take a moment to tack a comment on here with a request to the registry community.
We have technology innovators like Ram Mohan and countless others who work with the community to elevate awareness of the changes and evolution of the namespace.
ICANN seems to have a strong emphasis on Universal Acceptability, so that the new TLDs of all types are well known and prepared for.
I am also constantly impressed with Yahoo, Microsoft, Google, Time Warner, T-Mobile, BT, Verizon, and other companies that are developers or hold a stake in internet naming are present at the ICANN meetings and asking highly sophisticated questions that indicate a strong grasp on the complexities and inter-operability of the naming system with applications.
More TLDs, en masse, alter the known universe of TLDs for applications.
The TLD zone file, with the exception of some churn in the ccTLD space, has predominantly been a static file. As Ram points out, one issue to overcome was expansion of character length. Although .ARPA (and .NATO before it was migrated under .INT) existed that was >3 characters, the rule for almost 2 decades for validation of a TLD was 2 char= cctld, 3 char = TLD, and >3 char = bogus and filter it.
That’s if you simply work from the IANA root list of TLDs.
To turn that up a notch, the world is not just that simple. As you expand into ccTLD space, there’s also the matter of how many dots (ie the United Kingdom’s .co.uk and registrations are not allowed under .uk), and what is appropriate as a subdomain, and at what level. ccTLDs widely vary on their approach with registration policy. There are still, for example, legacy delegations to cities or schools in the US that are 4th or further level deep per RFC 1480 and still operating.
There’s such a depth of complexity that exists in and around how the domain system works out in the real world beyond the registries, and application developers don’t necessarily have any kind of early warning or heads up over what is coming down the path.
Or if they do, there’s really not an authoritative source. But the registries are not all known to the application developers, and vice-versa.
So….
I saw this coming and stepped up. It is my privilege to have been selected to serve as a volunteer with the Mozilla Foundation on a very ambitious and practical thing called the Public Suffix List, or PSL for short. http://publicsuffix.org.
The folks at Mozilla had an initiative under way that I have been able to contribute to with my wealth of contacts in the Registry realm, helping to bridge these universes a bit better.
The PSL is actually used by a humbling number of derivative works (search engines, anti-spam, security, programming languages, etc.) as an authoritative list of what is or is not a TLD. The value exists because the list is a central place to update, and the updates trickle out to numerous beneficial consumers in the application developer space.
It is trusted, because it is registries that are the ones who are to add/update their entries directly, and there is a group of people who validate and approve changes.
And registries are updating and maintaining their entries because of the trickle out benefit from the central repository.
The catch is, not all registries know of the benefit or even existence of the PSL.
I am working on elevating the profile of the PSL and increasing the value to the community from making sure it is updated by registries.
This is thankless volunteer work that will help everyone’s new TLDs work with the browsers and other software, but I felt it was a good way to contribute in a meaningful way to the community while we all wait for the new TLDs.
Adding in new TLDs will increase the workload on maintaining the list, but ‘the juice is worth the squeeze’. The value to the community at large that new TLDs will enable will be that much more valuable if others can step up and look now at what we can do to be proactive or collaborate on making new TLDs work in applications.
Registries, please do your part and update your information at the PSL website once you are root listed in the IANA list.
I will be at the ICANN meeting in Singapore and am glad to explain or clarify and aid the community on this important work.
It is a good opportunity for me to give back to this great industry and advance / enhance universal acceptability.
Hello all
and thanks for replying and commenting. So far, we all seem to agree that this is a problem simply brewing, and that if and when many TLDs get added, and leaving things in the current state, the user experience will be poor. And Based on the issues Ram says .info continues to experience, then many of the new TLDs may not see relief even in the next 5 years. Jothan, I am not very familiar with the project you mentioned (the http://publicsuffix.org) but from the description it seems the closest alternative to a solution I have heard thus far.
I do not think any single body will be able to resolve it, but someone, should take the coordinating role and identify who needs to know what when. Does that role belong to ICANN? Or ISOC? or someone else? ICANN’s role is “coordination of the Internet naming system” and ISOC’s is “ensuring use of the internet for the benefit of people” and “provide leadership in addressing the issues that confront the internet.” Both seems likely but none have stepped up. Even if the coordination role is defined as spreading the word around the http://publicsuffix.org project, some responsible organization needs to take the mantle.
Jothan, I applaud you for volunteering for PSL, but you alone cannot spread the word, even if the venue is at an ICANN event.
Too kind, and very much appreciated kudo. I needed something to do while we all wait for Godot
I accept, and also I pay the homage forward to the others on the PSL who started the effort.
amen, but alas, this is all volunteer work so it does not have the budget it should to properly market and spread awareness.
That said, ICANN seems very interested in, and quite active on, the topic of universal acceptability, and I think there are areas of nexus between the PSL effort and the universal acceptability program and the associated outreach that can be of mutual benefit. This will certainly be discussed in Singapore at the ICANN meeting.
There are so many good works initiatives for the community that are waiting for the ambiguities about start dates to the new TLD program, I remain cautiously optimistic that the process moves forward on June 20.