Home / Blogs

The Blurr-Cade Proposal on Root Zone Oversight

Becky Burr (former NTIA official) and lobbyist Marilyn Cade has made a proposal to create a multilateral working group to oversee the root zone file updates.

I would characterize the Burr-Cade proposal as a “small step for mankind and a giant step for the US” to paraphrase Neil Armstrong. The main merit of the proposal is that it looks like something the USG might want to follow.

Sevaral people suggested there should be no governmental oversight at all but that does not look realistic, in the sense that there can be huge economic and political interests behind ICANN decisions. Historically, governments have always been involved in foreign economic policy decisions (WTO) and would be blamed by their people if they did not. ICANN is yet another such process.

Like it or not, there are sovereignty issues linked to ccTLDs. There is no way one could exclude governments from the decision process.

Regarding the composition of this oversight group, I would say that the European representation should be reviewed. It would not be accepted by the EU that UK, itself a member of the EU, gets a seat on its own. I would rather expect two seats for the EU, one permanent and one rotating according to the EU presidency. The third seat should be for non-EU countries.

The authors of the proposal show they have no clue regarding regional political weightings. Rather, they suggest American-friendly countries. It would be wiser to allow regional governmental councils (African Union, Organization of American States, etc) to designate their representatives.

All in all, I would personally support this proposal as a starting point for discussions. It is incidentally close to a reply to a WSIS questionnaire last year, in which I suggested that the oversight on the root should be done by a sub-committee of the GAC.

Update: Becky Burr will discuss her proposal during the ICANN Strategy Committee Consultations today at 9:00am (Marina del Rey time). There will be a live audio feed.

By Patrick Vande Walle, All around Internet governance troublemaker

Filed Under


JFC Morfin  –  Jul 23, 2006 12:14 AM

I am afraid this proposition is utopic and I do hope the USG will resist to it. It proposes an unworkable ICANN private UN to take over the control of a file management which is only in the US best interest - in the case the US interest is not hurt. It is just ICANN added bureaucracy.

In addition this will not work. The current NTIA file is of interest only because it is the NTIA file. What is important is the US stability of this file while the root server system concept cannot survive the lingual TLDs, the aliases, and the keywords.

The urgency is for the IETF to document the post-RSS ISP Management procedures in order to get some keywords credible management and TLD stability. Status quo is what preserves the DNS right now (however, I object to it because it is not stable and anti-innovative, and because this does not push the IETF to document the next steps).

Let us not get trapped by our own joke. Would the root server system stop I think the Internet would nearly not notice it. But if the NTIA root file’s credibility is affected many who have not been told the way the DNS works (very well) with various root files and naming systems, will get confused - with a dramatic impact on economies.

Milton Mueller  –  Jul 24, 2006 1:59 PM

Just a simple correction. The proposal was written and developed by Ms. Burr; Marilyn Cade seems to have endorsed it but did not participate in drafting it as far as I know.

Milton Mueller  –  Jul 24, 2006 2:21 PM

A more substantive disagreement. I think you, like many people, confuse two distinct things.

1. The importance of Internet policies and the need, or even the inevitability, of governments playing a role in how those policies are rules are made.

2. Control over modifications of the Root Zone File (RZF).

Let’s create a sharp distinction between the two. Governments can have 2 without 1. Since ICANN is a global organization and needs to be accountable, governments can and should establish global rules that help to make it accountable. For example, if ICANN or its successor abuses its authority, breaks its own rules, cheats, steals, etc. it needs to be accountable. Governments need to work out how to apply competition policy and law, and trade rules to ICANN. That’s all legitimate government business.

But that does not mean that governments need to have or should have some kind of veto power over modifications of the root zone file. It seems to me that giving governments—any one or any collection of them—some kind of final veto power over the RZF is just asking for trouble. The Burr proposal tries to deal with this by saying that governments can intervene only to protect technical stability and security. But this is like telling a fox he can only eat one of the chickens he is guarding when the stability and security of the farm is threatened. The fox will always want to eat the chickens, and will use any excuse he can to define something as a threat to the farm’s stability and security.

Governments and their representatives are not likely to have any clue as to what RZF changes affect the technical security and stability of the Internet. But they will know how their political interests are affected. They will want to control or affect the RZF for political reasons, not technical ones.

So I am unimpressed with your statement that it is “unrealistic” to have no government oversight of the RZF at all. This is a typically naive internet technical person’s view of governments and law. As someone who has studied how governments regulate industries and how they relate internationally, I think it is highly “unrealistic” to think that broadening control from the US to more governments will accomplish anything constructive. Governments need to leave DNS administration and policy in the Internet community’s hands and focus on how that self-governing structure can be made accountable to internationally established rules. The distinction is perhaps subtle but it is vital that we understand it.

Patrick Vande Walle  –  Jul 25, 2006 6:01 AM


I do not think both points are mutually exclusive. Yes, ICANN needs be accountable to the world community, which is much larger than just its current stakeholders who, for the most part, have a business to protect.

The Burr proposal is saying that the working group can intervene “for the limited purpose of determining whether or not the proposed addition, deletion, or change creates an unreasonable risk to the stability or security of the DNS and/or the Internet”. We know that most of the updates to the RZF are purely technical housekeeping. I think we agree that, to cover that part, the working group would we need to rely on technical experts, not governments officials. However, the decision of the working group being collegial, it would be better accepted by governments than the current unilateral process.

Governments could become a major source of instability of the DNS and the Internet if they started mandating by law the use of alternative roots. This would happen if they feel frustrated by what they see as an unfair process. Adding collegiality in the decision may help a lot.

I see the Burr proposal as a step out of the current, unsatisfying situation. Let’s give it a try. There may be other steps later to correct some weaknesses of the model that may appear over time.

And of course, it may be that in a few years from now the name to address translation system may have evolved to the point that it does no more require a single coordinated root to function effectively. Or that we may not need name to address translation at all.

Kieren McCarthy  –  Jul 25, 2006 8:05 PM

There is already a government in overall charge of the root zone file, so can we honestly expect to get to a position where there isn’t governmental influence in the future?

I think the solution is to provide governments with the mechanism to oversee the file but set it up in such a way that the governments have to thrash out all the problems between themselves first. That way, stalemate means things continue as they are.

I think a good model would be something along the lines of what is being proposed for a reformed UN Security Council. You have a core of countries and a rotating selection of other countries according to region. The veto system means that even if two countries are at loggerheads - like the US and Russia were - that the worst that can be done is stalling.

Paul Twomey revealed on Friday that ICANN has asked Hans Corell - the UN’s legal counsel since 1994 - to look at possible models for ICANN to get it away from the US company status it lives in at the moment (presumably Carl Bildt had something to do with it as he’s on the ICANN Pres Strat Committee and both of them are Swedish).

I suspect Mr Corell knows a damn sight more than I do about the UN Security Council and getting governments to live within your organisation without dragging it down. And I suspect he has also been asked to come up with a usable government oversight mechanism.

But we shall see if any of this is even discussed at the NTIA meeting tomorrow.


Milton Mueller  –  Jul 26, 2006 12:11 AM

Kieren and Patrick:

The turn taken by this dialogue is sad. There are many good points about the Burr proposal. They call upon the US to issue assurances that they won’t abuse their unilateral power. They call upon it to renew its commitment to privatization. They call upon the US to lead by example. Somehow, you’ve taken the most poorly thought-out, unworkable element of the plan and gotten all enthusiastic about it.

Governments don’t need and shouldn’t have control of the RZF. You don’t solve the problem of ICANN oversight by giving them this power. It is the most misdirected attack on a problem since the US invaded Iraq because of Osama bin Laden’s attack on New York and DC.

Kieren points out that we already have a government overlooking the root. Indeed, we do. And what has been the result? Increasing exploitation of that control for political purposes. Increasing demands by other governments for similar forms of control. You want to bless this situation as fundamentally correct? You want to institutionalize it and make it permanent? Why? Those governments don’t want to join in the fun because they are sticklers for technical stability. Either they want to exploit that control for their own political purposes or, at best, counter what they see as US power. As I said before, governments are interested in this only for political reasons, not technical ones. So why are you proposing to give them a technical administrative task? We need to get out of that dysfunctional dynamic completely. 

Kieren, let’s have a conversation in Washington about the UN Security council. Before you hold it up as a good model, do some reading. Find out about how it really works, what kinds of things get dragged into it and how they get handled. Ask yourself why everyone in the UN believes that it needs to be reformed, and yet it still cannot be reformed. Understand what the security council is about: War. Force. Competition for power. That’s what governments do, see? Is that the kind of dynamic you want at the center of the Internet? I would hope that someone who aspires to the ICANN Board would think more deeply about this problem. 

Patrick: you say that “The Burr proposal is saying that the working group can intervene “for the limited purpose of determining whether or not the proposed addition, deletion, or change creates an unreasonable risk to the stability or security of the DNS and/or the Internet”. This made me laugh out loud. Do you think governments will be bound by Burr’s good intentions? Tell me, who do you appeal to, what do you do, when governments intervene (as one already has in the .xxx case) for purely political reasons and insist that it has a “stability” rationale? Are you aware of the kind of crimes the USG has gotten away with by appealing to “security” recently? What makes you think things will suddenly be different?

Patrick goes on to say: “We know that most of the updates to the RZF are purely technical housekeeping. I think we agree that, to cover that part, the working group would we need to rely on technical experts, not governments officials.” No, my friend. That is precisely wrong. As I have said a thousand times before, governments are driven by politics and power, just as businesses are driven by the profit motive. Asking the world’s governments to safeguard the technical stability of the internet via the root zone file is like asking a muscle-bound bouncer at a bar to be the arbiter of which vintage of wines are the tastiest. The task doesn’t match the person tasked. If you put governments in control of RZF changes, you are telling them that “they are in charge” and no technical changes can be made without their approval. And their approval will hinge on political considerations. It’s asking for trouble.

The only long-term, stable solution to DNS administration is to delegate it to a private, internationally accountable and inclusive governance structure. Governments need to channel their power into the areas where it belongs: rule-bound, lawful oversight of that governance structure’s fairness and accountability. Making ICANN accountable, in procedure and substance, to the public interest. We must insist that governments take up this task, and not cave in to their demands for the much easier, but less productive task of getting veto power over the global RZF.

Patrick Vande Walle  –  Jul 26, 2006 6:07 AM


You say that the Burr proposal “calls upon the US to issue assurances that they won’t abuse their unilateral power. They call upon it to renew its commitment to privatization”. Unfortunately, unilateral assurances are not enough if there are no control and appeal procedures. You cited the .XXX case and this is the best example that unilateral guarantees do not work.

I bet a large international working group would have approved .XXX by a majority vote, shielding the process from the local political agenda of one specific country.

I think the parallel made by some with the UN Security Council is unfortunate. Given the Bush administration’s current rethorics about the UN model, it won’t buy a solution modelled after the SC. Indeed, we are just trying to make the Internet work in the best interest of all, we are not trying to prevent WW III. The SC model gives a veto power to its permanent members. The Burr proposal is silent about the decision process within that WG. I would not expect anyone within the group to have a right to veto. This should rather be a majority voting.

While I agree that, in the end, ICANN should be the sole responsible for changes to the RZF, we are not there yet. ICANN indeed needs to be accountable to a larger community. It does need an international status of some sort with proper representation of all stakeholders, with equal rights. It should also demonstrate it can take decisions independently.

Until that happens, such a working group could actually relieve ICANN from taking difficult political decisions (like the .XXX case) which are outside the scope of its mandate, and focus on its main business.

Kieren McCarthy  –  Jul 26, 2006 8:09 AM

I wasn’t referring the UN Security Council as is, I thought I had made it clear I was refering to the proposed new models for the UN Security Council.

The reason why should be obvious: because people have been reviewing how the existing model doesn’t work and have been devising methods by which a larger number of governments can be involved, while at the same time the power given to each is reduced.

I know full well that the words “United Nations” cause tremors of hatred in the US and the words “UN Security Council” have the same impact in the EU, but this site should be somewhere where options can be considered openly and on merit. I’m supposed to be the one, as a journalist, that has Pavlovian reactions.

The fact is, however, that the Net community needs to come up with a model that everyone can broadly agree on because the current model is rapidly outliving its usefulness.

Ideally, I agree, governments would not be allowed access to or control over the root zone file but I have yet to be persuaded there is a real route to get to that position, and so some pragmatism is needed. In that sense, creating a system whereby governments are forced to arrive at their own consensus before it has an impact on the Internet means that negative influence is minimised.

The reality is also that ICANN under Paul Twomey is granting governments more plentiful and more powerful routes into ICANN processes. I think that if you gave governments their own mechanism at the top, they would realise that they don’t actually want to get involved in the vital technical aspects of the Internet.

The other actual advantage of having governments at the top is that it lops off the top level of power for ICANN, and so reduces the attraction of ICANN as a non-technical political battle ground.

ICANN as it exists as the moment should not be allowed full control of the DNS and root zone file because it is still too easily controlled by too few people. And that argument was made by just about everyone in the NTIA consultation comments.

Which means that there will probably have to be an interim governance mechanism until the Net finally matures and then the whole thing is shifted another 10 years down the line to a third system.

If the three of us, all coming from the same liberal, free-market mentality, can’t find agreement on a model, there is a big risk that majority consensus will be formed elsewhere on a far more prohibitive and damaging system.


Seiiti Arata Jr.  –  Jul 26, 2006 2:07 PM

Regarding the problem identified here on how to define what is a threat to the security and stability of the internet and thus avoiding the use of such argument as an excuse to exercise abuse of power (whether from a single government, a group of governments or even from a multistakeholder group):

How can we legitimately claim that we face a threat to Internet (DNS included) security and stability?

Would you think that this classification of what really constitutes a threat to the Internet should be the role of the academic/scientific community? Other ideas?

This problem of call for action in face of threats to the Internet is not only in Burr proposal and will likely be a recurring issue.

Karl Auerbach  –  Jul 30, 2006 4:56 AM

I find the proposal deeply flawed, for the same reasons that the original actions of NTIA and its creation of ICANN are flawed - they are based on a fantasy about DNS.

The only thing that makes a root zone file “authoritative” is that someone sets up a root server and says that his/her copy is should be published with the authoritative-answer bit.

We can have a root zone that is published by NTIA or ICANN, and if root server operators chose to use some other file, or to make changes to that file, they are completely free to do so.

The problems we faced in 1997 were twofold

First was that simply, nobody was accountable for the reliable, accurate, and unbiased operation of the upper tiers of the DNS hierarchy.

Second was that Network Solutions had a deep, and expensive, lock on most of the generalized top level domains.

Have we fixed any of these problems?

As to the first - No.  Nobody today, particularly not ICANN, is accountable for the 24x7x365 operation of the upper tiers of DNS.  Nobody, particularly not ICANN, is accountable that upper tier DNS servers quickly answer DNS query packets with reply packets containing accurate answers.  Nobody, particularly not ICANN, is accountable to ensure that the upper tiers of DNS are operated so that there is no prejudice for or against any source of query packets or for or against any query packet contents.

In other words, if the upper tiers of DNS - the root zone and TLD zones - were to be affected by some sort of problem, ICANN could, and probably would, hold up its hands and say “not my problem, man”.

Sure, we have an incredible group of root server operators - but there is no institutional structure that binds them (pun intended) to continue to operate with the integrity that they have demonstrated to date.

Indeed, some of the root servers, such as those operated by the US Military, are operated by people who are required to serve US policy, even if that means data mining DNS queries for purposes of “national security” and manipulation of DNS responses to derail those who are perceived as “enemies” of the US.

As to the second problem - Verisign is even more entrenched in its position today than its predecessor was in 1997.  And even though prices have dropped somewhat, we see that due to “fiat” registry fees from ICANN, fees that are several hundred times in excess of the actual registry costs, that consumers of domain names are not only paying hundreds of millions of dollars in excess fees, but they are also subsidizing free domain names to speculators.

All the Cade/Burr proposal does is to further the fantasy that Internet Stability can be obtained by minor tweaks to the side of DNS, the registration business and ccTLD updates.  That’s about as likely to provide internet stability as would fixing an electric company’s billing be a means to ensure that the power to a city stays on.

What is needed is a return to basic principles - to decide precisely what jobs need to be done, and then to create institutions, public or private, with only those powers they need to get those jobs done.  I’ve written on this several times, such as my paper of 2004: Governing the Internet, A Functional Approach

Kieren McCarthy  –  Jul 30, 2006 12:52 PM

It’s not a bad idea, the Baby Bell concept.

As I read your paper, you’re talking about four different bodies. One for IP allocation, one for root server management, one for DNS management and another one to cover interconnections, protocols and standards.

The pluses as I see it are that because each body has a small role and a technically defined role, you avoid the “government of the Internet” problem that ICANN has been having. And the illusion of Net control is lost, so the power politics are too.

The problem though is that I think we’re beyond the point where such a system can be put in place. Unless someone at the top of the DoC suddenly has an epiphany and pushes the whole concept through, the day-to-day realities will mean something different has to be produced.

The issue lies with governments. Governments now know all about the Internet and what’s more they have very good reasons to want to get involved because the Internet is no longer just a network of computers, it is a vital part of a national infrastructure.

The other problem is that the levels of trust in the US government at the moment are low - in some cases, non-existent. So any proposal that the DoC comes up with that doesn’t give foreign governments some say is unlikely to work.

A big problem with the four body idea as well is that ICANN’s biggest problem, aside from representation and accountability, is being a US company. It has to become an international body to a) make it more legitimate and b) to get away from the US legal system. If you have four bodies then you have four times the problems.

I honestly think that the most important element of any new Internet governance model will be that it has written in it a recognition that the whole model will need to change further down the line.

As long as it has that then when the Internet does finally settle, the best model won’t be thrown out because of unnecessary ties to the existing approach.


JFC Morfin  –  Jul 31, 2006 1:04 AM

Karl, Kieren,
let be realistic the need is to insure stability where stability is needed: at the resolver level. This level is mostly today at private netwrok and ISP level, and will be more and more at individual user level. This means that the need is to make sure that - if they want it - users can get, check, and trust the naming, handle,aliasing, keying systems information they want to use (probably neither monitored by foreign intelligence, nor subject to actions of foreign digital forces).

Along with the ICANN acknowledged legitimate reasons (exsiting g/s/language/ccTLDs) there is a need for millions TLDs (if there is “.cat” there must be “.fra”, “.ita”, etc. and all of them in all the used languages). That is the minimum prerequisite of the Multi (lingual, technology, national, lateral, modal,  etc.) Internet. This CANNOT be supported by the current solutions the IETF documents. But IMHO it can fairly well be supported by the tools we have, if they are intergoverned in the appropriate way.

So, we face an easy to understand choice. Either we document this correctly and we can probably keep things stable, permitting everyone - starting with the USA - to protect their interests. This is compartmentalization or whatever name you want to describe the networks of the network of networks. Or we do not document it, or do it poorly, and the different countries, communities, stakeholders, etc. will do it by their own, in confusion, and often against the USA. This is fragmentation.

For good reasons you give, this will happen as soon as two significant stakeholders or gurus developers think their overall interest is not the same as the interest of the USA (in several cases one can accept one alternative proposition, never three different conflicting approaches). China can already be (partly) counted as one, the USA as another one.

The USA must accept that the interest of the world is NOT the same as the US unilateral interest; however, the interest of the world is to see the US interest correctly multilateraly addressed because we are in the same global digital ecosystem. In a global, hence unique, system there can only be all-winners or all-losers situations. This is why ICANN must be American and stop pretending to be the ICUNN. ICANN must be the US missing NIC, and start representing the US interests, fostering foreign similar representation. If the NTIA root file is good, people will use it. If its root server system is useful ISPs, people, and countries will keep coordinating with it (yet not necessarily feeding its logger with their intelligence leaks).

If we cannot document the way we permit a user can make sure his DNS, Handles, local Aliases, different keyword systems resolver is secure and ICANN, or China, or European, or whatever consistent, in his language, others will do it , probably in various different ways, adding to the complexity of our InterNAT and e-commerce confusion (due to the resulting added difficulties in authentication).

Comment Title:

  Notify me of follow-up comments

We encourage you to post comments and engage in discussions that advance this post through relevant opinion, anecdotes, links and data. If you see a comment that you believe is irrelevant or inappropriate, you can report it using the link at the end of each comment. Views expressed in the comments do not represent those of CircleID. For more information on our comment policy, see Codes of Conduct.

CircleID Newsletter The Weekly Wrap

More and more professionals are choosing to publish critical posts on CircleID from all corners of the Internet industry. If you find it hard to keep up daily, consider subscribing to our weekly digest. We will provide you a convenient summary report once a week sent directly to your inbox. It's a quick and easy read.

I make a point of reading CircleID. There is no getting around the utility of knowing what thoughtful people are thinking and saying about our industry.

Co-designer of the TCP/IP Protocols & the Architecture of the Internet




Sponsored byVerisign


Sponsored byDNIB.com

New TLDs

Sponsored byRadix

Threat Intelligence

Sponsored byWhoisXML API

IPv4 Markets

Sponsored byIPv4.Global

Brand Protection

Sponsored byCSC

Domain Names

Sponsored byVerisign