|
After voting on the Comcast order today, Kevin Martin and his Democratic Party colleagues issued press releases telling us how they saved the Internet from Comcast’s discriminatory practices, but they’ve failed to release the actual order they adopted and subsequently re-wrote. Commissioner McDowell wasn’t allowed to see the revised order until 7:00 PM the night before the meeting. Rumor has it that high-level spin doctors are still trying to remove all the rough edges, inconsistencies, and factual errors. So once again, critics of the Commission’s apparent over-reach are left sparring with a shadow puppet. And I’m glad to see that Susan Crawford agrees that the Commission has exceeded its mandate.
The press releases are inconsistent and incoherent, and there’s no reason to expect the order to be any better. They display a significant lack of understanding of the technical and factual issues in this case. The Commission’s press release refers to a hitherto unknown right of network applications to be treated “equally”, without sourcing it or explaining it:
The Commission concluded that Comcast’s network management practices discriminate among applications rather than treating all equally and are inconsistent with the concept of an open and accessible Internet…
While Comcast claimed that it was motivated by a desire to combat network congestion, the Commission concluded that the company’s practices are ill-tailored to serve that goal for many reasons: they affect customers who are using little bandwidth simply because they are using a disfavored application; they are not employed only during times of the day when congestion is prevalent; the company’s equipment does not target only those neighborhoods suffering from congestion; and a customer may use an extraordinary amount of bandwidth during periods of network congestion and will be totally unaffected so long as he does not utilize an application disfavored by Comcast.
And at the same time Martin endorses the practice of raising the priority of delay-sensitive applications like VoIP:
We do not tell providers how to manage their networks. They might choose, for instance, to prioritize voice-over-IP calls. In analyzing whether Comcast violated federal policy when it blocked access to certain applications, we conduct a fact-specific inquiry into whether the management practice they used was reasonable. Based on many reasons, including the arbitrary nature of the blocking, the lack of relation to times of congestion or size of files, and the manner in which they hid their conduct from their subscribers, we conclude it was not.
We do not limit providers’ efforts to stop congestion. We do say providers should disclose what they are doing to consumers.
So it’s OK to prioritize VoIP. That means it’s OK to de-prioritize everything that’s not VoIP, and the only way you can determine which is which is by inspecting packets to see what protocol carries them. But the Commission says you can’t do that:
For example, Professor David Reed of the Massachusetts Institute of Technology, widely respected as one of the architects of the Internet, said that “[n]either Deep Packet Inspection nor RST Injection”—Comcast uses both to manage its network—“are acceptable behavior.”
Leaving aside the fact that Reed hasn’t been active in network engineering for over twenty years, one man’s personal opinion is not the law. Reed’s religious notions about right and wrong are inconsistent with Martin’s assertions about what’s permissible and what’s not. Deep Packet Inspection is how you see whether a given packet is carrying VoIP traffic or not. Internet packets aren’t hidden in envelopes, they’re a one-dimensional series of bytes that are all out in the open, like postcards, so there’s nothing nefarious about this. If we can’t tell what the application or protocols it is, we can’t prioritize it.
Martin says the FCC does not tell providers how they may manage their networks, but that’s the whole point of the exercise.
And the Commission remains lost about the impact of Comcast’s management on the ability of its customers to access content on the Internet. They don’t interfere with customers’ ability to download content of any kind using BitTorrent or any other protocol, but the Commission’s press release says otherwise:
The Commission concluded that the end result of Comcast’s conduct was the blocking of Internet traffic, which had the effect of substantially impeding consumers’ ability to access the content and to use the applications of their choice. The Commission noted that the record contained substantial evidence that customers, among other things, were unable to share music, watch video, or download software due to Comcast’s misconduct.
In fact, no such thing is happening on the Comcast network and it never has. Comcast does not interfere with BitTorrent downloads on its network; in fact, they prevent BitTorrent seeding from interfering with BitTorrent downloading and actually improve the performance of the application the Commission says it’s “disfavoring.”
Nobody has complained that you can’t download, and downloading is how you access the content of your choice. The actual complaint is that the amount of bandwidth Comcast allocates to BitTorrent acting as a file server is not enough. This can have an effect, typically a small one, on the ability of other people, especially those outside the Comcast network, to download the Comcast customer’s content, but there’s no “freedom to run a file server from your home.”
Throughout this whole debacle, I’ve repeatedly tested whether I could download movies and software using BitTorrent on the Comcast network, and at no time has it been blocked. There have been periods during which BitTorrent was slowed for seedding, when I wasn’t trying to download a file, just serving files to others, and that’s it.
So what we have is this:
Under no rational interpretation can this be considered a good day for the Internet.
I’ll have further comment when I can see the actual order.
Sponsored byVerisign
Sponsored byCSC
Sponsored byVerisign
Sponsored byDNIB.com
Sponsored byWhoisXML API
Sponsored byRadix
Sponsored byIPv4.Global
The FCC’s actions today were illegal, because it not only exceeded its authority but directly violated Federal law. The law, at 47 USC 230(b), says that the Internet must be “unfettered by Federal or State regulation.” Ironically, the 3-2 majority quoted from the same section during their meeting, but INTENTIONALLY SKIPPED the part that forbids them to regulate. This is deceptive and disingenuous. They also declined to fine Comcast and ordered it to do what it was already doing, in the hope that this would forestall a court challenge to their illegal, inconsistent, and arbitrary ruling and leave it standing as precedent. This is indeed a sad day for the Internet.
No. As the Commission write, customers were unable to
share
, i.e. upload, and this is a disfavoring limitation.
Why not? It sure should be free for anyone to provide content to the Internet, this is one of the fundamental principles used when creating Internet, end-to-end conectivity.
How about natural gas? Electricity? Gasoline? Whaddaya mean, they cost money to produce?
Joakim,
The reasons are simple. One is technical, that is, the upstream is typically more constrained in bandwidth than the downstream (towards the subscriber) in broadband platforms, and the very assymetric nature of Internet traffic where (typically, at least for now) more data is pulled down than sent upstream, forces restrictions on what types of systems are allowed at the home (e.g., thus no web servers).
Second, the business model for broadband is based on heavy oversubscription. It is the only way to keep the prices low, in the $40/month price point range. If you allow web servers at homes, or P2P like BitTorrent, the business model blows up. The ISPs may be able to upgrade a bit and suck up some of the cost, but ultimately it would get passed through to the consumer. The more broadband traffic patterns shift (because of BitTorrent, etc.) towards symmetric patterns or heavy uploads as the norm, the more broadband will have to resemble leased lines to the home. And that means you will pay more until the last (first?) mile is worked out, which it may never be - it’s always been the tough part and that hasn’t changed for 100 years. Hopefully one day we’ll all have fiber to the home with GigE running over it, or maybe WiMax, but I’ve been wishing that for years and I still can’t get Verizon FiOS to my home despite living just outside of DC.
I’m not trying to be funny, but if you need to run commercial services from home such as a web site or ecommerce site, you should consider a leased line and the associated cost. (A T1, if you can even get one delivered to your home, averages about $450/month for the local loop, plus whatever ISP charges you pay on top. The further you get from the CO, the more the local loop costs.) Broadband is meant for the masses, where the statistical averages play out and allow for a reasonably good service to be deployed for a reasonable price. I have Comcast and have had Verizon and Starpower DSL in the past. All are reasonably priced and perform well. I hope neither price nor performance changes for the worse because of P2P and the FCC’s ruling. This could easily force ISPs to usage-based models. I’ve seen such models in the past, and in my opinion it will be unfavorable to many subscribers. My monthly cost will probably go up (and I don’t share music).
Dan, The assymetry of last mile technologies only impacts the individual subscriber. If i choose to have a webserver or P2P app that sucks all the upstream capacity of my connection that only affects me. So assymetric technologies only help operators to limit upstream traffic. When/if we get symmetric last mile technologies (don't count on WiMAX) the problem will be much bigger as the aggregated traffic in the transport network will increase significantly. I agree that overprovisioning is used and probably necessary for many business models (and i do not have a solution for this) But I really belive that the network should not discriminate different types of traffic and shall not dissallow me from running applications of my choice. I run non-comercial web, ftp and mail services as well as IPv6 tunneling etc from home. Low-bandwidth use, should not be a problem for anyone.
No one is necessarily stopping you from running a web server, but if it does suddenly generate tons of traffic, it could have impact, thus the ISPs at least want to discourage it. As for the assymetry, it affects more than just the individual subscriber in a DOCSIS broadband architecture. All subscribers on the same head end are affected. You may be thinking DSL, which while assymetric is a direct shot from the home to the DSLAM and aggregator. (And let’s not say that providers have to choose DSL over cable. Obviously they are using their legacy infrastructure.) As for the aggregated traffic in the transport network, it will increase alot but long haul bandwidth has come down so far in price that it’s not the issue anymore. You can get dark fiber for what you used to get a T3, and light up gobs of bandwidth over it. It’s gone up recently, but still the price per Mbps is more or less commoditized.
And just to be sure, I don’t use BitTorrent/Napster/KaZaa P2P nor run a web site hosted at home, but I do have multiple VoIP lines and use Skype periodically (sometimes simultaneously), occasionally watch a streaming Netflix movie, along with fairly heavy Internet usage, and all of those apps (some assymetric, some uni-directional, some symmetric) work quite well over Comcast.
..and end-users allowed to run servers both compete with operator services and increase complexity.
Yeah, I was thinking DSL, havn’t been much into cable networks (not as much cable deployed here in Sweden as in the US). I can also add that wireless technologies also have the same problem with shared last mile (in each cell). And for wireless the cost of the last mile will probably be higher.
The questionable decision-making and procedural practices that Richard spotlighted are, unfortunately, characteristic of this particular iteration of the FCC. Since 2004, the commission’s activities and operations have become highly politicized. The politics and procedural game-playing under the Martin administration are the antithesis of open, democratic decision-making. This is a significant change from the FCC I have covered as a reporter for nearly 20 years.
The current chair, Kevin Martin, has ties to traditional telcos, but that’s not the issue. So have others in his position. What is unprecedented, however, is the lengths to which he will go to push that industry’s interests to the detriment of others.
Martin makes a big show of concern for the consumer. But only VoIP and cable companies, both significant rivals to telcos, are singled out for criticism on the grounds that they have not been sufficiently attentive to consumers.
Martin has focused his most alarming political efforts on VoIP. So great is Martin’s apparent antipathy toward the new technology that, since he took office early in 2005, FCC staff have been unable to attend public events, such as the VON and Internet Telephony conferences, that have to do with VoIP.
Worse, he has used 911 as a cudgel to attack VoIP rather than trying to resolve the incompatibilities between the old and new technologies. Who can forget his dog-and-pony show of VoIP 911 “victims” at an FCC meeting in 2005. But what has he done to advance much-needed updates, including migration to IP, for the 911 system? Nothing, despite the efforts of several groups in the emergency services community, which have developed detailed plans for that upgrade.
When a consumer issue threatened telcos, Martin played a different tune. The FCC recently considered a pro-consumer action against telcos—namely the decision prohibiting telcos from leaning heavily on customers who want to switch service to competitors. Only Martin opposed it.
The current situation with regard to Comcast and network traffic is typical of how Martin’s FCC has bungled tough decisions. The issue is a difficult one, and it needs careful negotiation. Consumers are rightly concerned about the possibility that their access to competitive services on the Internet might be restricted. On the other hand, network operators have to make sure that the network continues to function smoothly. The FCC’s response has been to slam Comcast without getting even the basic facts correct.
The network-traffic issue needs an even-handed, nonpolitical approach that can meet the needs of both consumers and network operators. Unfortunately, this FCC doesn’t have a prayer of doing that. It long ago lost all ability to deal with the tough questions effectively.
At this point one can only hope that commission members will delay major actions until the Bush administration is over and this FCC is gone. Some of its decisions are likely to be overturned anyway. Fortunately, we have the model of past FCCs, under both Democratic and Republican administrations, that did not make heavy-handed politics the centerpiece of every decision. Let’s hope that FCC reappears after the next election.
If you sell me the ability to download at a set speed, allow me bandwidth to download at that speed. Don't say you can download at that speed for 8 days each month than shut down or we will shut you down, and oh by the way we have a higher speed package for even more money for us so you can run up against the same limit even sooner. I know, I have received a over usage phone call and when I asked if there was a package that would increase my limit I was told there was no "upgrade" package unless maybe I could get a T1 line in my area. I live in the country and can not even get DSL. I don't want to rant but since Comcast has bought Suscom my old provider my bill has gone up and my sevice has gone down, I never had a problem with Suscom limits.