|
Ahh, so the telecom incumbents have come up with a “new” idea for the Internet—usage-based pricing. That’s right, more usage (for things like VOIP and video especially) means more costs to operate the network, so users should pay by the bit, or some similar metric. It’s all so logical!
But wait a minute. I thought what sparked the consumer Internet revolution was the fact that ISPs didn’t charge by the minute, but offered flat-rate monthly fees. And what catalyzed the boom in cellular usage here in the US was the shift from heavily usage-based pricing to the largely flat rates we see today. This “new idea” is actually the oldest one in the telecom book. Even with respect to the Internet, the debate over flat vs. usage pricing is a decade old. It was at the heart of the “modem tax” debate back when I was at the FCC. At that point, the telcos were complaining about dial-up Internet access, not broadband video, yet the argument was the same. They lost that battle, but really they (and users) won, because usage and innovation skyrocketed.
Experimenting with new pricing and business models is all well and good; there’s nothing written in stone about the current fee structure for broadband access, for example. But be wary any time you hear that “economics” dictates an “efficient” consumer pricing model. Sure, as a fundamental rule, prices in competitive markets are related to marginal cost. However, local access is still not truly a competitive free market, and cost numbers in telecom are inherently fuzzy and manipulable.
Given the option, users tend to vote with their dollars. If the pricing model doesn’t reflect the value they get from the application or service, they use it less. With the kind of rhetoric we’re hearing today we should be wary about pricing decisions by network owners choking off future growth and development of Internet-based services.
Sponsored byVerisign
Sponsored byRadix
Sponsored byDNIB.com
Sponsored byVerisign
Sponsored byIPv4.Global
Sponsored byWhoisXML API
Sponsored byCSC
Usage-based pricing is a bad idea. Users hate watching the meter, and will be deterred from doing things because they know it’s running. However, broadband providers have been forced into metering by the recent FCC ruling, which slapped Comcast for imposing implicit limits on customers’ usage and preventing bandwidth hogging.
Bandwidth costs money—lots of it, in rural areas. (Rural ISPs pay as much as $325 per Mbps per month for their bandwidth.) But even if it’s less expensive in a particular area, there has to be SOME limit on how much users can consume, or some of them (the big downloaders and P2Pers) will hog it, degrading other users’ service. One way to limit it is to charge a flat rate and hold the bandwidth hogs back; that’s what Comcast did. But the FCC said, no, Comcast couldn’t do that. So, Comcast had to start imposing caps. It really had no choice. Kevin, you’re going to have a say in who gets on the FCC and what it does during this administration. Perhaps you can pick someone who doesn’t make bad policy just because it affords him a chance to slap cable providers (as was the case with Kevin Martin) but will understand that ISPs do have to meet payroll. They will be as consumer-friendly as they can (they need to compete for customers), but the FCC doesn’t allow them to be consumer-friendly the results will not be good.
Why wait until late 2008 to respond to this article from early 2006?
Comcast was slapped for selectively discriminating against particular P2P applications. Their actions may have prevented a certain class of bandwidth hogging to some degree, but the impact was limited to the upstream channels of certain specific P2P applications. It was this "discriminatory and arbitrary practice" [FCC 08-183] which formed the crux of the matter in the Comcast case, particularly given that the practices were not disclosed to customers. (On the contrary, Comcast initially denied that they were blocking access to any application or throttling any traffic -- a misleading half-truth at best.) You're clamouring for change on the basis of a revised account of history. Nobody has been "forced into metering", as you say. The FCC ruled against unreasonable application-specific interference, not bandwidth management in general.It's relevant to comment on Kevin's 2006 posting because now, in 2008, he's on the Obama transition team and will be instrumental in determining (among other things) who will chair the FCC. As for the FCC decision against Comcast: it appears that you are the one engaging in revisionist history. As was reported in Comcast's filings with the FCC just last month, Comcast was not, as you claim, "selectively discriminating against particular P2P applications." It was throttling all P2P, and with good reason; keeping P2P in check is vital to ensuring quality of service for those of us who aren't doing illegal downloading of pirated music and movies.If bandwidth hogs are allowed to congest the network, the rest of us wind up with slow service and higher prices. Comcast was doing the right thing. So, why was it slapped down? Because Kevin Martin, who wants to be a telco exec when he steps down as FCC chair, was "paying forward" to the telcos by beating on the cable companies.
The use of RST packets is a longstanding and very effective method of administratively terminating connections. Of course, in the case of P2P, it doesn't stop the transfer; it just cuts off some of the hydra's heads. (BitTorrent and many other P2P programs effectively hack the network by opening up many connections, seizing priority over legitimate traffic.) Comcast had the right to terminate ALL P2P traffic, because it is contrary to its terms of service. But it was being nice; it only limited P2P when the network got congested. And because its equipment knew which P2P malware it was dealing with, it could do it gracefully and effectively, just as a virus checker can best deal with a virus if it has a pattern for it. As fo the "parade" you are whining about: maybe it's because there are a lot of people out there who know that Comcast was right.
The carriers are barking up the wrong tree with monthly aggregate data limits. If there are good ways to deal with pathological user behavior, then this isn’t one of them. First, spreading the monthly limits out over the approx 2.6 million seconds in a month produces bandwidth utilization generously under 100 kilobits per second. Can’t call that “hogging,” can we? The real problem is peak loads, isn’t it? So what load can a normal PC or Mac generate on a broadband connection? Many of us have gigabit Ethernet ports on our machines, but have you ever seen that? So, actually, in the real world, you’d hardly ever see more than a single megabit sustained upstream for more than a few seconds. And I’ve never seen close to Comcast’s claim of “up to 12 mbps” download either. The industry talk is all FUD, a smokescreen for desire to control user access to content. If ISP’s want to deal with gamers and other hackers trying to louse up the system, why not say so, and put the appropriate words in the service agreements rather than making 99% of their customers mad at them?
It happened in my neighborhood. A group of kids went on a downloading spree and everyone's access was bogged down all day, because P2P keeps running 24 x 7. It took complaints and a letter from the cable company to the kids' parents to clear up the problem. When users run P2P, they cost also cost the provider more for bandwidth than they are paying. Our cable company puts it right in its terms of service that running servers and P2P is prohibited. Blocking P2P is the right thing to do, but if providers cannot do this metering is the only way to discourage it. And it has nothing to do with access to content. Comcast was not looking at what was being transferred via P2P, though if it did I am sure it would have found that all of it was illegal. It was pushing back on P2P, and that is a good thing.