|
SAN JOSE, November 7 – Emboldened by their summertime victory against Comcast, advocates of network neutrality said Thursday that the next front in battle for the principle would be against wireless carriers who make “unreasonable” network management decisions.
In a panel discussion on managing wireless networks at the Wireless Communications Association conference here, Free Press Policy Director, Ben Scott and Google Telecom Counsel, Richard Whitt said that the FCC’s Net neutrality principles would bar discrimination over wireless networks—while conceding that the networks are, for the time being, more bandwidth-constrained than wired-based network.
Wireless networks “are not different,” said Scott. “We made this mistake in the 1996 Telecom Act, and regulated different technologies under different rules, and we are paying the price.”
Wireless networks are only different to the extent that bandwidth constraints might make it harder for the FCC to prove that a particular network-management technology was “unreasonable,” said Scott.
The top lobbyist for AT&T and a vice president of the wireless industry association CTIA appeared to accept the new reality: that their wireless services will be closely scrutinized for signs of Net neutrality violations.
Net neutrality refers the principle that carriers should be barred from blocking or throttling particular applications, from prioritizing or de-prioritizing certain applications (as with Comcast’s restrictions on peer-to-peer file sharing using BitTorrent), or from promising expedited delivery of internet traffic to favored content providers.
“It is fair to say that wireless is different,” said Christopher Guttman-McCabe, vice president of regulatory affairs for CTIA.
“We absolutely do prioritize things affected by latency, like voice,” said Guttman-McCabe. Such prioritization on the network—even though it might run afoul of the FCC’s Net neutrality rules if on a wired network—was absolutely required to ensure quality telephone calls for consumers, he said.
AT&T’s “biggest concern is [that] the wireless network is built in a granularly shared network, cell-by-cell,” said Jim Cicconi, senior vice president of external and legislative affairs for AT&T. “You can overwhelm a cell by having too many people in the same cell, [as when] everyone is trying to call home [in traffic] at the same time.”
Throttling wireless movie downloads clearly trumps voice conversations in such an environment, said Cicconi.
“Our customers expect to have a certain level of quality in their usage. It is one of the reasons that we have to prioritize traffic in the cell. We are not trying to balance them for the company’s advantage, except insofar as customers will leave us” if they have bad service, he said.
Whitt agreed that such conduct was acceptable “as long as the activities taking place are designed for a completely neutral way of applications or traffic, and they are not tilting one way or the other for competitive advantage.”
“There is some concession to the point that at least for now, maybe only temporarily, there are some limits in terms of what can be done with those networks,” Whitt said.
Originally posted at DrewClark.com. Drew Clark is executive director of BroadbandCensus.com.
Sponsored byDNIB.com
Sponsored byIPv4.Global
Sponsored byCSC
Sponsored byVerisign
Sponsored byVerisign
Sponsored byRadix
Sponsored byWhoisXML API
There really isn’t much difference. All telecom services are built on some sort of oversubscription model, some more extreme than others depending on the underlying technologies and resources available. Oversubscription is important to ensure that business model can be maintained, services can be deployed at reasonable consumer prices within the confines of the technology and the service provider’s operating budget, and adequate (if not excellent) service can be provided. Net Neutrality is a nice concept but can’t be taken to the extreme. There is some practicality that limits service deployment and determines pricing. Compromise is needed, particularly in extreme circumstances such as the Comcast / BitTorrent case (or other P2P file sharing over broadband services) where a small percentage of users can overwhelm and destroy service for everyone else, putting other customers at risk and the service provider at risk. In an ideal world it would be nice if things were left wide open and everyone did exactly what they wanted to do, but that’s not a practical reality. Indeed, the risk is significantly more clear in the much more limited wireless world where we still sometimes get fast busies, calls do not go through or calls are dropped when we change cells because of congestion. The demand for Internet access and data to mobile devices stresses the model further. But each telecom service has gone through such growing pains, and application and traffic demand seems to always be just a step ahead. We have to be careful. The Comcast case has already had the negative (in my opinion) results of pushing carriers to consumption-based models, which can lead to the very same limitations (just in another way) and higher prices for the majority of the user community.
The “network neutrality” crusaders obviously aren’t willing to unlimber their wallets and put down millions of dollars for wireless spectrum. Spectrum is expensive, and downloaders and P2Pers will eat it alive if they are not kept in check. Letting them run wild isn’t “neutrality,” it is partiality. And, yes, we will see caps and rate increases if providers cannot do the sensible thing and throttle P2P.