Succinctly: Cox has decided that it knows best which of your network activities is "time sensitive" and which ones are not. And it intends to force its ideas on your use of the bandwidth for which you've paid.
Unsurprisingly, reaction has been negative. Om Malik of GigaOm sez:
Who is Cox to decide that a certain FTP transfer is not time sensitive, or that some software update is not time sensitive? More importantly, why should consumers trust cable companies, whose record of giving customers the short end of the stick is pretty well known?...The Free Press, epicenter of the net neutrality battle in Congress, similarly remarks:
Unfortunately, as long as we have this comfortable duopoly in the broadband market, we the broadband consumers are going to have suffer from these kind of practices as we don’t have much of a choice. Hopefully a post-Kevin Martin FCC will be more citizen-friendly, and will act promptly against Cox and other traffic shapers.
"The lesson we learned from the Comcast case is that we must be skeptical of any practice that comes between users and the Internet.This will clearly be the first contentious issue to come before the new FCC and it is surely no accident that the strategy is announced at the moment when the agency is being reorganized and will find reacting quickly difficult.
"As a general rule, we're concerned about any cable or phone company picking winners and losers online. These kinds of practices cut against the fundamental neutrality of the open Internet. We urge the FCC to subject this practice to close scrutiny and call on Cox to provide its customers with more technical details about exactly what it's doing."
Hovering in the background of this story is series of failures on the part of Comcast the nation's largest, and hence most visible, cable company to extend the industry's privileged position in regard to regulation. (Cable companies have historically been much more lightly regulated than their competitors the telephone companies.) What Comcast failed to secure was the "right" to inspect your bits and to discriminate against bits it didn't like—especially P2P protocols. In doing so it ran up against the long-established ideals of common carriage. A common carrier is not allowed to discriminate in what it carries...it is not allowed to charge some loads of coal more than others nor to give some customers privileged service by delivering their coal first. Comcast was asserting the right to treat some bits differently based on the protocols that governed them.
The FCC came down on Comcast and in the ensuing back and forth Comcast, and the cable industry, got a huge black eye in public opinion as is evidenced by the qoutes above.
Cox steps into it
So Cox is deliberately taking up the network neutrality fight by declaring a new policy and is hoping to do a better job of it for the industry than the #1 guy. What's not so well know, but was cited in the story-breaking AP account, is that Cox has also been doing exactly what Comcast has been castigated for attempting:
Cox, apparently, is not willing to follow Comcast in a shift away from discrimination based on protocol to one based on the particular customers who actually use the most bandwidth. Instead it is trying to recast the issue in terms of "time sensitive" and "time insensitive" categories of protocols. (Cox, by all accounts, uses the same equipment (from a company called Sandvine that Comcast has; a technology that engages in deep packet inspection to try and discern the protocols used to transfer bits and other traits.) But whereas Comcast has been remarkably open about what it is trying to do since being spanked by the FCC and public opinion Cox appears to firmly set on the path continued obfuscation and misdirection. In its FAQ on the topic it says:
Comcast is fighting the FCC's ruling in court, but has abandoned its congestion management system in favor of one that doesn't discriminate between different types of traffic. It has also abandoned secrecy and revealed details on how the new system works.
Tests conducted by the Max Planck Institute for Software Systems in Germany indicated last year that Cox was using the same discriminatory network management system that Comcast employed then. Cox never revealed the details of its system but said it used "protocol filtering," a principle also used by Comcast.
Further testing by the Max Planck Institute indicated that Cox cut back sharply on its use of the old congestion system in August, and that it was shut down by January.
Our past practices were based on traffic prioritization and protocol filtering. This new technique is based on the time-sensitive nature of the Internet traffic itself, and we believe it will lead to a smoother Internet experience with fewer delays.This is nonsense. Honestly. Past practices are present practices. The FAQ directs your attention to the categories of protocols that Cox has created (time sensitive and insenstive) and away from the raw fact that all that is being done is that Cox has grouped some protocols into one pile and some into another and is discriminating against more than one protocol at a time. Protocols in the disfavored pile include: P2P (the one that got Comcast in trouble), usenet, and FTP. All of this requires deep packet inspection—Cox examining your data to determine what's in it—then deciding what is and isn't important and slowing down those that it thinks aren't important enough to get speedy service.
The confusing discussion that will ensue.......
I expect that all this will be recast by commentators, as soon as they get it together today, as a fork in the road betwen "caps" a la Comcast and "management" a la Cox. No. The real issue is congestion—the slowdown that comes when too many users are clogging the internet; usually in the local "last mile." Comcast is essentially telling some of its biggest users that they are using an unfair amount and "capping" their use at 250 gigs a month in the hopes that will serve to make congestion bearable. Cox, in contrast, is saying that some protocols deserve better service and is slowing others in its attempt to make its congestion less visible by passing the slowdown off to less important (in its view) uses.
What the first wave of comment0rs will be ignoring is that Cox also caps usage. It is much less transparent about how much incurs its wrath—it apparently differs by location and even then is not applied consistently. (In Las Vegas, for instance, the cap is 60 gigs on its 12 Mbps tier...much more restrictive than Comcast's more highly publicized and decried version.) So while Cox can be the newest villan in the the protocol arena of the network neutrality fight it cannot be cast as a hero to those who are disturbed about the implications of bandwidth caps. You can rest assured that if Cox succeeds in its current strategy Comcast will follow it into using both caps and "management" to restrict its users. The industry is not offering an either/or...this is an "in addition to" effort.
But the confusing discussion around caps and management will serve the incumbents as a whole by presenting the policy community with a Hobson's choice between two objectionable "solutions" to the problem of congestion. There is a third choice, a better choice, that doesn't involve picking either of the industry's favorite children.
A Third Way
Commentators (and policy-makers) would be better served by focusing on the actual problem: The real issue is congestion. The real solution is to directly address the undersupply of bandwidth that is the root cause of congestion. A congested network is, almost by definition, one which is under-engineered and so cannot handle the traffic demands that its users put on it. The real solution to the real problem is to fix that...to put in place a network which can handle the traffic and one which can easily, quickly, and cheaply be upgraded to handle downstream increases in demand.
It is no accident that I find it easy to reject the choice offered by Cox and Comcast. It is largely a product of where I happen to live. Lafayette's citizens are in a good position to see that there is a solution which doesn't involve making selections between false choices presented by the incumbents that seek to get concessions from the public in order advance their interests instead of actually taking the costly and admittedly risky business of fixing what is broken. Lafayette's new network, built explicitly to provide with the capacity the community believed was necessary for its future, is about to take on its first customers.
There are ways for the country as a whole to address the bandwidth/congestion issue directly without simply building an new network as Lafyette and other impatient communities have done. Regulators can do something as simple as setting standards on the advertising that currently allows companies to grossly overstate the amount of bandwidth they can reliably provide—buying a 12 meg tier seldom means that you can reliably get 12 megs. Simply require a truth in advertising standard of some sort: say that you have to actually be able to provide the advertised speed 98% of the time and that you must monitor and report your performance on a node by node basis to the FCC. If you fail to provide such speeds then you must rebate to your customers a per cent of their bill for the months in which the undersupply occurred. Performance standards like this used to be de rigeur for telephones back in the days before deregulation. They motivated the phone company to build the world's best phone system.
The almost inevitable consequence of this and other regulatory methods of demanding better service (why not make symmetrical up and down speeds a service standard?) would clearly and cleanly set a framework for rational behavior on the part of the incumbents. [An aside: for a good, current, take on why competition is never enough in some situations see Harold Feld's latest.]
What you could expect would be rational economic behavior that would drive
- telephone companies to follow Verizon's lead in building out a new FTTH network capable of dealing with today's demands. Verizion is unquestionably the smartest actor in the field. It is now well understood that Verizon has succeeded in what was initially judged a risky venture by the capital markets.
- the cable companies to push fiber much closer to the home and to restructure their use of bandwidth to supply fewer channels and more switched digital and raw bandwidth to customers.
- areas which have poor competitors in these categories—or who don't want rely on national policy to secure their futures—to build their own FTTH networks. As Lafayette has done.