Post ‘net neutrality’ internet needs new measurement tools, Princeton experts tell policymakers
John Sullivan, Office of Engineering Communications
For much of the past decade, fierce political battles over the internet have involved concerns that the fastest access would go only to those with the greatest ability to pay. In testimony last week in Washington, however, a Princeton professor said measuring such performance is no longer so simple. On the internet, speed no longer rules.
Speaking before the Federal Trade Commission’s hearing on internet competition and consumer protection, Nick Feamster said changes to the structure of the internet require new methods to measure online services. He said using speed as a benchmark is no longer the best measure.
“Oversimplification is going to lead to wrong conclusions,” Feamster, a professor of computer science, told the audience during the March 20 conference in Washington, D.C.
Princeton experts including Nick Feamster, a computer science professor, told the Federal Trade Commission last week that we need new standards for measuring internet performance rather than simply relying on speed. Feamster is deputy director of Princeton’s Center for Information Technology Policy.Photos courtesy of the Federal Trade Commission
Feamster, who is also deputy director of the Center for Information Technology Policy, later explained that internet speeds have vastly increased overall and most traffic now consists of video. At the same time, content is delivered to consumers in ways that are fundamentally different than five 10 years ago.
“The real issue here is that the dynamics are changing because of faster speeds and shifting market structure,” Feamster said in an interview. “We have to figure out how to help protect consumers in light of the changing technical dynamics.”
The conference featured speakers and panel discussions that raised issues related to consumer protection on the internet, especially in light of recent changes. Tithi Chattopadhyay, the center’s associate director, joined a panel discussion on technological developments and market structure. A former director of Wisconsin’s state broadband office, Chattopadhyay discussed internet competition and commerce and spoke of states’ roles in the market.
In his address, Feamster spoke about efforts to measure and oversee consumers’ access to information on the internet. In the past, much of this has concerned the speed at which information moved across the internet to consumers. Discussions over download speed and the potential for slowing or throttling downloads still play a central role in political discourse. But Feamster said that the internet has evolved to the point where speed is extremely difficult to measure and is not often a good measure of consumer experience.
“The methods of delivering traffic are a lot different than they were even five years ago,” Feamster said.
In its early stages, the internet was simpler, although not actually simple. Companies or people created content and uploaded it to their internet service provider. The information then moved through a series of large autonomous networks until it reached the consumer’s internet service provider, which downloaded it to the user.Now, the map is much more complex. Companies called content delivery networks, which specialize in moving large amounts of data across sections of the internet, have emerged to handle much of the network traffic. These companies are often not segregated in such a way that they could be targeted by predatory service providers trying to throttle traffic. Instead, these networks often place their machines inside the large networks that make up the internet’s backbone. At the same time, corporations that create a lot of content have merged with internet companies. Most of those companies also rely on cloud computing providers to store, manage and deliver data directly to consumers. Someone watching a movie on their laptop or phone might be relying of half a dozen companies to deliver the data.
“To deliver content to a consumer, there are a lot of parties who come to the table,” Feamster said.
Each of those players, he said, could be responsible, even unwittingly, for slowing download speed. Without detailed knowledge of the delivery system, it’s extremely difficult to identify the choke point, which could range from a large network player to a home router or an older-model phone.
“You might be tempted to say I buy access from an ISP [internet service provider], it’s clearly their fault,” he said. “The problem is that it is really tough to observe this from the edge of the network as a user.”
Even if the slowdowns were easily identifiable, he said, speed is not the always the best or most reliable measure of network performance. At high download speeds, on the order of gigabits per second, it’s very difficult to accurately measure download rates. For many applications, high download speed is not the most important technical limitation. The efficiency of an app, such as a phone’s web browser, might be the limiting factor. For conference calls, consumers might be more interested in the smoothness with which data is downloaded, a quality experts refer to as jitter. Gamers might be more interested in latency, which is the delay in response between their device and a gaming server.
To better measure consumer satisfaction, Feamster said, regulators need to look at many factors that affect performance and not just rely on speed. Better measurements will allow regulators “to better answer questions about the products that are delivered to consumers.”