Make Better Funding Decisions with Accurate Broadband Network Data: A Guide for Federal, State and Local Governments [White Paper]

State and federal officials are charged with spending billions of dollars in funding to improve broadband availability, particularly in rural areas. While many yearly budgets had already earmarked money for broadband development projects before COVID-19, the pandemic has highlighted deep digital divides at a time when the public is more reliant than ever on the internet for work, education and other essential services.

Federal, state and local governments need accurate data on broadband availability and network performance to correctly allocate this funding to serve the most constituents. This data drives budget and spending decisions — and historically, a significant portion of these funds have been misdirected by relying on bad data.

In this new Ookla® white paper, we share a case study where misleading data from Measurement Lab (M-Lab) led a U.S. Congressional office to an incomplete picture of broadband performance in Upstate New York. The white paper also includes a guide to the key considerations a savvy policymaker should take when evaluating network data on which to base funding decisions.

Using broadband network data to understand — and close — the digital divide

The shift to working and learning from home has underscored the need for high-speed connectivity across the entire country. Many households are trying to do much more with their internet connections than they ever have before. As more family members in a household use an internet connection for teleconferencing or distance learning, their need for internet speeds will go beyond the FCC minimum guidelines of what constitutes a broadband connection: 25 Mbps download speed and 3 Mbps upload speed.

In the U.S., legislators whose districts include rural areas have long been aware of the “digital divide” created by a lack of broadband access — and the economic and educational opportunities rural communities miss out on because of this divide. While urban dwellers usually have access to high-speed connections at or near their home addresses, sometimes broadband service can be cost prohibitive. Their rural counterparts are faced with the additional challenge of a lack of connectivity in their area. To commercial internet service providers (ISPs), there is a tipping point where population density is too low to make investments in high-speed internet infrastructure profitable.

Broadband funding efforts are often focused on closing this digital divide by targeting the most under-served communities for investment and development.

The dangers of using bad data to prioritize broadband funding

The white paper explores a case study where inaccurate network performance data created an incomplete picture of broadband access in Upstate New York. In August 2020, the office of Congressman Anthony Brindisi, New York, District 22, U.S. House of Representatives, released a report highlighting the lack of broadband service across the district. New York’s District 22 (NY-22) is large, and the people of the district are somewhat evenly distributed between city and country life, with 57.5% living in urban areas and 42.5% (roughly four out of ten people) living in rural areas. Like so many rural regions of the U.S., broadband has not yet reached all constituents in NY-22.

The report provided valuable insights gleaned from constituents’ direct feedback on their connectivity, and the congressman’s office made excellent recommendations on how the district should approach improving broadband access. However, our concern with the rest of the report is that it was based on network performance test results that painted an inaccurate picture of what many constituents were actually experiencing in the district. The presented results greatly underestimated the speeds being delivered by internet service providers (ISPs) throughout most of the study area while overestimating speeds in some others. The speeds included in the report used network performance information exclusively from tests taken with M-Lab.

The speeds measured by Speedtest® for the same areas during the same time period are dramatically higher in most areas, which indicates that some constituents can already achieve network speeds that meet FCC minimums — meaning that additional infrastructure investments are unnecessary. By relying on numbers that inaccurately indicate lower speeds than reality, the congressman’s office runs the danger of targeting certain areas for funding that already have adequate broadband service. Resources are limited, and these funds should be allocated to areas that lack the connectivity needed to meet the FCC’s minimum of 25 Mbps download speed and 3 Mbps upload speed.

The table below shows comparisons of the median download and upload throughputs for the twenty ZIP codes specified in the report as having the “worst” speeds within NY-22. Looking at Ookla and M-Lab data side by side, you can see that M-Lab vastly under-reported the network throughput in every single “worst” ZIP code in the congressional report.
Ookla_NY22_slowest_zips_chart_1120

The ZIP code showing the least amount of difference between Ookla and M-Lab data was 13803 (Marathon) where M-Lab’s recorded median was 5.5 Mbps and the median from Ookla data was 14.5 Mbps. This means the typical speed in Marathon measured by Ookla’s Speedtest was over two and a half times as fast as the average measurement captured by M-Lab. On the other end of the scale, in Whitney Point, M-Lab’s recorded median was 0.9 Mbps while Ookla measured a median of 71.6 Mbps, almost eighty times faster.

Contrary to M-Lab’s data, Ookla data determined that 12 of the listed ZIP codes met the FCC minimum threshold of 25 Mbps download and 3 Mbps upload, with two additional ZIP codes falling just below the thresholds.

A policymaker looking at M-Lab’s data alone might incorrectly assume that every single listed ZIP code in the district is wildly underserved. In this case, funding may be allocated to areas that already have adequate broadband service, leaving underserved constituents without connectivity.

When bad data leads to underserved communities

In a few outlying ZIP codes, the speeds measured by Ookla were actually much slower than those measured by M-Lab. Below is a comparison of the “best” ZIP codes in NY-22, as reported by M-Lab, compared to Speedtest results.

While the majority of their data vastly under-reported network speeds, we zoomed in on one example where M-Lab’s data looked questionable in the very rural town of New Berlin (13411). M-Lab results showed a median download speed of 103.5 Mbps, but the median upload speed of 102.6 Mbps looked too good to be true. If this measurement was accurate, it would be outstanding service for such an isolated community. M-Lab’s report names New Berlin’s ZIP code the fastest in the entire district, which may have come as a shock to the residents there.
Ookla_NY22_fastest_zips_chart_1120

Ookla’s results for the New Berlin ZIP code show a strikingly different picture: a median download speed of 18.5 Mbps and median upload speed of 3.3 Mbps. While the upload number meets FCC minimums, the download certainly does not. If ZIP codes are used to determine eligibility for broadband funding, the M-Lab results would indicate that the area around New Berlin is not in need of broadband infrastructure assistance.

While reporting data aggregated by ZIP code is common among network testing providers like M-Lab, Ookla does not recommend using ZIP codes as an arbitrary boundary for measuring broadband performance.

ZIP codes were created for a single purpose — to efficiently deliver the mail via linear routes. While an urban ZIP code may contain several neighborhoods in the same city, rural ones can encompass several small communities many miles apart from one another. ZIP code names do not reflect every community served, and are usually named for the community that hosts the postal facility.

The disparities between network data providers

Federal, state and local policymakers need to use the most accurate, comprehensive data available on the networks when deciding where to spend broadband funding. However, not all network testing providers are created equal.

To accurately measure the download speed of an internet connection, a testing application such as Ookla’s Speedtest or M-Lab’s Network Diagnostic Tool, running on the end users’ machine, pings dedicated testing servers to send as much data as possible. The testing application then measures how much data it receives back from the servers during a period of time (usually 10 or 15 seconds).
Test2_graphic_1102

Each test requires a large enough data transfer to ensure that it fully saturates the network connection and measures the full throughput capability. With ISPs offering high-speed connectivity such as optical fiber to the home, this problem is only getting worse. These connections are able to handle speeds between 1 and 2 Gbps, roughly 40 to 80 times more than the minimum broadband speed of 25 Mbps.

Some network testing providers, however, do not have adequate testing infrastructure to account for normal demand on the network, and thus are incapable of accurately measuring peak network speed.

Since M-Lab is a Google partner, search engine results drive traffic their way for performance testing. This is not because they are the best test, but because of the relationship between the two organizations. In fact, M-Lab’s testing infrastructure is extremely limited in a way that produces inferior testing outcomes. Currently, M-Lab has fewer than 60 servers across the entire United States listed on their infrastructure map below (with no servers shown in Alaska, Hawaii or the U.S. territories.)

The Speedtest Server Network™ was purpose-built to manage a global scale of volume, with testing servers strategically located in every country and most major population centers. We have over 12,000 servers in the network, with more than 1,600 in the United States and 68 servers in New York State alone.
Ookla_server-location-comparison_US_NY_1120

When a user takes a test through M-Lab, the test measures the speed between the user’s device and a single — and often distant — server. When data travels between the user’s device and a distant server it may have to traverse many network “hops” (when a packet of data is passed from one network segment to the next) to get there. The additional lag time this introduces to the test results can negatively impact the user’s perception of the local network’s performance. If the server being used for that specific test is also trying to run many other tests at the same time, it may not have sufficient capacity to provide an accurate result. If there are multiple users simultaneously testing their high-speed connection, the tests might consume all the available throughput from a single test server, thus denying other users the capacity required to measure their own connection. Simply put, M-Lab’s infrastructure is insufficient for internet performance testing in the modern era.

Learn how bad data can negatively impact government funding

There are billions of dollars of federal, state and local government funding at stake — not to mention the educational opportunities and livelihoods of millions of constituents. It is critical that policymakers vet their data sources to fully understand the broadband landscape in their jurisdictions — and prioritize spending to best serve their most vulnerable constituents.

Download the full white paper to learn the five considerations every policymaker should take into account when evaluating data sources for their broadband funding decisions.

Contact the GSMA

Please get in touch if you need more information or have any queries about anything you see on our website.

Contact us