Missouri House Special Interim Committee on Broadband Development Issues Its Final Report

posted in: | 0

The Missouri House Special Interim Committee on Broadband Development has issued its final report and recommendations for improving access to broadband and broadband applications. The Committee was appointed last year by House Speaker Rob Vescovo, and included Representatives Louis Riggs (Chair), Cyndi Buchheit-Courtway, Bishop Davidson, Travis Fitzwater, Jay Mosely, Wes Rogers, and Travis Smith.

The Report summarizes findings from at least 11 public meetings and testimony from over 40 witnesses; together with appendices and transcripts, it is more than 500 pages in length. The Report addresses issues of internet access, connection speed, and affordability, as well as the need for progress to improve adoption of internet-based applications for online education, telehealth, precision agriculture, workforce development, and entrepreneurship.

While acknowledging that the state has made some progress over the past several years – moving up from 41st to 32nd in the FCC state ranking for broadband access, the Committee concluded that “there is still a tremendous amount of work to do in order to move Missouri from below the middle of the pack into the Top 10 states in the country.” To illustrate the point, the Report noted that Missouri ranks 44 out of 50 states in home use of fixed broadband and 15th in the nation for households with no internet access at all.

Several recommendations were made to improve on these statistics, including the creation of legislative committees in the Missouri House and Senate dedicated exclusively to broadband expansion and oversight, along with a “Broadband Development Council” to enhance stakeholder engagement, ensure accountability and provide meaningful public oversight. As part of this effort the Committee called for a publicly accessible internet testing and mapping resource that would show actual internet connection speeds in real-time.

More funding for the Missouri Broadband Office within the Department of Economic Development was recommended to increase amounts available through the state’s broadband infrastructure matching grant program over the next three years and to provide additional staff to improve oversight of internet providers that participate in this program. The Committee recommended increasing connectivity speeds in the state’s definition of broadband, so that public funding would be available in areas lacking connectivity at speeds of at least 100 Mbps download and 20 Mbps upload and modifying the definition so that those standards automatically adjust in conjunction with future increases in the federal standard. At the same time, the Committee acknowledged that some public funding support should be available for connection speeds at lower levels for extremely remote last-mile locations, until technological advances permit these to be phased out.

Finally, the Report recommended legislation to encourage and streamline deployment of broadband, including the use of government-owned structures and broadband assets to expand service to homes and businesses through participation in public-private partnerships. Specific recommendations included overhaul of right-of-way access, streamlining resolution of utility make-ready and pole attachment cost disputes, and the institution of “Dig Once” policies to require more efficient and cost-effective installation of broadband infrastructure.

Several of these recommendations appear to be included in legislation proposed in  the Missouri General Assembly this session. For example, Senate Bill 981 changes the definition of broadband and Senate Bill 990 addresses part of the make ready and pole replacement cost issue.

Are Low Earth Orbit (LEO) Satellites the Answer to Missouri’s High-Speed Internet Access Problem?

posted in: | 0

(De-mystifying Some of the Technology Behind Your Internet Connection)

By Marc McCarty

When I talk to people around the state about bringing high-speed internet access to underserved communities, invariably someone mentions  “the SpaceX Starlink thing that Elon Musk is building.” Musk is not the only person interested in building low-earth orbit satellite (LEO Satellite) networks. Amazon (Project Kuiper), OneWeb, and TeleSat (Lightspeed) all have LEO Satellite projects planned and in some cases in limited operation.  But like electric cars and passenger-capable spacecraft, SpaceX’s “Starlink” product seems to be ahead of the competition, and it’s difficult to match Elon Musk for the “cool factor.” After all, who else has the moxie to launch a cherry-red electric roadster past the orbit of Mars, just to prove he can do it.

So, Is Starlink, or some other similar satellite technology that delivers high speed internet from outer space, the answer to the high-speed internet access problem in Missouri and other states?

It’s certainly worth asking the question, because over the next several years the federal government, along with states and localities, are likely to provide private and public internet service providers (ISPs) more than $60 billion to help fund the deployment of various types of broadband internet infrastructure; all in an effort to finally bring high-speed internet access to every location in America that could reasonably need one. With that much money at stake, getting the best value for the public’s investment will be critical. Past experience has taught valuable lessons on the need to wisely deploy public funds for internet infrastructure investments.

While fiber-optic cable seems to be the one infrastructure technology most capable of supplying high-speed internet service that can expand to meet future consumer and business needs, it is not without drawbacks. The cost of installing fiber to each residence and business can be more than $20,000 per mile even in rural areas with few obstructions, and most recently, wait times for delivery of fiber-optic cable and equipment can be well in excess of a year! The promise of connecting to the internet today – or within a few months – at speeds well in excess of the FCC’s current definition of “broadband” using a small antenna and some electronics that costs about $500, is very appealing, even if the monthly cost for the service is a bit more than fiber-optic cable or other wired internet options.

But what is LEO Satellite Internet? How does it work? What are the prospects that it might actually be the key component in efforts over the next few years to bridge the digital divide?

A Short Primer on Satellite Internet

At the outset, it’s important to distinguish LEO Satellite Internet from geocentric earth orbit (“GEO Satellite Internet”). Both technologies are “fixed” wireless internet, because they both access the internet by transmitting a signal through the air (and outer space) to a “fixed” antenna located at or near the customer’s premises.

However, this is where the differences begin. GEO Satellites orbit the earth at 22,236 miles above the equator.  This altitude, and the fact that the orbit is directly above the earth’s equator mean that it takes 24 hours (one day) for a GEO Satellite to complete a single orbit, and because the satellite’s orbit is above the equator, to a ground observer the satellite appears to remain “fixed” at a single point in the sky.

In Missouri, most folks with a clear view of the southern sky can subscribe for GEO satellite internet from ViaSat or HughesNet. These companies provide internet service using three or four large satellites positioned over the equator with line-of-sight view of North America. On the ground, a subscriber with a small antenna and a clear view of the southern sky can focus on one or more of these satellites to transmit data to and this location up to the satellite, and from there back down to an earth-based antenna connected to a traditional earth-based internet connection operated by the satellite internet service provider.

LEO Satellite Internet also transmits internet signals to and from a fixed antenna located at the customer’s residence, but LEO Satellites orbit much lower to the earth. These orbits vary, but generally are around 150 – 600 miles above the earth. At this altitude, to a ground-based observer the satellite appears to cross the sky from horizon to horizon in just a few minutes. Once the satellite passes below the horizon or some other obstruction, the signal carrying the internet data is lost. For this reason, LEO Satellite Internet requires a “fleet” of satellites orbiting the earth in predetermined orbits trailing one another.  In this way at least one satellite is always within line of site of the antenna at the customer’s home or business. As the connection with one satellite is lost, another comes within range and takes over the communication.

While GEO Satellite Internet is widely available and may seem ideal for many areas, particularly isolated rural areas with a good view of the southern sky, in practice it hasn’t been all that popular.  Complaints include limits on the download speed once certain monthly data transfer limits are exceeded, reliability of the signal in bad weather (snow and/or heavy rain), and high monthly subscription costs. For the most part, these shortcomings are a direct result of limitations imposed by physical laws governing the transmission of data, as well as the capacity of the satellite to handle subscriber demand for the service needed to run new internet applications. One of the reasons LEO Satellite Internet has received so much attention is that it may be able to work around some, but probably not all, of these limitations.

 A Little Science  

Understanding the physical laws that apply to the internet and working to minimize those limitations is the domain of scientists and engineers. Trying to talk their language to provide even  a high-level overview of the challenges they face can quickly result in a bunch of “technical gobble-de-goop.” However, others have developed some analogies that can be help illustrate these basic concepts, and even nontechnically trained folks like me usually can follow  them. Analogies like these do not provide answers on how best to “close the digital divide.” However, they can help in understanding the issues and challenges associated with different types of internet infrastructure, and they may help all of us, and especially our public officials, to make more informed choices of the technologies most appropriate for public-funded investment.

How Big is the Pipe?

The role of all internet infrastructure is to transport “data” from one physical location to another. Data is a series of 1’s and 0’s arranged in a specific sequence. An internet-connected device can decode this sequence and convert it into usable information. Things like email, texts, video calls or movies, audio recordings, large computer files (even this article) all can be converted into data, transmitted through the internet to another location, and then converted back into a usable format. Those wanting to learn a little more about how the internet works, and how it came into existence might enjoy this narrated presentation.

For purposes of understanding how different types of internet infrastructure work, it is useful to think of data moving through internet infrastructure as similar to water flowing through a pipe or a tube at a constant speed. Just as the amount of water that theoretically can move through a pipe in a given period of time will increase or decrease in relation to the diameter of the pipe that carries it, the amount of data that can be transferred through internet infrastructure will vary depending on the type of infrastructure used.

In other words, in a given period of time, a pipe that is 6 inches in diameter won’t  transport as much water as a pipe that is 12 inches in diameter. The same concept holds true for data moving through the internet. Certain technologies can be engineered to carry more data (more information) each second than others. This theoretical capacity to transfer data is called “bandwidth.” When we talk about internet service with download speeds of up to 25 Megabits per second, we really are saying that theoretically the technology being used to connect to the internet has the bandwidth – the capacity to move – 25 million “bits” of data (a megabit) through the internet each second. Different technologies (both wired and wireless) have different theoretical capacities to move data – different “bandwidths.”

Which technology has the highest theoretical bandwidth (the pipe with the greatest diameter)? Currently, first prize goes to fiber-optic cable. Paired with the right equipment fiber-optic cable now carries data at rates measured in the trillions of bits of data (Terabits) each second. To put that in perspective, a “trillion” is equal to a million-millions!

Wireless internet infrastructure technologies such as satellite internet move data by transmitting an electromagnetic signal (similar to that used for TV or radio signals) through the air (or through outer space). Wireless internet can achieve a very high “bandwidth” as well, measured in the billions of bits (or Gigabits) per second (a thousand-million bits per second).

Of course, wireless signals do not use a physical wire or cable of any type. But the “pipe” analogy can still apply if you understand that signals having a different frequency have a different “theoretical bandwidth.” The capacity of the signal to carry data from one point to another each second (its bandwidth) varies depending on the frequency of the signal. In general, the higher the frequency of the signal, the more information can be transmitted – the higher the theoretical bandwidth – the wider the diameter of the pipe.

If the transmitted signal has a high enough signal frequency, if the sender and receiver are within range and have a large enough antenna, and if there is an unobstructed line-of-site between sender and receiver, wireless internet technologies – theoretically – can transfer more than enough data to meet the requirements of current household internet applications. This is important, because it usually is much cheaper at least initially to install wireless internet networks that transmit data through the air than it is to bury or hang fiber-optic cable or other types of wired internet infrastructure.     

Theoretical Capacity and Practical Capacity

Leaky Pipes

You may be wondering why I continue to refer to the “theoretical” capacity of a wired or wireless internet to transfer data. The pipe analogy can help here as well. It may not have occurred to you, but the diameter of the pipe may not be the only thing that determines how much water a pipe can carry. Why? Well, the pipe might have a leak or two, and if the holes are large enough, or there are too many of them, you could end up losing quite a bit of the water.

Much the same holds true for the internet infrastructure. It turns out that if you are moving data through the air (or outer space) wirelessly, as you attempt to “increase the diameter of the pipe” (by increasing signal frequency) your “pipe” tends to get “much leakier.” You tend to lose more and more data the higher the frequency used to transmit the signal carrying the data. Of course, there is no pipe to leak. But data is lost because when signal frequency is raised to levels needed to transmit at bandwidth measured in the billions of bits per second, the signal cannot penetrate solid objects, and even snow or heavy rain will disrupt and, in some cases, interrupt the signal.

Scientists and engineers have found many ways to compensate for this “leaky pipe” problem, such as increasing the size and efficiency of the antenna or by using special techniques to improve the efficiency of data transmission, but it is not possible for GEO Satellite Internet, LEO Satellite Internet (or any other earth-based wireless technologies) to entirely overcome the issue. Building walls, tree leaves, heavy snow and strong rains, all will either disrupt the signal entirely or degrade and reduce the actual amount of information (data) transferred and received each second.

This same “data leaking” issue exists for “wired” internet infrastructure. For example, copper Ethernet cables can potentially carry up to 10 Gigabits of data per second, but only over a distance of 200 feet or less. After that, much of the data is lost (bandwidth is reduced) and eventually the connection is disrupted entirely. Coaxial cable can move data further without significant “leakage.” Again, however, the technology that has the least amount of “leakage” of data over longer distances is fiber-optic cable. It can transmit data without significant signal loss for distances up to 50 miles without refreshing and retransmission.      

GEO Satellite Internet — Too Long of a Run of Pipe …   

Taking the pipe example one step further, there are other physical laws that particularly impact the usefulness of GEO Satellite Internet. If water is flowing through a pipe, it’s obviously going to take some time to get from one end to the other and, of course, the longer the run of pipe, the longer it will take. The same principle applies for data moving through the internet but it’s not noticeable most of the time because unlike water making its way through your garden hose, bits of data move through copper wire, fiber-optic cable, the air and outer space  much faster– up to 186,000 miles each second, the speed of light.

Nevertheless, it does take some time. This delay is measured in intervals of one-thousandth of a second (milliseconds or “ms”), and the technical term used to describe the interval is “latency.” Latency normally is tested by sending data from one computer to the network remote server and back again through the internet network. This is sometimes called “pinging a server,” and the resulting “ping” is the recorded interval for data to complete the round trip. If you’d like an example to try this, the Missouri Broadband Resource Rail has an internet speed test you can use to test your connection’s bandwidth (uploading and downloading data measured in “Mbps”) and signal latency (measured in “ms”).    

High latency time isn’t that big a deal if you are sending or receiving email, watching a movie over the internet – or otherwise doing something that doesn’t require real-time two-way communication, but if you are playing an interactive video game, conducting a video conference call, or managing internet connected devices remotely, lower latency time becomes much more important. Most recently, Congress has limited grant funding for internet infrastructure to only those technologies capable of latency below 100 milliseconds (1/10th of a second).

Significant latency time has been a major drawback for GEO Satellite Internet, and it’s one that simply cannot be overcome through engineering. Moving data from a computer in Moberly, Missouri to St. Louis and back again, using GEO Satellite Internet is going to take a minimum of 480 milliseconds (approximately half a second) because of the distance involved (at least 22,236 miles 4 times). Of course, latency will be longer than this, because in practice the signal will not travel a direct route between two computers up to outer space and back, but instead will be routed thorough earth-based network infrastructure, and it will take additional time to navigate these switches and relays on earth.

Latency is not nearly as much of a concern with LEO Satellite Internet, simply because the signal need only travel a few hundred miles, up to the satellite and back. Of course, that will result in some signal delay, but early reports from SpaceX’s Starlink show that LEO Satellite can achieve latency well below 100 milliseconds.   

How Well Does LEO Satellite Internet Work?

To date, SpaceX’s Starlink has the only operational LEO Satellite Internet designed and available even on a limited basis to individual household users. This service is limited to certain specific locations with adequate satellite coverage, but more are being added and it is already available in some areas of Missouri.  The latest speed and performance tests for early adopters of Starlink’s better-than-nothing “beta” service have been positive.  The latest test data compiled by Speedtest confirms that the connection is far better than GEO Satellite Internet, but it also seems to illustrate the challenges that must be overcome if LEO Satellite Internet’s is to play more than a limited role in closing the digital divide.

The Speedtest results from July-September 2021 showed that on average connection speeds for LEO Satellite Internet were  more than 4 times faster than that achieved by GEO Satellite Internet and over 3 times faster than the minimum standard for broadband service set by the FCC in 2015 (25 Mbps download).  However, the service was still 35% slower than the average for other fixed wired connections tested and, potentially more troubling, the average connection speed declined by nearly 10 Mbps (from 97 Mbps to 87 Mbps) from the results reported by subscribers earlier in 2021.

In rural communities that currently lack any internet access other than GEO Satellite Internet or a first generation DSL connection, the “better than nothing” service offered by Starlink is much faster than other options, but based on the latest tests, it’s below the required levels set by Congress to qualify for grant funding. The Speedtest article speculates that the decline in service experience over the course of 2021 may be the result of an increase in the number of Starlink subscribers – too many subscribers all trying to access the satellite network at the same time.

Why would more customers result in a slower connection? Resorting once again to the “water flowing through a pipe” example, just as a pipe can handle only a finite amount of water passing through it each second, internet infrastructure can transfer only a limited amount of data each second.  When too many internet-connected devices in too many homes, schools, and businesses are all trying to access the internet at the same time, there are two choices – either the network stops working (it crashes), or the individual users – on average — see a decline in bandwidth and/or increased latency for their connection.

To prevent inadequate capacity from becoming a problem as the number of subscribers expand or their average use of the internet increases, an LEO Satellite Internet provider will need to reduce or limit the number of subscribers it serves, launch more satellites, upgrade and replace its satellites with models that can transfer more data (have a higher bandwidth) – or perhaps a combination of all three of these approaches. How many satellites might be needed to complete an LEO Satellite network that can serve consumers in all parts of the United States?  As of late 2021, Starlink currently had less than 2,000 satellites operating in orbit. As of November 2021, it reported that it has 140,000 customers in 20 countries around the world. The FCC has granted SpaceX a license to operate up to 4808 satellites. However, SpaceX recently said that to reach its desired network capacity – to serve customers across the United States and around the world with a network designed to provide capacity of 1 to up to 10 Gigabits per second, it needed a license from the FCC to operate nearly 30,000 satellites (more than 15 times the number it currently has on station). The fate of that request is uncertain, as questions and objections have been raised both to the location of satellite orbits, and to the risk posed from interference with other wireless communications.

Additionally, LEO satellites cannot operate indefinitely. Each satellite is estimated to have a useful life of approximately 7-10 years. This would seem to suggest that even after getting the full fleet of satellites in orbit, SpaceX would need to continue to launch more than 3,000 satellites a year just to maintain that network. Those costs presumably would need to be covered through monthly subscription fees or by permanent government operating subsidies (or both) in order for the company to earn a reasonable profit.

If 30,000 satellites were launched and operating, could LEO Satellite Internet serve at least 19 million Americans estimated by the FCC to be without adequate high-speed internet?  In late 2020, SpaceX received a preliminary grant award from the FCC of over $900 million in exchange for the company’s commitment to make internet service available to at least  643,000 locations in census tracts located in 35 states. The award required reliable delivery of at least 100 Mbps of bandwidth to each location. Competitors and others were skeptical of SpaceX’s ability to meet these requirements within 6 years as required by the terms of the grant, and provided the FCC with a study estimating that even with 12,000 satellites in orbit (the number then planned) the network could not serve this many locations at the required bandwidth level. However, the concern voiced about the specific grant may be academic, as the FCC has questioned whether many of the service locations SpaceX initially was awarded actually qualify for grant funding at all.  

A research report commissioned by Congress from last year catalogued many of these concerns and others, and concluded: “it is unclear—due to unknown factors such as the ability to reach fiber-like speeds, what the competition landscape may look like, or if LEO satellite broadband service will be affordable—whether the inclusion of LEO satellite broadband providers would help address the digital divide through their participation in federal broadband [grant funding] programs.”

No “One Size Fits All” Solution”

So, is LEO Satellite the answer?

No, it isn’t. But in some sense the same really is true of fiber-optic cable, coaxial cable, twisted copper, fixed wireless and all other infrastructure available to deliver internet service today.  There really is no “one size fits all” solution to the high-speed internet access problem, here in Missouri or in any other state. Different technologies or different “mixes” of technologies, likely will be needed to bridge the digital divide over the next several years. Different technologies have strengths and weaknesses that make them most appropriate for some installations and applications but not others.  In addition to theoretical and practical capacity (bandwidth) and the latency of the technology, other important characteristics include engineering difficulties, the cost of installation, ongoing maintenance and operating costs, and the time needed to plan, design and install the network.

Future-proofing the Internet Infrastructure

Yet perhaps among the more important considerations is the ability to increase network capacity to adapt to future increases in the demand for high-speed internet service. The electrification of rural America nearly a hundred years ago provides a useful analogy here. Electrical service installed in homes by the Rural Electrification Administration would be woefully inadequate to meet the requirements of modern homes and businesses. In most cases the service initially installed did not have the capacity to power one wall outlet in each room of the home. However, the service was adequate for the times. Modern electrical appliances we now use were not widely available; most had not even been invented. However, the electric power infrastructure that connected the homes and businesses anticipated these future needs, and could be upgraded over time to adapt to the increased demand for electric power.

We’ve seen the same pattern of ever-increasing demand today with the development of new internet applications and the proliferation of new internet-connected devices for homes and businesses. Together, these are feeding consumer demand for internet networks capable of delivering higher bandwidth and lower latency. This is reflected in the criteria used by the Federal Communication Commission and other federal funding programs. An internet connection capable of transmitting 1/5 of 1 Megabit of data per second (0.2 Mbps) was considered to be a “broadband connection” before 2010, when the standard was increased to 4 Mbps (a 20-fold increase), and yet again to 25 Mbps in 2015. Today, even that standard is widely viewed to be far too low. Last year Congress set the standard for federal grants to include only those networks offering service of at least 100 Mbps, 500 times what was deemed sufficient only 12 years ago!

The point here is that if the public is going to help fund broadband infrastructure, that infrastructure not only should be able to meet the needs of households and business over the 3-4 years that it will realistically take to plan, engineer, fund and deploy them, it also needs to be able to expand to meet future demand over the next few decades. Certain technologies (fiber-optic cable being the most obvious) clearly already has a proven capacity to expand far beyond the needs of any applications now contemplated. Others, such as standard copper telephone lines used to deliver digital subscriber line (DSL) connections, or GEO Satellite Internet, seem to have hit the limit of our ability to engineer ways around constraints imposed by their physical properties. While these internet technologies might still work for some applications, they seem clearly unsuited for a long-term, publicly funded investment that needs to have lasting value over several decades. Still others, like LEO Satellite Internet present closer questions. While the technology seems to hold some promise, the engineering behind wide scale deployment seems problematic.   

However, even taking the concerns and issues associated with LEO Satellite Internet into account, it would be a mistake to count it out as a useful technology. That should be evident by the substantial continued private investment being made by SpaceX, Amazon, OneWeb, and Telesat. Those companies could not raise substantial capital from private investors if there was no realistic market for LEO Satellite Internet service. The technology available and in use today for LEO Satellite Internet (both hardware and software) will continue to improve, and this likely will result in efficiencies and improved performance for earth-based antennas, the LEO satellites, and the rockets used to launch them. LEO Satellite Internet networks may play an important role in expanding earth-based 5G mobile phone and internet service, be a means of establishing temporary high-speed internet service in extremely remote locations, provide high-speed internet connections for container ships and cruise ships at sea, and keep international commercial airline passengers “connected.” Since the market will be world-wide, it’s likely LEO Satellite Internet will be an appropriate technology for some individual consumers in remote parts of the world as well.

That said, wired based technologies such as fiber-optic cable also continue to improve, and even though initial installation costs may be high, the cost and complexity to expand service to meet future demand likely will be far lower,  as will be the the potential for lower long-term operating costs.  The point here is that simply because LEO Satellite might be an appropriate solution for some consumers and applications, that doesn’t make it appropriate to minimize the obvious technological constraints that seem to make it inappropriate for wide-scale deployment in communities that need access now, particularly when other existing technologies can deliver superior levels of service today – and in the future.